Skip to content

Extensively-annotated referential dev setup of gRPC/ConnectRPC service in Go using `Buf CLI` for `.proto` dependency management and config-based code generation

License

Notifications You must be signed in to change notification settings

nadhifikbarw/x-go-buf-rpc

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Go gRPC/ConnectRPC using Buf (v2, Protobuf 2023 Edition, No Managed Mode)

Extensively-annotated referential dev setup of gRPC/ConnectRPC service in Go using Buf CLI to handle .proto dependency management and config-based code generation.

This write-up offers slightly different approach than Buf CLI quickstart guide. It showcases .proto dependency resolution but purposely skips "Managed Mode".

This approach feels missing for me from the official guide. By first understanding how .proto dependency resolution handled in Go, it's easier to configure Managed Mode once you know that it's trying to solve. I suspect it's why Managed Mode's Troubleshooting docs contains precisely the two issues you will encounter when you don't fully grasp Managed Mode yet.

I dedicated extra sections to lay out .proto codegen workflow in Go and considerations when developing on top of RPC framework such as gRPC/ConnectRPC.

Using Buf for Go codegen of .proto with dependencies

This write-up assumes you're already familiar with .proto source and how to define your RPC interfaces. If not, see Language Guide (editions)

When developing gRPC / ConnectRPC framework, you default to use Protobuf as your IDL. Typically:

  1. You create .proto source file(s) to define your RPC interfaces
  2. You generate RPC server stubs (and client) code from such .proto source
  3. You use the generated server stubs to implement your complete service

The premise laid out above seems straightforward, but understanding the relationship between dependency resolution and the code it generates can make or break your setup/implementation. Especially when dependencies are external.


This project stores .proto source(s) inside protos folder. This is aligned with Protobuf recommendation to store .proto files outside the directory of other language-specific sources. For Go, pkg and cmd directories would be language-specific directories.

Here's a snippet of the RPC service interface that we're going to use to explore further

// See: protos/task/v1

package task.v1;
// [`go_option` omitted to be discussed later]

import "google/type/datetime.proto";

message Task {
    string id = 1;
    string title = 2;
    bool completed = 3;
    google.type.DateTime created_at = 4;
    google.type.DateTime updated_at = 5;
    google.type.DateTime deleted_at = 6;
}

service TaskService{
    rpc CreateTask(CreateTaskRequest) returns (CreateTaskResponse){}
    rpc GetTask(GetTaskRequest) returns (GetTaskResponse) {}
    rpc ListTasks(ListTasksRequest) returns (ListTasksResponse) {}
    rpc MarkTaskIncomplete(MarkTaskIncompleteRequest) returns (MarkTaskIncompleteResponse) {}
    rpc MarkTaskComplete(MarkTaskCompleteRequest) returns (MarkTaskCompleteResponse) {}
    rpc RenameTask(RenameTaskRequest) returns (RenameTaskResponse) {}
    rpc DeleteTask(DeleteTaskRequest) returns (DeleteTaskResponse) {}
}

// [...]

As you can see, the Task Message uses external type from Google to help convey our timestamps information. It's also natural to assume google.type.DateTime will be fulfilled by google/type/datetime.proto the import directive.

But how do we know we need to add import google/type.datetime.proto to use google.type.DateTime to begin with?

Resolving import Directive by Specifying External BSR Module in buf.yaml

This Annotated buf.yaml should provide sufficient configurations for current requirement, see the docs to explore more of its features

# `buf.yaml` configures Buf Workspace, a workspace may contain multiple Buf Modules.
# This project only have one Buf Module, and it won't be published to Buf Schema Registry.
# `buf.yaml` is also where to external BSR Modules that are used in this workspace

# For details on buf.yaml configuration, visit https://buf.build/docs/configuration/v2/buf-yaml
version: v2

modules:
  # Treat `protos` folder as Buf module for this workspace
  - path: protos
deps:

  # This is the Buf Module that will resolve
  # `google/type/datetime.proto` import
  # in `protos/task/v1/task.proto`
  - buf.build/googleapis/googleapis

# [...]

BSR Module dependencies are declared in shared fashion for all modules within a workspace. The value must be a valid path to a BSR module (Buf module that has been pushed to BSR). This means that if you have a module you want to use as a dependency, it must also be pushed to the BSR.

Why this BSR module? The buf.build/googleapis/googleapis is the specific module that will be used to resolve import "google/type/datetime.proto" directive when we're generating code.

In case you haven't noticed, this repository doesn't vendorize/copy any file named datatime.proto anywhere. However the content of our .proto dependency is rather important, it contains definition about the type we're depending on (and sometimes the transitive dependencies of such type itself). So how can this BSR module help resolving this import directive?

google.type.DateTime docs

If you go through the Buf Schema Registry page for this BSR module, it contains various package, but we need the specific google.type.DateTime type from google.type package.

It specifies the google/type/datetime.proto import path that you need to use in your .proto file in order to access this DateTime type from the google.type package. Hence, this is why we need to specify buf.build/googleapis/googleapis as our BSR module dependency too.

Now that with the .proto dependency resolution configured we can start generating Go code we need.

go_package and Dependency Resolution

Before we configure buf.gen.yaml configuration for generation, we need to talk about go_package option, I purposely omitted it from task.proto snippet above. To explain the purpose of go_package easier, let's re-examine datetime.proto actual source content from the Google repository.

// from: https://github.com/googleapis/googleapis/blob/master/google/type/datetime.proto

syntax = "proto3";

package google.type;

import "google/protobuf/duration.proto";

option cc_enable_arenas = true;
option go_package = "google.golang.org/genproto/googleapis/type/datetime;datetime";
// [...]

message DateTime {
  // Optional. Year of date. Must be from 1 to 9999, or 0 if specifying a
  // datetime without a year.
  int32 year = 1;

  // Required. Month of year. Must be from 1 to 12.
  int32 month = 2;

  // [...]
}

Basically, in order to compile .proto into Go code that have other .proto dependencies, ALL .proto file MUST provide the valid Go package import path. You specify that import path using option go_package annotation inside .proto file.

If we jump ahead the configuration look at the generated Go code of task.pb.go, it will have import directive using that google.golang.org/genproto/googleapis/type/datetime to use datetime.DateTime as its dependency.

// from: pkg/gen/gengrpc/task/v1/task.pb.go
import (
  // This import match `go_package` from `datetime.proto`
	datetime "google.golang.org/genproto/googleapis/type/datetime"
  // the rest of the codes
)

This approach is relatively okay from external .proto consumer side. Dependency resolution for Go becomes simple because it isolates transitive dependencies by simply importing such Go package and assume that valid package is available for using such type.

However it means that any public .proto author that you depend on may accidentally cause dependency issue on your project if they may accidentally taken down that package out of public. On their side, it may become maintenance responsibility for them.

Generating Go code from .proto

With go_package behavior in mind, we now understand that we need to configure buf.gen.yaml to place generated code where it won't break the expected go_package import path contract for task.proto as well.

version: v2
# 'clean', when set to true, deletes the directories, zip files, and/or jar files specified in the
# `out` field for all plugins before running code generation.
clean: true
# 'plugins' is a list of plugin configurations used for `buf generate`.
plugins:
  # Buf allows you to use `protoc-gen-go` plugin without installing it locally
  # This plugin handles protocol buffer `Message` codegen
  - remote: buf.build/protocolbuffers/go
    # Target this folder for generation, matching the `go_package` option
    out: pkg/gen/genproto
    opt:
      - paths=source_relative
  # `*_grpc.pb.go`, the gRPC server/client code is generated using separate plugin
  # Another example of remote `protoc` plugin, this one is for `protoc-gen-go-grpc`
  - remote: buf.build/grpc/go
    out: pkg/gen/genproto
    opt:
      - paths=source_relative
  # This one for `protoc-gen-connect-go` the protoc plugin to generate ConnectRPC version
  - remote: buf.build/connectrpc/go
    # `protoc-gen-connect-go` will be generated as sub-package
    #  of `pkg/gen/genproto/task/v1/taskv1connect` but will import
    # `task.pb.go` from parent package correctly
    out: pkg/gen/genproto
    opt:
      - paths=source_relative
# Describe where to locate `.proto` sources
inputs:
  - directory: protos # points to `protos` folder

Commands

buf dep update # Update `buf.lock` based on deps in `buf.yaml`
buf generate # Run grpc generation as configured

Picking ConnectRPC or gRPC

In case you only found out about ConnectRPC. ConnectRPC is an alternative RPC framework created by Buf. While it should be considered as separate RPC protocol, it maintains gRPC-compatibility. ConnectRPC is now a Cloud Native Computing Foundation (CNCF) sandbox project.

In actual project you obviously don't need to implement both. I implemented both here for exploration purposes. ConnectRPC provides much more convenient mechanism to handle web-based RPC client. If your requirements involve allowing RPC call from browser consider Connect Protocol

Service Implementations

The service implementations are not production-grade, it's trivial to showcase differences between gRPC/ConnectRPC implementations style.

ConnectRPC allows RPC via HTTP easily

curl --header "Content-Type: application/json" --data "{\"title\": \"Task Title\"}" http://localhost:8080/task.v1.TaskService/ListTasks

gRPC-server uses grpcurl (I enabled reflection)

grpcurl -plaintext localhost:3000 task.v1.TaskService/ListTasks

About

Extensively-annotated referential dev setup of gRPC/ConnectRPC service in Go using `Buf CLI` for `.proto` dependency management and config-based code generation

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages