Protobufs for Golang

Protobufs for Golang

In this post I’m going to talk about various aspects of working with protobufs in golang. We are also going to cover how to maintain modifications to grpc-gateway.

Protobuf

Let’s first talk about distributing protobufs. When you deal with protobufs you have to interact with two acting parties: the server and the client. The server describes its interface in a protobuf file. The client uses the protobuf file in order to get access to the server.

Clients either use libraries generated from the protobuf file or they copy the protobuf file and generate the client code themselves. Both of these approaches are viable options. Let’s discuss pros and cons of each of them.

Distributing Protobufs in Libraries

The main complication with protobuf files is that they are really alien to the package manager of any language. Some package managers handle them better than others, but package managers are designed to distribute code rather than files from which the code is generated.

Plain protofile is language agnostic and library independent. Once the proto file is converted into code it has all the properties of the code: it depends on specific versions of certain libraries.

If your code depends on protobufs from different authors that evolve over an extended period of time, then there is a chance they use different versions of protobuf libraries. If your package manager can’t handle incompatible versions of different libraries then you are up for a problem.

For example, maven doesn’t allow you to have two libraries with different major versions. You need to pick one. No matter which one you pick there is a chance it will break, at the most unexpected moment, in runtime.

If you are using golang and your dependencies and their dependencies follow golang modules guideline, then you can avoid this problem. But the thing is that this guideline has been introduced fairly recently and not all modules follow those guidelines including golang protobuf libraries.

Distributing Plain Protobufs

Of course you will have all the same problems when you deal with regular libraries. If your dependencies require conflicting versions of other libraries, then you are in trouble. With protobufs we can at least opt out of distributing code. We can distribute the original proto files.

If you need to see the difference between what you use and what main project provides then you just diff two files. Most of the time you don’t even want to compare anything, you just pull the latest file and regenerate the code.

Compatibility of Both Approaches

The nice thing about these two approaches is that one doesn’t exclude the other one. The creator of the protobuf file can distribute the libraries generated from his proto file.

If those libraries don’t work for the client for some reason, then the client can commit the proto file into his repository and generate the code himself.

Generating Code

Now let’s discuss how to produce those libraries with generated code for golang.

Since code in golang isn’t distributed in binary packages you need to commit the generated code to the source code repository along with the protobuf files. In order to keep this code clear of any extra dependencies I keep this code in a separate repository. If you don’t want to keep it in a separate repository at least keep it in a separate module. In golang one repository may have multiple modules.

The necessity to commit your generated files into the source code repository renders ci useless in this situation. All you need is to call your generate scripts from the pre-commit hook.

Hooks are not cloned with the repository, but you can choose to put hooks in a directory in the repository and configure git to look for hooks in this repository: https://git-scm.com/docs/git-config#Documentation/git-config.txt-corehooksPath This way every time you commit your code it will generate code from the proto file.

In order to make builds reproducible I recommend running protobuf generation inside the docker image that you use for building your projects. It will ensure that you have the same verison of golang and you have the same version of protoc.

Here is an example of generate.sh

if [[ $# -eq 0 ]]; then
	docker run --volume=`pwd`:/home/ golang /home/generate.sh docker
else
	if [[ $1 -eq "docker" ]]; then
		echo "Running inside docker"
		cd /home/

		apt-get update
		apt-get install -y protobuf-compiler
		go install google.golang.org/protobuf/cmd/protoc-gen-go

		protoc --proto_path=proto_dir --go_out=. proto_dir/sample.proto -I.

		mv github.com/proto_repo/proto_dir/sample.pb.go proto_dir
		rm -Rf github.com
		chown 1000:1000 sample.pb.go

	fi
fi

Grpc-gateway

Excellent. We decided where to put our proto files. We reproducibly generated code from the proto file. There is one more thing you may need when you start using protobufs - grpc-gateway.

Grpc-gateway is a nice RESTful frontend for your grpc services. Grpc-gateway is just a function that returns a Handler interface that you can plug into any standard http router in golang.

There is one downside of grpc-gateway. It’s very opinionated regarding how to construct RESTful services. If you like Google API design guidlines, then you are in luck. Grpc-gateway follows them to the letter. An attempt to deviate from this guideline even slightly poses a problem - how to modify the grpc-gateway.

There are several approaches that you might want to employ.

One approach is to decorate the handlers generated by protoc before passing them to your server. The handlers get wrapped into another RESTful handler that does the custom job.

There is another way that I found more convenient - modify the generated gateway itself. The only problem with this approach is that you need to reapply your changes every time you generate new version of protobufs. In order to solve this problem I used a tool that debian developers use for deb packages - quilt.

Quilt allows you to store modifications as a series of patches. It keeps patches separate from the code. You can instruct it to apply patches or roll them back.

So what I suggest is to keep modifications using quilt applied in the repo. When you need to generate new protobuf files you can roll back all applied changes, generate protobufs and apply the changes again. If your changes are not too invasive even small shifts in the patches will be handled by patch utility. In the worst case you will need to adjust your modifications slightly.

Generate.sh with quilt commands and grpc-gateway generation.

#!/bin/bash

if [[ $# -eq 0 ]]; then
	docker run --volume=`pwd`:/home/ golang /home/generate.sh docker
else
	if [[ $1 -eq "docker" ]]; then
		echo "Running inside docker"
		cd /home/

		apt-get update
		apt-get install -y protobuf-compiler
		apt-get install -y protobuf-compiler-grpc
		apt-get install -y quilt
		go install google.golang.org/protobuf/cmd/protoc-gen-go
		go install github.com/grpc-ecosystem/grpc-gateway/protoc-gen-grpc-gateway

		mkdir -p google/api
		pushd google/api
		wget https://raw.githubusercontent.com/googleapis/googleapis/master/google/api/annotations.proto
		wget https://raw.githubusercontent.com/googleapis/googleapis/master/google/api/http.proto

		popd

		protoc --proto_path=proto_dir --go_out=. proto_dir/sample.proto -I.
		protoc --grpc-gateway_out=logtostderr=true:. proto_dir/sample.proto

		pushd proto_dir
		quilt pop -a
		popd

		mv github.com/proto_repo/proto_dir/sample.pb.go proto_dir
		mv github.com/proto_repo/proto_dir/sample.pb.gw.go proto_dir
		rm -Rf github.com
		chown 1000:1000 proto_dir/sample.pb.go

		pushd proto_dir
		quilt push -a
		popd


	fi
fi

Before generating grpc gateway file you need to make sure that rpc methods are annotated with options:

option (google.api.http) = {
  post: "/v1/example/echo"
  body: "*"
};

This is sample.proto file:

syntax = "proto3";
package tutorial;

import "google/api/annotations.proto";


option go_package = "github.com/proto_repo/proto_dir";

service YourService {
  rpc Echo(AddressBook) returns (AddressBook) {
    option (google.api.http) = {
      post: "/v1/example/echo"
      body: "*"
    };
}
}


message Person {
  string name = 1;
  int32 id = 2;  // Unique ID number for this person.
  string email = 3;

  enum PhoneType {
    MOBILE = 0;
    HOME = 1;
    WORK = 2;
  }

  message PhoneNumber {
    string number = 1;
    PhoneType type = 2;
  }

  repeated PhoneNumber phones = 4;

  //google.protobuf.Timestamp last_updated = 5;
}

// Our address book file is just one of these.
message AddressBook {
  repeated Person people = 1;
}
Show Comments