Need help with rules_protobuf?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

220 Stars 163 Forks Other 289 Commits 3 Opened issues


Bazel rules for building protocol buffers and gRPC services (java, c++, go, ...)

Services available


Need anything else?

Contributors list


rulesprotobuf was initially written when the bazel protobuf ecosystem was fairly immature. Now 2 years later, this repository is showing its age. Rather than retrofit this set of rules, it's been re-written from the ground-up to work correctly with the `native.protolibrary` rule and is available at Consequently, these rules are effectively no longer being maintained.

Please upgrade to the newer rules when appropriate and file issues if there are feature gaps that need to be filled.

Cheers, @pcj (Oct 25 2018)

Build Status

Bazel skylark rules for building protocol buffers with +/- gRPC support on (osx, linux) :sparkles:.

Bazel rules_protobuf gRPC

How is this related to the proto_library rules within Bazel itself?

These rules sprung out of a need to have protobuf support when there was limited exposed and documented proto generation capabilities in the main bazel repository. This is a moving target. The main goals of this project are to:

  1. Provide

    , the protocol buffer compiler (v3.5.1).
  2. Provide the language-specific plugins.

  3. Provide the necessary libraries and dependencies for gRPC support, when possible.

  4. Provide an extensible

    abstraction (used in conjunction with the
    rule) to generate outputs for current and future custom protoc plugins not explicitly provided here.


| Language | Compile 1 | Build 2 | gRPC 3 | | ---------------------------: | -----------: | --------: | -------- | | C++ | ccprotocompile | ccprotolibrary v3.5.1 | v1.10.1 | | C# | csharpprotocompile | csharpprotolibrary | 1.0.0 | | Closure | closureprotocompile | closureprotolibrary | | | Go | goprotocompile | goprotolibrary | v1.6.0 | | Go (gogo) | gogoprotocompile | gogoprotolibrary | Nov 2017 | | gRPC gateway | grpcgatewayproto_compile
grpcgatewayswagger_compile | grpcgatewayproto_library
grpcgatewaybinary | v1.2.2+ (f2862b) | | Java | javaprotocompile | javaprotolibrary | v1.9.0 | | Node | nodeprotocompile | nodeprotolibrary | 1.10.1 | | Objective-C | objcprotocompile | objcprotolibrary 4 | v1.10.1 | | Python | pyprotocompile | pyprotolibrary | v1.6.1 | | Ruby | rubyprotocompile | | v1.6.1 | | Custom proto_language | proto_compile | | |

Refer to
for a more detailed summary of workspace dependencies / versions.

  1. Support for generation of protoc outputs via

  2. Support for generation + compilation of outputs with protobuf dependencies.

  3. gRPC support.

  4. Highly experimental (probably not functional yet). A work-in-progress for those interested in contributing further work.


1. Install Bazel

If you have not already installed

on your workstation, follow the bazel instructions.

NOTE: Bazel 0.8.0 or above is required for go support.

2. Add rules_protobuf your WORKSPACE

Specify the language(s) you'd like use by loading the language-specific

  name = "org_pubref_rules_protobuf",
  remote = "",
  tag = "v0.8.2",
  #commit = "..." # alternatively, use latest commit on master

load("@org_pubref_rules_protobuf//java:rules.bzl", "java_proto_repositories") java_proto_repositories()

load("@org_pubref_rules_protobuf//cpp:rules.bzl", "cpp_proto_repositories") cpp_proto_repositories()

load("@org_pubref_rules_protobuf//go:rules.bzl", "go_proto_repositories") go_proto_repositories()

Several languages have other

dependencies that you'll need to load before the
function is invoked:

| Language | Requires | | ---: | :--- | | closureprotorepositories | rules_closure | | csharpprotorepositories | rules_dotnet | | goprotorepositories | rules_go | | gogoprotorepositories | rules_go | | grpcgatewayprotorepositories | rules_go | | nodeprotorepositories | rules_node | | pyproto_repositories 1 | rules_python |

1 Only needed for python grpc support.

3. Add *_proto_* rules to your BUILD files

To build a java-based gRPC library:

load("@org_pubref_rules_protobuf//java:rules.bzl", "java_proto_library")

java_proto_library( name = "protolib", protos = [ "my.proto" ], with_grpc = True, verbose = 1, # 0=no output, 1=show protoc command, 2+ more... )


To run the examples & tests in this repository, clone it to your workstation.

# Clone this repo
$ git clone

Go to examples/helloworld directory

$ cd rules_protobuf/examples/helloworld

Run all tests

$ bazel test examples/...

Build a server

$ bazel build cpp/server

Run a server from the command-line

$ $(bazel info bazel-bin)/examples/helloworld/cpp/server

Run a client

$ bazel run go/client $ bazel run cpp/client $ bazel run java/org/pubref/rules_protobuf/examples/helloworld/client:netty

Overriding or excluding WORKSPACE dependencies

To load alternate versions of dependencies, pass in a dict having the same overall structure of a deps.bzl file. Entries having a matching key will override those found in the file. For example, to load a different version of, provide a different commit ID:

load("@org_pubref_rules_protobuf//go:rules.bzl", "go_proto_repositories")
  overrides = {
    "com_github_golang_protobuf": {
      # Override golang with a different commit
      "commit": "2c1988e8c18d14b142c0b472624f71647cf39adb",

You may already have some external dependencies already present in your workspace that rulesprotobuf will attempt to load, causing a collision. To prevent rulesprotobuf from loading specific external workspaces, name them in the

  excludes = [

To completely replace the set of dependencies that will attempt to be loaded, you can pass in a full

object to the
  lang_deps = {
    "com_github_golang_glog": {

There are several language --> language dependencies as well. For example,

(and more) internally call the
rule to provide the grpc plugins. To suppress this (and have better control in your workspace), you can use the

Proto B --> Proto A dependencies

Use the

attribute to name proto rule dependencies. Use of
implies you're using imports, so read on...


In all cases, these rules will include a

) argument. This is functionally equivalent to
info execution_root)
. Therefore, when the protoc tool is invoked, it will 'see' whatever directory structure exists at the bazel execution root for your workspace. To better learn what this looks like,
$(bazel info execution_root)
and look around. In general, it contains all your sourcefiles as they appear in your workspace with an additional
directory for all dependencies used.

This has implications for import statements in your protobuf sourcefiles, if you use them. The two cases to consider are imports within your workspace (referred to here as 'internal' imports), and imports of other protobuf files in an external workspace (external imports).

Internal Imports

Internal imports should require no additional parameters if your import statements follow the same directory structure of your workspace. For example, the

file imports the
file. Since this matches the workspace directory structure,
can find it, and no additional arguments to a
are required for protoc code generation step.

Obviously, importing a file does not mean that code will be generated for it. Therefore, use of the imports attribute implies that the generated files for the imported message or service already exist somewhere that can be used as a dependency some other library rule (such as


Rather than using

, it often make more sense to declare a dependency on another protolibrary rule via the `protodeps
attribute.  This makes the import available to the calling rule and
performs the code generation step.  For example, the
rule in
names the
rule in its
proto_deps` attribute to accomplish both code generation and compilation of object files for the proto chain.

External Imports

The same logic applies to external imports. The two questions to ask yourself when setting up your rules are:

[Question 1]: Can protoc "see" the imported file? In order to satisfy this requirement, pass in the full path of the required file(s) relative to the execution root where protoc will be run. For example, the well-known

could be made visible to protoc via:
  name = 'fooprotos',
  protos = 'foo.proto`,
  imports = [
    inputs = [

This would be imported as

import "google/protobuf/descriptor.proto"
given that the file
is in the package

[Question 2]: Can the

rule "see" the generated protobuf files? (in this case

. Just because the file was imported does not imply that protoc will generate outputs for it, so somewhere in the
rule dependency chain these files must be present. This could be via another
rule defined elswhere, or a some other filegroup or label list. If the source is another
rule, specify that in the
attribute to the calling
rule. Otherwise, pass a label that includes the (pregenerated) protobuf files to the
attribute, just as you would any typical

Important note about sandboxing: simply stating the path where protoc should look for imports (via the

attribute) is not enough to work with the bazel sandbox. Bazel is very particular about needing to know exactly which inputs are required for a rule, and exactly what output files it generates. If an input is not declared, it will not be exposed in the sandbox. Therefore, we have to provide both the import path and a label-generating rule in the
attribute that names the files we want available in the sandbox (given here by

If you are having problems, put

in your build rule and/or disable sandboxing with


Contributions welcome; please create Issues or GitHub pull requests.


  • Much thanks to all contributors and the members of the bazel, protobuf, and gRPC teams.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.