Need help with cargo-guppy?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

facebookincubator
260 Stars 14 Forks Other 887 Commits 15 Opened issues

Description

Track and query Cargo dependency graphs.

Services available

!
?

Need anything else?

Contributors list

cargo-guppy: track and query dependency graphs

Build Status License License

This repository contains the source code for: *

guppy
: a library for performing queries on Cargo dependency graphs guppy on crates.io Documentation (latest release) Documentation (main) * libraries used by guppy: *
guppy-summaries
: a library for managing build summaries listing packages and features guppy-summaries on crates.io Documentation (latest release) Documentation (main) *
target-spec
: an evaluator for

Cargo.toml
target specifications target-spec on crates.io Documentation (latest release) Documentation (main) * tools built on top of guppy: *
determinator
: figure out what packages changed between two revisions determinator on crates.io Documentation (latest release) Documentation (main) *
cargo-hakari
: a command-line tool to manage workspace-hack packages cargo-hakari on crates.io Documentation (latest release) Documentation (main) * available in library form as
hakari
hakari on crates.io Documentation (latest release) Documentation (main) *
cargo-guppy
: an experimental command-line frontend for
guppy
Documentation (main) * and a number of internal tools and test fixtures used to verify that
guppy
behaves correctly.

Use cases

guppy
and
cargo-guppy
can be used to solve many practical problems related to dependency graphs in large Rust codebases. Some examples -- all of these are available through the
guppy
library, and will eventually be supported in the
cargo-guppy
CLI as well:
  • track existing dependencies for a crate or workspace
  • query direct or transitive dependencies of a subset of packages — useful when some packages have greater assurance or reliability requirements
  • figure out what's causing a particular crate to be included as a dependency
  • iterate over reverse dependencies of a crate in topological order
  • iterate over some or all links (edges) in a dependency graph, querying if the link is a build, dev or regular dependency
  • filter out dev-only dependencies while performing queries
  • perform queries based on Cargo features
  • simulate Cargo builds and return what packages and features would be built by it
  • evaluate target specs for platform-specific dependencies
  • generate summary files for Cargo builds, which can be used to:
    • receive CI feedback if a dependency is added, updated or removed, or if new features are added
    • receive CI feedback if a package is added to a high-assurance subset, or if any new features are enabled in an existing package in that subset. This can be used to flag those changes for extra scrutiny.
  • print out a
    dot
    graph for a subset of crates, for formatting with graphviz

Still to come:

  • a command-line query language

Development status

The core

guppy
code in this repository is considered mostly complete and the API is mostly stable.

We're building a number of tools on top of guppy, and those are still are under active development. Tool requirements may cause further changes in the API, but the goal will be to avoid extensive overhauls.

guppy
's simulation of Cargo builds is extensively tested against upstream Cargo, and there are no known differences. Comparison testing has found a number of bugs in upstream Cargo, for example: * v2 resolver: different handling for inactive, optional dependencies based on how they're specified * v2 resolver: a proc macro being specified with the key "proc_macro" vs "proc-macro" causes different results * specifying different versions in unconditional and target-specific dependency sections causes "multiple rmeta candidates" error

Production users

cargo-guppy
is extensively used by the Diem Core project.

guppy
is used for several lint checks. This includes basic rules that look at every workspace package separately: * every package has fields like
author
and
license
specified * crate names and paths should use
-
instead of
_

to more complex rules about the overall dependency graph, such as: * some third-party dependencies are banned from the workspace entirely, or only from default builds * every workspace package depends on a

workspace-hack
crate (similar to rustc-workspace-hack) * for any given third-party dependency, the workspace only depends on one version of it directly (transitive dependencies to other versions are still allowed) * every workspace package is categorized as either production or test-only, and the linter checks that test-only crates are not included in production builds * support for overlay features, which allow test-only code to be: * included in crates (similar to the
#[cfg(test)]
annotation
) * depended on by test-only code in other crates (
#[cfg(test)]
does not allow this) * but guaranteed to be excluded from production builds

In addition,

guppy-summaries
is used to generate build summaries of packages and features (particularly for high-security subsets of the codebase), and changes to these sets are flagged by Diem's CI (example).

Design

guppy
is written on top of the excellent petgraph library. It is a separate codebase from
cargo
, depending only on the stable
cargo
metadata
format. (Some other tools in this space like
cargo-tree
use cargo internals directly.)

Minimum supported Rust version

The minimum supported Rust version (MSRV) is 1.53.

While a crate is pre-release status (0.x.x) it may have its MSRV bumped in a patch release. Once a crate has reached 1.x, any MSRV bump will be accompanied with a new minor version.

Contributing

See the CONTRIBUTING file for how to help out.

License

This project is available under the terms of either the Apache 2.0 license or the MIT license.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.