Need help with FrameworkBenchmarks?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

6.1K Stars 1.7K Forks Other 12.5K Commits 93 Opened issues


Source for the TechEmpower Framework Benchmarks project

Services available


Need anything else?

Contributors list

Welcome to TechEmpower Framework Benchmarks (TFB)

Build Status

If you're new to the project, welcome! Please feel free to ask questions here. We encourage new frameworks and contributors to ask questions. We're here to help!

This project provides representative performance measures across a wide field of web application frameworks. With much help from the community, coverage is quite broad and we are happy to broaden it further with contributions. The project presently includes frameworks on many languages including

, and others. The current tests exercise plaintext responses, JSON serialization, database reads and writes via the object-relational mapper (ORM), collections, sorting, server-side templates, and XSS counter-measures. Future tests will exercise other components and greater computation.

Read more and see the results of our tests on cloud and physical hardware. For descriptions of the test types that we run, see the test requirements section.

If you find yourself in a directory or file that you're not sure what the purpose is, checkout our file structure in our documentation, which will briefly explain the use of relevant directories and files.

Quick Start Guide

To get started developing you'll need to install docker or see our Quick Start Guide using vagrant

  1. Clone TFB.

    $ git clone
  2. Change directories

    $ cd FrameworkBenchmarks
  3. Run a test.

    $ ./tfb --mode verify --test gemini

Explanation of the

The run script is pretty wordy, but each and every flag is required. If you are using windows, either adapt the docker command at the end of the

shell script (replacing
), or use vagrant.

The command looks like this:

docker run -it --rm --network tfb -v /var/run/docker.sock:/var/run/docker.sock -v [FWROOT]:/FrameworkBenchmarks techempower/tfb [ARGS]
  • -it
    tells docker to run this in 'interactive' mode and simulate a TTY, so that
    is propagated.
  • --rm
    tells docker to remove the container as soon as the toolset finishes running, meaning there aren't hundreds of stopped containers lying around.
  • --network=tfb
    tells the container to join the 'tfb' Docker virtual network
  • The first
    specifies which Docker socket path to mount as a volume in the running container. This allows docker commands run inside this container to use the host container's docker to create/run/stop/remove containers.
  • The second
    mounts the FrameworkBenchmarks source directory as a volume to share with the container so that rebuilding the toolset image is unnecessary and any changes you make on the host system are available in the running toolset container.
  • techempower/tfb
    is the name of toolset container to run

A note on Windows

  • Docker expects Linux-style paths. If you cloned on your
    drive, then
    would be
  • Docker for Windows understands
    even though that is not a valid path on Windows, but only when using Linux containers (it doesn't work with Windows containers and LCOW). Docker Toolbox may not understand
    , even when using Linux containers - use at your own risk.

Quick Start Guide (Vagrant)

Get started developing quickly by utilizing vagrant with TFB. Git, Virtualbox and vagrant are required.

  1. Clone TFB.

    $ git clone
  2. Change directories

    $ cd FrameworkBenchmarks/deployment/vagrant
  3. Build the vagrant virtual machine

    $ vagrant up
  4. Run a test

    $ vagrant ssh
    $ tfb --mode verify --test gemini

Add a New Test

Either on your computer, or once you open an SSH connection to your vagrant box, start the new test initialization wizard.

    [email protected]:~/FrameworkBenchmarks$ ./tfb --new

This will walk you through the entire process of creating a new test to include in the suite.


Official Documentation

Our official documentation can be found in the wiki. If you find any errors or areas for improvement within the docs, feel free to open an issue in this repo.

Live Results

Results of continuous benchmarking runs are available in real time here.

Data Visualization

If you have a

file that you would like to visualize, you can do that here. You can also attach a
parameter to that url where
is a run listed on tfb-status like so:


The community has consistently helped in making these tests better, and we welcome any and all changes. Reviewing our contribution practices and guidelines will help to keep us all on the same page. The contribution guide can be found in the TFB documentation.

Join in the conversation in the Discussions tab, on Twitter, or chat with us on Freenode at


We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.