Need help with aleth?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

3.7K Stars 2.2K Forks GNU General Public License v3.0 34.3K Commits 338 Opened issues


Aleth – Ethereum C++ client, tools and libraries

Services available


Need anything else?

Contributors list

No Data

Aleth – Ethereum C++ client, tools and libraries

The collection of C++ libraries and tools for Ethereum, formerly known as cpp-ethereum project. This includes the full Ethereum client aleth.


Gitter GitHub Issues


The Ethereum Documentation site hosts the aleth homepage, which has a Quick Start section.

Operating system

Ubuntu and macOS TravisCI
Windows AppVeyor


Download release binaries

Using docker images


docker run ethereum/aleth --help


docker run ethereum/testeth --help

Building from source

Get the source code

Git and GitHub are used to maintain the source code. Clone the repository by:

git clone --recursive
cd aleth


option is important. It orders git to clone additional submodules to build the project. If you missed
option, you are able to correct your mistake with command
git submodule update --init

Install CMake

CMake is used to control the build configuration of the project. Latest version of CMake is required (at the time of writing 3.9.3 is the minimum). We strongly recommend you to install CMake by downloading and unpacking the binary distribution of the latest version available on the CMake download page.

The CMake package available in your operating system can also be installed and used if it meets the minimum version requirement.

Alternative method

The repository contains the scripts/ script that downloads a fixed version of CMake and unpacks it to the given directory prefix. Example usage:

scripts/ --prefix /usr/local


Configure the project build with the following command to create the

directory with the configuration.
mkdir build; cd build  # Create a build directory.
cmake ..               # Configure the project.
cmake --build .        # Build all default targets.

On Windows we support Visual Studio 2017, and 2019. You should generate a Visual Studio solution file (

) for the 64-bit architecture via the following command:
  • Visual Studio 2017:
    cmake .. -G "Visual Studio 15 2017 Win64"
  • Visual Studio 2019:
    cmake .. -G "Visual Studio 16 2019" -A x64

After the necessary dependencies have been downloaded and built and the solution has been generated,

can be found in the

Common Issues Building on Windows

LINK : fatal error LNK1158: cannot run 'rc.exe'

Rc.exe is the Microsoft Resource Compiler. It's distributed with the Windows SDK and is required for generating the Visual Studio solution file. It can be found in the following directory:

%ProgramFiles(x86)%\Windows Kits\\bin\\\

If you hit this error, adding the directory to your path (and launching a new command prompt) should fix the issue.


Contributors Gitter up-for-grabs

The current codebase is the work of many, many hands, with over 100 individual contributors over the course of its development.

Our day-to-day development chat happens on the aleth Gitter channel.

All contributions are welcome! We try to keep a list of tasks that are suitable for newcomers under the tag help wanted. If you have any questions, please do not hesitate to ask us about more information.

Please read CONTRIBUTING and CODING_STYLE thoroughly before making alterations to the code base.

All development goes in develop branch.


Note: The following is the output of

./aleth -h [--help]
on Linux

   aleth 1.7.2
   aleth [options]

WALLET USAGE: account list List all keys available in wallet account new Create a new key and add it to wallet account update [|

, ... ] Decrypt and re-encrypt keys account import [||] Import keys from given source and place in wallet wallet import Import a presale wallet

CLIENT MODE (default): --mainnet Use the main network protocol --ropsten Use the Ropsten testnet --test Testing mode; disable PoW and provide test rpc interface --config Configure specialised blockchain using given JSON information

--ipc Enable IPC server (default: on) --ipcpath Set .ipc socket path (default: data directory) --no-ipc Disable IPC server --admin Specify admin session key for JSON-RPC (default: auto-generated and printed at start-up) -K [ --kill ] Kill the blockchain first -R [ --rebuild ] Rebuild the blockchain from the existing database --rescue Attempt to rescue a corrupt database

--import-presale Import a pre-sale key; you'll need to specify the password to this key -s [ --import-secret ] Import a secret key into the key store -S [ --import-session-secret ] Import a secret session into the key store --master Give the master password for the key store; use --master "" to show a prompt --password Give a password for a private key

CLIENT TRANSACTING: --ask Set the minimum ask gas price under which no transaction will be mined (default: 20000000000) --bid Set the bid gas price to pay for transactions (default: 20000000000) --unsafe-transactions Allow all transactions to proceed without verification; EXTREMELY UNSAFE

CLIENT NETWORKING: -b [ --bootstrap ] Connect to the default Ethereum peer servers (default unless --no-discovery used) --no-bootstrap Do not connect to the default Ethereum peer servers (default only when --no-discovery is used) -x [ --peers ] Attempt to connect to a given number of peers (default: 11) --peer-stretch Give the accepted connection multiplier (default: 7) --public-ip Force advertised public IP to the given IP (default: auto) --listen-ip (:) Listen on the given IP for incoming connections (default: --listen Listen on the given port for incoming connections (default: 30303) -r [ --remote ] (:) Connect to the given remote host (default: none) --port Connect to the given remote port (default: 30303) --network-id Only connect to other hosts with this network id --allow-local-discovery Include local addresses in the discovery process. Used for testing purposes. --peerset Comma delimited list of peers; element format: type:enode://[email protected][:port[?discport=port]] Types: default Attempt connection when no other peers are available and pinning is disabled required Keep connected at all times

                                      The first port argument is the tcp port used for direct communication among peers. If the second port
                                      argument isn't supplied, the first port argument will also be the udp port used for node discovery.
                                      If neither the first nor second port arguments are supplied, a default port of 30303 will be used for
                                      both peer communication and node discovery.

--no-discovery Disable node discovery; implies --no-bootstrap --pin Only accept or connect to trusted peers

CLIENT MINING: -a [ --address ] Set the author (mining payout) address (default: auto) -m [ --mining ] Enable mining; optionally for a specified number of blocks (default: off) --extra-data arg Set extra data for the sealed blocks

BENCHMARKING MODE: -M [ --benchmark ] Benchmark for mining and exit --benchmark-warmup Set the duration of warmup for the benchmark tests (default: 3) --benchmark-trial Set the duration for each trial for the benchmark tests (default: 3) --benchmark-trials Set the number of trials for the benchmark tests (default: 5)

MINING CONFIGURATION: -C [ --cpu ] When mining, use the CPU -t [ --mining-threads ] Limit number of CPU/GPU miners to n (default: use everything available on selected platform) --current-block Let the miner know the current block number at configuration time. Will help determine DAG size and required GPU memory --disable-submit-hashrate When mining, don't submit hashrate to node

IMPORT/EXPORT MODES: -I [ --import ] Import blocks from file -E [ --export ] Export blocks to file --from Export only from block n; n may be a decimal, a '0x' prefixed hash, or 'latest' --to Export only to block n (inclusive); n may be a decimal, a '0x' prefixed hash, or 'latest' --only Equivalent to --export-from n --export-to n --format Set export format --dont-check Prevent checking some block aspects. Faster importing, but to apply only when the data is known to be valid --download-snapshot Download Parity Warp Sync snapshot data to the specified path --import-snapshot Import blockchain and state data from the Parity Warp Sync snapshot

DATABASE OPTIONS: --db (=leveldb) Select database implementation. Available options are: leveldb, memorydb. --db-path (=$HOME/.ethereum) Database path (for non-memory database options)

VM OPTIONS: --vm | (=legacy) Select VM implementation. Available options are: interpreter, legacy. --evmc = EVMC option

LOGGING OPTIONS: -v [ --log-verbosity ] <0 - 4> Set the log verbosity from 0 to 4 (default: 2). --log-channels Space-separated list of the log channels to show (default: show all channels). Channels: block blockhdr bq chain client debug discov error ethcap exec host impolite info net overlaydb p2pcap peer rlpx rpc snap statedb sync timer tq trace vmtrace warn warpcap watch --log-exclude-channels Space-separated list of the log channels to hide.

--log-vmtrace Enable VM trace log (requires log-verbosity 4).

GENERAL OPTIONS: -d [ --data-dir ] Load configuration files and keystore from path (default: $HOME/.ethereum) -V [ --version ] Show the version and exit -h [ --help ] Show this help message and exit


The Aleth project includes the following tools in addition to the Aleth client: * aleth-bootnode: A C++ Ethereum discovery bootnode implementation * aleth-key: A rudimentary wallet * aleth-vm: An EVM bytecode runner tool * rlp: A RLP encoder/decoder tool * testeth: A consensus test generator/runner tool


This project is not suitable for Ethereum mining because the support for GPU mining has been dropped some time ago including the ethminer tool. Use the ethminer tool from


Details on how to run and debug the tests can be found here




All contributions are made under the GNU General Public License v3. See LICENSE.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.