An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
NNI (Neural Network Intelligence) is a lightweight but powerful toolkit to help users automate Feature Engineering, Neural Architecture Search, Hyperparameter Tuning and Model Compression.
The tool manages automated machine learning (AutoML) experiments, dispatches and runs experiments' trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in different training environments like Local Machine, Remote Servers, OpenPAI, Kubeflow, FrameworkController on K8S (AKS etc.), DLWorkspace (aka. DLTS), AML (Azure Machine Learning), AdaptDL (aka. ADL) , other cloud options and even Hybrid mode.
New demo available: Youtube entry | Bilibili 入口 - last updated on Feb-19-2021
New use case sharing: Cost-effective Hyper-parameter Tuning using AdaptDL with NNI - posted on Feb-23-2021
NNI provides CommandLine Tool as well as an user friendly WebUI to manage training experiments. With the extensible API, you can customize your own AutoML algorithms and training services. To make it easy for new users, NNI also provides a set of build-in state-of-the-art AutoML algorithms and out of box support for popular training platforms.
Within the following table, we summarized the current NNI capabilities, we are gradually adding new capabilities and we'd love to have your contribution.
<tr valign="top">
<td align="center" valign="middle">
<b>Built-in</b>
</td>
<td>
<ul>
</tr>
<tr valign="top">
<td valign="middle">
<b>References</b>
</td>
<td style="border-top:#FF0000 solid 0px;">
<ul>
<li><a href="https://nni.readthedocs.io/en/latest/autotune_ref.html#trial">Python API</a></li>
<li><a href="https://github.com/microsoft/nni/blob/master/docs/en_US/Tutorial/AnnotationSpec.rst">NNI Annotation</a></li>
<li><a href="https://nni.readthedocs.io/en/latest/installation.html">Supported OS</a></li>
</ul>
</td>
<td style="border-top:#FF0000 solid 0px;">
<ul>
<li><a href="https://github.com/microsoft/nni/blob/master/docs/en_US/Tuner/CustomizeTuner.rst">CustomizeTuner</a></li>
<li><a href="https://github.com/microsoft/nni/blob/master/docs/en_US/Assessor/CustomizeAssessor.rst">CustomizeAssessor</a></li>
<li><a href="https://github.com/microsoft/nni/blob/master/docs/en_US/Tutorial/InstallCustomizedAlgos.rst">Install Customized Algorithms as Builtin Tuners/Assessors/Advisors</a></li>
</ul>
</td>
<td style="border-top:#FF0000 solid 0px;">
<ul>
<li><a href="https://github.com/microsoft/nni/blob/master/docs/en_US/TrainingService/Overview.rst">Support TrainingService</a></li>
<li><a href="https://github.com/microsoft/nni/blob/master/docs/en_US/TrainingService/HowToImplementTrainingService.rst">Implement TrainingService</a></li>
</ul>
</td>
</tr>
Frameworks & Libraries
![]() |
Algorithms
![]() |
Training Services
![]() |
|
Hyperparameter Tuning
|
|||
NNI supports and is tested on Ubuntu >= 16.04, macOS >= 10.14.1, and Windows 10 >= 1809. Simply run the following
pip installin an environment that has
python 64-bit >= 3.6.
Linux or macOS
python3 -m pip install --upgrade nni
Windows
python -m pip install --upgrade nni
If you want to try latest code, please install NNI from source code.
For detail system requirements of NNI, please refer to here for Linux & macOS, and here for Windows.
Note:
--userto install NNI in the user directory.
Segmentation fault, please refer to FAQ. For FAQ on Windows, please refer to NNI on Windows.
git clone -b v2.0 https://github.com/Microsoft/nni.git
Linux or macOS
nnictl create --config nni/examples/trials/mnist-pytorch/config.yml
Windows
nnictl create --config nni\examples\trials\mnist-pytorch\config_windows.yml
INFO: Successfully started experiment!in the command line. This message indicates that your experiment has been successfully started. You can explore the experiment using the
Web UI url.
INFO: Starting restful server... INFO: Successfully started Restful server! INFO: Setting local config... INFO: Successfully set local config! INFO: Starting experiment... INFO: Successfully started experiment! ----------------------------------------------------------------------- The experiment id is egchD4qy The Web UI urls are: http://223.255.255.1:8080 http://127.0.0.1:8080 -----------------------------------------------------------------------You can use these commands to get more information about the experiment
commands description
- nnictl experiment show show the information of experiments
- nnictl trial ls list all of trial jobs
- nnictl top monitor the status of running experiments
- nnictl log stderr show stderr log content
- nnictl log stdout show stdout log content
- nnictl stop stop an experiment
- nnictl trial kill kill a trial job by id
- nnictl --help get help information about nnictl
Web UI urlin your browser, you can view detail information of the experiment and all the submitted trial jobs as shown below. Here are more Web UI pages.
![]() |
![]() |
---|
NNI has a monthly release cycle (major releases). Please let us know if you encounter a bug by filling an issue.
We appreciate all contributions. If you are planning to contribute any bug-fixes, please do so without further discussions.
If you plan to contribute new features, new tuners, new training services, etc. please first open an issue or reuse an exisiting issue, and discuss the feature with us. We will discuss with you on the issue timely or set up conference calls if needed.
To learn more about making a contribution to NNI, please refer to our How-to contribution page.
We appreciate all contributions and thank all the contributors!
Join IM discussion groups:
|Gitter||WeChat|
|----|----|----|
|| OR |
|
Targeting at openness and advancing state-of-art technology, Microsoft Research (MSR) had also released few other open source projects.
We encourage researchers and students leverage these projects to accelerate the AI development and research.
The entire codebase is under MIT license