Need help with FreeLB?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

zhuchen03
135 Stars 21 Forks 8 Commits 2 Opened issues

Description

Adversarial Training for Natural Language Understanding

Services available

!
?

Need anything else?

Contributors list

Introduction

This repository contains the implementation for FreeLB on GLUE tasks based on both fairseq and HuggingFace's transformers libraries, under

./fairseq-RoBERTa/
and
./huggingface-transformers/
respectively. We also integrated our implementations of vanilla PGD, FreeAT and YOPO in our fairseq version. FreeLB is an adversarial training approach for improving transformer-based language models on Natural Language Understanding tasks. It accumulates the gradient in the ascent steps and updates the parameters with the accumulated gradients, which is approximately equivalent to enlarging the batch size with diversified adversarial examples within different radiuses around the clean input. FreeLB improves the performance of BERT and RoBERTa on various Natural Language Understanding tasks including Question Answering, Natural Language Inference, and Sentiment Analysis.

For technical details and additional experimental results, please refer to our paper:

Chen Zhu, Yu Cheng, Zhe Gan, Siqi Sun, Tom Goldstein, and Jingjing Liu. FreeLB: Enhanced Adversarial Training for Language Understanding. In ICLR, 2020.

What's New

  • Feb 15, 2020: Initial release of FreeLB based on fairseq and HuggingFace's transformers. The first one contains our implementations of FreeLB, FreeAT, YOPO for RoBERTa, while the latter one is FreeLB for ALBERT.

  • May 16, 2020: Hyperparameters for ALBERT are now available at

    huggingface-transformers/launch/run_glue.sh
    .

Prerequisites

The code is compatible with PyTorch 1.4.0. In addition, you need to execute the followings in order, to install the prerequisites for fairseq and HuggingFace's transformers: ```

Install apex

git clone https://github.com/NVIDIA/apex cd apex pip install -v --no-cache-dir --global-option="--cppext" --global-option="--cudaext" ./

Configure fairseq

cd ../fairseq-RoBERTa pip install --editable .

Download and pre-process GLUE data

wget https://gist.githubusercontent.com/W4ngatang/60c2bdb54d156a41194446737ce03e2e/raw/17b8dd0d724281ed7c3b2aeeda662b92809aadd5/downloadgluedata.py python downloadgluedata.py --datadir gluedata --tasks all source ./examples/roberta/preprocessGLUEtasks.sh glue_data ALL

cd ../huggingface-transformers pip install --editable . mkdir logs ```

Launch

The launch scripts are under

./fairseq-RoBERTa/launch/
and
./huggingface-transformers/launch/
, where we have included most of the running scripts for RoBERTa and ALBERT on GLUE dev sets. We will release more details in the future.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.