Need help with provis?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

227 Stars 36 Forks Other 16 Commits 1 Opened issues


Official code repository of "BERTology Meets Biology: Interpreting Attention in Protein Language Models."

Services available


Need anything else?

Contributors list

BERTology Meets Biology: Interpreting Attention in Protein Language Models

This repository is the official implementation of BERTology Meets Biology: Interpreting Attention in Protein Language Models.

Table of Contents

ProVis Attention Visualizer

This section provides instructions for generating visualizations of attention projected onto 3D protein structure.

Image Image


General requirements: * Python >= 3.7

pip install biopython==1.77
pip install tape-proteins==0.4
pip install jupyterlab==3.0.14
pip install nglview
jupyter-nbextension enable nglview --py --sys-prefix

If you run into problems installing nglview, please refer to their installation instructions for additional installation details and options.


cd /notebooks
jupyter notebook provis.ipynb

If you get an error running the notebook, you may need to execute the notebook as follows:

jupyter notebook --NotebookApp.iopub_data_rate_limit=10000000

See nglview installation instructions for more details.

You may edit the notebook to choose other proteins, attention heads, etc. The visualization tool is based on the excellent nglview library.


This section describes how to reproduce the experiments in the paper.


python develop

To download additional required datasets from TAPE:

cd /data
tar -xvf secondary_structure.tar.gz && rm secondary_structure.tar.gz
tar -xvf proteinnet.tar.gz && rm proteinnet.tar.gz

Attention Analysis

The following steps will reproduce the attention analysis experiments and generate the reports currently found in /reports/attention_analysis. This includes all experiments besides the probing experiments (see Probing Analysis).

Before performing steps, navigate to appropriate directory:

cd /protein_attention/attention_analysis

Tape BERT Model

The following executes the attention analysis (may run for several hours):

sh scripts/
The above script create a set of extract files in /data/cache corresponding to various properties being analyzed. You may edit the script files to remove properties that you are not interested in. If you wish to run the analysis without a GPU, you must specify the

The following generate reports based on the files created in previous step:

sh scripts/
If you removed steps from the analysis script above, you will need to update the reporting script accordingly.

ProtTrans Models

In order to generate reports for the ProtTrans models, follow the instructions as for the TapeBert model above, but substitute the following commands:


sh scripts/
sh scripts/


sh scripts/
sh scripts/


sh scripts/
sh scripts/


sh scripts/
sh scripts/

Probing Analysis

The following steps will recreate the figures from the probing analysis, currently found in /reports/probing

Navigate to directory:

cd /protein_attention/probing


Train diagnostic classifiers. Each script will write out an extract file with evaluation results. Note: each of these scripts may run for several hours.

sh scripts/probe_ss4_0_all
sh scripts/probe_ss4_1_all
sh scripts/probe_ss4_2_all
sh scripts/
sh scripts/




This project is licensed under BSD3 License - see the LICENSE file for details


This project incorporates code from the following repo: *


When referencing this repository, please cite this paper.

    title={BERTology Meets Biology: Interpreting Attention in Protein Language Models},
    author={Jesse Vig and Ali Madani and Lav R. Varshney and Caiming Xiong and Richard Socher and Nazneen Fatema Rajani},

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.