Need help with bert-event-extraction?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

nlpcl-lab
180 Stars 35 Forks MIT License 56 Commits 7 Opened issues

Description

Pytorch Solution of Event Extraction Task using BERT on ACE 2005 corpus

Services available

!
?

Need anything else?

Contributors list

# 184,084
emnlp
pytorch
preproc...
HTML
54 commits

bert-event-extraction

Pytorch Solution of Event Extraction Task using BERT on ACE 2005 corpus

Prerequisites

  1. Prepare ACE 2005 dataset.

  2. Use nlpcl-lab/ace2005-preprocessing to preprocess ACE 2005 dataset in the same format as the data/sample.json. Then place it in the data directory as follows:

    ├── data
    │     └── test.json
    │     └── dev.json
    │     └── train.json
    │...
    
  3. Install the packages.

    pip install pytorch==1.0 pytorch_pretrained_bert==0.6.1 numpy
    

Usage

Train

python train.py

Evaluation

python eval.py --model_path=latest_model.pt

Result

Performance

Method Trigger Classification (%) Argument Classification (%)
Precision Recall F1 Precision Recall F1
JRNN 66.0 73.0 69.3 54.2 56.7 55.5
JMEE 76.3 71.3 73.7 66.8 54.9 60.3
This model (BERT base) 63.4 71.1 67.7 48.5 34.1 40.0

The performance of this model is low in argument classification even though pretrained BERT model was used. The model is currently being updated to improve the performance.

Reference

  • Jointly Multiple Events Extraction via Attention-based Graph Information Aggregation (EMNLP 2018), Liu et al. [paper]
  • lx865712528's EMNLP2018-JMEE repository [github]
  • Kyubyong's bertner repository [[github]](https://github.com/Kyubyong/bertner)

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.