项目作者: ABaldrati

项目描述 :
In MT-BERT we reproduce a neural language understanding model which implements a Multi-Task Deep Neural Network (MT-DNN) for learning representations across multiple NLU tasks.
高级语言: Python
项目地址: git://github.com/ABaldrati/MT-BERT.git
创建时间: 2021-02-11T15:55:47Z
项目社区:https://github.com/ABaldrati/MT-BERT

开源协议:

下载


MT-BERT

Table of Contents

About The Project

In MT-BERT we reproduce a neural language understanding model based on the paper
by Liu et al.(2019).
Such model implements a Multi-Task Deep Neural Network (MT-DNN) for learning representations across multiple NLU tasks.
MT-DNN extends the model proposed in paper by Liu et al.(2015) by incorporating a pre-trained bidirectional transformer language model, known as BERT.

More details about the project are available in the presentation

Original implementation available at repo

Built With

Getting Started

To get a local copy up and running follow these simple steps.

Prerequisites

The project provide a Pipfile file that can be managed with pipenv.
pipenv installation is strongly encouraged in order to avoid dependency/reproducibility problems.

  • pipenv
    1. pip install pipenv

Installation

  1. Clone the repo
    1. git clone https://gitlab.com/reddeadrecovery/mt-bert
  2. Install Python dependencies
    1. pipenv install

Usage

Here’s a brief description of each and every file in the repo:

  • model.py: Model definition
  • task.py: Task dataset preprocessing and definition
  • train_glue.py: Training file for Multi task training on GLUE
  • fine_tune_task.py: Fine tuning, domain adaptation and single task training file
  • utils.py: utils file

There is also a executable jupyter notebook:train.ipnyb

Authors

Acknowledgments

Machine Learning © Course held by Professor Paolo Frasconi - Computer Engineering Master Degree @University of Florence