Flexible interface for high performance research using SOTA Transformers

lightning-transformers

Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.

732e676966

Installation

Option 1: from PyPI

pip install lightning-transformers
# instead of: `python train.py ...`, run with:
pl-transformers-train ...

Option 2: from source

git clone https://github.com/PyTorchLightning/lightning-transformers.git
cd lightning-transformers
pip install .
python train.py ...
# the `pl-transformers-train` endpoint is also available!

Quick recipes

Train bert-base-cased on the CARER emotion dataset using the Text Classification task.

python train.py 
    task=nlp/text_classification 
    dataset=nlp/text_classification/emotion

See the composed Hydra config used under-the-hood

optimizer:
_target_: torch.optim.AdamW
lr: ${training.lr}
weight_decay: 0.001
scheduler:
_target_: transformers.get_linear_schedule_with_warmup
num_training_steps: -1
num_warmup_steps: 0.1
training:
run_test_after_fit: true
lr: 5.0e-05
output_dir: .
batch_size: 16
num_workers: 16
trainer:
_target_: pytorch_lightning.Trainer
logger: true

 

 

 

To finish reading, please visit source site