pip install pytorch-transformers==1.2.0
Repository of pre-trained NLP Transformer models: BERT & RoBERTa, GPT & GPT-2, Transformer-XL, XLNet and XLM
SourceAmong top 2% packages on PyPI.
Over 202.8K downloads in the last 90 days.
pytorch-transformers
Based on how often these packages appear together in public
requirements.txt
files on GitHub.
Repository of pre-trained NLP Transformer models: BERT & RoBERTa, GPT & GPT-2, Transformer-XL, XLNet and XLM |
|
pip-able S2 Geometry Bindings |
|
None |
|
pytest plugin to add diagnostic information to the header of the test output |
|
Google Cloud Dataproc API client library |
|
Cloud Key Management Service (KMS) API client library |
|
Pytest plugin for controlling remote data access. |
|
pytest plugin to help with comparing array output from tests |
|
Meta-package containing dependencies for testing |
|
Backport of new features in Python's os module |
|
Python bindings for the Qt Charts library |
|
Jupyter kernels for Spyder's console |
|
SacreMoses |
|
fastai simplifies training fast and accurate neural nets using modern best practices |
|
Jupyter support for HTTP-over-ws |
|
gmpy2 interface to GMP/MPIR, MPFR, and MPC for Python 2.6+ and 3.4+ |
|
A nested progress with plotting options for fastai |
|
A Jupyter widget for Vega 5 and Vega-Lite 4 |
|
A refreshing functional take on deep learning, compatible with your favorite libraries |
pytorch-transformers
Proportion of downloaded versions in the last 3 months (only versions over 1%).
1.2.0 |
60.64% |
1.1.0 |
23.63% |
1.0.0 |
15.62% |