Bio-transformers : Documentation and Tutorial
Contents
Bio-transformers : Documentation and Tutorial¶
Caution
Bio-transformers introduces breaking changes replacing device
and multi_gpu
arguments by num_gpus
. Multi-GPU inference is now managed with ray
, which leverage the full computational capacity of each GPU in contrast to torch.DataParallel
bio-transformers is a python wrapper on top of the ESM/Protbert model, which are Transformers protein language model, trained on millions on proteins and used to predict embeddings. This package provides other functionalities that you can use to build apps thanks to deepchain-apps
Features¶
Note
Bio-transformers now use Ray to manage multi-GPU inference.
Bio-transformers extends and simplifies workflows for manipulating amino acids sequences with Pytorch, and can be used to test several pre-trained transformers models without taking into account the syntax specificity of different models.
- The main features are:
compute_loglikelihood
compute_probabilities
compute_embeddings
compute_accuracy
finetune
Our development and all related work involved in the project is public, and released under the Apache 2.0 license.
Contributors¶
Bio-transformers is a package belonging to the DeepChainBio repository, maintained by a team of developers and researchers at Instadeep.