All Competitors
Every biological foundation model, evaluated and ranked by the bio.rodeo team
Applications
Architectures
Learning Paradigms
Biological Subjects
Showing 1–7 of 7 filtered models
IgLM
GrayLab
Generative language model trained on 558 million antibody sequences for infilling-based design of CDR loops and full-length immunoglobulin sequences.
ABGNN
Huazhong University of Science and Technology / Microsoft Research
Graph neural network framework for antigen-specific antibody CDR design, combining a pre-trained antibody language model with one-shot sequence and structure generation.
Efficient Evolution of Human Antibodies from Protein Language Models
Stanford University
Zero-shot antibody affinity maturation using ESM pseudolikelihood scoring. Improves binding up to 160-fold with no antigen-specific training data.
ReprogBERT
IBM
Reprograms a frozen English BERT model for antibody CDR sequence infilling via learnable cross-domain projection matrices, without training a new protein language model.
AntiBERTa
Alchemab
A BERT-based antibody language model pre-trained on 57M BCR sequences for paratope prediction and convergent antibody discovery.
AbLang
Oxpig
Antibody-specific language model trained on the OAS database for restoring missing residues and generating high-quality sequence representations.
Parapred
University of Cambridge
Sequence-based deep learning model for antibody paratope prediction using convolutional and recurrent neural networks. Identifies antigen-contacting residues from CDR sequences alone.