Every biological foundation model, evaluated and ranked by the bio.rodeo team
Showing 1–3 of 3 filtered models
GrayLab
Generative language model trained on 558 million antibody sequences for infilling-based design of CDR loops and full-length immunoglobulin sequences.
Alchemab
A BERT-based antibody language model pre-trained on 57M BCR sequences for paratope prediction and convergent antibody discovery.
Oxpig
Antibody-specific language model trained on the OAS database for restoring missing residues and generating high-quality sequence representations.