Every biological foundation model, evaluated and ranked by the bio.rodeo team
Showing 1–3 of 3 filtered models
EvolutionaryScale
A family of protein language models (300M, 600M, 6B parameters) for representation learning that substantially outperforms ESM-2 at equivalent or smaller scale.
Kakao Brain
Multimodal protein pre-training framework that learns sequence, 3D structure, and surface representations jointly using implicit neural representations.
BioMap
Asymmetric encoder-decoder transformer for single-cell RNA-seq data that reduces FLOPs by 1-2 orders of magnitude while achieving state-of-the-art performance.