All Competitors
Every biological foundation model, evaluated and ranked by the bio.rodeo team
Applications
Architectures
Learning Paradigms
Biological Subjects
Showing 1–12 of 12 filtered models
TranscriptFormer
Chan Zuckerberg Initiative
A generative cross-species foundation model for single-cell transcriptomics, trained on 112 million cells from 12 species spanning 1.5 billion years of evolution.
Evo 2
Arc Institute
Genomic foundation model trained on 9.3 trillion DNA base pairs spanning all domains of life, with 40B parameters and a 1-million-token context window.
BioEmu-1
Microsoft
Generative deep learning model from Microsoft Research that emulates protein equilibrium ensembles at 100,000x the speed of molecular dynamics simulation.
CryoFM
ByteDance Seed
Generative foundation model for cryo-EM density maps using flow matching, enabling zero-shot denoising, map sharpening, and missing wedge restoration.
GenerRNA
Preferred Networks
Transformer-based generative language model for de novo RNA sequence design, pre-trained on 16 million sequences to generate novel, structurally stable RNAs.
ESM-3
EvolutionaryScale
Multimodal generative protein language model that jointly reasons over sequence, structure, and function. Trained at 98B parameters on 2.78 billion proteins.
scMulan
Tsinghua University
A 368M-parameter generative language model for single-cell transcriptomics, enabling zero-shot cell type annotation, batch integration, and conditional cell generation.
scDiffusion
Tsinghua University
Generative diffusion model for single-cell RNA-seq data synthesis, enabling controlled generation of specific cell types, rare cells, and developmental trajectories.
Chroma
Generate:Biomedicines
Generative diffusion model for programmable protein design that jointly samples novel structures and sequences, conditioned on symmetry, shape, and natural language.
ProGen2
Salesforce
Family of autoregressive protein language models (151M–6.4B parameters) trained on over a billion sequences for protein generation and zero-shot fitness prediction.
DNAGPT
TencentAILabHealthcare
A GPT-based foundation model pre-trained on 200B+ base pairs from mammalian genomes, supporting DNA sequence generation, classification, and regression.
ProtGPT2
University of Bayreuth
Autoregressive protein language model based on GPT-2 that generates de novo protein sequences sampling unexplored regions of protein space.