All Competitors
Every biological foundation model, evaluated and ranked by the bio.rodeo team
Showing 1–24 of 158 models
RFdiffusion3
Institute for Protein Design
All-atom diffusion model for de novo protein design conditioned on ligands, nucleic acids, and arbitrary non-protein atoms, enabling enzyme and DNA binder design.
RFdiffusion2
Institute for Protein Design
Atom-level generative diffusion model for de novo enzyme design. Scaffolds arbitrary functional group geometries, solving all 41 benchmark active sites vs. 16/41 for prior methods.
AlphaGenome
Google DeepMind
Google DeepMind model that predicts thousands of functional genomic tracks at single base-pair resolution from megabase-scale DNA sequences.
Boltz-2
MIT CSAIL / Recursion Pharmaceuticals
Open model that jointly predicts biomolecular structure and small-molecule binding affinity, approaching FEP+ accuracy in seconds on a single GPU.
Cellpose-SAM
HHMI Janelia Research Campus
Generalist cell segmentation model combining SAM's ViT-L backbone with Cellpose flow fields. First model to surpass average human annotators on the Cellpose benchmark.
TranscriptFormer
Chan Zuckerberg Initiative
A generative cross-species foundation model for single-cell transcriptomics, trained on 112 million cells from 12 species spanning 1.5 billion years of evolution.
OmniEM
Peking University
Unified electron microscopy image analysis toolkit built on EM-DINO, a vision foundation model pretrained on 5 million diverse EM images.
mLLMCelltype
Texas A&M University
Multi-LLM consensus framework for automated cell type annotation in scRNA-seq data, outperforming prior methods by ~15% in mean accuracy.
scPRINT
Institut Pasteur / CNRS
Foundation model pre-trained on 50 million single cells for robust gene network inference, with zero-shot denoising, batch correction, and cell type prediction.
Pinal
Westlake University
A 16B-parameter framework for de novo protein design from natural language, converting text descriptions into functional protein sequences via two-stage structure-conditioned generation.
LigandMPNN
Institute for Protein Design
Protein sequence design method that explicitly models small molecules, nucleotides, and metals at atomic resolution, enabling ligand-aware design with 100+ validated designs.
Evo 2
Arc Institute
Genomic foundation model trained on 9.3 trillion DNA base pairs spanning all domains of life, with 40B parameters and a 1-million-token context window.
NatureLM
Microsoft Research AI for Science
Unified science foundation model from Microsoft Research treating molecules, proteins, RNA, DNA, and materials as a shared sequence language for cross-domain generation.
Cellpose 3
HHMI Janelia Research Campus
Generalist cell segmentation framework with a super-generalist cyto3 model and one-click image restoration networks optimized for downstream segmentation quality.
Protenix
ByteDance AI Lab
Open-source PyTorch reproduction of AlphaFold 3 (Apache 2.0) that matches or exceeds AF3 performance on protein-ligand, protein-protein, and protein-nucleic acid benchmarks.
Borzoi
Calico Life Sciences
Deep learning model predicting cell-type-specific RNA-seq coverage at 32 bp resolution from 524 kb of DNA sequence, jointly modeling transcription, splicing, and polyadenylation.
Evolla
Westlake University
An 80B-parameter multimodal protein-language model that decodes protein function through natural language dialogue, integrating sequence, structure, and evolutionary context.
ProteinDT
UC Berkeley
A multimodal framework for text-guided protein design, enabling sequence generation, zero-shot editing, and property prediction via contrastive learning.
SubCell
Chan Zuckerberg Initiative / Human Protein Atlas / Lundberg Lab
Self-supervised Vision Transformer models trained on proteome-wide fluorescence microscopy images from the Human Protein Atlas for subcellular protein localization.
BioEmu-1
Microsoft
Generative deep learning model from Microsoft Research that emulates protein equilibrium ensembles at 100,000x the speed of molecular dynamics simulation.
ESM Cambrian
EvolutionaryScale
A family of protein language models (300M, 600M, 6B parameters) for representation learning that substantially outperforms ESM-2 at equivalent or smaller scale.
BiomedParse
Microsoft Research
A biomedical foundation model for joint segmentation, detection, and recognition across nine imaging modalities using natural language prompts.
Evo
Arc Institute
A 7B parameter genomic foundation model using StripedHyena architecture to model prokaryotic DNA, RNA, and proteins at single-nucleotide resolution with 131k token context.
Boltz-1
MIT
Open-source deep learning model for biomolecular structure prediction achieving AlphaFold3-level accuracy, trained entirely on publicly available data.