All Competitors
Every biological foundation model, evaluated and ranked by the bio.rodeo team
Applications
Architectures
Learning Paradigms
Biological Subjects
Showing 1–9 of 9 filtered models
TranscriptFormer
Chan Zuckerberg Initiative
A generative cross-species foundation model for single-cell transcriptomics, trained on 112 million cells from 12 species spanning 1.5 billion years of evolution.
GeneCompass
Chinese Academy of Sciences
Knowledge-informed cross-species foundation model pre-trained on 101M human and mouse single-cell transcriptomes to decipher universal gene regulatory mechanisms.
CellFM
Sun Yat-sen University
An 800M-parameter single-cell foundation model pre-trained on 100 million human cells via a RetNet architecture for cell annotation, perturbation prediction, and gene analysis.
scFoundation
Biomap Research
A 100M-parameter foundation model trained on 50M+ human single-cell transcriptomic profiles, achieving state-of-the-art performance across diverse downstream tasks.
Nicheformer
Helmholtz Munich / Technical University of Munich
Transformer foundation model pretrained on 110M single-cell and spatially resolved transcriptomics profiles, enabling spatial context prediction for dissociated cells.
GPTCelltype
Columbia University / Duke University
An R package that uses GPT-4 to annotate cell types in scRNA-seq data from marker genes, matching expert accuracy across hundreds of cell types and tissues.
scGPT
Bowang Lab
A generative pre-trained transformer for single-cell multi-omics, pretrained on 33 million human cells for cell annotation, batch correction, and perturbation prediction.
scDisInFact
Zhang Lab
Disentangled VAE framework for joint batch correction, condition-key-gene detection, and perturbation prediction in multi-batch multi-condition scRNA-seq data.
Geneformer
Broad Institute / Dana-Farber Cancer Institute
Transformer-based foundation model pretrained on ~30 million single-cell transcriptomes for context-aware gene network predictions and therapeutic target discovery.