All Competitors
Every biological foundation model, evaluated and ranked by the bio.rodeo team
Showing 1–24 of 25 filtered models
TranscriptFormer
Chan Zuckerberg Initiative
A generative cross-species foundation model for single-cell transcriptomics, trained on 112 million cells from 12 species spanning 1.5 billion years of evolution.
mLLMCelltype
Texas A&M University
Multi-LLM consensus framework for automated cell type annotation in scRNA-seq data, outperforming prior methods by ~15% in mean accuracy.
scPRINT
Institut Pasteur / CNRS
Foundation model pre-trained on 50 million single cells for robust gene network inference, with zero-shot denoising, batch correction, and cell type prediction.
GeneCompass
Chinese Academy of Sciences
Knowledge-informed cross-species foundation model pre-trained on 101M human and mouse single-cell transcriptomes to decipher universal gene regulatory mechanisms.
PINNACLE
Harvard University
Geometric deep learning model generating context-aware protein representations across 156 cell-type contexts from a multi-organ single-cell atlas.
scPRINT
Institut Pasteur
Single-cell foundation model pre-trained on 50 million cells for gene network inference, denoising, and cell type prediction.
Cell2Sentence
Yale University
Framework that converts single-cell gene expression profiles into ranked gene-name sequences, enabling standard LLMs to generate, annotate, and analyze cells.
CellFM
Sun Yat-sen University
An 800M-parameter single-cell foundation model pre-trained on 100 million human cells via a RetNet architecture for cell annotation, perturbation prediction, and gene analysis.
scFoundation
Biomap Research
A 100M-parameter foundation model trained on 50M+ human single-cell transcriptomic profiles, achieving state-of-the-art performance across diverse downstream tasks.
CellPLM
OmicsML
Single-cell transformer that treats cells as tokens and tissues as sentences, encoding cell-cell relationships with 100x faster inference than prior pre-trained models.
Nicheformer
Helmholtz Munich / Technical University of Munich
Transformer foundation model pretrained on 110M single-cell and spatially resolved transcriptomics profiles, enabling spatial context prediction for dissociated cells.
GPTCelltype
Columbia University / Duke University
An R package that uses GPT-4 to annotate cell types in scRNA-seq data from marker genes, matching expert accuracy across hundreds of cell types and tissues.
scGPT
Bowang Lab
A generative pre-trained transformer for single-cell multi-omics, pretrained on 33 million human cells for cell annotation, batch correction, and perturbation prediction.
scDisInFact
Zhang Lab
Disentangled VAE framework for joint batch correction, condition-key-gene detection, and perturbation prediction in multi-batch multi-condition scRNA-seq data.
scMulan
Tsinghua University
A 368M-parameter generative language model for single-cell transcriptomics, enabling zero-shot cell type annotation, batch integration, and conditional cell generation.
scDiffusion
Tsinghua University
Generative diffusion model for single-cell RNA-seq data synthesis, enabling controlled generation of specific cell types, rare cells, and developmental trajectories.
scPROTEIN
TencentAILabHealthcare
Deep graph contrastive learning framework for single-cell proteomics embedding, handling peptide uncertainty, missingness, and batch effects.
scPML
Shenzhen University
Pathway-based multi-view learning for cell type annotation from single-cell RNA-seq data, integrating biological pathway knowledge through graph neural networks.
UCE
Stanford University
Zero-shot foundation model for single-cell gene expression that generates species-agnostic cell embeddings using protein language model representations of gene products.
A Deep Dive into scRNA-seq Foundation Models
MIT CSAIL / Broad Institute
A rigorous benchmarking study of scBERT and scGPT for cell type annotation, comparing foundation models against logistic regression baselines.
Geneformer
Broad Institute / Dana-Farber Cancer Institute
Transformer-based foundation model pretrained on ~30 million single-cell transcriptomes for context-aware gene network predictions and therapeutic target discovery.
DPI
Xiamen University
End-to-end single-cell multimodal analysis framework using deep parametric inference to integrate RNA and protein data into a unified latent space.
scBERT
Tencent AI Lab
Pretrained transformer for cell type annotation of scRNA-seq data. Trained on 1.1M cells; outperforms supervised methods on cross-dataset transfer.
scVAE
Technical University of Denmark / University of Copenhagen
Variational autoencoder for single-cell RNA-seq that models raw count distributions directly, producing latent cell representations without normalization preprocessing.