Every biological foundation model, evaluated and ranked by the bio.rodeo team
Showing 1–5 of 5 filtered models
Chan Zuckerberg Initiative
A generative cross-species foundation model for single-cell transcriptomics, trained on 112 million cells from 12 species spanning 1.5 billion years of evolution.
Chinese Academy of Sciences
Knowledge-informed cross-species foundation model pre-trained on 101M human and mouse single-cell transcriptomes to decipher universal gene regulatory mechanisms.
Stanford University
Zero-shot foundation model for single-cell gene expression that generates species-agnostic cell embeddings using protein language model representations of gene products.
MAGICS Lab
Multi-species genomic foundation model replacing k-mer tokenization with BPE, achieving state-of-the-art performance with 21x fewer parameters than prior leading models.
Technical University of Munich
Masked DNA language model trained on 800+ species spanning 500M years of evolution, using explicit species conditioning to capture conserved regulatory elements.