All Competitors
Every biological foundation model, evaluated and ranked by the bio.rodeo team
Showing 1–8 of 8 filtered models
NatureLM
Microsoft Research AI for Science
Unified science foundation model from Microsoft Research treating molecules, proteins, RNA, DNA, and materials as a shared sequence language for cross-domain generation.
BioT5+
Microsoft Research Asia
An enhanced T5-based encoder-decoder that unifies molecule, protein, and text understanding via IUPAC integration and multi-task instruction tuning.
ChatCell
ZJUNlp
A T5-based conversational framework that converts scRNA-seq data into cell sentences, enabling cell type annotation, pseudo-cell generation, and drug sensitivity prediction via natural language.
BioT5
Renmin University of China
Pre-training framework bridging molecules, proteins, and natural language using T5 with SELFIES representations for cross-modal biological understanding.
DARWIN Series
MasterAI EAM
Domain-specific large language models for natural science, fine-tuned on physics, chemistry, and materials science literature using automated instruction generation.
scMoFormer
Michigan State University
Transformer framework for single-cell multi-omics that predicts cross-modality relationships using heterogeneous graphs of cells, genes, and proteins.
Galactica
Meta AI
A large language model trained on 48 million scientific papers and knowledge bases to store, combine, and reason about scientific knowledge.
BioSeq-BLM
Beijing Institute of Technology
An integrated platform implementing 155 biological language models for analyzing DNA, RNA, and protein sequences across residue-level and sequence-level tasks.