All Competitors

Every biological foundation model, evaluated and ranked by the bio.rodeo team

Showing 18 of 8 filtered models

Multimodalities

NatureLM

Microsoft Research AI for Science

Unified science foundation model from Microsoft Research treating molecules, proteins, RNA, DNA, and materials as a shared sequence language for cross-domain generation.

834
See the scorecard
Multimodalities

BioT5+

Microsoft Research Asia

An enhanced T5-based encoder-decoder that unifies molecule, protein, and text understanding via IUPAC integration and multi-task instruction tuning.

12529.3K
See the scorecard
Multimodalities

ChatCell

ZJUNlp

A T5-based conversational framework that converts scRNA-seq data into cell sentences, enabling cell type annotation, pseudo-cell generation, and drug sensitivity prediction via natural language.

52612
See the scorecard
Multimodalities

BioT5

Renmin University of China

Pre-training framework bridging molecules, proteins, and natural language using T5 with SELFIES representations for cross-modal biological understanding.

125
See the scorecard
Multimodalities

DARWIN Series

MasterAI EAM

Domain-specific large language models for natural science, fine-tuned on physics, chemistry, and materials science literature using automated instruction generation.

24748
See the scorecard
Multimodalities

scMoFormer

Michigan State University

Transformer framework for single-cell multi-omics that predicts cross-modality relationships using heterogeneous graphs of cells, genes, and proteins.

2716
See the scorecard
Multimodalities

Galactica

Meta AI

A large language model trained on 48 million scientific papers and knowledge bases to store, combine, and reason about scientific knowledge.

2.7K3.5K
See the scorecard
Multimodalities

BioSeq-BLM

Beijing Institute of Technology

An integrated platform implementing 155 biological language models for analyzing DNA, RNA, and protein sequences across residue-level and sequence-level tasks.

14209
See the scorecard