Every biological foundation model, evaluated and ranked by the bio.rodeo team
Showing 1–3 of 3 filtered models
Broad Institute / Dana-Farber Cancer Institute
Transformer-based foundation model pretrained on ~30 million single-cell transcriptomes for context-aware gene network predictions and therapeutic target discovery.
HHMI Janelia Research Campus
Human-in-the-loop cell segmentation framework enabling custom model training from as few as 100-200 corrected annotations.
UC Berkeley
Benchmark suite of five biologically relevant tasks for evaluating protein sequence representation learning, covering structure, homology, and engineering.