Every biological foundation model, evaluated and ranked by the bio.rodeo team
Showing 1–4 of 4 filtered models
University of Missouri
A unified multi-task framework that converts diverse protein prediction tasks into autoregressive next-token prediction using pre-trained protein language model encoders.
PKU-YuanGroup
A 7B-parameter protein language model built on LLaMA-2 that performs both protein sequence generation and superfamily classification in a unified framework.
Tsinghua University
A 368M-parameter generative language model for single-cell transcriptomics, enabling zero-shot cell type annotation, batch integration, and conditional cell generation.
TencentAILabHealthcare
A GPT-based foundation model pre-trained on 200B+ base pairs from mammalian genomes, supporting DNA sequence generation, classification, and regression.