Every biological foundation model, evaluated and ranked by the bio.rodeo team
Showing 1–2 of 2 filtered models
Bowang Lab
A generative pre-trained transformer for single-cell multi-omics, pretrained on 33 million human cells for cell annotation, batch correction, and perturbation prediction.
Zhang Lab
Disentangled VAE framework for joint batch correction, condition-key-gene detection, and perturbation prediction in multi-batch multi-condition scRNA-seq data.