Every biological foundation model, evaluated and ranked by the bio.rodeo team
Showing 1–3 of 3 filtered models
Harvard University / MIT
Genomic language model trained on metagenomic scaffolds that learns protein co-regulation and function by modeling gene context and operon structure.
MAGICS Lab
Species-aware DNA embedding model built on DNABERT-2, using contrastive learning to cluster and differentiate genomic sequences by species without labeled data.
Burstein Lab
A word2vec-based language model trained on 360 million microbial genes that predicts gene function from genomic context without sequence homology.