Every biological foundation model, evaluated and ranked by the bio.rodeo team
Showing 1–1 of 1 filtered model
Beijing Institute of Genomics / Chinese Academy of Sciences
Self-supervised generative foundation model jointly trained on 91.7M nucleotide sequences and structured annotations spanning 1.076 trillion bases, achieving SOTA on 23 nucleotide-language benchmarks.