Every biological foundation model, evaluated and ranked by the bio.rodeo team
Showing 1–3 of 3 filtered models
Bowang Lab
Mamba-based mature RNA foundation model using contrastive learning on splice isoforms and 400+ mammalian species orthologs for mRNA property prediction.
HazyResearch
Genomic foundation model using the Hyena operator to process DNA at single-nucleotide resolution with context lengths up to 1 million tokens, 500x longer than transformer-based predecessors.
AIRI Institute
A family of transformer-based DNA language models supporting context lengths up to 36,000 bp via BPE tokenization and BigBird sparse attention.