Every biological foundation model, evaluated and ranked by the bio.rodeo team
Showing 1–2 of 2 filtered models
Chinese Academy of Sciences
Autoregressive generative model pretrained on 30 million full-length natural mRNA sequences that jointly optimizes 5' UTR, CDS, and 3' UTR for therapeutic mRNA stability and translation efficiency.
GENTEL Lab
Long-context generative RNA foundation model trained on 114 million full-length RNA sequences, supporting de novo design of tRNAs, aptamers, CRISPR guide RNAs, mRNAs, and circular RNAs.