Labs & Groups (3)
Models (9)
NatureLM
Microsoft Research AI for Science
Unified science foundation model from Microsoft Research treating molecules, proteins, RNA, DNA, and materials as a shared sequence language for cross-domain generation.
BioEmu-1
Microsoft
Generative deep learning model from Microsoft Research that emulates protein equilibrium ensembles at 100,000x the speed of molecular dynamics simulation.
BiomedParse
Microsoft Research
A biomedical foundation model for joint segmentation, detection, and recognition across nine imaging modalities using natural language prompts.
SFM-Protein
Microsoft Research
A transformer protein language model using integrative co-evolutionary pre-training to capture both short-range and long-range residue interactions from sequence alone.
BioT5+
Microsoft Research Asia
An enhanced T5-based encoder-decoder that unifies molecule, protein, and text understanding via IUPAC integration and multi-task instruction tuning.
Prov-GigaPath
Microsoft Research
Whole-slide pathology foundation model pretrained on 1.3 billion tiles from 171,189 clinical WSIs. Achieves state-of-the-art on 25 of 26 pathology benchmark tasks.
ABGNN
Huazhong University of Science and Technology / Microsoft Research
Graph neural network framework for antigen-specific antibody CDR design, combining a pre-trained antibody language model with one-shot sequence and structure generation.
BiomedCLIP
Microsoft Research
Multimodal biomedical foundation model trained on 15M PubMed Central figure-caption pairs via contrastive learning, achieving state-of-the-art zero-shot performance across imaging modalities.
CARP
Microsoft Research
CNN-based protein language model series showing convolutions match transformer performance on sequence pretraining while scaling linearly with sequence length.