Difference between revisions of "Science Agents"

From GISAXS
Jump to: navigation, search
(Related Tools)
(Biology)
Line 73: Line 73:
 
* 2025-02: [https://arxiv.org/pdf/2502.18449 Protein Large Language Models: A Comprehensive Survey]
 
* 2025-02: [https://arxiv.org/pdf/2502.18449 Protein Large Language Models: A Comprehensive Survey]
 
* [https://x.com/vant_ai/status/1903070297991110657 2025-03]: [https://www.vant.ai/ Vant AI] [https://www.vant.ai/neo-1 Neo-1]: atomistic foundation model (small molecules, proteins, etc.)
 
* [https://x.com/vant_ai/status/1903070297991110657 2025-03]: [https://www.vant.ai/ Vant AI] [https://www.vant.ai/neo-1 Neo-1]: atomistic foundation model (small molecules, proteins, etc.)
 +
* 2025-03: [https://arxiv.org/abs/2503.16351 Lyra: An Efficient and Expressive Subquadratic Architecture for Modeling Biological Sequences]
  
 
===Successes===
 
===Successes===

Revision as of 11:50, 22 March 2025

AI Use-cases for Science

Literature

LLM extract data from papers

AI finding links in literature

Explanation

Autonomous Ideation

Adapting LLMs to Science

AI/ML Methods tailored to Science

Regression (Data Fitting)

Tabular Classification/Regression

Symbolic Regression

Literature Discovery

Commercial

AI/ML Methods in Science

Chemistry

Biology

Successes

AI/ML Methods co-opted for Science

Mechanistic Interpretability

Train large model on science data. Then apply mechanistic interpretability (e.g. sparse autoencoders, SAE) to the feature/activation space.

Uncertainty

Science Benchmarks

Science Agents

Reviews

Specific

Science Multi-Agent Setups

AI Science Systems

Inorganic Materials Discovery

Chemistry

Impact of AI in Science

Related Tools

Literature Search

Data Visualization

Generative

Chemistry

See Also