Difference between revisions of "Science Agents"

From GISAXS
Jump to: navigation, search
(LLMs Optimized for Science)
(Literature)
Line 11: Line 11:
 
* 2019-07: [https://doi.org/10.1038/s41586-019-1335-8  Unsupervised word embeddings capture latent knowledge from materials science literature]
 
* 2019-07: [https://doi.org/10.1038/s41586-019-1335-8  Unsupervised word embeddings capture latent knowledge from materials science literature]
 
* 2024-11: [https://doi.org/10.1038/s41562-024-02046-9  Large language models surpass human experts in predicting neuroscience results]
 
* 2024-11: [https://doi.org/10.1038/s41562-024-02046-9  Large language models surpass human experts in predicting neuroscience results]
 +
 +
===(Pre) Generate Articles===
 +
* 2025-03: [https://arxiv.org/abs/2503.18866 Reasoning to Learn from Latent Thoughts]
 +
* 2025-03: [https://arxiv.org/abs/2503.19065 WikiAutoGen: Towards Multi-Modal Wikipedia-Style Article Generation]
  
 
==Explanation==
 
==Explanation==

Revision as of 13:58, 26 March 2025

AI Use-cases for Science

Literature

LLM extract data from papers

AI finding links in literature

(Pre) Generate Articles

Explanation

Autonomous Ideation

Adapting LLMs to Science

AI/ML Methods tailored to Science

Regression (Data Fitting)

Tabular Classification/Regression

Symbolic Regression

Literature Discovery

Commercial

AI/ML Methods in Science

Chemistry

Biology

Successes

AI/ML Methods co-opted for Science

Mechanistic Interpretability

Train large model on science data. Then apply mechanistic interpretability (e.g. sparse autoencoders, SAE) to the feature/activation space.

Uncertainty

Science Benchmarks

Science Agents

Reviews

Specific

Science Multi-Agent Setups

AI Science Systems

Inorganic Materials Discovery

Chemistry

LLMs Optimized for Science

Impact of AI in Science

Related Tools

Literature Search

Data Visualization

Generative

Chemistry

See Also