Difference between revisions of "AI research trends"

From GISAXS
Jump to: navigation, search
(Retrieval beyond RAG)
(Reviews)
 
(2 intermediate revisions by the same user not shown)
Line 4: Line 4:
 
=Memory=
 
=Memory=
 
==Reviews==
 
==Reviews==
 +
* 2024-04: [https://arxiv.org/abs/2404.13501 A Survey on the Memory Mechanism of Large Language Model based Agents]
 
* 2026-01: [https://arxiv.org/abs/2601.09113 The AI Hippocampus: How Far are We From Human Memory?]
 
* 2026-01: [https://arxiv.org/abs/2601.09113 The AI Hippocampus: How Far are We From Human Memory?]
  
Line 50: Line 51:
  
 
==Context Remaking==
 
==Context Remaking==
 +
* 2021-01: [https://arxiv.org/abs/2101.00436 Baleen: Robust Multi-Hop Reasoning at Scale via Condensed Retrieval]
 
* 2025-08: [https://blog.plasticlabs.ai/blog/Memory-as-Reasoning Memory as Reasoning (Memory is Prediction)]
 
* 2025-08: [https://blog.plasticlabs.ai/blog/Memory-as-Reasoning Memory as Reasoning (Memory is Prediction)]
 
* 2025-09: [https://arxiv.org/abs/2509.25140 ReasoningBank: Scaling Agent Self-Evolving with Reasoning Memory]
 
* 2025-09: [https://arxiv.org/abs/2509.25140 ReasoningBank: Scaling Agent Self-Evolving with Reasoning Memory]
Line 91: Line 93:
 
* 2025-11: [https://research.google/blog/introducing-nested-learning-a-new-ml-paradigm-for-continual-learning/ Introducing Nested Learning: A new ML paradigm for continual learning]
 
* 2025-11: [https://research.google/blog/introducing-nested-learning-a-new-ml-paradigm-for-continual-learning/ Introducing Nested Learning: A new ML paradigm for continual learning]
 
* 2026-01: [https://arxiv.org/abs/2601.16175 Learning to Discover at Test Time]
 
* 2026-01: [https://arxiv.org/abs/2601.16175 Learning to Discover at Test Time]
 +
* 2026-01: [https://arxiv.org/abs/2601.19897 Self-Distillation Enables Continual Learning]
 
* 2026-02: [https://arxiv.org/abs/2602.07755 Learning to Continually Learn via Meta-learning Agentic Memory Designs]
 
* 2026-02: [https://arxiv.org/abs/2602.07755 Learning to Continually Learn via Meta-learning Agentic Memory Designs]
  

Latest revision as of 10:25, 18 February 2026

System 2 Reasoning

See: Increasing AI Intelligence

Memory

Reviews

Big Ideas

LLM Weights Memory

Context Length

Extended Context

Context Remaking

Retrieval beyond RAG

See also: AI tools: Retrieval Augmented Generation (RAG)

Working Memory

Long-Term Memory

Storage and Retrieval

Episodic Memory

Continual Learning

Updating Weights at Inference-time

Parameters as Tokens

Internal Thought Representation Space

Visual Thinking

Neural (non-token) Latent Representation

Altered Transformer

Tokenization

Generation Order

Diffusion Language Models

Related: Image Synthesis via Autoregression/Diffusion

Sampling

Daydreaming, brainstorming, pre-generation

Pre-generation

Missing Elements

  • Memory
  • Continuous learning/update
  • Robust contextual model
  • Long-time-horizon coherence
  • Fluid intelligence
  • Agency
  • Modeling of self
  • Daydreaming

Memes

See Also