Difference between revisions of "AI research trends"

From GISAXS
Jump to: navigation, search
(Memory)
(Continual Learning)
Line 85: Line 85:
 
* 2025-10: [https://arxiv.org/abs/2510.15103 Continual Learning via Sparse Memory Finetuning]
 
* 2025-10: [https://arxiv.org/abs/2510.15103 Continual Learning via Sparse Memory Finetuning]
 
* 2025-11: [https://research.google/blog/introducing-nested-learning-a-new-ml-paradigm-for-continual-learning/ Introducing Nested Learning: A new ML paradigm for continual learning]
 
* 2025-11: [https://research.google/blog/introducing-nested-learning-a-new-ml-paradigm-for-continual-learning/ Introducing Nested Learning: A new ML paradigm for continual learning]
 +
* 2026-01: [https://arxiv.org/abs/2601.16175 Learning to Discover at Test Time]
  
 
=Updating Weights at Inference-time=
 
=Updating Weights at Inference-time=

Revision as of 08:59, 26 January 2026

System 2 Reasoning

See: Increasing AI Intelligence

Memory

Reviews

LLM Weights Memory

Context Length

Extended Context

Context Remaking

Retrieval beyond RAG

See also: AI tools: Retrieval Augmented Generation (RAG)

Working Memory

Long-Term Memory

Storage and Retrieval

Episodic Memory

Continual Learning

Updating Weights at Inference-time

Parameters as Tokens

Internal Thought Representation Space

Visual Thinking

Neural (non-token) Latent Representation

Altered Transformer

Tokenization

Generation Order

Diffusion Language Models

Related: Image Synthesis via Autoregression/Diffusion

Sampling

Daydreaming, brainstorming, pre-generation

Pre-generation


Missing Elements

  • Memory
  • Continuous learning/update
  • Robust contextual model
  • Long-time-horizon coherence
  • Fluid intelligence
  • Agency
  • Modeling of self
  • Daydreaming

Memes

See Also