Difference between revisions of "AI research trends"

From GISAXS
Jump to: navigation, search
(Context Remaking)
(LLM Weights Memory)
Line 8: Line 8:
 
* 2025-10: [https://arxiv.org/abs/2510.15103 Continual Learning via Sparse Memory Finetuning]
 
* 2025-10: [https://arxiv.org/abs/2510.15103 Continual Learning via Sparse Memory Finetuning]
 
* 2026-01: [https://developer.nvidia.com/blog/reimagining-llm-memory-using-context-as-training-data-unlocks-models-that-learn-at-test-time/ Reimagining LLM Memory: Using Context as Training Data Unlocks Models That Learn at Test-Time] (Nvidia)
 
* 2026-01: [https://developer.nvidia.com/blog/reimagining-llm-memory-using-context-as-training-data-unlocks-models-that-learn-at-test-time/ Reimagining LLM Memory: Using Context as Training Data Unlocks Models That Learn at Test-Time] (Nvidia)
 +
* 2026-01: [https://arxiv.org/abs/2601.02151 Entropy-Adaptive Fine-Tuning: Resolving Confident Conflicts to Mitigate Forgetting]
  
 
==Context Length==
 
==Context Length==

Revision as of 11:09, 13 January 2026

System 2 Reasoning

See: Increasing AI Intelligence

Memory

LLM Weights Memory

Context Length

Extended Context

Context Remaking

Retrieval beyond RAG

See also: AI tools: Retrieval Augmented Generation (RAG)

Working Memory

Long-Term Memory

Storage and Retrieval

Episodic Memory

Continual Learning

Updating Weights at Inference-time

Parameters as Tokens

Internal Thought Representation Space

Visual Thinking

Neural (non-token) Latent Representation

Altered Transformer

Tokenization

Generation Order

Diffusion Language Models

Related: Image Synthesis via Autoregression/Diffusion

Sampling

Daydreaming, brainstorming, pre-generation

Pre-generation


Missing Elements

  • Memory
  • Continuous learning/update
  • Robust contextual model
  • Long-time-horizon coherence
  • Fluid intelligence
  • Agency
  • Modeling of self
  • Daydreaming

Memes

See Also