Difference between revisions of "AI research trends"

From GISAXS
Jump to: navigation, search
(Reviews)
(LLM Weights Memory)
Line 15: Line 15:
 
* 2026-01: [https://developer.nvidia.com/blog/reimagining-llm-memory-using-context-as-training-data-unlocks-models-that-learn-at-test-time/ Reimagining LLM Memory: Using Context as Training Data Unlocks Models That Learn at Test-Time] (Nvidia)
 
* 2026-01: [https://developer.nvidia.com/blog/reimagining-llm-memory-using-context-as-training-data-unlocks-models-that-learn-at-test-time/ Reimagining LLM Memory: Using Context as Training Data Unlocks Models That Learn at Test-Time] (Nvidia)
 
* 2026-01: [https://arxiv.org/abs/2601.02151 Entropy-Adaptive Fine-Tuning: Resolving Confident Conflicts to Mitigate Forgetting]
 
* 2026-01: [https://arxiv.org/abs/2601.02151 Entropy-Adaptive Fine-Tuning: Resolving Confident Conflicts to Mitigate Forgetting]
 +
* 2026-02: Sakana AI: [https://pub.sakana.ai/doc-to-lora/ Instant LLM Updates]: Train hypernetwork to generate LoRA adapters on the fly
 +
** 2026-02: [https://arxiv.org/abs/2602.15902 Doc-to-LoRA: Learning to Instantly Internalize Contexts] ([https://github.com/SakanaAI/Doc-to-LoRA code])
 +
** 2025-06: [https://arxiv.org/abs/2506.06105 Text-to-LoRA: Instant Transformer Adaption] ([https://github.com/SakanaAI/Text-to-LoRA])
  
 
==Context Length==
 
==Context Length==

Revision as of 14:59, 28 February 2026

System 2 Reasoning

See: Increasing AI Intelligence

Memory

Reviews

Big Ideas

LLM Weights Memory

Context Length

Extended Context

Context Remaking

Retrieval beyond RAG

See also: AI tools: Retrieval Augmented Generation (RAG)

Working Memory

Long-Term Memory

Storage and Retrieval

Episodic Memory

Continual Learning

Updating Weights at Inference-time

Parameters as Tokens

Internal Thought Representation Space

Visual Thinking

Neural (non-token) Latent Representation

Altered Transformer

Tokenization

Generation Order

Diffusion Language Models

Related: Image Synthesis via Autoregression/Diffusion

Sampling

Daydreaming, brainstorming, pre-generation

Pre-generation

Missing Elements

  • Memory
  • Continuous learning/update
  • Robust contextual model
  • Long-time-horizon coherence
  • Fluid intelligence
  • Agency
  • Modeling of self
  • Daydreaming

Memes

See Also