Difference between revisions of "AI research trends"

From GISAXS
Jump to: navigation, search
(Context Length)
(Episodic Memory)
 
Line 49: Line 49:
 
* 2024-12: [https://www.arxiv.org/abs/2412.18069 Improving Factuality with Explicit Working Memory]
 
* 2024-12: [https://www.arxiv.org/abs/2412.18069 Improving Factuality with Explicit Working Memory]
  
==Episodic Memory==
+
==Long-Term Memory==
 +
* 2025-04: [https://arxiv.org/abs/2504.19413 Mem0: Building Production-Ready AI Agents with Scalable Long-Term Memory]
 +
 
 +
===Episodic Memory===
 
* 2024-03: [https://arxiv.org/abs/2403.11901 Larimar: Large Language Models with Episodic Memory Control]
 
* 2024-03: [https://arxiv.org/abs/2403.11901 Larimar: Large Language Models with Episodic Memory Control]
  

Latest revision as of 08:58, 30 April 2025

System 2 Reasoning

See: Increasing AI Intelligence

Memory

LLM Weights Memory

Context Length

Extended Context

Retrieval beyond RAG

See also: AI tools: Retrieval Augmented Generation (RAG)

Working Memory

Long-Term Memory

Episodic Memory

Updating Weights at Inference-time

Parameters as Tokens

Internal Thought Representation Space

Visual Thinking

Neural (non-token) Latent Representation

Altered Transformer

Tokenization

Generation Order

Diffusion Language Models

Related: Image Synthesis via Autoregression/Diffusion

Sampling

Missing Elements

  • Memory
  • Continuous learning/update
  • Robust contextual model
  • Long-time-horizon coherence
  • Fluid intelligence
  • Agency

See Also