Difference between revisions of "AI research trends"

From GISAXS
Jump to: navigation, search
(Episodic Memory)
(Updating Weights at Inference-time)
 
Line 59: Line 59:
 
=Updating Weights at Inference-time=
 
=Updating Weights at Inference-time=
 
* 2025-01: [https://arxiv.org/abs/2501.06252 Transformer<sup>2</sup>: Self-adaptive LLMs]
 
* 2025-01: [https://arxiv.org/abs/2501.06252 Transformer<sup>2</sup>: Self-adaptive LLMs]
 +
* 2025-08: [https://arxiv.org/abs/2508.14143 Beyond Turing: Memory-Amortized Inference as a Foundation for Cognitive Computation]
  
 
==Parameters as Tokens==
 
==Parameters as Tokens==

Latest revision as of 10:14, 25 August 2025

System 2 Reasoning

See: Increasing AI Intelligence

Memory

LLM Weights Memory

Context Length

Extended Context

Retrieval beyond RAG

See also: AI tools: Retrieval Augmented Generation (RAG)

Working Memory

Long-Term Memory

Episodic Memory

Updating Weights at Inference-time

Parameters as Tokens

Internal Thought Representation Space

Visual Thinking

Neural (non-token) Latent Representation

Altered Transformer

Tokenization

Generation Order

Diffusion Language Models

Related: Image Synthesis via Autoregression/Diffusion

Sampling

Missing Elements

  • Memory
  • Continuous learning/update
  • Robust contextual model
  • Long-time-horizon coherence
  • Fluid intelligence
  • Agency
  • Modeling of self
  • Daydreaming

Memes

See Also