Difference between revisions of "AI research trends"

From GISAXS
Jump to: navigation, search
(Context Length)
(Context Length)
Line 36: Line 36:
 
* 2025-02-18: [https://arxiv.org/abs/2502.12962 Infinite Retrieval: Attention Enhanced LLMs in Long-Context Processing]
 
* 2025-02-18: [https://arxiv.org/abs/2502.12962 Infinite Retrieval: Attention Enhanced LLMs in Long-Context Processing]
 
* 2025-02-19: [https://github.com/MoonshotAI/MoBA MoBA: Mixture of Block Attention for Long-Context LLMs]
 
* 2025-02-19: [https://github.com/MoonshotAI/MoBA MoBA: Mixture of Block Attention for Long-Context LLMs]
 +
* 2025-02-27: [https://arxiv.org/abs/2502.20082 LongRoPE2: Near-Lossless LLM Context Window Scaling]
  
 
==Extended Context==
 
==Extended Context==

Revision as of 11:22, 3 March 2025

Novel Tokenization and/or Sampling

System 2 Reasoning

See: Increasing AI Intelligence

Memory

LLM Weights Memory

Context Length

Extended Context

Retrieval beyond RAG

See also: AI tools: Retrieval Augmented Generation (RAG)

Working Memory

Episodic Memory

Updating Weights at Inference-time

Parameters as Tokens

Internal Thought Representation Space

Visual Thinking

Neural (non-token) Latent Representation

Diffusion Language Models

See Also