Difference between revisions of "AI tricks"

From GISAXS
Jump to: navigation, search
(Prompt Engineering)
(Position Bias)
 
Line 33: Line 33:
 
* 2024-11: [https://arxiv.org/abs/2411.01101 Self-Consistency Falls Short! The Adverse Effects of Positional Bias on Long-Context Problems]
 
* 2024-11: [https://arxiv.org/abs/2411.01101 Self-Consistency Falls Short! The Adverse Effects of Positional Bias on Long-Context Problems]
 
* 2025-02: [https://arxiv.org/abs/2502.01951 On the Emergence of Position Bias in Transformers]
 
* 2025-02: [https://arxiv.org/abs/2502.01951 On the Emergence of Position Bias in Transformers]
 +
* 2025-07: [https://arxiv.org/abs/2507.22887 Where to show Demos in Your Prompt: A Positional Bias of In-Context Learning]
 
* '''Testing models:'''
 
* '''Testing models:'''
 
** [https://github.com/gkamradt/LLMTest_NeedleInAHaystack?utm_source=chatgpt.com Needle-in-a-Haystack tests]
 
** [https://github.com/gkamradt/LLMTest_NeedleInAHaystack?utm_source=chatgpt.com Needle-in-a-Haystack tests]

Latest revision as of 11:22, 7 August 2025

Prompt Engineering

In-Context Learning

Chain of Thought (CoT)

Multi-step

Tool-use, feedback, agentic

Retrieval-Augmented Generation (RAG)

Input/Output Formats

Brittleness

Position Bias

Generation