posted on X
PROMPT 1 - The Contradiction Finder Most researchers miss this. This prompt doesn't: "Across all papers uploaded, identify every point where two or more authors directly contradict each other. For each contradiction: - State both positions - Name the papers - Explain WHY they likely disagree (methodology, dataset, era) Format as a table."
View original tweet on X →
Surface every point where multiple uploaded papers directly disagree — and explain why.
Prompt
Across all papers uploaded, identify every point where two or more authors directly contradict each other.
For each contradiction:
- State both positions
- Name the papers
- Explain WHY they likely disagree (methodology, dataset, era)
Format as a table.Why it works
Most LLM-based paper analysis defaults to summarizing each document in isolation. By explicitly asking for cross-document contradictions, the prompt forces the model to hold multiple sources in working memory simultaneously and compare them rather than treat each as a standalone input.
The structured sub-instructions (state both positions, name papers, explain why) prevent vague outputs. Without them, a model might surface a surface-level tension without attributing it or explaining the root cause. Requiring a table further compresses the output into scannable, comparable rows — making gaps in consensus immediately visible.
The 'WHY they likely disagree' step is the highest-value part. Differences in methodology, dataset size, time period, or domain scope are exactly the editorial judgment a researcher needs to decide which result to trust — and this prompt makes the model do that diagnostic work explicitly.
When to use
- •Conducting a literature review where you need to map the state of disagreement in a field
- •Fact-checking a draft that synthesizes multiple sources — to catch places where you may have inadvertently favored one position
- •Preparing for a research presentation or debate and needing to anticipate counterarguments from cited literature
Report an Issue
Found something wrong with this article? Let us know and we'll look into it.