Why choosing a topic that produces evaluation matters

If you’re staring at a blank page wondering how to pick an Extended Essay (EE) topic, breathe. The single most useful lens to use when choosing is not whether a topic feels exciting in the abstract, but whether it invites evaluation—meaning it lets you weigh evidence, compare perspectives, judge claims, and arrive at a reasoned conclusion. Your Internal Assessments (IAs), Extended Essay, and Theory of Knowledge (TOK) work best when they’re connected through critical thinking. An EE that produces evaluation doesn’t just describe facts; it tests ideas, analyzes trade-offs, and shows the examiner you can think like a researcher.

Photo Idea : A focused student surrounded by books and notes, sketching a research question on a whiteboard

What we mean by “evaluation” in an EE

Evaluation in the context of an Extended Essay is the act of judging the value, credibility, or significance of evidence, methods, or arguments. It’s not merely saying something is right or wrong; it’s examining how strong the evidence is, where it might be biased, which assumptions underlie a claim, and how alternative explanations might change your conclusion. An evaluative EE asks “how well?” or “to what extent?” rather than just “what?”.

Evaluation looks like this in practice

  • Comparing two explanations to see which better fits the data.
  • Assessing the reliability of sources and explaining their limitations.
  • Testing a hypothesis and discussing how experimental design affects confidence in results.
  • Weighing competing theories, showing nuance rather than a one-sided claim.
  • Reflecting on how methodology and perspective shape the outcome.

Characteristics of EE topics that invite evaluation

When you scan ideas, look for features that naturally demand judgment. These features make it easier to plan research that leads to analysis rather than description.

Key characteristics

  • Multiple plausible explanations or perspectives exist (so you can compare and weigh them).
  • Evidence is accessible (data, texts, experiments, surveys, case studies) so claims can be tested.
  • The question can be framed in comparative or degree-focused terms: “to what extent,” “how effective,” “what impact”.
  • The topic allows operational definitions—variables can be measured or criteria defined.
  • There is room to discuss limitations, counterarguments, and the implications of findings.

Questions to ask when narrowing your topic

Before locking in a topic, run it through a quick reality-check. Each of these questions pushes you toward evaluation-friendly choices.

Self-interview checklist

  • Does this topic have at least two competing views or explanations?
  • Can I find primary or robust secondary evidence to test claims?
  • Is the question specific enough to be answered within the word limit?
  • Can I define clear criteria for judging evidence (e.g., validity, reliability, relevance)?
  • Is the methodology feasible with the resources and time I have?
  • Will the question allow me to discuss limitations and alternative interpretations?
  • Does the topic connect in some way with TOK ideas—knowledge, evidence, bias, or perspective?

Turning ideas into an evaluative research question: step-by-step

Generating a research question that produces evaluation is a stepwise process. You can treat it like a mini-investigation that prepares the rest of the EE.

Step 1 — Start with curiosity, not with a method

Identify what genuinely interests you: a phenomenon, a literary motif, a social issue, a biological process, an economic policy. Interest sustains motivation. From that curiosity, ask “why” or “to what extent” questions rather than “what happened?”

Step 2 — Do a quick literature scan

Skim articles, books, lab reports, or case studies to find recurring debates, contradictory findings, or gaps. The presence of debate is gold—debates mean evaluative work is possible.

Step 3 — Make the question comparative, causal, or evaluative

Transform a descriptive idea into an evaluative one by framing it around degree, impact, or comparison. For example, instead of “What are the causes of X?” consider “To what extent is X caused by A rather than B?” or “How effective is intervention Y in reducing X?”

Step 4 — Define your criteria and scope

Decide how you will judge evidence: accuracy, predictive power, explanatory scope, ethical implications, cost-effectiveness, etc. These criteria become the backbone of your evaluation.

Step 5 — Test feasibility and refine

Ask whether you can access the evidence you need. Try a miniature pilot—collect one dataset, analyze a short text, or run a small experiment. If that pilot produces usable material for analysis and shows multiple plausible interpretations, your question is on the right track.

Examples across subject areas (and why they invite evaluation)

Below are example research questions crafted to encourage evaluation. They’re written broadly so you can adapt them to your interests and resources.

Subject Example research question Why it invites evaluation Suggested methods
History To what extent did economic factors drive policy X compared to ideological motives? Allows weighing different causes and evaluating documentary evidence. Primary source analysis, comparative archival study.
Biology How effective is treatment A at reducing measureable symptom X under controlled conditions? Invites experimental data collection and critical discussion of controls and limitations. Lab experiment, statistical analysis, discussion of validity.
Economics To what extent did policy Y influence consumer behavior compared with global trends? Requires data interpretation and causal inference, with room for counter-explanations. Quantitative analysis, regression or comparative time-series review.
English/Literature How convincingly does author Z use narrative structure to critique social theme T? Encourages comparative textual evidence and assessment of literary devices. Close reading, comparative analysis, theoretical framework application.
Chemistry Which catalyst yields the highest efficiency for reaction R under comparable conditions, and why? Experimental results invite explanation and assessment of error and mechanism. Controlled experiments, measurement of yields, error analysis.
Psychology To what extent does variable M affect behavior N in a sample population? Allows hypothesis testing, statistical evaluation, and discussion of generalizability. Survey/experiment, statistical tests, ethical reflection.

Common pitfalls and how to avoid them

Even well-intentioned topics can lead to descriptive essays if you don’t guard against common traps. Here’s how to steer clear of them.

Pitfall: A question that’s too descriptive

“What happened?” questions often produce narrative summaries. Redraft them into evaluative forms: “How significant was X in causing Y?” or “To what extent did X contribute to Y?”

Pitfall: Too broad or too narrow

Overly broad topics drown you in material; overly narrow ones give nothing to evaluate. Use geographic, temporal, or conceptual limits and ensure there is enough evidence within those bounds.

Pitfall: No clear criteria for judgment

Without criteria you’ll end up with unsupported assertions. Define what counts as convincing evidence for your question and apply those criteria transparently throughout the essay.

Pitfall: Methodology mismatches question

Choose methods that can answer the question. Qualitative questions need interpretive evidence; quantitative questions need measurable data. If you mix methods, explain why and how that strengthens evaluation.

Working with your supervisor and weaving in TOK

Your supervisor is an ally in shaping a question that produces evaluation. Use their feedback to tighten scope, test feasibility, and refine the criteria you’ll use. Supervisors can point you to sources and warn you when a question will be difficult to assess fairly.

Make TOK connections explicitly

TOK offers useful language and structures for evaluation: consider knowledge claims, evidence, methods, perspectives, and the role of bias. If your EE discusses how methodology shapes conclusions, or compares ways of knowing in different disciplines, you’re also deepening TOK connections—use that to enrich evaluation rather than distract from it.

Time management: milestones and a sample plan

A topic that produces evaluation requires time for collecting, testing, and critically analyzing evidence. Below is a simple milestone table you can adapt to your timeline.

Stage Goal Suggested duration
Idea generation & feasibility check Find topics, do quick literature scan, run a pilot Short block up front
Research question refinement Define criteria, methods, and scope Focused period after initial scan
Data collection / close reading / experimentation Gather evidence to test claims Largest block—allow revisions
Analysis & evaluation Apply criteria, compare perspectives, run stats or syntheses Ample time for rework
Write-up & refinement Draft, get feedback from supervisor, refine evaluation and limitations Final block—leave margin for editing

How to present evaluation clearly in your essay

Presentation matters. When you structure the EE so that evaluation is explicit, readers (and examiners) can follow your reasoning.

Structure that highlights evaluation

  • Introduction: state the research question and the criteria you will use to judge evidence.
  • Literature review / background: summarize existing arguments and identify the debate.
  • Methodology: explain how your approach tests the question and what limitations might distort results.
  • Results / findings: present data or close readings objectively.
  • Evaluation / analysis: apply your criteria, compare explanations, question the strength of evidence, and discuss alternative readings.
  • Conclusion: weigh the evidence in a balanced way and state the degree of confidence you have in your claim.

Language tips for strong evaluation

  • Use measured qualifiers: “the evidence suggests,” “this indicates,” “there is limited support for.”
  • State counterarguments and show how they affect your conclusion.
  • Explicitly discuss limitations and how they reduce or modify confidence in your findings.
  • Make your criteria visible; when you say something is “better” explain according to which standard.

When to consider extra support

Sometimes you’ll need expert feedback to shape a rigorous evaluative question—especially for complex methods or subjects where you lack prior experience. Targeted, one-on-one guidance can help you translate a promising idea into a workable question and research design. For personalized tutoring that emphasizes tailored study plans, focused feedback, and methodological clarity, platforms like Sparkl can offer support. If you use additional help, ensure it strengthens your own analytical skills rather than doing the thinking for you.

Final checklist before you submit

Run through this checklist to make sure your EE demonstrates genuine evaluation:

  • Is the research question explicitly evaluative (e.g., “to what extent”, “how effective”)?
  • Have you defined clear criteria for judging evidence?
  • Do your methods actually test the question you posed?
  • Have you presented alternative explanations and addressed their strengths and weaknesses?
  • Have you been transparent about limitations and bias?
  • Is the conclusion proportionate to the evidence (no overclaiming)?
  • Have you linked your discussion back to TOK ideas where appropriate?
  • Have you sought and applied constructive supervisor feedback?

Small examples of phrasing that shift a question toward evaluation

Sometimes a tiny change in wording makes the difference between summary and critical analysis. Here are quick edits that push questions into evaluative territory.

  • From: “What are the causes of X?” To: “To what extent is X caused by A rather than B?”
  • From: “How does author Y present theme Z?” To: “How effectively does author Y use technique T to critique theme Z?”
  • From: “What happens when A is applied?” To: “How effective is A at achieving outcome B, and under what conditions?”

Closing thoughts

Choosing an Extended Essay topic that produces evaluation is part curiosity, part craft. Start with a question you care about, aim for a formulation that demands judgment, make your criteria explicit, and design methods that actually test the claims you make. If you plan carefully and keep the focus on weighing evidence and perspectives, your EE will move beyond description and become a piece of original, critical scholarship that truly reflects the skills IB DP aims to develop.

Photo Idea : A student and supervisor discussing draft notes over coffee, with open notebooks and highlighted passages

Comments to: IB DP EE Topic Selection: How to Pick an EE Topic That Produces Evaluation

Your email address will not be published. Required fields are marked *

Trending

Dreaming of studying at world-renowned universities like Harvard, Stanford, Oxford, or MIT? The SAT is a crucial stepping stone toward making that dream a reality. Yet, many students worldwide unknowingly sabotage their chances by falling into common preparation traps. The good news? Avoiding these mistakes can dramatically boost your score and your confidence on test […]

Good Reads

Login

Welcome to Typer

Brief and amiable onboarding is the first thing a new user sees in the theme.
Join Typer
Registration is closed.
Sparkl Footer