
Most advice on test cases pushes for more structure than you need. Here's Testpad's take on writing ones that are clear, easy to maintain, and fit the way your team tests.
test case is an instruction for what to test. Writing one is as simple as writing down what you want to check – "verify password reset email sends correctly" is a perfectly good test case.
Most guides will tell you otherwise. They'll push formal formats with multiple required fields, elaborate metadata, and step-by-step instructions for every possible action. At Testpad, we think that level of detail creates more work than it saves for most teams.
This guide covers what test cases genuinely need – and how to think about structure, detail, and maintenance in a way that keeps testing moving rather than slowing it down.
A test case is a single test instruction. One thing to check as part of testing your software.
That instruction might be a brief prompt – "check login with valid credentials" – or a more detailed set of steps with expected outcomes. Both are test cases. The format depends on who's testing, how well they know the product, and how much guidance they need.
At Testpad, we use the term test prompts rather than test cases. The idea is the same – an instruction for what to test – but the emphasis is on keeping things outcome-focused rather than prescriptive. Instead of spelling out every step, a test prompt points testers at what to investigate and trusts them to use their judgment. It's a deliberately lighter approach, and it works well for teams doing regular release testing or exploratory work.
Read the Testpad guide on how to write effective test prompts.
At minimum, a test case needs to be clear enough that the right person can act on it.
In practice, most test cases benefit from three things:
The more formal version also includes steps to execute, test data, priority, and metadata like affected components or linked requirements. In Testpad, you can add all of that if you need it. For most tests, you won't. A short prompt with a clear outcome is enough to direct a tester's attention and record a meaningful result.
As detailed as it needs to be, and no more. The instinct to add detail comes from a good place. More detail feels safer. But detailed test cases take longer to write, longer to read, and significantly longer to maintain. When your product changes, every overly specific step becomes something to update.
If your tester knows the product, keep test cases short. A one-line prompt is enough. If your tester is unfamiliar with the product – a new joiner, a client doing UAT, a guest tester – more detail helps them know what to do.
In Testpad, you can adjust the level of detail per test or per section within the same script. Most things stay brief and detail gets added where it genuinely helps.
Before worrying about format, get the ideas down. What are the most important things to check? Start with the main features and core user flows – login, key actions, anything that would be embarrassing to break.
In Testpad, this usually means opening a new script and typing out a rough list. A rough list that exists is more useful than a polished template waiting to be filled in.
The most common test case mistake is describing every click rather than the outcome you're checking.
Instead of: "Click the login button, enter username 'test@example.com', enter password '12345', click submit, verify you see the dashboard"
Write: "Valid credentials log in successfully"
The second version trusts the tester to know how to log in. It focuses on what matters – does it work? – and stays useful even if the UI changes. This is the thinking behind Testpad's test prompt approach: point testers at the outcome, let them figure out the path.
Edge cases matter, but you don't need to test every conceivable one. Focus on edges that are reasonably likely to happen and would cause real problems if they failed – invalid inputs, empty states, boundary values, things that have broken before.
Organize test cases by feature, user flow, or area of the product – whatever makes it easy to see what's covered and what isn't. Testpad's Scripts > Folders > Sub-Folders structure handles this naturally. You get a clear hierarchy without any extra configuration.
Once you have a draft list, read back through it. Are there duplicates? Tests so obvious they add no value? Steps that are really one test written twice? Cut them. A shorter list that gets used is more valuable than a long one that doesn't.
When do more detailed test cases make sense?
Testpad isn't against detailed test cases. There are genuine situations where they earn their keep:
Outside of these situations, shorter is usually better. The goal isn't thoroughness for its own sake – it's finding problems before users do.
A traditional test case is a formal document with prescribed fields – title, preconditions, steps, expected outcomes, and metadata. It's built for completeness and traceability.
A test prompt is Testpad's lighter version: a specific instruction that points testers at what to investigate without prescribing every step. The tester brings judgment to how they check it.
The practical difference is speed. Prompts are faster to write, easier to maintain, and leave more room for testers to find things a step-by-step approach would miss. For teams without a formal QA function – developers, product managers, or small teams sharing testing responsibilities – that difference matters.
Enough to cover the things that matter. There's no target number. A focused set of 30 test cases covering your critical paths is more valuable than 300 that include things nobody reads. Quality of coverage matters more than quantity.
Testpad's starting point: every important user journey, every area that's changed in the current release, and every bug that's been fixed and could plausibly come back.
Test cases age. Features change, the UI changes, and test cases written six months ago start pointing at things that no longer exist.
A few habits that help:
In Testpad, duplicating a script for a new release copies the tests but not the results. Your test library stays current without starting from scratch each time. Have a look here at how that works in practice.
In Testpad, a script is your test plan: a checklist of everything you want to cover in a release. You open one, write down the first few things you want to check, and go from there. Testpad works just as well with a short list as it does with a detailed, organized test plan – you add structure as you need it.