from Hacker News

Ask HN: How to measure QA in a startup?

by muratk on 1/25/16, 5:10 AM with 2 comments

We have a web-app and a QA engineer who's helping to ensure we don't ship outages and we know about severe regressions. (We're not aiming for pixel-perfect at the moment, but for good enough.) She's doing that by building up a Selenium suite that catches regressions just fine, spec-driven testing for new features and exploratory testing for the unexpected.

How can I give her a sense of “a job well done”? I can and do /tell/ her, cuz she's doing great, but I'd like to do better, and /show/ her: Here's how we've been doing, and this has improved in the last weeks; you're doing great.

We sat together to think about this, and came up with collecting those numbers per week:

* #severe bugs (bugs in our core functionality that are encountered by customers for whom there are no workarounds) * #reported bugs (reported by customers) * #caught severe bugs (bugs found in staging that hold up deployment — basically bugs in above “severe” category that we caught before they went live)

If those numbers are looking good, she feels she's doing a great job.

How are you doing that? Do you have better ideas to measure the success and progress of QA in a startup?

  • by brudgers on 1/25/16, 6:13 PM

    The measure of everything in a startup is growth. The message to QA should be the same as for everyone else: We are growing! Trust that your employee will do a good job figuring out when something bad happens from a technology standpoint.

    If the employee is unhappy, it is more likely to be for cultural reasons than anything. Management by walking around can develop rapport. Asking "what can I do to help you?" is a good place to start.

    If you're looking for metrics, ask an expert: e.g. ask the employee what the employee tracks.

    Good luck.

  • by Isammoc on 1/25/16, 8:18 AM

    * #regressions (bugs already seen, already fixed, but coming back) => should be zero (but sadly, not everywhere)

    * Is she confident when there is a release ?

    * If there is an API : #questions (by dev) not already answered in documentation

    Many and many metrics points to add... But what's relevant is business dependent...