Top (Christmas) testing tips

There isn’t much that screams ‘Christmas’ quite like ‘Quality Control’... right?! OK, maybe not, but the Kineo elves have been busy cooking up some Top Tips for a warm winter’s testing.

You may find yourself becoming your own QA department if you’re producing e-learning in-house, or setting expectations with a partner on QA standards – but whatever you do, you’ll always be QA-in-chief for your own work. In all cases, remember to get a full lowdown on both clients' and co-workers’ expectations for each project: technical specs, structure and functionality, style and branding, tone of voice, any other nuances, etc. The info you need is out there somewhere, so don’t be a Scrooge – get the team sharing!

With that in mind, here are a few tips. Apologies in advance for any seasonal punnery – QA doesn’t extend to removing gags, it transpires.


#1. Take thyme to marinate your text

It’s an important (if seldom voiced) aspect of QA’s job to actively ensure that everyone’s working from the same page. When dealing with particularly complex material you’ll often be trying to keep up with pages of specialised industry jargon, tricky terms, acronyms and abbreviations, or odd capitalisation rules – and every tiny detail has to be right for each customer. Consistency is the first law of testing; make sure you get agreement from all relevant parties on how client-specific terms should appear. A style guide is well worth the upfront investment if it allows you to avoid a five-way capitalisation fight during review.

Choose the correct present to continue

Imagine the disappointment of opening up a Christmas pressie (that big box with a bow and ribbon you’ve been secretly eyeing) who tag is written out to you, only to discover it’s for Aunt Brenda instead. Classic case of misleading instructional text...

That’s the kind of dilemma you’ll be faced with if scripted instructions don’t match button labels within a course, or don't provide clear and accurate signposts for interactivity. It’s something to look out for early on. A quick chat between the relevant designer, developer, tester and graphic artist (and/or art director) will ensure your instruction text is clear, consistent and relevant.

Try to see scripts as early as possible, and request a thorough brief upfront (or at least have an in-house, default style guide to go by). That way, if anything is questioned further down the line, you’ll have a pre-agreed reference to fall back on.

Proofing scripts while looking out for hot issues – like instruction text, which may be transferred into the build – will save you time reporting issues later on.

 

#2. Go cold turkey!

We all know how important it is to develop good habits and sound testing practices. An efficient process is key to handling larger and more complex e-learning productions. Errors left in at the start can quickly scale up and get out of hand. But sometimes you just have to break out of those old habits.

We don’t often talk about the more reactive or intuitive aspects of QA. Hopefully, testing will be there to catch any unwanted snowflakes before they start to snowball, but suggesting a good pre-testing review of the course by the writers, Subject Matter Experts (SMEs) and designers can make a world of difference. Here’s why...

A trifle subjective...

Before getting your Xmas testing mittens on again, remember that QA’s job at each stage of a project lifecycle includes:

  • Spotting potential issues that can arise given the unique structure of each project
  • Accounting for change, and design and subjectivity
  • Sensing when a course just isn’t ready for testing and pushing back for more production work if necessary.

These explain why a good review mid-way can be helpful.

Time and budget must be accounted for, of course. But designers, SMEs and writers generally have a better insight into the agreed learning outcomes, and/or more subjective design matters that QA may not be able to scrutinise objectively, even with a good brief.

Once scripts are finalised and any changes have been agreed, the first version of the course may well be a rough prototype. If so, testing is likely to pull up a ton of issues. That’s OK! It’s early days – log as much as possible and where appropriate ask the design and development teams to feed back to your client (internal or external) with any queries or concerns.

 

#3. Make it a tough nut to crack

As you get closer to a final version, things will have to look and feel much more polished. No room for errors; by this stage there should ideally be no major changes going in. If there are, they can cause problems and knock-on effects. For example, changes to functionality late in the game could totally alter the way a course tracks and scores on an LMS, or typos in client-amended text could inadvertently be copied into the course. Ideally, proof comments and amends before developers implement them. Even if a document is proofed first, copy/paste errors or odd characters can still creep back in.

If you’re working in a small team, it’s still worth spending some quality time actively trying to break the course – in both orthodox and obscure ways! Your technical team can be a true source of wisdom here – they get to see the nuts and bolts. If you can break the functionality intentionally within a day or two, chances are your of hundreds or thousands of learners will break it accidentally over the coming weeks or months.

So don’t be haunted by the Ghost of Christmas Past – innovate, keep up with new tools and techniques and keep your approach to testing fresh! Otherwise, unanticipated bugs can creep in, and blaming new functionality (complex branching, anyone?!) won’t wash. No excuse for blaming it on the elves (or that second glass of sherry) this year...

That’s all for now. Have a wonderful, bug-free Christmas!

The Testing Team x