Writing

Your first manual testcase

This page will walk through writing and contributing a manual testcase.

Requirements

There are some things you need to have set up first to be able to contribute, documented here

An optional tools is bzr-explorer

sudo apt-get install bzr-explorer

Should you get stuck / need help

Don't forget the wonderful quality community to get help from if you get stuck! Here's a list of resources to help you connect with for help.

Use these resources to help you!

Where Manual Testcase data is kept

Launchpad holds the source code for all manual testcases in the manual testcase project. Currently required testcases and bugs for already written testcases can all be found on Launchpad. Code contributions are submitted as merge requests.

Bzr is the version control system used. You can see a reference chart here as well as a quick start guide if you need help on the basic commands you'll need to contribute a testcase.

The QATracker is the master repository for all our our testing within ubuntu QA. It records results and helps coordinate our testing events. Learn more about it here. Code committed to the manual testcase project will be utilized for testing on the qatracker.

Style Guide

A quick review of how the testcases should look and tips to keep in mind while your writing.

Writing the testcase

Get the current testcases

Open up a terminal and make a new folder for you to develop under.

For this example, let's utilize ~/manualtests.

mkdir ~/manualtests

Change directory into the new folder, then branch the current testcases

cd ~/manualtests

bzr branch lp:ubuntu-manual-tests

This will grab a copy of the current testcases used by the project. We can then use bzr to add a new testcase we'll write below and then finally commit the testcase back to the project.

Choose an application

Choose an application you wish to write the testcase for. Here's a list of needed testcases that you can choose from. Don't see something you like on the list? We still welcome your contribution. Testcases for any image, package, or potential piece of hardware that runs on Ubuntu or official flavours is welcome!

Assigning the work to you

Each testcase request is listed as a bug report. Once you've chosen a bug (test) or two to work on, assign yourself to that bug (test) in launchpad and mark the bug as "In Progress". You will use the bug later on as well as part of your merge request.

Write the test

When you write your test it is quite important that you don't assume what the testcase will be numbered as, the number is created when Testcase Admins add your work onto the QA tracker(s)

Create Action Steps

Run the application you've chosen and pick a few of the features of the application. Document each feature you've chosen and write down step by step instructions in order to utilize the feature. Let's quickly take a look at a current testcase to get an idea of how this works.

The first testcase is simple enough. It's goal is to 'check that Update Manager opens correctly'. You can see each step is laid out with an action and expected result.

Notice the formatting and the action followed by expected result format, and notice the html markup of dd and dt used. You can see the proper format information here.

Create Expected Results

Now, run through the steps you wrote down to ensure they exercise the feature you targeted. As you step through your instructions, record what happens for each step so you can add them to the test case. These will become your expected results in the testcase. In our example, you can see this like so:

ACTION: Open Update Manager

RESULT: The Software Updater starts, checking for updates as it opens.

Finish Formatting and verify your test

Format your testcase steps and expected results in the proper testcase format. You should utilize the 'normal' testcase format where each step represents and 'action', followed by an 'expected result'. Note that you have to use the html markup as shown. The tracker will interpret the html to display and number your items properly.

Run through your test as if you were executing it as a testcase. Ensure you didn't miss any steps, or make any assumptions. It can be helpful to have someone else run the test for you to ensure it's successful for them as well. Opening the file in a word processor is an easy way to check for spelling mistakes.

Check the formatting

Once you have saved your testcase, you can check that the formatting is correctly applied with the testcase format script.

If the formatting is correct, the script will report that by showing the testcase name

test-case-format 1596_Update\ Manager 
1596_Update Manager

If an error is found it will report that by showing where the first line with errors can be found.

 test-case-format 1656_StartupDiskCreator 
1656_StartupDiskCreator
line 1 column 1 - Warning: discarding unexpected plain text
line 5 column 15 - Error: unexpected </dt> in <dl>
1 warning, 1 error were found!

Fix the formatting error found and retest until there are no errors in the file. Now you are in a position to push your new testcase to the branch for verification.

What next?

Before your new testcase can be verified and merged, it needs to be pushed so that others can see it, comment on any changes that might be required and of course, approve it.

After that it will enter the pool of testcases available for usage on the various qatrackers. From there it can be utilized for all sorts of manual testing events; a special call for testing or as part of the normal cadence testing that occurs throughout the cycle.

To push your new testcase to your branch check out the QA Tool Usage page where more detail regarding BZR and importantly linking your branch to bugs can be found.

Video Screencast of Tutorial

A video version of the previous version of this tutorial can be seen here.

QATeam/ContributingTestcases/Manual/Writing (last edited 2015-02-11 20:45:25 by host217-44-229-128)