activities

Differences between revisions 10 and 29 (spanning 19 versions)
Revision 10 as of 2012-04-25 14:17:30
Size: 12239
Editor: 87
Comment:
Revision 29 as of 2016-06-08 11:01:09
Size: 16388
Editor: localhost
Comment:
Deletions are marked like this. Additions are marked like this.
Line 3: Line 3:
= Activities Planner and QQ Release Schedule =
 * '''Period:''' QQ (12.10) Development Cycle (May to October 2012).
 * '''Start date:''' Release date of Precise Pangolin (26th April 2012).

= Activities Planner and Development Release Schedule =
 * '''Period:''' YY (16.10) Development Cycle (May 2016 to October 2016).
 * '''Start date:''' Release date of Xenial Xerus (May 2016).
Line 8: Line 9:

||<tablestyle="float:right; font-size: 0.9em; width:100%; margin: 2.5em 0 1em 0em; border: none; border-radius: 10px 10px 10px 10px;" style="margin: 0; padding:0.5em;" style="width:10%;">||<style="width:15%;">||<style="width:15%;">||<style="width:40%;">||<style="width:20%;">||
||<rowbgcolor="#cccccc"> '''Week''' || '''Date''' ~-(Thursday)-~|| '''Work Item Iteration''' || '''Status''' || '''Notes''' ||
||<-5 #eeeeee> May 2012 ||
|| 1 || May 3rd || A-2 ||<#CCFFCC> || <!> Toolchain Uploaded ||
|| 2 || May 10th || A-2 ||<#CCFFCC> || {*} '''Developer Summit''' ||
||<style="color:#FF0000;"> '''U+1''' || date TBC || ||<#CCFFCC> '''Testing the tool chain''' || ||
|| 3 || May 17th || A-2 ||<#CCFFCC> || ||
|| 4 || May 24th || A-2 ||<#CCFFCC> || ||
|| 5 || May 31st || A-2 ||<#CCFFCC> /!\ FeatureDefinitionFreeze || ||
||<-5 #eeeeee> June 2012 ||
||<style="color:#FF0000;"> '''U+1''' || June 2 - 6 || ISO testing ||<#FFFFCC> '''Live session and install tests of pre-alpha 1 ISO''' || ||
|| 6 || June 7th || A-2 ||<#FFFFCC> || Alpha 1 ||
||<style="color:#FF0000;"> '''U+1''' ||||<(> June 7 - August 09 ||<#FFFFCC> '''Identifying and testing packages subject to Partner Upload Deadline''' || ||
||<style="color:#FF0000;"> '''U+1''' ||||<(> June 7 - August 23 ||<#FFFFCC> '''Identifying and testing packages subject to a Feature Freeze Deadline''' || ||
||<style="color:#FF0000;"> '''U+1''' ||||<(> June 7 - August 30 ||<#FFFFCC> '''Identifying and testing packages subject to User Interface Freeze''' || ||
|| 7 || June 14th || A-2 ||<#FFFFCC> || ||
|| 8 || June 21th || A-2 ||<#FFFFCC> /!\ DebianImportFreeze || ||
||<style="color:#FF0000;"> '''U+1''' ||||<(> June 21 onwards ||<#FFFFCC> '''Identifying and testing Debian packages post Debian Import Freeze''' || ||
||<style="color:#FF0000;"> '''U+1''' || June 23 - 27 || ISO testing ||<#FFFFCC> '''Live session and install tests of pre-alpha 2 ISO''' || ||
|| 9 || June 28th || A-2 ||<#FFFFCC> || Alpha 2 ||
||<-5 #eeeeee> July 2012 ||
|| 10 || July 5th || Iteration Planning ||<#FFEBBB> || ||
|| 11 || July 12th || A-3 ||<#FFEBBB> || Rally ||
|| 12 || July 19st || A-3 ||<#FFEBBB> || ||
|| 13 || July 26th || A-3 ||<#FFEBBB> || ||
||<style="color:#FF0000;"> '''U+1''' || July 28 - Aug 01 || ISO testing ||<#FFEBBB> '''Live session and install tests of pre-alpha 3 ISO''' || ||
||<-5 #eeeeee> August 2012 ||
|| 14 || August 2nd || A-3 ||<#FFEBBB> || Alpha 3 ||
|| 15 || August 9th || Iteration Planning ||<#FFCCCC> || ||
|| 16 || August 16th || Beta UI ||<#FFCCCC> || {*} '''Ubuntu 12.04.1'''||
|| 17 || August 23rd || Beta UI ||<#FFCCCC> /!\ FeatureFreeze || ||
|| 18 || August 30th || Beta UI ||<#FFCCCC> /!\ UserInterfaceFreeze, /!\ Beta 1 Freeze || ||
||<-5 #eeeeee> September 2012 ||
||<style="color:#FF0000;"> '''U+1''' || September 01 - 05 || ISO testing ||<#FFCCCC> '''Live session and install tests of pre-beta 1 ISO''' || ||
|| 19 || September 6th || Beta UI||<#FFCCCC> || Beta 1 ||
|| 20 || September 13th || Quality ||<#E47A7A> || ||
|| 21 || September 20nd || Quality ||<#E47A7A> /!\ Beta 2 Freeze || ||
||<style="color:#FF0000;"> '''U+1''' || September 22 - 26 || ISO testing ||<#E47A7A> '''Live session and install tests of pre-beta 2 ISO''' || ||
|| 22 || September 27th || Quality ||<#E47A7A> || Beta 2 ||
||<-5 #eeeeee> October 2012 ||
|| 23 || October 4th || Quality ||<#E47A7A> /!\ KernelFreeze, /!\ DocumentationStringFreeze, /!\ NonLanguagePackTranslationDeadline || ||
||<style="color:#FF0000;"> '''U+1''' || Date TBC || ||<#E47A7A> '''Identifying and testing kernel packages post kernel freeze''' || ||
|| 24|| October 11th || Quality ||<#BB3333> /!\ FinalFreeze, /!\ ReleaseCandidate, /!\ LanguagePackTranslationDeadline || ||
||<style="color:#FF0000;"> '''U+1''' || October 13 - 17 || ISO testing ||<#BB3333> '''Live session and install tests of pre-final release''' || ||
||<style="color:#FF0000;"> '''U+1''' || Date TBC || ||<#BB3333> '''Identifying and testing packages relative to post release candidate''' || ||
|| 25|| October 18th || Quality ||<#BB3333> /!\ '''FinalRelease''' || {*} '''Ubuntu 12.10''' ||
 * '''QA''' Cadence weeks run from Saturday to Saturday.
 *
 *
 * Information on this page is drawn from various sources freely available to the public.

[[https://wiki.ubuntu.com/YakketyYak/ReleaseSchedule|Yakkety Yak Release Schedule]]

= Methods and Means =


== ISO Image Testing - Cadence Testing ==

 * Every 2 weeks as part of our testing cadence,, we download copies of the latest daily ISO images, burn them to CDs (or load them into VM's) and test them. This brings to light many issues that might have been missed by other early adopters and developers, especially in the CD builds and installers.

 * If you are interested in this kind of testing, see the [[https://wiki.ubuntu.com/Testing/ISO|ISO information page]] for instructions on getting started with ISO testing and directions for using the [[http://iso.qa.ubuntu.com/|test tracker]].

 * The [[https://wiki.ubuntu.com/Testing/ISO/Walkthrough|ISO Testing Walkthrough]] is a good place to start.

 * Also ensure you've joined the [[https://lists.ubuntu.com/mailman/listinfo/ubuntu-quality|Ubuntu QA mailing list]] to know when the testing weeks are occurring.
-----

== ARM Testing ==

 * Additionally, if you have the hardware for it, we are actively helping push forward ubuntu onto ARM architectures. Check out the [[https://wiki.ubuntu.com/ARM/QA|ARM pages]] for more information on testing ubuntu ARM images, including testing on a [[https://wiki.ubuntu.com/ARM/QA/Pandaboard|pandaboard]].

-----

== Daily smoke Testing ==

 * Between milestones, it is also good (but not release critical) to test the daily ISOs and as many applications as possible. This kind of testing can be done in a Virtual Machine or on spare hardware.

-----

== Stable Release Update (SRU) Testing ==

 * All [[https://wiki.ubuntu.com/StableReleaseUpdates|stable release updates]] are first uploaded to the Proposed repository before they are released live. An important principle behind Ubuntu is that we should never make a working system worse; stable release updates need to be extensively tested to make sure that there are no regressions.

 * If you are interested in this kind of testing, there is a public Launchpad [[https://launchpad.net/~sru-verification|SRU Verificaton team]] that you can join as well as a fantastic [[https://wiki.ubuntu.com/QATeam/PerformingSRUVerification|SRU wiki page ]] with helpful tools for finding bugs to work on.

-----

== Feature Testing ==

 * One of the objectives each cycle is to provide testing for all the major features in the release. We organize periodic calls for testing to focus on new features, such as a new video driver or a big change to an important tool like Software Center.

 * If this kind of testing interests you, please join the [[https://lists.ubuntu.com/mailman/listinfo/ubuntu-quality|Ubuntu QA mailing list]]. You will receive periodic emails about calls for testing a specific feature and how to participate.

-----

== General Testing ==

 * Ubuntu is more than its default installation and basic tasks - it's an entire repository of software and possible configurations. We need to test as much of it as possible, and some things aren't well covered by established test cases and procedures. General testing is as simple as attempting to use the development release and reporting whatever problems you run into as a bug.

 * If you are interested in this kind of testing, install the development version of ubuntu, and report and follow-up on any bugs you encounter. The mailing list and weekly QA meetings are another great place to discuss any bugs found that you feel may be of a more critical nature or have a widespread effect.

== Application Testing ==

 * Application testing is the manual testing of specific things (test cases) in applications. Regression tests are specific tests for potential breakages from one release to another (they're also relevant for SRU testing, above).
 * If you are interested in this kind of testing, head to [[http://packages.qa.ubuntu.com/|Packages QA Tracker ]] and run through the application testcases and report your results. NOTE: The application testcases are currently in the process of being migrated during the quantal cycle.
 * Now Testing
  * Week of 15th November - 12.10 kernel 3.5.0.18 on 12.04 userspace.

'''Install Quantal kernel'''
Prerequisites: Make sure you are running the latest version of precise, and all your packages are up to date

'''1)''' Add the [[https://launchpad.net/~ubuntu-x-swat/+archive/q-lts-backport|X-team ppa]]

{{{sudo add-apt-repository ppa:ubuntu-x-swat/q-lts-backport}}}

'''2)''' Update apt and install the new kernel

{{{sudo apt-get update && sudo apt-get install linux-image-generic-lts-quantal linux-headers-generic-lts-quantal}}}

'''3)''' Restart your computer to boot into the new kernel

'''Uninstall Quantal kernel'''
You may remove the 3.5 kernel by using ppa-purge

'''1)''' Install ppa-purge

{{{sudo apt-get install ppa-purge}}}

'''2)''' Remove the ppa

{{{sudo ppa-purge ppa:ubuntu-x-swat/q-lts-backport}}}

'''3)''' PPA Purge will find the packages installed and offered to downgrade them.
Say yes and ppa-purge will remove the upgraded versions and reinstall the versions from the archive.

'''4)''' Remove the meta packages

{{{sudo apt-get remove --purge linux-image-generic-lts-quantal linux-headers-generic-lts-quantal}}}

'''5)''' Remove the kernel itself. While running the new kernel, enter the following command

{{{uname -r}}}

This returns a number, like '3.5.0-8-generic'. Use the number (3.5.0-8) to replace the word KERNEL

{{{sudo apt-get remove --purge linux-image-KERNEL-generic linux-headers-KERNEL linux-headers-KERNEL-generic}}}

{{{sudo apt-get autoremove}}}

Eg, for '3.5.0-8'

{{{sudo apt-get remove --purge linux-image-3.5.0-8-generic linux-headers-3.5.0-8 linux-headers-3.5.0-8}}}


__Notes based upon experience__

During periods of transition from testing one kernel to testing the next we get the error message: "Unable to locate linux-image-generic-lts-quantal." The answer is to wait a few days and then run the update command again. It also takes the qa tracker as few days to be updated to allow reporting for that new kernel image.

[[http://packages.qa.ubuntu.com/qatracker/milestones/223/builds/25321/buginstructions|Bug reporting instructions]]
Line 57: Line 125:
/* each cell has a place holder with the same colour as the background. This hides the text. Replace the place holder with your text and remove the style=color format and change the background color or remove it */
/* Unused cells have have #F1F1DD as the background color; Test dates yet to be held should have red background. Test dates completed should have green background. */

||Alpha 1||June 02-06||
||Alpha 2||||<style="background-color:#F1F1DD;"> ||June 23 - 27 ||
||Alpha 3||||<style="background-color:#F1F1DD;"> ||||<style="background-color:#F1F1DD;"> ||||July 28 - August 01||
||Beta 1 ||||<style="background-color:#F1F1DD;"> ||||<style="background-color:#F1F1DD;"> ||||<style="background-color:#F1F1DD;"> ||||September 01 - 05||
||Beta 2 ||||<style="background-color:#F1F1DD;"> ||||<style="background-color:#F1F1DD;"> ||||<style="background-color:#F1F1DD;"> ||||<style="background-color:#F1F1DD;"> ||||September 22 - 26||
||12.10 ||||<style="background-color:#F1F1DD;"> ||||<style="background-color:#F1F1DD;"> ||||<style="background-color:#F1F1DD;"> ||||<style="background-color:#F1F1DD;"> ||||<style="background-color:#F1F1DD;"> ||||October 13 - 17||

||New task||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||
||New task||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||
||New task||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||

= Coordinated =

/* Each cell has a place holder with the same colour as the background. This hides the text. Replace the place holder with your text and remove the style=color format for the text and change the background color or remove it */
/* Unused cells have have #F1F1DD as the background color; Test dates yet to be held should have red background. Test dates completed should have green background. */

||New task||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||
||New task||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||
||New task||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||
||New task||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Start date||

= Regular =

/* each cell has a place holder with the same colour as the background. This hides the text. Replace the place holder with your text and remove the style=color format and change the background color or remove it */
/* Unused cells have have #F1F1DD as the background color; Test dates yet to be held should have red background. Test dates completed should have green background. */

||New task||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||
||New task||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||
||New task||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||<style="background-color:#F1F1DD;" style="color:#F1F1DD;">Period||

||Daily ISO testing||<style="background-color:#FFFF80;" width="75%"> || 18 October ||

Color ranges being tested here. I want paler colors than these.

||<style="background-color:#FFFF00;">placeholder||<style="background-color:#00FF00;">Placeholder||<style="background-color:#FF0000;">Placeholder||<style="background-color:#0000FF;">Placeholder||<style="background-color:#00FFFF;">Placeholder||<style="background-color:#FF00FF;">Placeholder||Placeholder||Placeholder||
||placeholder||Placeholder||Placeholder||Placeholder||Placeholder||Placeholder||Placeholder||Placeholder||

= Ideas =
== Automated Testing ==

 * Automated testing is the conversion of large numbers of test cases into simple scripts. They can often be run in bulk with a single command. UTAH is the main way of automating testcases for ubuntu. In addition, the program [[https://wiki.ubuntu.com/Testing/Automation/Checkbox|Checkbox]] also allows for automated test cases to be run and recorded.

 * If you are interested in this kind of testing, start by running checkbox. It is now installed by default. You can find it in the Dash by searching for ''System Testing''. If you wish to go further and write UTAH test cases, you can visit the [[https://wiki.ubuntu.com/QATeam/AutomatedTesting|Automated Testing]] page for more information.

=== Autopilot Unity Testing ===

The unity team has built autopilot as a testing tool for unity. However, autopilot has broader applications beyond unity to help us do automated testing on a grander scale. Source: http://qa.ubuntu.com/ See blog posted 20th November 2012.

 '''To Install Autopilot'''

{{{sudo apt-get install python-autopilot unity-autopilot}}}

 '''Caution:''' Autopilot tests should be run in a Guest session. Any data saved during a guest session is lost when we log out of the guest session. To prevent the lost of test results.

1) do not log out of the test session.

2) instead Switch User to your normal user account.

3) use the gksudo command to load Gedit or Nautilus.

4) browse to the /tmp folder and look for the guest folder. That is where you will find any documents saved whilst using the guest session. The guest folder will have a name similar to guest-v4GnNa. This folder name will be different each time you run a guest session. The second part of the name will be different.

5) copy the data into a new documetn and save it. Now you will not loose the data when you log out of the guest session.

 '''Autopilot Unity Tests'''

This command will list all the Unity tests available.

{{{autopilot list unity}}}

There are at present 461 Unity tests. We can run them as a single test. For example:

{{{autopilot run unity.tests.test_dash.DashRevealTests.test_alt_f4_close_dash}}}

Or, we can run all the tests at the same time with this command:

{{{autopilot run unity}}}

'''Caution:''' It will take a very long time to run all 461 Unity tests.

I have grouped the Unity tests into batches and I have added a command argument that will cause Autopilot to save the results into a log file. These files will be found in the /tmp/guest-###### folder. The log files will have a name that is based upon the 'computer-name_date_time.log' format. Therefore each log file will have a slightly different name that relates to the day and time that the Unity batch test was run. Autopilot will give the log file name when the test brings to run.

 '''Autopilot Unity batch tests commands'''

Copy and paste each command into a terminal. The great the number of tests in a batch, the longer the test will run.


        __Batches with 1 test__
{{{autopilot run -o . unity.tests.test_dash.CategoryHeaderTests}}}

{{{autopilot run -o . unity.tests.test_dash.DashDBusIfaceTests}}}

{{{autopilot run -o . unity.tests.test_dash.DashLensBarTests}}}

{{{autopilot run -o . unity.tests.test_dash.DashSearchInputTests}}}

{{{autopilot run -o . unity.tests.test_dash.DashVisualTests}}}

{{{autopilot run -o . unity.tests.test_home_lens}}}

{{{autopilot run -o . unity.tests.test_ibus.IBusTestsAnthy}}}

{{{autopilot run -o . unity.tests.test_ibus.IBusTestsHangul}}}

{{{autopilot run -o . unity.tests.test_ibus.IBusTestsPinyin}}}

        __Batches with 2 tests__
{{{autopilot run -o . unity.tests.test_dash.DashBorderTests}}}

{{{autopilot run -o . unity.tests.test_dash.DashCrossMonitorsTests}}}

{{{autopilot run -o . unity.tests.test_dash.DashKeyboardFocusTests}}}

{{{autopilot run -o . unity.tests.test_hud.HudAlternativeKeybindingTests}}}

{{{autopilot run -o . unity.tests.test_hud.HudCrossMonitorsTests}}}

{{{autopilot run -o . unity.tests.test_hud.HudLauncherInteractionsTests}}}

{{{autopilot run -o . unity.tests.test_ibus.IBusTestsAnthyIgnore}}}

{{{autopilot run -o . unity.tests.test_ibus.IBusTestsPinyinIgnore}}}

{{{autopilot run -o . unity.tests.test_switcher.SwitcherDetailsTests}}}

{{{autopilot run -o . unity.tests.xim.test_gcin.GcinTestHangul}}}

        __batches with 3 tests__
{{{autopilot run -o . unity.tests.test_hud.HudLockedLauncherInteractionsTests}}}

{{{autopilot run -o . unity.tests.test_panel.PanelIndicatorEntryTests}}}

{{{autopilot run -o . unity.tests.test_spread.SpreadTests}}}

{{{autopilot run -o . unity.tests.test_switcher.SwitcherWindowsManagementTests}}}

        __Batches with 4 tests__
{{{autopilot run -o . unity.tests.launcher.test_shortcut}}}

{{{autopilot run -o . unity.tests.launcher.test_visual}}}

{{{autopilot run -o . unity.tests.test_dash.DashLensResultsTests}}}

{{{autopilot run -o . unity.tests.test_dash.PreviewInvocationTests}}}

{{{autopilot run -o . unity.tests.test_ibus.IBusActivationTests}}}

{{{autopilot run -o . unity.tests.test_panel.PanelGrabAreaTests}}}

{{{autopilot run -o . unity.tests.test_showdesktop.ShowDesktopTests}}}

{{{autopilot run -o . unity.tests.test_switcher.SwitcherDetailsModeTests}}}

{{{autopilot run -o . unity.tests.test_switcher.SwitcherWorkspaceTests}}}

{{{autopilot run -o . unity.tests.test_unity_logging.UnityLoggingTests}}}

           __Batches with 5 tests__
{{{autopilot run -o . unity.tests.launcher.test_capture}}}

{{{autopilot run -o . unity.tests.test_dash.DashMultiKeyTests}}}

{{{autopilot run -o . unity.tests.test_panel.PanelKeyNavigationTests}}}

{{{autopilot run -o . unity.tests.test_shortcut_hint.ShortcutHintTests}}}

        __Batches with 6 tests__
{{{autopilot run -o . unity.tests.test_command_lens}}}

{{{autopilot run -o . unity.tests.test_dash.DashClipboardTests}}}

{{{autopilot run -o . unity.tests.test_shopping_lens.ShoppingLensTests}}}

        __Batches with 7 tests__
{{{autopilot run -o . unity.tests.test_dash.PreviewNavigateTests}}}

{{{autopilot run -o . unity.tests.test_shortcut_hint.ShortcutHintInteractionsTests}}}

        __Batches with 8 tests__
{{{autopilot run -o . unity.tests.launcher.test_reveal}}}

{{{autopilot run -o . unity.tests.test_panel.PanelCrossMonitorsTests}}}

{{{autopilot run -o . unity.tests.test_panel.PanelTitleTests}}}

        __Batches with 9 tests__
{{{autopilot run -o . unity.tests.test_dash.DashKeyNavTests}}}

{{{autopilot run -o . unity.tests.test_panel.PanelHoverTests}}}

{{{autopilot run -o . unity.tests.test_quicklist.QuicklistActionTests}}}

        __Batches with 10 tests__
{{{autopilot run -o . unity.tests.test_hud.HudVisualTests}}}

{{{autopilot run -o . unity.tests.test_panel.PanelMenuTests}}}

        __Batches with 12 tests__
{{{autopilot run -o . unity.tests.launcher.test_icon_behavior}}}

{{{autopilot run -o . unity.tests.test_switcher.SwitcherTests}}}

        __Batches with 13 tests__
{{{autopilot run -o . unity.tests.launcher.test_switcher}}}

{{{autopilot run -o . unity.tests.test_quicklist.QuicklistKeyNavigationTests}}}

        __Batches with 14 tests__
{{{autopilot run -o . unity.tests.test_dash.DashRevealTests}}}

        __Batches with 21 tests__
{{{autopilot run -o . unity.tests.launcher.test_keynav}}}

        __Batches with 28 tests__
{{{autopilot run -o . unity.tests.test_hud.HudBehaviorTests}}}

        __Batches with 30 tests__
{{{autopilot run -o . unity.tests.test_panel.PanelWindowButtonsTests}}}

Some of these Unity batch tests finish very quickly because they only have one or two or three tests to run. Other batch tests will take a lot longer because the have 6 to 10 or more tests to run. The return of the terminal command prompt will tell you that the test has finished.

Tests that detect failures will take longer to finsh than those tests that do not fail. The failure of a test is not an indication of a fault in the hardware or even of a bug in Unity. At this stage of development the failure might be a result of a fault in the coding of the test.

 '''Reporting Failure'''

Reports should be posted here: http://packages.qa.ubuntu.com/qatracker/milestones/246/builds/28425/testcases.

Others who are already pasting results are using http://paste.ubuntu.com/ or http://pastebin.com/ or some other type of paste bin to get their results to the QA Team. We can do the same by pasting the contents of each log file as will run the tests.

See here for examples of this method: http://packages.qa.ubuntu.com/qatracker/milestones/246/builds/28425/testcases/1466/results





-----

== Laptop Testing ==

 * Laptop Testing is about the manual testing of specific things (test cases) mainly related to laptops hardware, using milestone releases of the development version (alphas, betas and release candidates). The goal is to get Ubuntu to work great on as many different makes and models of laptops as possible and this can be done knowing which hardware works straight off the install CD and which hardware needs configuring or is poorly supported.

if you are interested in this kind of testing, head to the [[https://wiki.ubuntu.com/Testing/Laptop|laptop testing]] Wiki page.

-----

Team Structure
align="middle"

Home
History of U+1
Team FAQ
Contact U+1
Join U+1

Team WorkIconsPage/picto_engineering_48.png

Blog
Staff
Roles
Activities
Agenda

Docs & ToolsIconsPage/picto_articles_48.png

Testers FAQ
Testers Wiki
Tools
Library
Ubuntu Forums

Ideas & ProjectsIconsPage/picto_education_48.png

Brainstorming
ToDo
Ongoing
Instructional Development

The U+1 Team is looking for new members. Only basic skills are needed for most tasks. This is an opportunity to join a friendly and talented community, learn fast and be an active part in Ubuntu future. Click here to know more.

Activities Planner and Development Release Schedule

  • Period: YY (16.10) Development Cycle (May 2016 to October 2016).

  • Start date: Release date of Xenial Xerus (May 2016).

  • Weeks: Start on a Thursday.

  • Aim: To identify testing opportunities that fit in with the development release process.

  • QA Cadence weeks run from Saturday to Saturday.

  • Information on this page is drawn from various sources freely available to the public.

Yakkety Yak Release Schedule

Methods and Means

ISO Image Testing - Cadence Testing

  • Every 2 weeks as part of our testing cadence,, we download copies of the latest daily ISO images, burn them to CDs (or load them into VM's) and test them. This brings to light many issues that might have been missed by other early adopters and developers, especially in the CD builds and installers.
  • If you are interested in this kind of testing, see the ISO information page for instructions on getting started with ISO testing and directions for using the test tracker.

  • The ISO Testing Walkthrough is a good place to start.

  • Also ensure you've joined the Ubuntu QA mailing list to know when the testing weeks are occurring.


ARM Testing

  • Additionally, if you have the hardware for it, we are actively helping push forward ubuntu onto ARM architectures. Check out the ARM pages for more information on testing ubuntu ARM images, including testing on a pandaboard.


Daily smoke Testing

  • Between milestones, it is also good (but not release critical) to test the daily ISOs and as many applications as possible. This kind of testing can be done in a Virtual Machine or on spare hardware.


Stable Release Update (SRU) Testing

  • All stable release updates are first uploaded to the Proposed repository before they are released live. An important principle behind Ubuntu is that we should never make a working system worse; stable release updates need to be extensively tested to make sure that there are no regressions.

  • If you are interested in this kind of testing, there is a public Launchpad SRU Verificaton team that you can join as well as a fantastic SRU wiki page with helpful tools for finding bugs to work on.


Feature Testing

  • One of the objectives each cycle is to provide testing for all the major features in the release. We organize periodic calls for testing to focus on new features, such as a new video driver or a big change to an important tool like Software Center.
  • If this kind of testing interests you, please join the Ubuntu QA mailing list. You will receive periodic emails about calls for testing a specific feature and how to participate.


General Testing

  • Ubuntu is more than its default installation and basic tasks - it's an entire repository of software and possible configurations. We need to test as much of it as possible, and some things aren't well covered by established test cases and procedures. General testing is as simple as attempting to use the development release and reporting whatever problems you run into as a bug.
  • If you are interested in this kind of testing, install the development version of ubuntu, and report and follow-up on any bugs you encounter. The mailing list and weekly QA meetings are another great place to discuss any bugs found that you feel may be of a more critical nature or have a widespread effect.

Application Testing

  • Application testing is the manual testing of specific things (test cases) in applications. Regression tests are specific tests for potential breakages from one release to another (they're also relevant for SRU testing, above).
  • If you are interested in this kind of testing, head to Packages QA Tracker and run through the application testcases and report your results. NOTE: The application testcases are currently in the process of being migrated during the quantal cycle.

  • Now Testing
    • Week of 15th November - 12.10 kernel 3.5.0.18 on 12.04 userspace.

Install Quantal kernel Prerequisites: Make sure you are running the latest version of precise, and all your packages are up to date

1) Add the X-team ppa

sudo add-apt-repository ppa:ubuntu-x-swat/q-lts-backport

2) Update apt and install the new kernel

sudo apt-get update && sudo apt-get install linux-image-generic-lts-quantal linux-headers-generic-lts-quantal

3) Restart your computer to boot into the new kernel

Uninstall Quantal kernel You may remove the 3.5 kernel by using ppa-purge

1) Install ppa-purge

sudo apt-get install ppa-purge

2) Remove the ppa

sudo ppa-purge ppa:ubuntu-x-swat/q-lts-backport

3) PPA Purge will find the packages installed and offered to downgrade them. Say yes and ppa-purge will remove the upgraded versions and reinstall the versions from the archive.

4) Remove the meta packages

sudo apt-get remove --purge linux-image-generic-lts-quantal linux-headers-generic-lts-quantal

5) Remove the kernel itself. While running the new kernel, enter the following command

uname -r

This returns a number, like '3.5.0-8-generic'. Use the number (3.5.0-8) to replace the word KERNEL

sudo apt-get remove --purge linux-image-KERNEL-generic linux-headers-KERNEL linux-headers-KERNEL-generic

sudo apt-get autoremove

Eg, for '3.5.0-8'

sudo apt-get remove --purge linux-image-3.5.0-8-generic linux-headers-3.5.0-8 linux-headers-3.5.0-8

Notes based upon experience

During periods of transition from testing one kernel to testing the next we get the error message: "Unable to locate linux-image-generic-lts-quantal." The answer is to wait a few days and then run the update command again. It also takes the qa tracker as few days to be updated to allow reporting for that new kernel image.

Bug reporting instructions


Automated Testing

  • Automated testing is the conversion of large numbers of test cases into simple scripts. They can often be run in bulk with a single command. UTAH is the main way of automating testcases for ubuntu. In addition, the program Checkbox also allows for automated test cases to be run and recorded.

  • If you are interested in this kind of testing, start by running checkbox. It is now installed by default. You can find it in the Dash by searching for System Testing. If you wish to go further and write UTAH test cases, you can visit the Automated Testing page for more information.

Autopilot Unity Testing

The unity team has built autopilot as a testing tool for unity. However, autopilot has broader applications beyond unity to help us do automated testing on a grander scale. Source: http://qa.ubuntu.com/ See blog posted 20th November 2012.

  • To Install Autopilot

sudo apt-get install python-autopilot unity-autopilot

  • Caution: Autopilot tests should be run in a Guest session. Any data saved during a guest session is lost when we log out of the guest session. To prevent the lost of test results.

1) do not log out of the test session.

2) instead Switch User to your normal user account.

3) use the gksudo command to load Gedit or Nautilus.

4) browse to the /tmp folder and look for the guest folder. That is where you will find any documents saved whilst using the guest session. The guest folder will have a name similar to guest-v4GnNa. This folder name will be different each time you run a guest session. The second part of the name will be different.

5) copy the data into a new documetn and save it. Now you will not loose the data when you log out of the guest session.

  • Autopilot Unity Tests

This command will list all the Unity tests available.

autopilot list unity

There are at present 461 Unity tests. We can run them as a single test. For example:

autopilot run unity.tests.test_dash.DashRevealTests.test_alt_f4_close_dash

Or, we can run all the tests at the same time with this command:

autopilot run unity

Caution: It will take a very long time to run all 461 Unity tests.

I have grouped the Unity tests into batches and I have added a command argument that will cause Autopilot to save the results into a log file. These files will be found in the /tmp/guest-###### folder. The log files will have a name that is based upon the 'computer-name_date_time.log' format. Therefore each log file will have a slightly different name that relates to the day and time that the Unity batch test was run. Autopilot will give the log file name when the test brings to run.

  • Autopilot Unity batch tests commands

Copy and paste each command into a terminal. The great the number of tests in a batch, the longer the test will run.

  • Batches with 1 test

autopilot run -o . unity.tests.test_dash.CategoryHeaderTests

autopilot run -o . unity.tests.test_dash.DashDBusIfaceTests

autopilot run -o . unity.tests.test_dash.DashLensBarTests

autopilot run -o . unity.tests.test_dash.DashSearchInputTests

autopilot run -o . unity.tests.test_dash.DashVisualTests

autopilot run -o . unity.tests.test_home_lens

autopilot run -o . unity.tests.test_ibus.IBusTestsAnthy

autopilot run -o . unity.tests.test_ibus.IBusTestsHangul

autopilot run -o . unity.tests.test_ibus.IBusTestsPinyin

  • Batches with 2 tests

autopilot run -o . unity.tests.test_dash.DashBorderTests

autopilot run -o . unity.tests.test_dash.DashCrossMonitorsTests

autopilot run -o . unity.tests.test_dash.DashKeyboardFocusTests

autopilot run -o . unity.tests.test_hud.HudAlternativeKeybindingTests

autopilot run -o . unity.tests.test_hud.HudCrossMonitorsTests

autopilot run -o . unity.tests.test_hud.HudLauncherInteractionsTests

autopilot run -o . unity.tests.test_ibus.IBusTestsAnthyIgnore

autopilot run -o . unity.tests.test_ibus.IBusTestsPinyinIgnore

autopilot run -o . unity.tests.test_switcher.SwitcherDetailsTests

autopilot run -o . unity.tests.xim.test_gcin.GcinTestHangul

  • batches with 3 tests

autopilot run -o . unity.tests.test_hud.HudLockedLauncherInteractionsTests

autopilot run -o . unity.tests.test_panel.PanelIndicatorEntryTests

autopilot run -o . unity.tests.test_spread.SpreadTests

autopilot run -o . unity.tests.test_switcher.SwitcherWindowsManagementTests

  • Batches with 4 tests

autopilot run -o . unity.tests.launcher.test_shortcut

autopilot run -o . unity.tests.launcher.test_visual

autopilot run -o . unity.tests.test_dash.DashLensResultsTests

autopilot run -o . unity.tests.test_dash.PreviewInvocationTests

autopilot run -o . unity.tests.test_ibus.IBusActivationTests

autopilot run -o . unity.tests.test_panel.PanelGrabAreaTests

autopilot run -o . unity.tests.test_showdesktop.ShowDesktopTests

autopilot run -o . unity.tests.test_switcher.SwitcherDetailsModeTests

autopilot run -o . unity.tests.test_switcher.SwitcherWorkspaceTests

autopilot run -o . unity.tests.test_unity_logging.UnityLoggingTests

  • Batches with 5 tests

autopilot run -o . unity.tests.launcher.test_capture

autopilot run -o . unity.tests.test_dash.DashMultiKeyTests

autopilot run -o . unity.tests.test_panel.PanelKeyNavigationTests

autopilot run -o . unity.tests.test_shortcut_hint.ShortcutHintTests

  • Batches with 6 tests

autopilot run -o . unity.tests.test_command_lens

autopilot run -o . unity.tests.test_dash.DashClipboardTests

autopilot run -o . unity.tests.test_shopping_lens.ShoppingLensTests

  • Batches with 7 tests

autopilot run -o . unity.tests.test_dash.PreviewNavigateTests

autopilot run -o . unity.tests.test_shortcut_hint.ShortcutHintInteractionsTests

  • Batches with 8 tests

autopilot run -o . unity.tests.launcher.test_reveal

autopilot run -o . unity.tests.test_panel.PanelCrossMonitorsTests

autopilot run -o . unity.tests.test_panel.PanelTitleTests

  • Batches with 9 tests

autopilot run -o . unity.tests.test_dash.DashKeyNavTests

autopilot run -o . unity.tests.test_panel.PanelHoverTests

autopilot run -o . unity.tests.test_quicklist.QuicklistActionTests

  • Batches with 10 tests

autopilot run -o . unity.tests.test_hud.HudVisualTests

autopilot run -o . unity.tests.test_panel.PanelMenuTests

  • Batches with 12 tests

autopilot run -o . unity.tests.launcher.test_icon_behavior

autopilot run -o . unity.tests.test_switcher.SwitcherTests

  • Batches with 13 tests

autopilot run -o . unity.tests.launcher.test_switcher

autopilot run -o . unity.tests.test_quicklist.QuicklistKeyNavigationTests

  • Batches with 14 tests

autopilot run -o . unity.tests.test_dash.DashRevealTests

  • Batches with 21 tests

autopilot run -o . unity.tests.launcher.test_keynav

  • Batches with 28 tests

autopilot run -o . unity.tests.test_hud.HudBehaviorTests

  • Batches with 30 tests

autopilot run -o . unity.tests.test_panel.PanelWindowButtonsTests

Some of these Unity batch tests finish very quickly because they only have one or two or three tests to run. Other batch tests will take a lot longer because the have 6 to 10 or more tests to run. The return of the terminal command prompt will tell you that the test has finished.

Tests that detect failures will take longer to finsh than those tests that do not fail. The failure of a test is not an indication of a fault in the hardware or even of a bug in Unity. At this stage of development the failure might be a result of a fault in the coding of the test.

  • Reporting Failure

Reports should be posted here: http://packages.qa.ubuntu.com/qatracker/milestones/246/builds/28425/testcases.

Others who are already pasting results are using http://paste.ubuntu.com/ or http://pastebin.com/ or some other type of paste bin to get their results to the QA Team. We can do the same by pasting the contents of each log file as will run the tests.

See here for examples of this method: http://packages.qa.ubuntu.com/qatracker/milestones/246/builds/28425/testcases/1466/results


Laptop Testing

  • Laptop Testing is about the manual testing of specific things (test cases) mainly related to laptops hardware, using milestone releases of the development version (alphas, betas and release candidates). The goal is to get Ubuntu to work great on as many different makes and models of laptops as possible and this can be done knowing which hardware works straight off the install CD and which hardware needs configuring or is poorly supported.

if you are interested in this kind of testing, head to the laptop testing Wiki page.


U+1/activities (last edited 2016-06-08 11:01:09 by localhost)