Meeting started by brendand at 15:02:08 UTC. The full logs are available at http://ubottu.com/meetingology/logs/ubuntu-meeting/2011/ubuntu-meeting.2011-10-24-15.02.log.html .

Meeting summary

Scoring system

Jedimike explained changes that are to be made to the Ubuntu Friendly scoring system so that systems that would be awarded 1 star due to tests that weren't run will be rejected by the system. This will avoid the large number of systems currently on the site that were not tested properly (over 250).

Everyone agreed this was a good idea. Some issues were raised around how this change will be communicated to the user. Making tests which are essential unskippable in Checkbox would be the best solution but would be a large change to SRU into Oneiric. Guidance on the UF website seems to be the agreed way to do it at the moment, but brendand and akgraner concurred that some users don't necessarily make first contact with Ubuntu Friendly through the website so might miss this memo.

  • AOB

ACTION: cr3 to add gremlins/detect to checkbox (brendand, 15:47:42) ACTION: jedimike to prepare a small usability survey about the site (brendand, 16:01:24)

Meeting ended at 16:01:59 UTC.


Action items

  • cr3 to add gremlins/detect to checkbox
  • jedimike to prepare a small usability survey about the site

Action items, by person

  • cr3
  • * cr3 to add gremlins/detect to checkbox
  • jedimike
  • * jedimike to prepare a small usability survey about the site

People present (lines said)

  • brendand (75)
  • jedimike (32)
  • cr3 (27)
  • ara (22)
  • roadmr (19)
  • akgraner (15)
  • meetingology (5)

Full Log

  • 15:02:08 <brendand> #startmeeting Ubuntu Friendly Squad

    15:02:08 <meetingology> Meeting started Mon Oct 24 15:02:08 2011 UTC. The chair is brendand. Information about MeetBot at http://wiki.ubuntu.com/AlanBell/mootbot.

    15:02:08 <meetingology>

    15:02:08 <meetingology> Available commands: #accept #accepted #action #agree #agreed #chair #commands #endmeeting #endvote #halp #help #idea #info #link #lurk #meetingname #meetingtopic #nick #progress #rejected #replay #restrictlogs #save #startmeeting #subtopic #topic #unchair #undo #unlurk #vote #voters #votesrequired

    15:02:33 <ara> hello!

    15:02:36 <jedimike> hi

    15:02:36 <roadmr> hey Smile :)

    15:02:45 <brendand> The agenda for today is:

    15:02:47 <brendand> Scoring system - jedimike

    15:02:47 <brendand> AOB

    15:02:58 <brendand> #topic Scoring system - jedimike

    15:03:10 <jedimike> o/

    15:03:11 <brendand> jedimike - proceed Smile :)

    15:03:29 <jedimike> currently, the rules for rejection and failure of submissions are this:

    15:03:29 <jedimike> * If you skip all tests in a core category, we reject the submission

    15:03:30 <jedimike> * If you skip an individual test in a core category that required external hardware that not everyone might have (writable media, for example), we ignore that test and apply a penalty to the score, but still allow that category to pass if all other tests pass

    15:03:30 <jedimike> * If you skip an individual test in a core category that you should have ran, we fail the category and score the system 1 star

    15:03:30 <jedimike> That last rule seems far too harsh. I think that skipping an individual test in a core category that we don't allow you to skip should result in the submission being rejected, as we have a lot of 1 star systems that are only 1 star because someone skipped a test.

    15:03:43 <jedimike> 251 systems, to be precise

    15:04:03 <ara> o/

    15:04:06 <jedimike> ..

    15:04:13 <brendand> ara - go ahead

    15:04:48 <ara> I agree that we should change 3rd rule to reject the submission

    15:04:56 <ara> if a non-skippable test was skipped

    15:05:04 <jedimike> o/

    15:05:14 <ara> and I would put something in the particpate page to make things clearer

    15:05:27 <ara> ..

    15:05:41 <brendand> back to you jedimike

    15:06:34 <jedimike> I'd also put something in that intermediate "report a problem" page that says, "Can't find your results?" and explains why we reject submissions, and directs them towards either results tracker, or where we need them to look, to view their submission

    15:06:36 <jedimike> ..

    15:06:54 <ara> o/

    15:07:09 <brendand> ara - shoot

    15:07:41 <ara> yes, I agree. I think that we don't need to be really fancy on letting people know exactly why it was rejected

    15:07:53 <ara> a nice explanation of the basics should be fine

    15:08:07 <ara> and yes ,that explanation should be at Participate and Report a problem

    15:08:28 <brendand> o/

    15:08:29 <ara> not sure if pointing to the submissions is needed, though

    15:08:29 <ara> ..

    15:09:01 <roadmr> o/

    15:09:04 <brendand> just a quick piece, to say that in the long term if we could enforce running of the tests through the checkbox UI that would be nice

    15:09:05 <brendand> ..

    15:09:21 <brendand> roadmr - go ahead

    15:09:29 <roadmr> we should make sure that reasons for rejecting submissions are made abundantly clear

    15:09:31 <cr3> o/

    15:09:43 <roadmr> we already see people wondering why their submissions take so long to appear in UF

    15:09:54 <roadmr> so if we just start "eating" them there's bound to be some complaints

    15:10:00 <roadmr> it goes to being as transparent as possible

    15:10:16 <jedimike> o/

    15:10:29 <roadmr> ideally I'd like to see checkbox say "this submission is not UF-complete so it'll be rejected" or something to the point (i.e. instant feedback) but it may not be possible to do Sad :(

    15:10:37 <roadmr> still this is something to be looked into, for transparency's sake.

    15:10:38 <roadmr> ..

    15:10:40 <ara> o/

    15:10:55 <brendand> cr3 - your turn

    15:10:57 <cr3> brendand: dude, I want to write a testing game! seriously though, it doesn't really need to contribute to UF though but might be nice Smile :)

    15:11:00 <cr3> ..

    15:11:20 <brendand> ok, after that, back to ara

    15:11:34 <cr3> not jedimike?

    15:11:45 <ara> jedimike it is

    15:11:47 <brendand> cr3 - you're right

    15:11:57 <brendand> jedimike?

    15:12:02 <cr3> brendand: /unignore jedimike Smile :)

    15:12:18 <jedimike> I agree with roadmr that we do need something to tell the users at least which of their submissions made it and which didn't

    15:12:25 <jedimike> we don't need to break down scores or anything

    15:12:36 <jedimike> but it might improve the quality of submissions

    15:12:54 <jedimike> if we were able to say "X submission didn't get accepted because you skipped audio/xyz"

    15:13:17 <jedimike> and it would cut out the "why didnt my results get listed" questions totally"

    15:13:18 <jedimike> ..

    15:13:40 <brendand> ara, your turn (and apologies to jedimike)

    15:14:19 <ara> so, I agree with roadmr, but those are ideas for 12.04 LTS, when we improve the UI

    15:14:32 <ara> but we need to take them into account, of course

    15:15:12 <ara> with the UI changes we can make a much better work in letting people know what they have to run for the submission to be accepted in UF

    15:15:33 <jedimike> o/

    15:15:44 <ara> for now, an explanation of what is a core category and how it works might help

    15:15:44 <ara> ..

    15:15:58 <brendand> jedimike, go ahead

    15:16:45 <jedimike> if we're not going to make the submissions available to the users through UF, can I put in a feature request for results tracker so that a user's test runs are linked to from their page on there?

    15:16:45 <jedimike> ..

    15:17:03 <brendand> o/

    15:18:17 <brendand> we need to remember that even though ubuntu-friendly is in beta, if we don't keep things transparent we risk creating frustration for users

    15:18:53 <jedimike> o/

    15:19:39 <brendand> one challenge i think we have is that we are able to update UF pretty much as we want, but not the source of the tests (i.e. checkbox)

    15:20:10 <brendand> so we need to avoid making changes in UF which really require support from checkbox

    15:20:26 <brendand> this is one of those i think

    15:20:27 <brendand> ...

    15:20:37 <brendand> jedimike - you can go now

    15:20:46 <akgraner> oops sorry I am late

    15:21:33 <brendand> akgraner - no problem. we're discussing a possible change to the scoring system. 15:21:50 * akgraner catches up

    15:22:00 <brendand> akgraner - thanks

    15:22:03 <brendand> jedimike?

    15:22:05 <jedimike> just to say, transparency is good Smile :) and at the moment it's not clear if your submission has made it or not, and I think if it's possible to make that change on results track to link the user's page to their test submissions it would help (even helps me respons to bug reports!)

    15:22:11 <jedimike> ..

    15:22:41 <brendand> o/

    15:22:45 <akgraner> o/

    15:23:02 <brendand> akgraner - you go first

    15:23:47 <akgraner> Is ther anyway to say - not let the skipped question count against the over scoring and not somewhere that these scores don't include skipped questions

    15:24:06 <akgraner> overall scoring I meant

    15:24:11 <akgraner> ...

    15:25:35 <akgraner> I mean I might have skipped a test just b/c I didn't have a USB stick handy that doesn't mean it didn't pass

    15:25:43 <brendand> akgraner - the issue is that, at the moment tests like the audio ones are skippable. and it seems a lot of people are skipping them. but if we only have one submission for a system then we need to make a call about what that means.

    15:25:58 <ara> me needs to leave now

    15:26:02 <brendand> bye ara 15:26:06 * ara will read the minutes tomorrow

    15:26:07 <ara> cheers 15:26:19 * brendand continues

    15:26:21 <akgraner> brendand, I go back and re-do the test for the ones I skip once I find all my stuff Smile :-)

    15:26:22 <jedimike> o/

    15:27:04 <brendand> we don't want to say that a skipped test means that the component must work, but neither can we say for sure that it means it doesn't

    15:27:20 <brendand> so the best thing to do is probably to not accept these submissions at all

    15:27:44 <akgraner> but some people don't have external monitors

    15:27:57 <akgraner> so you would disregard their test on that one point

    15:28:00 <brendand> akgraner - but for tests which need special equipment we already have a different rule, so your system can still get a good score if you didn't test external monitor or usb

    15:28:14 <cr3> o/

    15:28:21 <akgraner> ah ok Smile :-)

    15:28:30 <jedimike> akgraner: if a test requires special equipment, like a USB stick or external monitor, we allow people to skip it without failing that component

    15:29:06 <brendand> ..

    15:29:19 <brendand> jedimike - you can go ahead now

    15:29:28 <akgraner> gotcha - sorry I didn't know that. /me is quiet now Smile :-)

    15:29:46 <jedimike> was just going to say that Smile :) and that we need to make that clear on the participate page

    15:30:21 <jedimike> and implement what ara said about making it clear that if you skip tests that don't require external equipment

    15:30:28 <jedimike> your submission may not be included in the site

    15:30:29 <jedimike> ..

    15:31:10 <brendand> akgraner - an example of a test you *can't* skip is audio/alsa_record_playback_internal, since it doesn't require extra equipment

    15:31:18 <brendand> cr3, your turn

    15:31:20 <cr3> if I understand correctly, skippable tests may affect scoring between 3-5, but non-skippable tests are those that may affect scoring between 1-3

    15:31:23 <cr3> I think someone made a point that non-skippable tests should be enforced in the checkbox UI so that people don't get surprised with a crappy score between 1-3

    15:31:26 <cr3> ..

    15:31:34 <brendand> cr3 - that was me Smile :)

    15:31:49 <brendand> cr3 - but the problem is we'd need to SRU that change in

    15:32:22 <brendand> o/

    15:32:55 <cr3> brendand: oneiric is beta, it could be argued that we're really targeting precise with the ultimate ninja solution

    15:33:27 <brendand> actually, if i could expand on this. i'm not sure i feel too awesome about the 'skippableness of tests being encoded in the u-f site itself

    15:33:42 <brendand> it should really be encoded in checkbox

    15:34:25 <brendand> ..

    15:35:43 <brendand> it seems that most people agree we need to reject submissions that don't have all the tests run that must be run

    15:36:03 <brendand> the question is to what extent do we guide the users about this?

    15:36:36 <brendand> ideal would be to enforce it in checkbox, but in the short term it needs to be stated on the UF website

    15:36:40 <brendand> but...

    15:37:19 <brendand> it's important to remember that the assumption that everyone will engage with UF directly through the site is probably wrong

    15:37:46 <brendand> so some users may not even read the participate page

    15:37:47 <brendand> ...

    15:38:32 <akgraner> o/

    15:38:46 <brendand> akgraner - go ahead

    15:39:22 <akgraner> brendand, your assumption is right - many people find out about system testing on their computer, run the test, *then* find out about the site

    15:40:03 <akgraner> and I know a people who never read documentation (sadly) but it does happen...

    15:40:22 <cr3> o/ 15:40:32 * brendand points to self

    15:40:38 <brendand> cr3 - your turn

    15:41:13 <cr3> it would be nice to see the number of submissions to launchpad before and after ubuntu friendly was announced, there was already a large number of submissions coming in before probably from people just discovering checkbox and running it for the heck of it

    15:41:17 <cr3> ..

    15:42:39 <brendand> cr3 - indeed

    15:43:44 <cr3> brendand: modulo hardware certification submissions, of course Smile :)

    15:45:14 <cr3> brendand: is it time for aob?

    15:45:32 <brendand> seems like everyone is done on this topic

    15:45:36 <brendand> #topic AOB

    15:46:17 <brendand> AOB?

    15:46:31 <cr3> o/

    15:47:12 <brendand> cr3 - yep

    15:47:13 <cr3> checkbox needs tests for gremlins because they've been misbehaving in my computer lately Smile :)

    15:47:16 <cr3> ..

    15:47:42 <brendand> #action cr3 to add gremlins/detect to checkbox 15:47:42 * meetingology cr3 to add gremlins/detect to checkbox

    15:48:10 <cr3> "Does your gremlin have a shiny coat, and I don't mean a pimp coat!"

    15:49:12 <roadmr> o/

    15:49:51 <brendand> roadmr - is this still about gremlins?

    15:49:58 <roadmr> nope

    15:50:07 <brendand> roadmr - then please, go ahead Smile :)

    15:50:28 <roadmr> heheh, just wanted to get a feel for how useful people think the number of "raters" is in the UF front page

    15:50:58 <cr3> between 1 and 10, I'd say 11 Smile :)

    15:51:11 <roadmr> ... basically just that, should that information be available at a glance, or is it ok to move it to the system's detail page for instance?

    15:51:14 <roadmr> ..

    15:52:31 <brendand> roadmr - i think it depends on whether people are using it the way we imagine they would (i.e. to get an idea of how 'reliable' the results are)

    15:53:54 <roadmr> hmm maybe at some point we could conduct a poll on how useful people visiting UF think each bit of information is

    15:54:06 <brendand> maybe at UDS?

    15:54:39 <roadmr> I was thinking something like those "would you like to answer a poll to help improve our site?" - to get a feel from normal, average users

    15:54:57 <roadmr> I think the UDS crowd may be too biased towards preferring a lot of information Smile :)

    15:55:15 <cr3> roadmr: maybe it can be used as an excuse to make people aware of the site in the first place

    15:55:31 <brendand> roadmr - i hate those things Smile :) but yeah, you're right it should be from normal site users

    15:55:34 <cr3> roadmr: although, since ara will be making a presentation, everyeone will innevitably know about it. maybe ask her to announce the poll?

    15:55:36 <roadmr> cr3: that too! and to encourage a bit more participation, which is always good

    15:56:28 <cr3> I think that deserves an action item for ara if we all agree a poll would be useful. I don't see how it could hurt

    15:57:08 <cr3> from the little usability testing I've done, it's been tremendously useful

    15:57:35 <cr3> if anyone intends to prepare some usability testing sessions, you might like to read: Rocket Surgery Made Easy

    15:58:21 <roadmr> we'd have to have the poll ready before UDS (i.e. in about 10 days)

    15:58:21 <cr3> ... and don't let jedimike run the session otherwise he'll jedi mind trick everyone to say what he wants to hear Smile :)

    15:59:04 <brendand> who'd like to prepare some questions?

    16:00:07 <brendand> considering we're nearly out of time, i propose cr3

    16:00:29 <cr3> brendand: I thought jedimike would be better placed for writing the questions, no?

    16:00:43 <brendand> jedimike - do you want to do that?

    16:01:01 <jedimike> brendand: yeah

    16:01:24 <brendand> #action jedimike to prepare a small usability survey about the site 16:01:24 * meetingology jedimike to prepare a small usability survey about the site

    16:01:37 <brendand> ok, i think our time here is up

    16:01:49 <brendand> thanks everyone for your participation

    16:01:59 <brendand> #endmeeting

UbuntuFriendly/Meetings/20111024 (last edited 2011-10-27 15:46:11 by brendan-donegan)