Evaluation: Questionnaire on first release

In this post we explain how we collected and processed the feedback on the first release (version 0.1). We analyze the results from the questionnaire and discuss what should be changed in the next release.

Goal

We want to get answers to some questions that came up during development. These questions are about which services the user uses (log in, calendar) and how the users thought certain parts of the interface worked (e.g. whether they chose all available time slots and when notifications would be sent). Apart from that we want to have a general idea how usable (more specifically learnable, effective and satisfactory) the first version is.

Method

We made a questionnaire with Google Docs for users to fill in after they used the app for a bit. This questionnaire consists of four parts:

  1. general questions for everyone
  2. questions for the host of a cooking party
  3. questions for an invitee of a cooking party
  4. questions of the SUS

Rationale

The answers to the questions of the first three parts will allow us to make informed decisions for the next release. The SUS questions give us a high level idea of how usable the app is now. They can help us localize problems as well as  compare with future releases.

Who, what, where

The first version of our app has been tested so far by 21 people (friends, family and classmates). We contacted them either in person or through announcements in the social media (Facebook and Twitter). 12 of them responded to our questionnaire.

Results

The results are published online on this URL. Additionally, we plotted the results from the SUS questions as boxplots in order to allow easier comparison with upcoming releases. Our SUS total score comes out at 71,875% (according to the scoring scheme from Usability.gov).

Evaluation release 1: box plot SUS 1-3
Evaluation release 1: box plot SUS 4-6
Evaluation release 1: box plot SUS 7-9
Evaluation release 1: box plot SUS 10

Conclusions

General questions

Facebook is, as expected, the most popular way to log in. The most popular calendar service is Google Calendar. It might be more convenient to switch to Google+ as log in, in order to more easily integrate with Google Calendar when we implement the “add party to calendar” functionality. On the other hand, Google Calendar can also integrate with Android’s built-in calendar app, so we might just use the built-in app and let the user handle the synchronization with Google Calendar outside of our app.

Most users (both hosts and invitees) selected all time slots that they were available for (12 out of 18). We think that some users only selected one time slot because they knew they were only testing, so the number will probably be higher when the app is used in real situations.

Questions for hosts

A (small) majority of users expected the invites to be sent when clicking ‘Done’. This is not how we currently implemented it, and it is also not how we want it to work. We don’t want invites to be “fire and forget”, we want to encourage the host to respond to declined invites by inviting other candidates. We will make sure this is more clear in the next release.

Questions for invitees

Both hosts (6/8) and invitees (3/4) want the host to receive notifications when an invitation is declined. We decided to implement this in release 2.

All invitees understood the meaning of disabled time slots. In the next release we will probably simply remove these time slots, as there is really no point in still showing them.

Most (4/6) of the invitees understood the meaning of the photos of people on the top of the screen.

SUS

The first SUS question on usage frequency received an average reply of 3.08. This low score is partly due to the fact that the first version of the app only has one stripped-down function, so it seems reasonable that users don’t expect to use something that basic a lot.

The testers didn’t find the app too complex (avg. 2.0/5), inconsistent (avg. 2.2/5) or cumbersome (avg. 2.1/5). They also didn’t feel the need to learn a lot before using the app (avg. 1.8/5) or for technical support (avg. 1.8/5).

The testers found the app relatively easy to use (avg. 3.9/5), well integrated (avg. 3.7/5) and quick to learn (avg. 4.1/5). They felt generally confident using it (avg. 3.8/5).

Final thoughts

We think the questionnaire really helped us track down some issues (such as the confusion around the ‘Done’ button) and select features for the next release (declined invite notifications). The SUS results are not that spectacular, but we think that’s normal for a first release based on a minimal scenario.

Advertisements

3 comments

  1. Great overview of your evaluation, I liked the boxplots (maybe we’ll use that for our next evaluations too;) ).

Comments are closed.