Usability Testing Tool Review: Usabilla
We’re seeing a boom period in the development of new tools for usability testing, which is a great trend that foretells better web experiences in the coming years. Today I’m going to look at Usabilla, which my friend Keri Morgret and I used on her site Strike Models, which sells products for building remote controlled […]
We’re seeing a boom period in the development of new tools for usability testing, which is a great trend that foretells better web experiences in the coming years. Today I’m going to look at Usabilla, which my friend Keri Morgret and I used on her site Strike Models, which sells products for building remote controlled battleships.
Service description: Usabilla shows screenshots of your choosing to testers and asks them questions which they answer by clicking and/or annotating over the image.
You choose what questions to ask from a preset list and/or provide your own.
You can provide the screenshots or just input the URL and Usabilla will take the screenshots for you.
Usabilla hosts the test (it appears in an overlay).
Here’s what the marked up screenshot looked like, with points (circles) and notes (squares).
Value of insights obtained: Pretty high in many cases, perhaps 8.5/10.
1. The header graphic, which I took to be just ordinary branding, was seen as displaying products from Strike Models’ catalogue. Likewise, an image of the products in action lower down saw notes attached suggesting that the boats be linked to associated products.
2. The basic navigation menu above the header graphic that is standard on the Thesis theme is subtle and takes a while to notice (the average time recorded by Usabilla was 15.8 seconds. Accordingly, a few people suggested making it larger and one thought moving it under the main header graphic would be helpful.
Nevertheless, the large majority of users clicked on the menu item labeled Products when told to find product info and try to order.
3. One tester said he was a news addict and wanted more news about the model warship field. This obviously presents an opportunity to become the content authority in the field and build up an SEO advantage accordingly.
4. Another tester was interested in seeing more featured products, and several users generally commented that they would like it to be easier to find information about the products.
5. Product images and lighting could often be improved, and when they were this induced interest in finding out more about the products. Action shots in particular were really interesting to several users, since after all the point is to have the battleships duke it out on a mini-pond.
6. Remove pictures of products that aren’t carried.
Usability of Usabilla: It’s a little ironic, but there were a few stumbling blocks to making use of this tool.
First, once you upload your pictures and get started, it’s not clear that you need to check the checkboxes near the questions you want to ask. I thought I was going to ask all of them, and wanted to click the Activate button, but couldn’t when I had no questions yet selected.
Second, Usabilla provides no panel of users, which means you need to recruit people yourself. In this case, I was fortunate to be able to rely on Keri’s network of contacts, but otherwise this can be problematic.
For example, a usability client has a target demographic of women in their 50s to early 70s. I know few women that meet the description, so recruiting subjects has been a challenge.
Third, the results are displayed in a way that isn’t intuitive and instead relies on mystery meat navigation. You need to hover the mouse over the squares to read the attendant notes… but you don’t know that to begin with, let alone that squares different from circles because they represent notes.
Similarly, the full notes themselves are displayed after a large whitespace area that creates a false bottom impression, in IE7 (and IE8?).
In fairness to Usabilla, their support was very fast and helpful in clarifying things, and I was able to figure out the whole notes thing after a little initial frustration. Additionally, Usabilla support told me they’re working on redesigning the back end to make it more intuitive, so hopefully these—relatively minor— problems will be quickly resolved.
The panel question remains an issue, however.
Miscellaneous: Having data on the average speed of task completion is very valuable in some cases. For example, once Strike Models gets its ecommerce functionality implemented (and hopefully a proper redesign), I’d retest the homepage to see how fast people can get to product details. I’d guess this ties back in to bounce rate as well…
Another thing that has its pros and cons is that in contrast to many tools currently out there, Usabilla is driven by click data, not screen sharing or recording.
The end result is that you can review certain aspects faster (since you don’t need to watch video) and have a more reliable idea (assuming you recruit enough users) of what works and what doesn’t on a statistical basis. Also in contrast to video—where you’d have to fast-forward or mark particular points (which some usability tools do enable, however)- you can review particular questions’ answers together as a group, which also speeds analysis up.
Conclusion: Usabilla provides additional, valuable insights to the usability professional’s toolkit. The qualitative insights one can get are valuable and practical, and it has the advantage of being slightly faster in terms of analysis.
The main disadvantages are that you need to recruit testers yourself and you get somewhat poorer qualitative data than the full thought stream provided with audio/video recording.
Interested in more reviews? I’m writing a series of reviews on these tools, which started with a review of UserTesting.com.
New on Search Engine Land