I experienced the same effectiveness (95%-99%) of getting the "guy/gal reading the newspaper" to volunteer for the study first hand. Although, this was more focused in cafe environments.
Our team was trying to get beta users signed up for a ___location-based iphone app in the Boston/Cambridge area and most people reading newspapers in coffee shops or cafes are frequently willing participants. I even noticed that 7/10 times these people will engage with you in conversation regarding feedback/UX improvements etc.
In parallel with Rodyancy's experience, some of the user feedback was arbitrary (one guy started comparing our application to time/space continuum theories...etc) but nonetheless, I would advocate approaching the newspaper guy/gal as they are already on the knowledge quest and usually are more open-minded.
Oh, and avoid the book/textbook readers. These people are either really engaged in novel (imagine tapping on someone's shoulder during a movie) or are stressed-out students.
I've done the sober equivalent to this (coffee shops). Not only did it point out some usability issues, like, "oh, he thinks THAT'S a button," it also proved to be a great way to generate ideas. Most of the input we received was dead wrong, but we had to think about why it was wrong, and that got us thinking in ways we hadn't thought before, which led to good ideas. I suspect the drunken version would be more fun and just as useful, assuming you are producing a consumer product. As for B2B, if your ___domain is specific, this might not be the most valuable use of your time.
What other types of guerrilla testing have people tried before? This sounds like a fairly witty idea for gathering user interaction information, but I wonder if it's actually functional.
Thanks, that's good to know. I know someone once mentioned using Mechanical Turk and that those sometimes ended up being their most loyal users, but I cannot recall the context. Has anyone used these services and what results did you see?
I've tried Feedback Army and was disappointed w/ the results. Not that it isn't a cool idea, but I think the give and take of a good usability test requires that it be done in person. If you watch usability expert Steve Krug doing a demo test (http://network.businessofsoftware.org/video/steve-krug-on-th...), you'll see what I mean.
I recently saw a blog post where someone wrote about the feedback they received and they weren't happy. The same person then revised their questions and tried a more task oriented approach and they were much happier.
Then again, Feedback Army is just a usability testing tool. The right one for the job. I think it's best when a breadth of ideas are needed quickly or you need some ideas about where to start improving your site.
You should use other methods to do testing that requires a lot of depth from one or two people.
Here are a few more posts where folks have written up their experiences:
kartme posted a good post about changes they made with usability testing. they mentioned uservoice which is on turk if I remember, they mentioned doing a followup post explaining more about what the did during testing.
I like their approach using a variety of tools at different stages of product development to get the job done.
I think something our field may benefit from is a guide that explains how to use the different usability services and at what point in the project cycle they should be used.
Anything run by a 3rd party strikes me as immediately questionable for accurate testing. They necessarily use people more geared towards using applications, as otherwise they wouldn't be there in the first place, getting paid to do something that most people wouldn't touch. They can certainly provide extensively useful insight, so running through a UI-review company is still a good idea, but can there really be a guerrilla testing company? The idea of guerrilla testing is to get totally untainted results, isn't it?
This is popular in some design circles, particularly with evaluating reactions to 2/3 designs prior to launch.
I'm not sure you'd get exactly the same quality of results with a user test, and demographics is key - after all, the blue-collar workers at the pub aren't the people to ask about your b2b conversion tool - but it's certainly the type of thing that we should be doing.
I'm kind of disappointed that Steve Krug didn't think up something like this for _Rocket Surgery Made Easy_ (http://www.sensible.com/rocketsurgery/index.html); this would be a very convenient source of manpower for his model of "hallway usability testing," and would allow even larger numbers of even more easily set-up tests than the model he describes.
It's good to see how usability testing is becoming de-mythologized these days; all of us should keep in mind, for current and future program development, that you really don't need a very sophisticated apparatus.
Start off approaching a group of the same sex as you so that the encounter isn’t sexualized. Next, move on to a mixed sex group and then finish the night with an opposite gender interaction so you get a nice demographic spread.
I wonder if that last bit is intentionally a double-entendre. Especially since the proposed environment involves a bar.
It's an interesting technique, and I'll definitely keep the strategy in mind for future endeavors.
The double entendre was completely unintentional. Testing against women is important but one of the big hangups a lot of people have is the sexual component. This is one technique I've discovered to help ease out of that mindset.
As a strategy for picking up women, this is somewhere down the bottom in terms of effectiveness. As it turns out, asking someone to participate in a user study is about the least sexiest thing imaginable.
FTA: "Bars are wonderful at segmenting by demographic. Match the bar you’re going to with the user population you want to target. Different bars will produce slightly different results but the variation is not huge."
Sophisticated users will reveal sophisticated usability flaws in your application but even unsophisticated users can reveal the unsophisticated flaws.
One really great tip (for sober testing, not drunk) is to use role playing. If you were soliciting feedback for a blog analytics package for example, you could tell users "Imagine you're the CEO of a lollipop manufacturer". Most people end up getting really into the role you're asking them to play and ask surprisingly insightful questions.
That being said, you still need to test with sophisticated users but they're much harder to get and their time is expensive so it's always a good idea to test with a few unsophisticated users first and fix the obvious usability flaws so you can maximize the amount of relevant feedback you do get from your sophisticated users.
Well, that was sort of the point of my original comment. The test will work if bar-people represent your target customer base. This won't be the case for B2B products and may not be the case for some Consumer products.
Why no usage what so ever? Any process of getting user's feedback is better than doing nothing. We are so skewed of using our own apps that we think all is perfect. Nothing ever is. I think its a great idea how to get more usage data. If I try it I will let you know how it went. I am thinking of using some screenrecording software like SilverBackApp to archive and analyze results later when not drunk :)
Interesting site... if it works like it appears to, I wouldn't be one of the survey testers, but I'm sure it'd nab a lot of people, enough to at least get some ideas. Good luck with it!
Because success is not measured by the usability of your product.
There are countless examples of way too complex products that are still successful and almost any successful company you can think of DIDN'T do usability testing.
Companies like Amazon do A/B testing, Apple don't do usability testing at all.
Usability starts to matter later on, when you can measure on actual customer behavior rather than user opinion.
Down vote me all you want. I challenge anyone to prove that usability tests are actually making more successful products.
Which is obvious for a couple of reasons. Apple is paranoid that anything regarding what they work on get's out.
and Apple use the Genius design method. Apple do new things not me to products.
Other than that you must take my word for it and those I know at Apple. But again if you can find anything that proves that Apple do usability testing then by all means please share.
With regards to Amazon. I know that what matters at amazon is A/B testing.
And looking at their checkout process it doesn't look like anything that have been anywhere near a usability test.
But you are missing the point here.
The companies didn't become successful because of usability testing that is the crux of the matter and the claim behind the 5USD that is wrong. To the extent it testing your product matters you can ask ANYONE. If it is that bad then why spend 5USD on a beer.
But most likely it wont be totally unusable and then the actual evidence of usability test having any positive influence on your likely success.
I notice that you are a design professional and have written lengthy articles on interface design. The point of this article is that buying a handful of beers will improve your product design immensely, when compared to doing nothing at all. It is an amateur move that probably looks useless from a professional standpoint, but like 24-grit sandpaper, makes a rough job look more polished.
There is a whole list of issues with asking people about what they think.
Again success of a product is not secured by these kind of tests. Sure if you are doing something that is outright useless but then asking anyone (like girlfriend) is enough.
What he is suggesting is equivalent of doing code review with a first year programmer.
Wouldn't you call BS if I suggested something like that?
I am telling you as a professional, that's not any way to go about testing your product. Unless it's outright useless. But then you don't have to spend 5USD on a beer. You can ask ANYONE for free.
You seem to have a mistaken idea of what user testing is. It's not asking people what they think of this software, it's asking them to perform specific tasks and then observing where they make mistakes or have issues.
The benefits of user testing are well documented. You're most definitely in the minority position for claiming this.
I am quite aware of what a usability test is as I have spend quite some time observing them.
But as I have also learned over the years is that usability test does not equal more successful product.
You are welcome to prove the benefits claim.
I don't care if you can prove some academic self referencing point about usability testing improving customer sentiment. I care about successful products and there is absolutely nothing that proves that usability tests or the UCD method is better than any other design process.
What company's success is based on them doing usability tests. By all means, please share.
Show me examples of companies who wouldn't be a success if it wasn't for the usability testing they did up front.
If it's so well proven that should be easy for you.
You are confusing improvements to an established already successful e-commerce company with creating one from scratch.
It's quite obvious if you already have a successful site you can adjust accordingly which is exactly what Amazon do all the time. Because that is what e-commerce sites must do to keep sales up.
"The best you're going to get is anecdotal evidence that early stage usability testing caused significant pivots around the core concept and assume that these pivots are generally positive.
Much of the customer development stuff Steve Blank talks about fall into this category."
In other words. There is no evidence what so ever. Thank you for proving my point.
I have 15 years of anecdotal evidence that there is no correlation between Usability testing and successful products.
And I am still asking you to provide evidence.
Twitter, Facebook, Apple, Amazon, LastFM, 37Signals and so on. None of them to my knowledge did usability testing before they launched their products.
Again you are welcome to prove me wrong. With regards to Steve Blank then I don't think that is what he is saying.
Look, I'd love to discuss this further with you but this format is getting too annoying. Feel free to ping me at AIM:[email protected] MSN:[email protected] if you want to continue this.
Oh, I just noticed that we sparred before around this post: http://000fff.org/getting-to-the-customer-why-everything-you... about the exact same issue (I am Xianhang Zhang). I don't know if we're going to get any more productive the second time around.
Proving usability testing works when creating a product has the same problem with proving something like agile development works. Every project is unique so you never know how things would have turned out if you did things another way.
The best you're going to get is anecdotal evidence that early stage usability testing caused significant pivots around the core concept and assume that these pivots are generally positive.
Much of the customer development stuff Steve Blank talks about fall into this category.
I don't think it is, after all, I wrote that article. Usability testing can be done at all stages of the design cycle from paper prototyping to maintenance.
Our team was trying to get beta users signed up for a ___location-based iphone app in the Boston/Cambridge area and most people reading newspapers in coffee shops or cafes are frequently willing participants. I even noticed that 7/10 times these people will engage with you in conversation regarding feedback/UX improvements etc.
In parallel with Rodyancy's experience, some of the user feedback was arbitrary (one guy started comparing our application to time/space continuum theories...etc) but nonetheless, I would advocate approaching the newspaper guy/gal as they are already on the knowledge quest and usually are more open-minded.
Oh, and avoid the book/textbook readers. These people are either really engaged in novel (imagine tapping on someone's shoulder during a movie) or are stressed-out students.