CHI-WEB Archives

ACM SIGCHI WWW Human Factors (Open Discussion)

CHI-WEB@LISTSERV.ACM.ORG

Options: Use Classic View

Use Monospaced Font
Show HTML Part by Default
Condense Mail Headers

Topic: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Mime-Version: 1.0 (Apple Message framework v1081)
Sender: "ACM SIGCHI WWW Human Factors (Open Discussion)" <[log in to unmask]>
X-To: Kay Corry Aubrey <[log in to unmask]>
Date: Fri, 8 Oct 2010 07:44:06 -0400
Reply-To: Ron Perkins <[log in to unmask]>
From: Ron Perkins <[log in to unmask]>
Message-ID: <[log in to unmask]>
In-Reply-To: <019c01cb668e$eac5c590$c05150b0$@[log in to unmask]>
Content-Transfer-Encoding: quoted-printable
Content-Type: text/plain; charset=us-ascii
Parts/Attachments: text/plain (196 lines)
Hello Kay,

I think I missed this the first time around, and I didn't see a couple of things that are important--perhaps some of the tools mentioned address these issues.

There are two things I think of when doing A/B testing.

If you are not going to use large numbers of participants, you need a within subjects design for presenting the two approaches for comparison.  As such, you must counterbalance the order of introduction of the two designs to control for learning effects.  You don't want the learning from seeing one design to affect the other.  So, half of the participants get A/B the other half get B/A.  This controls for order effects. If you have large enough numbers you don't need to do this but the populations should be controlled to make sure they are similar in whatever user profile or persona dimensions are important for the product.

You need behavioral performance data on the two designs to get a real answer if you want to know which 'works' better, especially if you have a small number of participants.  You can ask debriefing questions about opinions, (which they like better) but they won't really mean anything from a small group because people vary so much in their opinions.  If you ask 'why' questions, you can start to fish for patterns in their answers and perhaps see some trends.

People will sometimes say they like something a great deal but perform poorly while trying to use it.

Hey, let us know how it turned out, what methods you used, and how you liked the tools if you can.  It would be interesting!

Ron Perkins
Principal, Design Perspectives


On Oct 7, 2010, at 10:17 PM, Kay Corry Aubrey wrote:

> Thanks to Susan, Nancy, and Toby for their excellent responses to a question
> I posted to this list last week:
> 
> ****************************************
> 
> Hi - can anyone on the  Chi-web list suggest innovative usability test
> design approaches for A/B testing? Both A and B will be presented on the
> same screen and the user needs to peruse each and choose the one they
> prefer. I am looking for ideas on the best UIs to present text and pictures
> most effectively for this type of purpose and was wondering if folks on the
> list have suggestions. 
> 
> Please send your ideas to me and I will collect, summarize, and send out to
> the list.
> 
> Kay
> 
> Kay Corry Aubrey, user-centered research and design 
> Usability Resources Inc | www.UsabilityResources.net |
> [log in to unmask]
> Phone: 781-275-3020 | Fax: 1-781-998-0325
> 
> ************************************************
> -----Original Message-----
> From: Susan Price [mailto:[log in to unmask]] 
> Sent: Wednesday, September 29, 2010 10:18 AM
> To: Kay Corry Aubrey
> Cc: [log in to unmask]
> Subject: Re: Usability test design approaches for A/B testing
> 
> Kay, 
> 
> Your proposed test is not what I think of as A/B testing. Our A/B testing is
> showing one or the other variant to the same user segment (usually in the
> live production environment) and watching which performs better.
> 
> Presenting on the same screen side by side is an attempt to not lead the
> participants? This seems to invite careful perusal of the differences, and
> that's usually not what we're going for. What we seek in the usability lab
> are the untutored, knee-jerk reactions that we can't anticipate. Inviting
> careful perusal invites participants to play "designer." Our team doesn't
> suffer from a lack of such opinions - what we seek are the revelations that
> we weren't able to anticipate.
> 
> There will be unavoidable left-right bias I believe, and the test experience
> doesn't compare to the "one solution per screen" actual experience. 
> 
> Seems like it might be better to show each variant (and perhaps some
> dummies) several times and have the participant rank each with a numerical
> scale, with duplications of each to weed out the biases.
> 
> -Susan
> 
> 
> 
> -----Original Message-----
> From: Nancy Frishberg [mailto:[log in to unmask]] 
> Sent: Wednesday, September 29, 2010 12:46 PM
> To: Kay Corry Aubrey
> Subject: Re: Usability test design approaches for A/B testing
> 
> I agree with Susan Price's comment:
> 
> What you propose is not A/B testing.
> 
> A/B testing is a method which relies on behavioral data, not attitudes  
> or preferences. It uses large numbers of responses in place of richer  
> data from a few participants (as is found in usability studies). And  
> it measures which of the 2 (or more designs, where you might actually  
> be testing A/B/C/D) achieves a better business outcome (typically  
> conversions to the next step in a purchase sequence). You need not  
> present the 2 design to an equal number of participants, but  
> sufficient to be able to say with confidence that design A converted N 
> % and design B converted M%, and N>M (or the reverse).
> 
> Doing several usability sessions with one design and several more  
> sessions with the other is unlikely to get the statistically valid  
> results that a true A/B test can provide.  However, you might consider  
> having 2 very similar tasks that use one design for one task and the  
> other for the other task, and watching timing as well as other  
> reactions for which design performs better. Of course you still need  
> sufficient participants so that you can vary which one is presented  
> first, to separate out issues of learning/familiarity from better  
> performing design.
> 
>  -- Nancy
> 
> 
> -----Original Message-----
> From: ACM SIGCHI WWW Human Factors (Open Discussion)
> [mailto:[log in to unmask]] On Behalf Of Toby Biddle
> Sent: Wednesday, September 29, 2010 7:34 PM
> To: [log in to unmask]
> Subject: Re: Usability test design approaches for A/B testing
> 
> Hi Kay,
> 
> Our tool, Loop11 (www.Loop11.com) can and has been used for A/B testing by
> many of our customers before. We've also written a case study
> (http://bit.ly/dzmm93) that outlines how a project was set up by one of our
> customers.  I also agree with Susan's response in which she suggests that
> inviting careful perusal of the differences doesn't encourage natural
> behaviour. 
> 
> Creating task-based scenarios to see which design performs the best is the
> better way to run A/B testing. If you haven't already signed up to Loop11
> feel free to do so. Your first project is free, so you can use it to see
> whether A/B testing will work for you.
> 
> I hope this helps.
> Regards,
> 
> Toby Biddle
> Director
> 
> t: (03) 9684 3470
> m: 0402 113 104
> f: (03) 9684 3434
> e: [log in to unmask]
> skype: toby.biddle
> 
> 119 Ferrars Street, South Melbourne, Victoria, 3205
> 
> 
> 
> 
> On Sep 29, 2010, at 7:54 AM, Kay Corry Aubrey wrote:
> 
>> Hi - can anyone on the  Chi-web list suggest innovative usability test
>> design approaches for A/B testing? Both A and B will be presented on the
>> same screen and the user needs to peruse each and choose the one they
>> prefer. I am looking for ideas on the best UIs to present text and
> pictures
>> most effectively for this type of purpose and was wondering if folks on
> the
>> list have suggestions. 
>> 
>> Please send your ideas to me and I will collect, summarize, and send out
> to
>> the list. 
>> 
>> Thanks!
>> 
>> Kay 
>> 
>> Kay Corry Aubrey, user-centered research and design 
>> Usability Resources Inc | www.UsabilityResources.net |
>> [log in to unmask]
>> Phone: 781-275-3020 | Fax: 1-781-998-0325
>> 
>>   --------------------------------------------------------------
>>   Tip of the Day: Suspend your subscription if using auto replies
>>           CHI-WEB: www.sigchi.org/resources/web/faq.html
>>             MODERATOR: mailto:[log in to unmask]
>>   --------------------------------------------------------------
> 
>    --------------------------------------------------------------
>        Tip of the Day: Forward out-of-office replies to
>                    mailto:[log in to unmask]
>            CHI-WEB: www.sigchi.org/resources/web/faq.html
>              MODERATOR: mailto:[log in to unmask]
>    --------------------------------------------------------------

Ron Perkins
Principal, Design Perspectives
Web Design and Usability
www.DesignPerspectives.com	

978-465-6083	Office

    --------------------------------------------------------------
           Tip of the Day: Postings must be in plain text
            CHI-WEB: www.sigchi.org/resources/web/faq.html
              MODERATOR: mailto:[log in to unmask]
    --------------------------------------------------------------

ATOM RSS1 RSS2