Wednesday, April 7, 2010

Stage 3: Multivariate and Remote Testing

Finally, it is time to drop the ubiquitous Google Analytics javascript code into the final design and disseminate the URL to those persons that will be most interested in using the site, specifically, Kyle's clients, prospective clients, and friends.

However, the usability testing is not complete! No, in fact, there a couple of final cheap, and potentially free, very powerful tools I can utilize before I put the finishing touches on the site and get it out of my hair forever (or until something breaks or Kyle moves or the documentary actually gets made or hell freezes over).

These two techniques work hand-in-hand and can offer some very useful, final feedback.

First, there is “multivariate” or A/B or Split testing. These terms can more or less be used interchangeably, and as far as I know, mean the same thing. Now, there are a couple of ways to actually achieve this technique which usually focuses on minimizing “bounce” and maximizing “conversion”. Bounce refers to the rate at which users leave a site moments after they've navigated to it, and conversion refers to the rate at which users execute and complete a desired outcome, e.g. donation to the documentary, signing up for the newsfeed, or linking to the site on their Facebook or Twitter account.

The first way to achieve this is to set up an account at Visual Website Optimizer. But another way is to combine the power of Google Analytics with a simple javascript code written into the site's “index.html”. The javascript code would go something like this:



var number;
number = Math.floor(Math.random()*2+1);
if (number == 2)
{
window.location.replace('http://web.simmons.edu/~lague/LIS467/index.html');
}
else {

}



What this does is allow the researcher or developer the ability to send users to two different homepages at random. Why is this useful? Well, for starters, in my current development quandary, I'd like to test the effects of having one homepage use a flash animation as a sort of introduction to the site. Than, on another page, I'd like to test the effects of not having the animated introduction. I can see the effects these two differing homepages have on the overall “conversion” and “bounce” rates by checking Google Analytics. One potential way to achieve this is to set up two sites, identical but for the homepages, in two separate directories on my web server. Each directory would have its own distinct Google Analytics account. Alternatively, I could do some serious sleuthing, and just look at the pages that are being accessed from the referring sites, that is the two unique homepages, and then see which one is more effective that way.

But the really cool thing, and this brings me back to why I mentioned that many of Kyle's friends own Macs, is that I can use this technique in conjunction with a piece of software called “Silverback”. Readers of this article can investigate this technology for themselves by going here: http://silverbackapp.com/

Silverback is free for the first thirty days. That is really cool, because what it does is simultaneously record a screencast as well as a video using the built-in iSight and audio using the built-in microphone on MacBooks, Pros, and iMacs. This is really amazing, because it turns your $900 MacBook into a mobile usability lab for free (for thirty days) or $50 (forever!). One can conduct truly remote usability testing including capturing screencasts, video of the participant, and audio recordings all in one integrated file by enlisting the help of friends, and allowing them to conduct their own test of the site on their own time.

Obviously I could still draft a list of tasks for these participants. However, at this point of the testing, I'd be more inclined to gather feedback on the overall aesthetic and interactivity of the site. For instance, I would probably try to lead the users, half of whom would be experiencing an animated introduction before accessing the home page, the other half of whom would be taken directly to the site, with questions regarding their first impressions of the site such as:

1. "Do you feel the site initially grabbed and held your interest?"

(It will be interesting to compare the responses from the A group and the B group on this question. Will users like the animation? Will they prefer to get right into the content of the site?)

2. "Do you like the format of the gallery and the method by which you select and view pictures?"

(I've included this question because it might be another chance to use multivariate testing techniques. One group would be directed to the gallery which uses a "rollover" javascript function so as to elicit a larger version of the image and coincident description. The other would be directed to a more traditional, point-and-click format, where clicking directly on an image calls out a larger image and a description of that image. Determining which version the participants preferred would be extremely beneficial.)

3. "Did you enjoy using this site?"

(Hey, why not? As informative as the site is supposed to be, it is supposed to be enjoyed. It is supposed to be entertaining.)

These questions would be supplemented by some of the more quantitative, task-oriented instructions found throughout this paper and the Chalkmark example as well.

After I'd finished this here test, I'd say that I was done.

Done.

No comments:

Post a Comment