Analyzing’s Color Contrast

Some people have hobbies like knitting graffiti or extreme ironing. One of mine is quantifying web design variables.

Along these lines, one of my criticisms of the newly launched website is that it’s hard to read. For my eyes, the contrast in type to background browns is not high enough for comfortable reading. Considering that the site is meant to be read, this seemed like a legitimate problem – at least for me. Here’s an example of a paragraph from BMTN: Color Scheme

Just because something is a problem for me doesn’t mean it’s a problem for everyone, so I set out to attempt to measure whether font choices affect on a person’s reading speed or comprehension.

To do this, I tapped into one of my hobby businesses, 3rd Party Feedback, to design a test that could quantify this sort of thing. At first thought, it seemed nearly impossible to determine how long it would take someone to read a piece of text. In the end, an indirect measurement of reading comprehension was created: transcription.

The final test asked 10 independent web users to transcribe four snippets of text. There were two snippets screen-grabbed from BMTN, and two snippets with the same copy presented in a black on white color scheme like this:

Google Docs Default Font Scheme

As I mentioned in my previous critique, I find the latter of the two above examples much easier to read. However, I was curious to find out if I was the only one.

The task 3rd Party Feedback workers were asked to complete was to transcribe one of the four examples. Each task would have 10 workers complete it and all tasks would be timed. At the end of the project, 3rd Party Feedback had timed results to 40 tasks. When averaged out, the results (surprisingly to me) looked like this: Transcription Times

So what’s saying is that it took people an average of 40 seconds longer to transcribe the black on white version of the Tim Pawlenty snippet above than the same copy when presented in’s color scheme. To me, this didn’t seem intuitive. How could brown on brown beat black on white? I pondered the results and discussed them with some friends. Then, it dawned on me that there may be a lurking variable in the results. Typically, when we run tests at 3rd Party Feedback, we include a comments field after the assigned task asking for feedback. The feedback can be very constructive and help improve the way future tasks are worded. However, when running a timed test, this can skew results. The black on white versions generated more comments on average, which would have extended their project completion time:

would you want leagues to be possessive? league’s, right?

I typed it as is, but there was a mistake. “leagues” should be “league’s”

The black on white version drew out the grammar nazis.

Having discovered this, we removed the comments field from this task and ran it against a new group of testers. Here are the revised results without the comments field distraction: Transcription Times

That’s what was expecting from the first test.

Is this a case of tweaking the test to create results that match expectations or creating a better test? I’m clearly biased so let me know what you think. As I see it, the latest version of the test shows a 6-9% improvement in transcription times when using black on white vs.’s default font and color scheme. However, it’s still not a perfect test since more than one variable was changed. The color scheme and font size were both adjusted.

So that’s my little experiment.

A few people asked me how much costs to run split tests like this through 3rd Party Feedback. Generally, it’s around $1 per response so a test like this would run $40 (10 respondents to 4 tasks). That doesn’t include analysis.


If you’d like to give this a try, here’s a coupon worth $10 that you can use as many times as you’d like between now at the end of 2009 on orders of $50 or more:

Coupon code: FOTD

If you stuck with this post this long, this is something you should be doing. Feel free to along this coupon code to friends, family, but not your competition.

2 thoughts on “Analyzing’s Color Contrast”

  1. Ed, this is great information to have, and much appreciated. We put the highest priority on user experience (easy to access quality content, no intrusive advertising, direct outbound links, etc.), so we’re always open to possible enhancements to improve on this.

    Here’s an odd argument: our color scheme is much closer to green on yellow than brown on brown.

    Yeah, you heard me right. Green on yellow. Is that any better you might ask? Maybe. Here’s a study you might find interesting:

    The study shows (and the page itself follows) that a green on yellow italicized font has the fastest “reaction time” of any color combination. Not the same green and yellow that we employ mind you, but still.

    Since green on yellow and black on white both have the fastest reaction times (at least in this research), how about a third sample that is known to be low-res, such as white on black or red on green? This might help us determine whether we need to simply tweak our colors or switch entirely to black and white.

    Again, thanks for your time spent analyzing this. We love the (third-party) feedback.

  2. Hi Ed,

    Nice post. My hobbies are extreme ironing and online marketing. And agreed, that color contrast makes it difficult to read. And I like your thinking on why you came back with unexpected results.

    Here is an article from Website Mag you might enjoy that reviews usablity tools. (I am affiliated with but the article covers other tools as well).

Leave a Reply

Your email address will not be published.