Now You See It, Now You Don’t: Is it an A/B Test?

We recently got an inadvertent peek behind the curtain of the process for evolving legal research database interfaces. Early this fall, working on research exercises for our incoming 1L students, we found ourselves cross-editing each other’s’ instructions for how to run a simple Lexis+ search. Why tell students to click on “content” when the label says “categories”? And why not just tell students the icon for editing looks like a pencil? Thanks to the screen sharing function on Zoom, we discovered we were simultaneously looking at different versions of the same interface, and after polling our colleagues, about half of us were on “team content” and the other on “team categories.” It turns out, we were unknowingly part of an “A/B” interface test:

This kind of testing is a common way for developers to compare two versions of a design and see how these variations change user behavior. Some companies use A/B testing quietly to see if subtle changes in font size, color, position or wording increase visits, clicks, or purchases. We reached out to Lexis, and learned from the product development team that this is standard practice, intended to test variables and improve user experience:

LexisNexis uses online experimentation or A/B testing to improve our products by evaluating potential changes before rolling those changes out to the entire user population.  For Law Schools we take steps to avoid disruptive testing during times of peak usage during the school year to minimize any challenge to your preparation and teaching of legal research with our products.

Bloomberg law also uses beta testing of its interface:

Bloomberg Law occasionally might engage in beta testing where we enlist specific firm/school accounts. We won’t do it with just random individual users, however. Users who participate in beta testing are enlisted by a Client Service Partner or someone from our Bloomberg Law team. Random users are not selected to participate in our testing.

We reached out to Westlaw, but we received no statement about interface testing by the time of this posting.

As a practical matter, the variations we saw were subtle and unlikely to cause confusion, and as of this afternoon, we are all on “team content.” We were never actually asked which term we preferred, so we can assume website metrics showed “content” must have gotten more clicks than “categories.” Legal researchers are constantly watching for and adjusting to changes in research database interfaces, as each new academic year our vendors seem to roll out yet another new menu of changes. Some changes are significant re-developments, while others, like the ones we discovered, are much more nuanced. A word to the wise for all legal instructors for the spring: even if you are not alerted to a major interface change, be sure to double check your screenshots.

Categories:

Tags:



Leave a Reply

Blog at WordPress.com.

Discover more from CRIV Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading