Increasing the barrier to entry by simplifying the process
The more we browse the web the more our experiences influence how we design and develop our own products. We isolate and extract the best experiences and replicate them to help guide and inform users as best we can. But are we really in the best position to decide what works and what doesn't? The simple answer is, no!
Over the last few months I have been working on a major new section of a website and I was recently invited to sit in on one of the user testing session that was set up to test the work I had done. In the run up to the user testing sessions we had identified that parts of the user journey were quite complicated for new users and decided to pre-empt any confusion by implementing IntroJS so we could highlight the main parts of the product and tell the user what they were expected to do.
The cardinal sin of programming is to allow any developer to test their own code as by the time we get the code to a point where we can test it we have been through the process several times and without much thought we will put in semi-valid values that will allow us to continue to the next step. Building and testing the user journey falls into a similar category as by the time we get to the point of being able to test the journey we have a clear interpretation in our minds of what the system is supposed to do but, and here is the key point, is this how a user would actually use it?
We had some expectations of what would come out of the user testing session but the biggest feedback was something completely unexpected.
Our implementation of IntroJS was meant to highlight the 4 or 5 most important points of that page and as the user navigated the website the IntroJS dialogs popped up to advise them what that section was about We quickly saw that our testers did not read the messages and instead were trying to interact with the highlighted component. We had not anticipated this behaviour and our testers were quickly getting frustrated that they could not do what they were expecting.
This was then compounded with them treating the IntroJS dialogs like an irriating advert as they clicked Next, then Next, then Skip so they could summarily dismiss the dialogs and get the playing with the system underneath, all without reading any of the help messages that were trying to explain what was going on. Once the IntroJS message had been dismissed our testers were able to work their way around the system, albeit with a few puzzled looks as they figured things out, but in general they were a much happier campers.
The feedback we received at the end of these user testing sessions was largely positive but a few of our testers did mention that if this had been a live product rather than a testing session we would have lost them very early on in the process and by the one thing we had put in place to try and make the process simpler.
It also worth mentioning that although IntroJS has been highlighted as the main issue in this instance, this is down to our implementation of it and is not representitive of IntroJS itself.
- October 14th, 2015