RESEARCH - USER JOURNEY
RESEARCH - COMPETITIVE ANALYSIS
RESEARCH - USER TESTING
UPDATED DESIGN INTERVIEWEES SAW
SOLUTION - ADDING A TIMELINE
SOLUTION - ADDING TEXT HIERARCHY
Title of test
SOLUTION - NEW ACTIVITY EVALUATION
Labeled number of total questions; one question at a time
Clear labelling that lets users know they can reach their main goal next step (receiving credit)
USER TESTING THE DESIGNS
There was a clear increase in positive responses to the designs by users.
There was a greater likelihood to complete the activity evaluation with the one-by-one format.
There was a decrease of confusion concerning where buttons led.
More specifically I found:
The progress bar was the new feature users wanted the most and had the most positive things to say about.
Users were satisfied with the current linear user flow and preferred to stick to the end rather than being offered options to choose.
Some users preferred an activity evaluation where questions were all on one page but users who preferred them one-by-one felt more strongly about their opinion.
The final results of this project were 4 high-fidelity designs also adjusted for different breakpoints. Shown below are the final designs for the test success page and the activity evaluation.
WHAT I LEARNED / FUTURE SUGGESTIONS
These were some of the recommendations I gave to my team to continue exploring with the project that I could not accomplish in just 4 weeks.
Consider how user preferences found from the end of the activity experience can be applied to the rest of the activity experience.
How can we make the entire activity cohesive?
Are there usability issues present there that were also present in this investigation?
Consider interviewing those who use specifically Medscape’s CME regularly.
If designs are improved, do we see more regular users?
Are there needs that regular users have that first-time users don’t?
Evaluate user preferences between desktop and mobile CME.
Do users use our mobile CME? Is it more or less user-friendly?