Usability testing interactive prototypes gives designers profound insight into user experience early on, whilst saving time and money. In turn, this improves client trust and protects stakeholder funds and confidence in the project.
When delivering an app or website, we need to make sure every user journey flow is seamless. If we discover barriers to conversion or bugs in our design that we need to remedy further down the production line, it can be an extremely costly exercise. Our developers can end up spending weeks refactoring existing code, which is not only expensive to do but puts the project timeline and milestones in jeopardy.
The practice of usability testing via interactive prototypes shows us areas we need to rework and any designs we need to update before any coding has taken place. By making the changes during the design phase instead of development, we save time and money whilst delivering the most user-friendly solution possible. Appscore has proven time again that this method is best practice for delivering digital solutions via usability testing our client work.
We recently performed an exemplary usability testing exercise while developing a mobile-responsive eCommerce platform for a leading beverage retailer. The usability test was on the mobile web portal rather than a desktop site, as we wanted to create a mobile-first experience.
The prototype design was created in line with industry standards and best practice. For example, the login screen and flow were inspired by Google account experience, providing the user with a familiarity with the platform. However, we wanted to determine the success of the prototype’s overall user experience. We wanted to understand how users would perform a variety of actions and tasks that are set. Through the usability tests, we wanted to identify any changes we should make as early as possible.
The high-fidelity prototype we created in Adobe XD consisted of all the main user flows that users are taken when engaging with an Ecommerce platform. The hi-fidelity design was linked up via hotspots, giving it the feeling of a functional piece of software.
Test users could see where they could navigate to as buttons appeared in blue when they clicked on the screen (right).
Our user base consisted of four participants of a range of ages and positions within the client’s customer base, ranging from Secretary to Operations Manager. Possessing varied levels of expertise and competency with smartphones, the differences in the participants online shopping habits was immediately apparent. We chose to engage four users as with any more than 4/5, we would end up getting the same results.
We gave participants 10 open-ended scenarios to complete (‘how would you do this? rather than ‘click here’). This allowed users to determine the navigational steps required to first get them to the target page and then to complete the target action. These actions included how to sign up, change password, add or remove a product from Favourites, and more. Once they had completed all the tasks, we asked them four open-ended questions to receive the best possible qualitative data:
We measured success by asking each user to rank each of the questions between 1-5. We lined up all of the findings from the User Experience Survey and calculated an average score for each question. This helped us understand where the common problems were with the design.
The findings from the test were fairly consistent across the board. We found that most of the users were comfortable with the user registration process and understood the different terminology throughout this flow. Most found the platform fairly user-friendly and in line with standard Ecommerce trends.
The findings that we found and have since been actioned include:
Users found that the Filter button on the catalogue page wasn’t easily identifiable and not in a prominent enough position for easy navigation. It floated at the bottom of the page rather than being in a fixed position, proving not as user-friendly as we had hoped. We analysed existing successful Ecommerce site designs before updating the position of the, moving it to a more recognisable spot were people are used to seeing it.
We A/B tested the login screen background image to properly define what users are expecting to see and what they’re most comfortable with when entering the platform. The client wanted to use a particular image that our UX team disagreed with, so we A/B tested a few images. All of the participants chose the image that we recommended, confirming our suggestion based on real user experience.
We took these results on board and based on the findings changed certain user flows and aspects of the design. Design and navigation changes are easily rectified during this early phase of the project, well before development begins. This is much cheaper than having to do it later down the line.
User experience is a vital component of our adoption strategy. We aim to ensure that the solutions we develop are based on the end-user rather than on what the client thinks they need. The client may think they need A, but they may need B because that’s what the end-user wants. Objective usability test can confirm what’s best and get the project moving in the right direction quicker.
Years in the business has shown us that when it comes to journey flow, the end-user knows best. When we create websites and application with end-user flows at the forefront of design, we deliver better solutions faster.
Follow our latest ventures and PoCs through regular updates on the Appscore blog!