Overview
MOT tests ensure that vehicles are checked at least once a year to see that they comply with roadworthiness and environmental standards. The service allows Nominated Testers (NT) to electronically record MOT test results, with additional services such as: printing MOT certificates, buy test slots, check test station’s balance, view testing performance etc.
The problem
The online MOT testing service replaced the old VTS system back in 2015, many of the MOT test requirements, functionality, and services were in an alpha / initial release state. Now with live analytical data and user testing recommendations, the service required enhancement or iterated upon, along with new features introduced to support all user tasks, system requirements and align to the future business strategy.
Users and audience
The MOT testing service is used by Authorised Examiners, Authorised Examiner Designated Manager, Nominated Tester etc. at approved vehicle testing stations (VTS), but also by DVSA staff to monitor and support the MOT testing community.
In addition, the electronic record created is:
- Checked when a vehicle is taxed
- Used by police and certain enforcement agencies
- Made available publicly online and can be used for a variety of road safety related purposes
- Such as helping motorists to make informed decisions when considering a car purchase
- To help look after their vehicle
- To ensure they get their vehicle tested on time
- Used by third parties
- To check validity of vehicle mileages
- For part of car insurance considerations
Roles and responsibilities
As a lead designer in the UX Service Design team I worked on delivering improvements to user journeys and interactions across the service while ensuring accessibility and inclusivity is at the forefront of each design. By providing guidance and supporting the development and maintenance of the MOT Design System, in accordance with the Government Digital Services (GDS) design system, I introduced new components and patterns that were essential for our service requirements and user needs.
Examples of designs and workflows delivered:
- Training and certificates hub
- Performance dashboard and Test Quality Information
- Service reports
- Roadworthiness directive implementation e.g., new rules on deficiencies
- Improving the ‘search for a defect’
- Special notices and messages centre
- Concept designs of a future service
Service team:
- Data Analysts
- User Researchers
- Content Designer
- UX Designers
- UX Architect and Designer (me)
- Frontend Developers
- Backend Developers
- Business Analysts
- Product Owners
Process and what I did
Scoping
- Understanding business requirements
- Plan research activities and the time necessary to perform them
- Working with User Researcher to identify user goals and needs based on the known scenario(s) and context
- Identified user touchpoints
Analysis
- Task analysis and task flows based on the scenarios identified
- Review business and system requirements and align with user goals and needs
- Identify recurring issues, categorise, and prioritise fixes
Design
- Workflows – whiteboards, sticky notes, digital diagrams
- Low-fidelity to high-fidelity prototypes – paper, Illustrator, digital screen flows, coded prototypes
- Content design with guidance and reviews from Content Designers
- Design decision documentation
Test and iteration cycle
- Prototypes ready for user testing
- Support test plans and script writing
- Facilitate and support user testing sessions
- Review feedback with team
- Question issues to fully understand any problems
- Iterate / update designs
Delivery
- Story writing with BA and PO
- Support story refinement sessions during pointing activities
- Review feature development prior to delivery
- Support product testing when required
Outcome and results
With rapid prototype techniques in place supported by the Design System we were able to quickly test functionality and workflows to promptly gain user feedback. By identifying issues early, we had the opportunity to enhance, iterate and test again prior to development.
Working this way, I successfully designed and improved many areas of the service e.g., ‘search for a defect’ where incorrect defects were being selected. Working with researchers and analysts we reviewed the data and spoke with users to understand the problem. Following several user testing sessions of my designs, we were able to release a design that would meet user needs, system requirements, and improve test result accuracy. Once launched, we monitored its use to identify any patterns or issues that’ll help us in enhancing the solution further, improve defect selection accuracy, reduce user frustration and to make the service easier to use.