Assessment Platform

An end-to-end online assessment management platform built to help employers screen candidates at scale.

Duration:

2020 - 2021 (Launched at Q3,2020)

Team Members:

1 Product Manager, 4 Engineers, 1 Sale Manager, 1 Product designer(me)

Tasks:

Stakeholder Interview, User journey mapping, User stories, Wireframing. Prototyping, Usability testing, Visual design

Result:

85.2 SUS, 92% new feature adoption. 385% increase in pageviews

Overview

I designed the flow of assessment creation, question creation, assessment management, and the assessment itself. This resulted in 92% new feature adoption, 385% increase in pageviews, and 85.2 SUS.

This project focused on overhauling the experience of three user journeys, 1. Test creators manage the existing assessment and create new assessments for test owners. 2. Clients manage the assessment production, invitation, and results tracking. 3. Candidates receive the test invitation and complete the assessment. I explored mid-term and future design solutions and we shipped the new platform in late 2020 and gradually shipped the new features until today.

The latest interface of assessment, assessment builder and individual report.

Context

Our online assessment platform is aiming to help employers evaluating candidates scientifically at scale. It has been integrated into our product pipeline perfectly and used in conjunction with all c1 services, for example, if a candidate wants to participate in a Terminal competition, they have to complete a short assessment first as required. Initially, The assessment platform worked very manually, we handled all assessment management works including assessment creation, assessment invitation performance tracking, and talent resource delivery, etc. This not only increased the workload internally but also did not provide customers with a reliable experience, because most of the work is invisible. It's okay for short terms but not a scalable solution for long terms growing. So I constantly facilitated the internal communications, explored the long-term scalable solution with a cross-functional team.

Goal

As we growing our business, more and more clients rely on our platform to evaluating candidates at scale. But the current assessment platform wasn't designed for the customer-facing purpose, which causes the issue that our end users are not clear on how to use the platform. So we decided to overhaul our assessment platform from both perspectives of engineering and design.

Target Users

I worked with the product manager to understand the target users and their most important tasks of using the assessment platform. Below is a brief summary of people who we aimed to help.

Service mapping

By mapping out the holistic service landscape, we defined the flows that need to be improved(orange cards). This flow helped with aligning the team on the direction the design was heading towards.

User story #1

I am a task creator trying to deliver a high-quality assessment to clients but it's very difficult because I don't know how to make new assessments, add new questions, and manage my deliveries.


Solution 01

Straight forward assessment creation

One simple and intuitive assessment creation experience helps test creators make new tests easier and faster. Even a beginner can create an assessment without reading a huge amount of instruction.

Assessment creation interface (Before)

Assessment creation interface (After)

Solution 02

Intuitive question creation

A consistent question creation experience can lower the cost of learning. And a live preview window on the right side gives creators immediate feedback.

Question creation interface (Before)

Question creation interface (After)

User story #2

I am a hiring manager(client) trying to evaluate our candidates on scale accurately and effectively but It's not very smooth because I feel we spend too much time managing our assessments such as review assessment, send invitations, and tracking performance from both aggregate levels and individual level.


Solution 01

Seamless review and comment experience

Delivering an assessment for reviewing in PDF or Doc can be very painful, it does not only require tons of manual work to put all questions together but is also very hard to maintain (every change on the assessment platform has to be manually updated in doc). I designed this In-platform reviewing experience(prototype), simplified the workflow, and reduced the cost of communication.

This feature is currently in development.

The google doc of assessment draft delivery (Before)

Online assessment draft delivery (After)

Solution 02

Tracking assessment progress

The explicit number of how many people have invited, started, and completed the assessment is very helpful for tracking the progress, and an aggregated category view provides a high-level understanding of “what does this assessment test for”.

All assessment page (Before)

All assessment page (After)

Solution 03

Sending Invitation in the platform

Making invitation editor be self-serviced and available for test owners simplified the workflow between owners and creators. Before, sending assessment invitations is challenging because lots of manual works happened offline which required heavy collaboration between test owners and creators. For example, test owners rely on creators sending invitations for them by delivering a list of recipients on Excel, which increased unnecessary workload for test creators.

Solution 04

Interpreting data from all aspects

Tracking performance and translating data from both the individual level and the assessment level helps test owners build a holistic view of the strengths and weaknesses of their candidates, so I made clients can access data without any frictions.

User story #3

I am a candidate trying to complete my online assessment but I feel exhausted because it took me 3.5 hours without break.


Solution 01

Allow candidates take a break

Allow candidates to take a break in the middle of the assessment provided an enjoyable experience because some of our tests may be longer than usual (like 2-3 hours) and take it in one sit is unrealistic.

Solution 02

Content-first approach

A simple interface design and concise linear user flow with metadata gathering are how we approached the content-first experience.

Assessment interface (Before)

Assessment interface (After)

Results

We successfully onboarded 3 new clients after we launched the new platform, and 62% of existing clients are willing to switch to the new platform next year(Old assessment data cannot transfer).

Test creators

85.2 SUS of test creation and question creation experience.

350% time-saving on new test creator onboarding.

5.5 hours saved on the production of the new assessment per week.

Reduce the time spending on client meetings from 3 hours per week to 0.5 hours per week on average.

Clients

385% increase in pageviews.

397% increase in time spent on the platform.

91% new feature adoption. (Invitation sender )

Weekly return users increased from 0.7 to 17.5.

Candidates

Less complains about their assessment-taking experience.

(Numbers may exceed the industry average due to the limited data sample. 9 test creators in total)