Language Proficiency Test

This project was created to be implemented in a platform that connects companies looking to hire with job seekers. This test is part of a premium service that aims to guarantee the employer that the language level reported in the candidate's profile is real.
Role
UX/UI Design, UX Research
Tools
Figma, Miro
The issue identified
After a set of user interviews done to recruiters and talent seekers, an issue was repeatedly mentioned: candidates were reaching interview stages only for the recruiter to realize that the level they had stated in their profile was higher than the level they really possessed, and they were applying to a job that required certain language level, which they clearly lacked. This resulted in time lost for the companies, and lots of disatisfaction from the platform's premium users.
The solution
To create a tool to measure the candidates' language level and show the result on their profile for companies to review and rule out candidates with lower language levels than the ones required for the role.
Research
User interviews
The first part of the research consisted of conducting user interviews to identify the real issue. A recurring comment of disatisfaction of the service provided by the platform during the hiring process was that candidates would get all the way to the interview stages only for recruiters to realize that their language level was nowhere near the one they had stated in their profile. This resulted in a lot of time wasted from the recruiter and the company's side. The platform was promising to match the company with the ideal candidate, and the issue was that the platform was doing its job well, but the candidate was not being truthful.
Since these were premium clients and this was creating a lot of inconvenience and time wasted for them, we set to find a way to keep this from happening.
​
Tools for language verification
We first brainstormed ideas to keep users from lying in their profiles, but realized there was no way for us to avoid this situation. What we could do was force users who were already enrolled in a hiring process to verify their language level so that, if they had exagerated it, the company or recruiter looking to hire them would be aware and the candidate would be disqualified before reaching interview stages.
​
ELSA Speak
We found an awesome tool for practicing conversations whose technology was developed using audios and sounds from English speakers with many different accents to understand, grade, and even correct grammar, pronunciation, etc. We decided to use this tool in the second iteration of this project.
​
GPT
Aside from speaking, we knew we had to test reading, writing, and listening in order to have an accurate verification of the language level. Reading and listening proposed no major issue: we could have multiple choice questions on each, so that the answers would be automatically graded to obtain the results, but writing was a different issue. We wanted to make this as automated as possible, without the need to hire any external party to check any part of our test, thus reducing verification times.
For writing, we developed a very long prompt to be used by ChatGPT to grade 5 aspects of the user's answer: grammar and syntax, vocabulary and word choice, punctuation and mechanics, organization and structure, content relevance, coherence and cohesion, register and style, and task fullfillment. Each would be given a grade from 0 to 5 based on the parameters in the prompt. We tested this with over 50 answers, because results seemed to vary slightly, but the variation was not significant, so this prompt was recorded to be used to check the writing part of the test.
Insights
After our first research phase, and a small brainstorming session, the team had many questions to consider:
​
Test placement
-
Will the language test be a part of the application process to a job?
-
Will it be placed as screening questions of a job?
-
Will the user receive an invitation from the recruiter to answer the test?
-
Will users have to manually enter the test to answer it? And if so, will they enter from the language section in their profile? From the 'enhance your profile' checklist?
​
Administering the test
-
Will the test be answered by all job seekers?
-
Will it be answered only by candidates already enrolled in a hiring process?
-
Will it be answered only by candidates almost in the intervew stages of their hiring process?
-
Will candidates be manually invited by recruiters to answer the test once the company is interested in them?
-
Will all the languages in the platform have a test?
​
Automatic validation?
-
Is there a way in which we can give an automatic validation to users without having answered the test?
-
If the user's past work experiences have been in a country/region that speaks that language, can we assume they speak it?
-
If the user's location indicates they live in a country/region that speaks the language, can we assume they speak the language?
-
Would an automatic validation look the same as a regular validation (user takes test)?
​
Profile status
-
Will users obtain a sort of badge in their profile if their language has been validated?
-
Will the language level of users who scored lower than what they originally stated in their profile be automatically lowered?
-
Will we prevent users who scored lower than what's stated in their profile from manually increasing their level in their profile?
-
Will having a validated language help users rank higher in job opportunities?​
​
The test itself
-
Is it necessary to include all 4 aspects of language in the test?
-
Should we take into account the time the user took to answer in the final score?
-
What will happen if a user leaves mid-test? Would we let them continue or would they have to start over? If they continue, would we show a different question to avoid cheating?
-
Is there a way to avoid users from changing windows to keep them from cheating?
-
How many questions are the minimun necessary per exercise?​​

Ideas and sketches
With all the info gathered regarding testing, effective questions, recommended number of possible answers, section division, elements required, examples of language certifications and online awards, we went to the drawing board to decide where all the important elements would go and what the final layout would look like.
We had decided that all the sections of the test would include a timer to let the user know how much time had passed. We would create at least 3 sections for now: Listening, Reading, and Writing, along with the introduction screen and the language certification that could be shared with peers or on social media.

Where would the test be placed?
The next, and very important question to tackle was, where would users access the test? We had already established that this test did not need to be too long, and we wanted to link it to something job seekers were already required to do, to increase the rate of completion.
-
The first thought was to place it in the middle of the application process. This idea was explored deeply, but ultimately abandoned because the application process already consisted of many other steps to ensure that candidates would not reach later stages without being qualified for the job.
-
Another idea was to let users access the test from the language section on their profile, but that would mean that all users would have access to this (all users have a profile, not only job seekers), and it would greatly increase costs since the test required the use of GPT and it was charged per use. Still, we decided to implement this idea as well, so that users could have a different method of reaching the test other than answering screening questions during a job application.
-
The last idea we considered before reaching the chosen solution was to have recruiters send the link to the test to the candidates they were interested in directly, but that idea was also abandoned, since we would be adding more manual work to the recruiter, and they are supposed to have hired the platform to automate their hiring process and save time.
-
The solution was to add the 3 sections of the test as part of the screening questions required to apply to a job. It was not exactly inside the application process, but still part of it. Not all users end up answering screening questions, so we would reduce the number of testers, and this way, the job seekers would not feel like they had entered a language test, but as if they are just answering more questions inside the same screening questions they were already answering. We tested this and other ideas using UX research tools with testers and confirmed that this was the best solution. ​
Meet our users

What is the job to be done here?
To hire an employee that possesses the language level required for the role.
Who is our main user?
Recruiters and Human Resources managers that are part the premium users of the platform.
Most premium users belong to medium to large companies in Latin American and American countries, their premium subscription was hired by the company itself, and those companies benefit from having a premium service in a recruitment platform because they are constantly looking to fill different roles, and the hiring process is usually very time consuming for them.
Wireframing
Knowing all the elements we would have to include in each of the sections, we created a few proposals of layouts.
All the test sections would include a timer, a progress bar, a title, instructions, the actual exercises, and the bar with the continue button.
The certification screen had to include the user's name, the level obtained, the language, the bars to show the level of each skill, information about how the results were obtained, the date the test was taken, and a way to share the certification with contacts and social media.
We also created a screen with a very brief introduction in case we would need to have users access the test from a different location than its assigned place, the screeing questions of the job opening. This one would only include a short explanation of the test and the time it would take to complete it, to give users a moment to prepare.
We created a few proposals for layouts and these were the approved ones.


Style guide
We selected the elements from the company's design system that would be used in the test.
Since it was decided that its design would contain only the most necessary elements to simplify the process, these were the components we chose for the tests and the certification screen.
The End Product
Users can access the language certification test as they are answering screening questions as part of their job application, or by clicking on an uncertified language in the language section of their profile.
The test will take them through the listening exercise, where they will have 2 opportunities to listen to the audio. Then, they will answer 6 questions: 2 of beginner difficulty, 2 of medium, and 2 that only advanced speakers can answer. Then, they will be asked to read a paragraph and answer 6 questions with the same levels of difficulty as the ones for listening.
The third step will be to answer the writing exercise. Users will then be asked to wait to receive a notification when their score is ready. This will give GPT buffer time in case the system is down. Users will then receive an email and a push notification to open their certification
The certification will have an option to be shared. Other users will be able to answer the certification test from this screen.
If the results from the test are different from the level the user had stated in their profile, the profile level will be automatically adjusted to match the results.
Users will be allowed to take the test twice in a period of 6 months. Each section of the test will contain at least 6 variations so as not to show the same exercises in case a user is taking the test again. After 6 months, the user will be allowed to take the test for a third time.
​
A language certification will allow recruiters to be more confident that the person they are about to interview really possesses the language level they are stating in their profile, without the recruiter having to add any extra steps to the process from their side.



Final prototype

This is a prototype of the language certification test coming from the languages section of the user's profile. If a user is not automatically certified by their past experiences and location, they will have to take all 3 sections of the test (reading, writing, and listening), and will have to wait a few minutes for GPT to score their written answers, and then they will receive an email and push notification that their results are ready.
Users will be able to share their certification with others, and those others will be able to enter from the person's post and take the test to obtain their own certification.