
Career Readiness Report
As a Senior UX/UI Designer on this project, I led the end-to-end design process, ensuring that user needs, business objectives, and technical feasibility were seamlessly aligned. My responsibilities included:
-
Driving user research to uncover pain points, optimize workflows, and enhance the overall user experience of ALL four company products.
-
Collaborating closely with cross-functional teams, including stakeholders, product managers, analytics team, and engineers, to ensure design decisions were strategically aligned.
-
Crafting intuitive user flows, wireframes, and high-fidelity prototypes that effectively addressed user challenges and improved usability.
-
Conducting usability testing and iterative refinements, validating design solutions through real user feedback before development.
-
Overseeing the design-to-development handoff, while maintaining consistency, accessibility, and a design that remained scalable.
Overview:
SkillSurvey, in collaboration with the National Association of Colleges and Employers (NACE), developed the Career Readiness platform to provide students with data-driven feedback on their career preparedness. Designed for students completing internships, co-ops, or on-campus jobs, the platform enabled them to assess their professional skills and compare their competencies against a national benchmark of students. The primary goal of this project was to develop a second norm group based on student ratings to enhance comparison insights within student reports. Secondly, redesign the main report to include the norm group and enhance the visualization of the overall design and include new functionalities. Lastly, I introduced a new way for students to view the Career Readiness report by introducing a new design, we aimed to provide students and institutions with a more accurate and meaningful benchmark for evaluating their self-assessment results.
This improvement was designed to:
-
Expand and breakdown the data so students could compare their results, increasing the relevance of their feedback.
-
Improve benchmarking accuracy, ensuring that students are assessed within a more representative peer group.
-
Enhance the value of self-assessments by contextualizing student performance alongside others who had completed the same process.
-
Refine reporting metrics, helping institutions and students make better-informed decisions based on clearer comparative data.
Understanding the Product First
The SkillSurvey Career Readiness platform helped students and institutions evaluate and enhance workforce readiness by offering personalized feedback and benchmarking their skills against national standards.
How it Works:
-
Students with internships complete a self-assessment, evaluating their proficiency in key professional skills.
-
Supervisors provide evaluative feedback, offering real-world insights into students’ strengths and areas for improvement.
-
Students, supervisors and institutions receive a comprehensive report, allowing them to identify gaps in their skillset and track progress over time.
Key Features:
-
Individual Career Readiness Reports – Personalized reports highlight strengths and growth areas based on NACE competencies.
-
National Benchmarking – Students compare their skills against a national dataset, offering valuable context for professional development.
-
Supervisor Feedback – Direct input from employers and supervisors helps students understand how their skills translate into the workplace.
-
Competency-Based Assessment – The platform aligns with NACE’s eight core competencies, widely recognized as essential for career success.
NACE's Role with SkillSurvey
NACE, a leading authority in career services, played a crucial role in defining and validating the competencies that underpin career readiness. In collaboration with SkillSurvey, they developed this framework to ensure students graduate with marketable, real-world skills that align with employer expectations.
Understanding the Problem
Previously, the "How Do I Compare" section of the report relied on a norm group of 12,380 students, with ratings solely based on evaluator assessments. However, this approach presented a critical gap—students lacked meaningful benchmarking data, making it difficult for them to gauge their competencies against a relevant peer group.

The challenge was to introduce a new norm group that incorporated self-assessed students while ensuring the comparison model remained accurate, transparent, and user-friendly. This required:
-
Defining a new norm group that included students who completed self-assessments but had not yet received evaluator feedback.
-
Revising the comparison methodology to present meaningful insights while avoiding confusion between self-assessment and evaluator ratings.
-
Maintaining clarity in the user interface, ensuring students could easily interpret their standing relative to both self-assessed peers and evaluator-rated students.
-
Minimizing disruptions to existing user workflows, integrating the new norm group seamlessly without overcomplicating the experience.
By addressing these challenges, the goal was to enhance the value of student reports, allowing users to gain more relevant, actionable insights into their skills and competencies.
Discussions with Stakeholders and Product Owners
To ensure I understood Career Readiness report effectively, I conducted research through group discussions with stakeholders as well as stakeholder interviews, product analysis, and industry best practices. This phase allowed me to gain a deep understanding of user pain points and data visualization.
I collaborated with the Analytics team and Product Owners on the Career Readiness (CR) team to gain insights into:
-
Data structure – How comparison data was stored, processed, and retrieved.
-
Current challenges – Understanding technical constraints and identifying gaps in the existing comparison model.
-
Success metrics – Defining key KPIs to measure improvement post-implementation.


Student User Workshop

As a new hire at the time I needed an opportunity to understand our users and their pain points, so I suggested we hold a workshop for our student users to understand their pain points. After conducting interviews and testing with 12 of our student users, I came to understand they wanted a deeper dive into the data retrieved from the surveys that came from their evaluations from their Intern Supervisors and how they compared to other students in their career path or norm group.
We decided to focus on the software industry, specifically students interested in software engineering as it is a relatively fast-growing industry where information can become quickly outdated and constant learning is required to keep up to date with emerging trends. I included all the areas that were a priority on the surveys and made each table interact and discuss how they viewed and valued each category and their expectations from not only their supervisors, but also from their colleges and universities to help better breakdown their Career Readiness score.

Insights from Student Interviews
After the workshop with the students, a few clear frustrations stood out:
-
Unrealistic job expectations: Students shared how discouraging it is to find “entry-level” jobs asking for years of experience. It’s a classic catch-22 — they need experience to get a job, but need a job to get experience.
-
The education-to-industry gap: Many students felt unprepared for real-world work because what’s taught in college doesn’t always match what employers expect. Online courses and bootcamps help a little, but don’t completely close the gap.
-
Recruiter knowledge and keyword hiring: Some students mentioned frustration with recruiters who don’t fully understand technical skills. A lot of hiring seems to come down to keyword matching, rather than truly evaluating their abilities.
-
Messy, inconsistent application processes: Applying to different companies felt like running an obstacle course — every company had a different system, set of assessments, and expectations, making the process long and exhausting.
-
Bias and culture fit challenges: Non-native speakers expressed feeling like they were judged unfairly based on culture or language. Plus, a few female students mentioned concerns about being hired just to meet diversity quotas rather than for their skills.

This column represents where the student falls based on their Evaluator ratings and the norm group with similar surveys.

This column represents the National average across all other students in the program and where you fall based on your overall Career Readiness Level.
The Ideation Process
After gathering insights from our research, our team came together to clearly define the challenge we were solving.
We found that many career-minded students and professionals felt stuck. They were eager to grow but lacked a clear view of the skills they were missing, what steps they needed to take, and how to measure their progress. Without that guidance, reaching career goals felt more like guesswork than a clear path forward.

The Journey Mapping Process
From those sessions, we landed on four key features:
-
Set a Goal – Users can choose a career path and define the level they’re aiming for, giving them a clear destination to work toward.
-
Personalized Recommendations – The platform suggests a mix of courses, events, and more to help close any skill gaps.
-
Internship Integration – Users can pull in their past activities from supervisors, saving them from manually tracking every detail.
-
Progress Tracking – A visual tracker lets users not only to see their results, but also how they compare to their peers and how far they’ve come and what’s left to reach their career goal.

Using ADOBE XD, I created low-fidelity wireframes for the condensed report using the colors in our Style Guide to expedite the design process, as we were on an aggressive timeframe, to explore how we could update the "How Do I Compare" section in a way that felt intuitive and easy to understand for students.


My focus was on usability and clean data visualization—I wanted students to quickly grasp how their scores stacked up without feeling overwhelmed by numbers or technical language.
One of the main challenges was figuring out how to introduce the new norm group—which was based on student self-assessments—while still preserving the original evaluator-based comparison. I explored different layout ideas that would keep the information clear and well-organized, without crowding the page.
Challenges I Ran Into
-
The CR team said it’s very important that the student is able to see how they rated themselves with how their Evaluators rated them. With the current mockup, those two columns show in different tabs.
-
The team asks to add a third tab called My Ratings that includes the columns: What Employers Deem as Essential, How Your Evaluators Rated You, and How You Rated Yourself.
-
-
Lastly, it is important for the student to be able to easily compare their ratings with their evaluators. The latest design put these columns in the different tabs. The product team recommended a third tab, My Ratings, which compares the NACE, Evaluator, and Self Ratings.

High Fidelity Designs
I kicked things off by updating the condensed report, adding the "My Rating" tab to the comparison columns to match the new logic behind the self-assessment norm group. When students were being compared to their peers instead of evaluators, we swapped the label “How Your Evaluators Rated You” with “How Other Students Rated Themselves.” It was a simple change, but it made a big difference—giving students clearer context so they could better understand what their results really meant.


This is Where I Proved My Value
I pulled everything together into a clickable prototype that was used in both internal design reviews and user testing. Knowing that we’d be supporting multiple comparison models, I introduced a color-coded key system. I made sure each norm group was visually differentiated with consistent use of color accents, which helped reduce confusion and set the foundation for a future toggle feature where students could actively switch between different benchmarking views.
After conducting interviews and testing with 12 of our student users, I came understand that they wanted a deeper dive into the data retrieved from the surveys that came from their evaluations from their Intern Supervisors and how they compared to other students in their career path or norm group. I included all the areas that were a priority on the surveys and made each section interactive where they are able to click the section then it opens expands to breakdown their score.

While adhering to the Branding Style Guide and with the help of our Graphic Designer, we were able to create icons to match the category titles. This helped me expedite the design process, fine-tune micro-interactions and language before handing it off to development—ensuring a polished, user-centered experience all the way through.
Overall Numeric Ratings on the Side
The Overall Career Readiness Level reflects the average of a student’s evaluator ratings across all eight NACE competencies captured in the Career Readiness Survey. These competencies serve as a broad measure of how prepared a student is to successfully transition from college into the workforce. In short, it gives students a high-level snapshot of how their skills stack up when it comes to real-world expectations.
To bring the idea to life, I ran a few workshops and sketching sessions with the team where we mapped out the user flow and built a rough storyboard to capture what the MVP should look like. The proposed solution was to provide an interactive list of skills mapped based on the profession, specialization and level.

This column represents where the student falls based on their Evaluator ratings and the norm group with similar surveys.

This column represents the National average across all other students in the program and where you fall based on your overall Career Readiness Level.
Expanding Categories
For example, if one of the sections such as 'Professionalism' is clicked, the entire section would dropdown and open details regarding that topic. The student would be able to drill down and see each question asked as well as have a sense of what they need to work on to be "Career Ready."


This column represents where the student falls based on their Evaluator ratings and the norm group with similar surveys.

This column represents the National average across all other students in the program and where you fall based on your overall Career Readiness Level.

In the next section, to make things easier to read and less overwhelming, I polished up the typography, spacing, and layout across the entire comparison section. The goal was to help students quickly understand where they stand without needing to decode tricky stats or dense text. Now, key insights like percentile rankings are easier to spot at a glance, even for those who aren't familiar with technical language.
High Fidelity Designs
We successfully introduced a new feature with a scalable design into the platform without disrupting the existing experience. The updated “Career Readiness” report now includes:
-
Clear labeling that helps users quickly see which norm group their results are being compared against.
-
A refreshed column layout that makes side-by-side comparisons easier to read.
-
A toggle-ready design that allows for switching between norm groups in the future without a full redesign.
-
Helpful messaging that keeps students informed—like how long it might take for evaluator ratings to come in, or which comparison group is being shown.
This update not only made the comparison feature more useful for self-assessing students but also set the foundation for future flexibility.

Overall Numeric Ratings on the Side

After review with the team we cleaned up the design and made the title easier to read.

I made this section stand out by allowing the colors to be inverted to differentiate results from the above Evaluator Ratings column.
Expanding Categories
With the structure and logic of the experience mapped out, I moved into high-fidelity design to bring the concept to life with visual clarity and thoughtful interaction.
I started by updating the comparison columns to reflect the revised logic of the new norm group. When students view their results in each category sample, they are able to still see where they fall in the Overall Career Readiness column from an Evaluator and National Student average view. This small language shift played a big role in ensuring that users always had the right context—helping them interpret their data in a more meaningful way.

Results and Takeaways
Joining a startup as a new hire came with a fast-paced learning curve, but it pushed me to grow in all the right ways. I learned to be scrappy, focused, and intentional with my time—especially while balancing full-time work. This project was more than just a design challenge; it was a crash course in working lean, staying focused, and building with purpose.
Here are a few key lessons I walked away with:
-
Build smart, not big. Time and resources were tight, so identifying the core features that would bring the most value to users helped us stay focused and actually ship something impactful.
-
Zoom out when you need to. I initially got caught up in polishing the visuals, but stepping back to reassess the user flow reminded me that good UX should always lead the way.
-
Stay anchored to the problem. It’s easy to get swept up in details or timelines, but grounding every decision in the user’s needs kept the work meaningful and clear.
In the end, this experience taught me how to balance quality with speed, and how to keep the user at the center—even when the pace is fast and the pressure is high.
Executive Team Huddle After Successful Deployment
