ROLE AND CONTRIBUTION
I led the visuals and technical documentation for a centralized, digital-first admission examination system at 🐶 National University - Manila.
PROJECT BACKGROUND
To understand the problem space, I studied 🧐 how digital admission examinations are used, and where they fall short.
Schools and universities rely on large volumes of data to identify qualified applicants, with entrance exams playing a critical role. As technology evolves, many institutions are adopting digital exams with features such as real-time scoring, question banks, and performance tracking.
However, managing and making sense of this data remains a significant challenge.
RESEARCH AND DISCOVERY
To validate ✅ the problem and define the system requirements, the team conducted stakeholder interviews, user surveys, and workflow analysis.
Our research confirmed that entrance exams generate valuable data beyond student selection, which can measure readiness, improve evaluation methods, and understand how well each test question performs.
Interviews with the NU Manila Admissions Office revealed three major pain points:
Difficulty managing exams at scale
"Once the number of applicants increases, everything becomes harder to control."
Limited applicant performance reports
"We are able to see the final scores, but not how applicants actually performed at the exam."
Lack of question quality evaluation
"Once a question is used, we have no significant data to show if it actually worked."
A pre-survey with NU students validated the need for modernization: over 90% believed that a digital admission exam would help the university attract more applicants and better understand their performance.
These insights shaped NU AdEx into a system focused on reliability, data-driven evaluation, and seamless exam administration. Supported by item banking, item analysis, three distinct user interfaces, and built-in recovery for network interruptions.
GOALS
The system is envisioned to be a full-on upgrade from the old pen-and-paper style, designed to make admissions smarter, faster, and more data-driven.
CORE ACTIVITIES
Brand Definition & Direction
I spent time exploring both direct and indirect brand references to really understand the product’s identity and tone. I looked at similar platforms, industry patterns, and existing touchpoints to make sure the direction felt familiar, credible, and still unique.
Experience Design & Prototyping
I turned insights into flows, wireframes, and interactive prototypes, focusing on making things easy to understand and use. I tested different layouts and interactions early on so I could spot issues quickly and improve the experience through iteration.
Project Documentation & Handoff
I led the team in writing a full thesis document highlighting the project’s research, system logic and flow, and outcomes. It served as both an academic requirement and a practical reference for implementation.
Feedback-Driven Iteration & Refinement
The team gathered feedback from stakeholders and users, paid attention to what worked and what didn’t, and made improvements based on real input. We treated the product as something that evolves over time, not something that’s ever truly finished.
FINAL DESIGNS
CHALLENGES AND SOLUTIONS
Following development and initial rollout, these challenges ⚠️ surfaced during real-world use and system evaluation.
Limited data visibility on the Dashboard
We redesigned the dashboard to surface the most relevant metrics upfront. By analyzing user needs, we prioritized key data points, introduced visual hierarchy with charts and summaries, and made the interface more interactive so admins could filter and drill down for deeper insights.
Exam Integrity at Risk
We integrated features like timed exam sections, question randomization, and restricted access to prevent cheating. Proctors could monitor attempts in real time, ensuring assessment integrity.
WHAT WE DELIVERED
The final system 🧑🏻💻 delivered measurable improvements in efficiency, reliability, and data visibility across the admission examination process.
An intuitive digital exam user interface for students with clean layout, minimal distractions, and easy-to-use navigation.
Admin dashboard with a robust question-bank and analytics / item-analysis tools. Allowing data-driven evaluation of question effectiveness.
A unified system replacing pen-and-paper, ensuring faster results, better data tracking, and easier exam management.
TAKEAWAYS
I learned how critical it is to align system design with real-world workflows, and how small 🤏🏻 interface decisions can significantly impact usability and data reliability.
WE GOT MORE TO EXPLORE!
Next Project











