My reflection

This project re-enforced me to how to pay attention, not just to data, but to what people need, even when they don’t say it directly. I discovered that I genuinely enjoy the analysis process. 🗂 Using Braun & Clarke’s six-phase framework, uncovering those easter eggs!
A personal learning for me came when things didn’t go as planned. Our recruitment pipeline stalled, and it was already close to finals week. I was tired, but I pushed and pivoted. And what surprised me most was how much I ended up enjoying it: the spontaneity, the little conversations, the unexpected curiosity from people who just met. It reminded me that research isn’t just about collecting data, it’s about connecting with people. Being present. Asking well. And also learning to enjoy the parts that don’t always follow the script!
This project also became the foundation for my first academic paper, which I co-authored as co–first author. We’ve submitted it for review, and while I’m hopeful it gets accepted, I’m already proud,not just of the research, but of the growth it represents ✨

🔍 Market & Competitive Research
So...What tools exist, What do they lack, and Where is the opportunity?
📈 Understanding the Market Need:
The interview prep space is saturated with tools that support problem-solving — but not performance. Many platforms focus on solving coding problems (LeetCode, HackerRank) or offer peer-based simulations (Pramp, Interviewing.io), but few are designed to help students:
-
Practice speaking out loud
-
Build confidence over time
-
Reduce the stress of live human feedback
This highlighted a broader market gap:
There was no tool offering verbal, repeatable, solo practice tailored to early-stage interviewees, especially college students preparing for internships.
📊 Visualizing the Competitive Landscape:
To clarify where this tool could stand out, I mapped current platforms across two key dimensions:
-
Interaction Level (Code-only ←→ Conversational)
-
Pressure Level (High-pressure ←→ Low-pressure)
Most tools clustered around code-only environments or peer-dependent simulations, offering limited opportunities for solo, verbal, low-pressure practice. Very few, if any, delivered a space where students could rehearse technical interviews by speaking out loud without fear of judgment or scheduling constraints.
Recommedations for Future work
Following usability testing, we identified key product opportunities that directly address usability gaps, emotional friction, and prep-stage flexibility. These recommendations focus on improving clarity, confidence, and control.
Dual Modes: Practice & Exam
Let users toggle between:
-
Practice Mode: guided, forgiving, repeatable
-
Exam Mode: timed, high-fidelity, no hints
Built-In Code Compiler
Users strongly expected to write and test code during the session. An integrated compiler would bridge the gap between explanation and execution.
Smarter Flow & Turn-Taking Cues
Introduce subtle visual or audio pacing signals to guide when to speak or listen, reducing hesitation and confusion.
Difficulty Level Settings
Participants wanted more control over challenge intensity. Adding adjustable difficulty (e.g., Beginner, Intermediate, Advanced) helps align the tool with users’ confidence levels and technical depth.
Visual Anchors for Focus
Incorporate timers, progress bars, or question counts to ground users and reduce uncertainty during sessions.

📍Positioning the Product
The competitive gap revealed more than just a missing feature, it revealed a missing experience. While existing platforms support problem-solving or peer simulations, none serve the student who wants to practice performing under pressure, without the fear of being watched or judged.
This product was positioned to own that space by offering:
-
High interaction through voice-based prompts
-
No peer dependency, enabling solo, repeatable practice
-
Emotional safety, reducing anxiety around performance
-
A bridge between personal LeetCode drills and formal mock interviews
This wasn't about creating just another prep tool, it was about redefining what confidence-building could look like for students practicing for technical interviews.
Key Takeaways & Product Impact: The 7 Core Themes Identified
Scroll right to left to continue viewing...
AI Felt Natural & Human-Like: This reduced intimidation and helped sessions feel more like actual interviews, without the social pressure.
01
Perceived Skill Improvement: Many students reported that the tool helped them practice speaking their thought process aloud, something they rarely get to rehearse, leading to increased self-awareness and pacing.
02
Realistic Practice, Not Just Simulation: The experience was perceived as close to a real technical interview, especially due to the conversational prompts and structured problem flow.
03
Confidence Boost Through Repetition: Several users shared that having a judgment-free, repeatable experience helped reduce anxiety and made them feel more prepared.
04
Issues with Pacing & Response Flow: Some participants struggled with the AI’s timing, feeling rushed, delayed, or unsure when to respond, which disrupted focus.
05
Desire for Visual Support & Feedback: Users wanted progress indicators, time cues, or light guidance to manage expectations and stay grounded during the session.
06
Need for Personalization: Students wanted more control over difficulty, pacing, and AI tone, preferring a tailored experience that aligned with their interview stage
07

Service Name
This is the space to introduce the Services section. Briefly describe the types of services offered and highlight any special benefits or features.

Service Name
This is the space to introduce the Services section. Briefly describe the types of services offered and highlight any special benefits or features.

Service Name
This is the space to introduce the Services section. Briefly describe the types of services offered and highlight any special benefits or features.
🎯 Target Audience Research
In defining who the product should serve in its earliest iteration, I focused on identifying a specific, high-need user group that would benefit most from solo, verbal technical interview practice.
While the broader market for interview prep includes everyone from bootcamp graduates to senior engineers, our Alpha prototype needed to start with a more focused segment, one with clear friction and unmet needs.
📍 Immediate Reach:
To test early assumptions, I focused on upperclassmen in computer science at Drexel University, specifically juniors, seniors, and graduate students preparing for technical internship or entry-level roles. This group became our initial prototype testing audience.


Next, Recruiting Candidates...
While working under Dr. Tiffany Do, our original plan was to recruit through Drexel’s internal system, mass emails sent by CCI admin staff. But like many institutional processes, it moved at snail space 🐌. With just 2–3 weeks reserved for usability testing, we couldn’t afford to wait.
So I pivoted 🧭 Plan B:
I went grassroots! walking through CS buildings, lounges, and labs with my elevator pitch ready. It almost felt like being a real estate agent cold door-knocking. No scripts, just energy, adaptability, and purpose!
It pushed me out of my comfort zone, I had to be proactive, clear, and quick on my feet. I even made a few new friends in the process 🤭
Within those two weeks, I personally recruited 11 students who matched our core user segment, and their feedback shaped everything that followed.

🎯 Usability Testing (Quantitative + Qualitative data)
With our participant pool locked in, I conducted a round of usability testing to evaluate how real students experienced the alpha prototype, not just in terms of surface-level functionality, but in terms of mental models, emotional friction, and interview behavior.
This wasn’t about catching system bugs.
It was about asking:
Does this tool feel like interview prep?
Does it support the way students think, speak, and react under pressure!
🧩 Pre-Screening Survey
-
Before testing, participants completed survey capturing:
-
Major + academic level
-
Past technical interview experiences
-
goals & pain points
-
Demographic context
-


🎙️ Live Session + Post-Interview
Participants engaged in a real-time AI-led technical interview simulation.
I followed with structured interview exploring:
-
What felt real, awkward, or helpful?
-
How did it compare to peer mock interviews or LeetCode practice?
-
Would you want to use something like this again?
🔍 What I Focused On
In addition to capturing survey responses and verbal feedback during the session, I observed:
-
Behavioral signals – pausing, repetition, filler words
-
Mental models – how they interpreted AI prompts, responded to questions, or moved through the task
-
Emotional cues – visible nerves, moments of ease, confidence dips or post-session relief
These observations helped surface insights that go beyond usability, capturing how well the tool aligned with real student behavior and mindset during technical prep.
Analysis & Thematic Synthesis
From raw transcripts to research-backed insights
Once testing was complete, I collaborated with my research partner to analyze participant interviews using a grounded thematic analysis approach. We developed a shared qualitative codebook(clustered responses into themes) to capture recurring themes across sessions.

Researching Confidence in Technical Interview Prep
Research Assitant @ Diva Lab, Drexel University CCI | Duration: Dec 2024 – Apr 2025
This project explores how AI can support students in practicing technical interviews in a more human, conversational way. Most platforms focus on solving problems, but not on articulating thinking under pressure. We designed a tool that simulates real technical interviews through an AI voice agent, giving students low-pressure, repeatable practice that feels real.
As a Research Assistant, I contributed across the research pipeline: from competitor analysis and survey design to participant recruitment, moderated testing, and codebook-driven analysis, culminating in a co-authored academic paper.
The 7 Core Themes Identified

Service Name
This is the space to introduce the Services section. Briefly describe the types of services offered and highlight any special benefits or features.

Service Name
This is the space to introduce the Services section. Briefly describe the types of services offered and highlight any special benefits or features.

Service Name
This is the space to introduce the Services section. Briefly describe the types of services offered and highlight any special benefits or features.
🗂 Using Braun & Clarke’s six-phase framework
We reviewed session transcripts, notes, and post-interview responses, tagging recurring behaviors, emotional responses, and feedback trends. Using a structured analysis framework (often referred to as a “codebook” in academic settings), we grouped these into clear, actionable insight categories.
In total, we:
-
Identified 42 unique behavioral and emotional patterns
-
Synthesized them into 7 core user themes that reflected consistent needs, frustrations, and moments of impact.
75% Said the AI Felt Realistic
Participants were surprised by how natural the AI interviewer sounded especially in tone, pacing, and prompt delivery
78%Confidence Boast
The judgment-free format helped reduce anxiety and made students feel more prepared to speak in real interviews.
80% Found the Tool Useful
The vast majority of participants described the experience as helpful, realistic, and aligned with how they want to prepare
IMPACT
🧭 Understanding the Gap
While most platforms focus on problem-solving (LeetCode, HackerRank), few prepare students for thinking out loud, handling pressure, or verbalizing technical decisions.

