Electrical & Computer Engineering • Neuroscience
I'm a 4.0 GPA student double majoring in Electrical & Computer Engineering and Neuroscience, passionate about AI, neurotechnology, and neural prosthetics that bridge the gap between biological and engineered systems.
How can we build intelligent systems that understand and improve human life?
I love building hardware and software, but I'm most excited when engineering meets biology. Whether that's neural prosthetics, brain-computer interfaces, or AI-driven neurotech, I'm drawn to problems where circuits and computation interact with living systems.
The brain is the most sophisticated computing system we know. Neural prosthetics translate thought into action. AI enables adaptive, responsive medical technology.
I'm fascinated by the intersection of these fields.
My goal is to help design technologies that restore function, enhance capability, and genuinely improve quality of life.
I approach problems analytically, but I build with intention. I care about clean architecture, thoughtful design, and meaningful impact.
Being immersed in both engineering and neuroscience has shaped how I think: systems-first, interdisciplinary, and always curious.
I've led fundraising initiatives, collaborated on community projects, and taught coding and STEM to students across different age groups. Teaching has strengthened my ability to break down complex systems, and build them better.
Projects in robotics, AI, and intelligent systems
Designed a research-grade autonomous mobile robot that performs real-time environment mapping, path planning, and adaptive obstacle avoidance using sensor fusion, embedded control, and algorithmic decision-making.
An AI assistant that reduces decision fatigue by optimizing your daily schedule. Transcribes and structures your day, analyzes photos (gym exercises for time estimates, skincare products for compatibility and timing), tracks task completion history, and builds reports on productivity patterns. Basically helps you get through your day without overthinking every little thing.
Building a behavioral ML system that understands human intent through motion and proximity patterns alone. No camera. No face detection. The robot learns to distinguish between someone approaching versus passing by, blocking intentionally versus accidentally, and predicts future motion trajectories. This is behavioral machine learning.
Designed a Python-based automation framework to simulate user interactions and system workflows, inspired by high-traffic streaming platforms. Built logging and telemetry pipelines to capture performance metrics, error states, and system behavior during automated test runs. Visualized system performance data using Matplotlib to identify bottlenecks and reliability issues.
Leadership and community engagement
Working with Robot Operating System 2 to develop autonomous systems and collaborative robotics projects
Developing cloud-integrated robotics solutions using AWS infrastructure and services
Founded and lead the CS Association, building a community for computer science students
Organizing professional development events and empowering women in technology
@waynewomenintechLeading fundraising initiatives and fostering cultural exchange within the community
@waynestateisaInterested in neurotech, AI in healthcare, neural prosthetics, or robotics? Let's connect.