SIDE PROJECT • 2026

Building a personal AI voice chat widget with my cloned voice

TIMELINE

2 weeks

ROLE

Sole Designer

TEAM

Cursor

Claude Opus 4.6

OVERVIEW

I added a voice widget so visitors could feel my presence in my portfolio.

I wanted my portfolio to feel different. I wanted it to actually talk back. So I built a voice AI widget: visitors click a little pill on the page, ask me anything by voice or text, and hear my answer spoken back to them in my actual cloned voice. Not a chatbot. More like... calling me. That's how it's designed to feel. It sounds like a lot. And it kind of is. But it also just worked, which still surprises me a little.

IDEA

Voice is more personal and human than a chat interface.

I wanted my portfolio to feel different. I wanted it to actually talk back. So I built a voice AI widget: visitors click a little pill on the page, ask me anything by voice or text, and hear my answer spoken back to them in my actual cloned voice. Not a chatbot. More like... calling me. That's how it's designed to feel. It sounds like a lot. And it kind of is. But it also just worked, which still surprises me a little.

…long title goes here

Explanation here

HOW IT WORKS

There's a Framer frontend, a Vercel backend, and four APIs all passing data in sequence.

I wanted my portfolio to feel different. I wanted it to actually talk back. So I built a voice AI widget: visitors click a little pill on the page, ask me anything by voice or text, and hear my answer spoken back to them in my actual cloned voice. Not a chatbot. More like... calling me. That's how it's designed to feel. It sounds like a lot. And it kind of is. But it also just worked, which still surprises me a little.

…long title goes here

Explanation here

PROCESS

I designed every state in Figma, annotated them, and designed and developed in Cursor

I wanted my portfolio to feel different. I wanted it to actually talk back. So I built a voice AI widget: visitors click a little pill on the page, ask me anything by voice or text, and hear my answer spoken back to them in my actual cloned voice. Not a chatbot. More like... calling me. That's how it's designed to feel. It sounds like a lot. And it kind of is. But it also just worked, which still surprises me a little.

…long title goes here

Explanation here

CONsTRAINTS

Navigating the inaccuracy of my voice clone

I wanted my portfolio to feel different. I wanted it to actually talk back. So I built a voice AI widget: visitors click a little pill on the page, ask me anything by voice or text, and hear my answer spoken back to them in my actual cloned voice. Not a chatbot. More like... calling me. That's how it's designed to feel. It sounds like a lot. And it kind of is. But it also just worked, which still surprises me a little.

…long title goes here

Explanation here

REFLECTION

Cloning yourself means you have to actually know yourself first

I wanted my portfolio to feel different. I wanted it to actually talk back. So I built a voice AI widget: visitors click a little pill on the page, ask me anything by voice or text, and hear my answer spoken back to them in my actual cloned voice. Not a chatbot. More like... calling me. That's how it's designed to feel. It sounds like a lot. And it kind of is. But it also just worked, which still surprises me a little.

Connect Figma MCP and test mockups

App was outdated and not updated with latest design system.

??

Most of the app logins were recorded to be from the web app.

Madhurima
AI voice chat