A voice-interactive web UI built with Next.js and ElevenLabs, enabling real-time conversations with an ElevenLabs agent. Users can start and stop a session, and visually observe when the agent is speaking or listening through animated button feedback.
- 🔗 GitHub: github.com/aimaster-dev/call-with-ai-agent
- 🌐 Live Demo: call-with-ai-agent.vercel.app
- 🎙️ Voice input using browser microphone access
- 🧠 Agent session control using ElevenLabs' SDK
- 💬 Live status updates for speaking vs listening
- ✨ Animated visual feedback via Tailwind CSS
- 🔐 Environment-based agent ID management
.
├── public/
├── src/
│ └── app/
│ ├── components/
│ │ └── conversation.tsx # Core conversation logic and UI
│ ├── globals.css
│ ├── layout.tsx
│ └── page.tsx # Root page using <Conversation />
├── .gitignore
├── next.config.ts
├── package.json
├── tsconfig.json
└── README.md
git clone https://github.com/aimaster-dev/call-with-ai-agent.git
cd call-with-ai-agent
npm install
Create a .env.local
file at the root:
NEXT_PUBLIC_AGENT_ID=your_elevenlabs_agent_id_here
✅ Must begin with
NEXT_PUBLIC_
to be available on the client side.
npm run dev
Open http://localhost:3000
to view it in your browser.
This project is deployed using Vercel.
- Go to your project on Vercel
- Navigate to Settings → Environment Variables
- Add:
Name: NEXT_PUBLIC_AGENT_ID
Value: your_agent_id
Environment: All (Production, Preview, Development)
- Redeploy the project.
- Next.js 15+
- TypeScript
- Tailwind CSS
- @11labs/react
- Vercel (for hosting)
ElevenLabs — Realistic voice AI for conversational experiences.
MIT — free to use, modify, and distribute.