
A collection of Swift voice components.
![]() |
|
A SwiftUI view that visualizes audio levels and agent state as an animated Orb. Pass AudioTrack
instances for real-time visualization, optional agentState
to reflect status, and two Color
values to customize the Orb.
- inputTrack: Optional
AudioTrack
used to visualize microphone/input levels. - outputTrack: Optional
AudioTrack
used to visualize agent/output levels. - agentState:
AgentState
controlling visual states (e.g.,.listening
,.thinking
,.speaking
). - colors: Tuple of two
Color
values to customize the orb gradient.
Conversational AI docs and guides: https://elevenlabs.io/docs/conversational-ai/overview
See our example app, to see how you can build cross-platform Apple voice experiences.
You can add ElevenLabsComponents to your project using Swift Package Manager.
In Xcode:
- Go to File > Add Packages...
- Enter the repository URL:
https://github.com/elevenlabs/components-swift
- Select the
main
branch or a version, and add theElevenLabsComponents
library to your target.
Or add to your Package.swift
:
dependencies: [
.package(url: "https://github.com/elevenlabs/components-swift.git", from: "0.1.3")
]
This package provides a set of SwiftUI components for building real-time voice experiences with ElevenLabs Conversational AI, including an OrbVisualizer
.
This project extends the LiveKit components-swift codebase under the same permissive license, with modifications tailored for ElevenLabs voice experiences. We are grateful for their foundational work, which has enabled further innovation in this space.