I bet everyone's used ChatGPT or Gemini by now. Have you noticed how you can interact with AI? Those pulsating motions when you speak and when they respond?
It's amazing real-time animation. It uses audio wavelengths to anchor the highs and lows of that pulse effect.
I've seen this incredible sphere from Google's portfolio—the sphere feels real and like it's talking to me. I wonder how it's even possible? Is it because of the computing power we have now on our chips?
Animation by Google Portfolio, Deepseek and ChatGPT
It's amazing real-time animation. It uses audio wavelengths to anchor the highs and lows of that pulse effect.
I've seen this incredible sphere from Google's portfolio—the sphere feels real and like it's talking to me. I wonder how it's even possible? Is it because of the computing power we have now on our chips?
The Real-Time Revolution
You know how smooth those AI animations look? That smoothness is actually tricky to achieve. How animation usually works is you need software like After Effects or DaVinci Resolve to lay a canvas for your animation. Then you attach all your assets and animate using characteristics like scale, transparency, position, color. To make it move, you stamp the time where you want those characteristics to change—we call it inbetweening.
But these AI interfaces skip all that pre-planning. They're generating animation in real-time, responding instantly to your voice without any pre-rendered frames. I keep wondering how it's done. I believe coding and art are behind this, working harmoniously together.
My Experiment
This got me thinking... I tried it myself to see how this is done. Let's think of any idea but use what ChatGPT is doing—let's use the user's voice frequency for our animation. I created a circle using HTML and animated the circle's radius using the user's voice frequency.
Let's push it a little further. Let's transcribe the user's voice and echo it on our sphere. Using the Speech API that most web browsers support, let's see if it works.
It worked! There's some minimal delay, but it works, and I didn't use any JS library to achieve this interactive animation.
But Here's Where It Gets Interesting
It's amazing what the modern web can do. Imagine if we connect this with a reasoning AI LLM like OpenAI. It interacts, and through the animation, you can feel that it's listening to you and can reason with you. You could program it to sense the user's emotion and change color—if it senses you're sad, it changes to blue; if you're happy, it changes to yellow or red.
Speaking of real-world applications, I've experienced an interesting use case on Deepgram's website. When you go there, there's an AI agent or chatbot waiting for you like a receptionist that can answer all your questions about their services, pricing, features.
Why This Matters
As a multidisciplinary creative, this blend of design and technology is exactly what interests me. We're not just building smarter AI—we're making it feel intuitive and emotionally resonant. These pulsating spheres aren't just pretty animations; they're changing how we expect to interact with technology.
I believe that's how it'll work in 2-3 years—there'll be an AI receptionist on every website you browse. You can immediately ask your questions by interacting with them. No need to read all those FAQs or wait for customer service for simple questions.
But this shift means web designers need to start thinking beyond static interfaces. We're moving from clicking and typing to speaking and feeling. The question isn't whether every website will have an AI receptionist—it's whether designers are ready to create experiences that feel genuinely human.
Photo by Google DeepMind from Pexels