Background
Emotions is an interactive web-based experience that animates emotional expressions using voice recognition. When a user says a word like "happy," "sad," "angry," "scared," or "neutral," the site recognizes the input through the Web Speech API and responds by animating a character to match the emotion. I illustrated each expression by hand in Krita, then exported them as transparent png files and sequenced the frames in JavaScript to create short, expressive animations.
As I was building the website, which initially was just experimentation with browser-based voice recognition and hand-drawn animation, I began to recognize something — the emotion visuals I created would not always read the same way to others. What looked like anger to me may come across as determination or even playfulness to someone else. This gap in perception became central to the work. It made me question how emotional expressions are interpreted, and how much that interpretation depends on culture, context, and personal experience. The project became less about conveying emotion "accurately," and more about revealing the space between expression and perception.
To invite others into this reflection, I added a follow-up prompt asking users what the emotion means to them, and why. I was curious about how people perceive emotion words and/or my animation, and what drives their perception. User responses are recorded through the Google Sheets API and made publicly visible to forming a small archive of emotional interpretation.
By blending voice input, hand-drawn expression, and communal storytelling, Emotions explores what it means to name a feeling, and what gets translated, lost, or transformed in the process.