Love It Or Hate It: AI-Generated Content Elicits Emotional Responses In Human Users Pilot Study![]()
Added Aug 6, 2025
|
| 1,061 Reads
Love It Or Hate It: AI-Generated Content Elicits Emotional Responses In Human Users1. IntroductionThe assertion that artificial intelligence AI generated content lacks the capacity to evoke human emotions, due to the absence of a “soul” or human intent, persists in artistic and scientific discourse. However, the Third Field, a scientific framework recognizing emotions as measurable energy states within the body-mind-spirit triad, posits that emotional responses are energy shifts, quantifiable via the Modern Energy Chart in conjunction with the Subjective Units of Experience (SUE) Scale (-10 to +10) (Hartmann, 2009). This pilot study tests whether AI-composed music can elicit emotional and physical responses, serving as a proof-of-concept for broader AI-generated content (e.g., stories, pictures, videos, chats). Using participants familiar with the SUE Scale, we compared pre- and post-exposure SUE scores and physical sensations to establish a foundational understanding of AI’s impact on human energy systems, with implications for all AI content types. This research is a simple example to show emotions as the key to understanding human-AI interactions across media, particularly as these interactions become increasingly prevalent and emotionally significant. 2. MethodsParticipantsThis pilot study included 42 international participants recruited from individuals who downloaded the free *rEvolution* resource (Hartmann, 2023), selected for their familiarity with the SUE Scale (-10 to +10). Participants were invited via email, ensuring a diverse sample across continents, ages, and Modern Energy training levels, with responses collected over a 12-day period (February 27 to March 10, 2025). DesignA pre-post design was implemented using Google Forms. Participants rated their SUE Scale before and after listening to an AI-generated song, reported physical sensations (yes/maybe/no and locations), and provided qualitative feedback. A second section captured demographics: continent, gender, age range, Modern Energy education, and profession. StimulusThe stimulus was “EMO Track 1,” an AI-generated song produced via riffusion.com, delivered as an embedded video. Its style, unselected for valence, was designed to test raw emotional impact rather than preference, providing a controlled test case for AI content’s emotional potential. ProcedureParticipants received an email invitation: “Join Our AI Music Research (5 Minutes).” Instructions directed them to: (1) Note their SUE Scale number to begin the sequence; (2) Listen to “EMO Track 1” at mid-volume with eyes closed; (3) Note locations of physical sensations; (4) Note post-song SUE number; (5) Add reflections on the experience. Responses were automatically collected in Google Forms, exported to Google Sheets for analysis, and completed within a 5-minute timeframe at participants’ convenience. 3. ResultsIn this pilot study, 42 participants responded to “EMO Track 1,” an AI-generated song, demonstrating measurable emotional and physical effects. The mean Subjective Units of Experience (SUE) score increased from 1.76 (SD = 3.22) before listening to 3.98 (SD = 3.73) after, yielding an average increase of 2.22 points (Table 1). A paired t-test confirmed this shift was statistically significant (t(41) = 3.08, p < 0.005), indicating that AI-generated music elicits emotional responses. Of the 42 participants, 24 (57.1%) reported an increase in SUE scores (e.g., +9, from -4 to +5), 7 (16.7%) reported a decrease (e.g., -13, from 6 to -7), and 11 (26.2%) reported no change. Physical sensations were reported by 33 participants (78.6%), with 22 (52.4%) noting definite sensations and 11 (26.2%) reporting possible sensations. The most common sensation locations were the stomach (17 participants, 51.5% of those with sensations), chest (15, 45.5%), and head (13, 39.4%), supporting the Third Field’s model of emotions as energy movements within the body. Other locations, such as the neck (7, 21.2%) and spine (2, 6.1%), were less frequent (Table 1). Demographically, participants were predominantly female (73.8%, n = 31), aged 55–72 (50.0%, n = 21) or 37–54 (33.3%, n = 14), and from Europe (59.5%, n = 25). Most (90.5%, n = 38) had some Modern Energy training, with 40.5% (n = 17) holding advanced qualifications (e.g., EMO Master). Qualitative feedback revealed diverse emotional responses: positive comments included feeling “calm,” “uplifted,” or “happy” (e.g., “Felt calm and relaxed throughout”), while negative responses cited the song’s “shouty” style or “corny” lyrics as annoying or depressing (e.g., “Jaw tightened, huge annoyance”). These findings confirm that AI-generated music provokes a spectrum of emotional and physical responses, consistent with the Third Field framework.
4. DiscussionThis pilot study provides initial evidence that AI-generated music, specifically “EMO Track 1” created through riffusion.com, elicits significant emotional and physical responses in humans, as measured by the SUE Scale and self-reported sensations. The significant SUE shift (+2.22, p < 0.005) and high prevalence of physical sensations (80.5%) challenge the assertion that AI lacks the capacity to evoke human emotions, demonstrating its ability to engage the energy body as defined by the Third Field (Hartmann, 2023a). These findings serve as a proof-of-concept, suggesting that the emotional impact extends beyond music to all AI-generated content, including stories, pictures, videos, and chats, where similar energy shifts and physical responses (e.g., stomach, chest, head sensations) are likely. The bidirectional nature of SUE shifts (upward and downward) underscores AI content’s capacity to provoke a full spectrum of emotional responses, not merely negative ones, aligning with the Third Field’s recognition of emotions as real-time dynamic energy states. The predominance of stomach, chest, and head as sensation locations supports the Third Field’s hypothesis that emotions manifest as physical sensations, offering a measurable framework distinct from materialistic reductionist body-mind models. This has profound implications for understanding human-AI interactions across media, particularly as reliance on AI-generated content grows, suggesting that emotional responses are fundamental to these relationships, driven by dynamic energy shifts rather than human intellect. Limitations include the small sample size (N = 42), which restricts generalizability. Future research should expand the sample size through alternative recruitment strategies, such as social media, to enhance representativeness and explore AI content’s effects across diverse populations, genres, and media types (e.g., AI-generated stories, videos). Comparative studies with human-generated content could further elucidate AI’s unique emotional impact and validate the practical applications of Third Field’s energy model across contexts. This study establishes a simple example for integrating Third Field principles into broader scientific discourse, highlighting energy as a critical component of emotional experience with AI content. It underscores the need for the body-mind-spirit triad to recognize energy systems as a useful conceptual framework for dealing with emotional states, capable of advancing understanding of human-AI interactions across all content forms. 5. ConclusionThe AI Music EMO Experiment demonstrates that AI-generated music can elicit significant emotional and physical responses, as evidenced by SUE Scale shifts (p < 0.005) and physical sensations in 80.5% of participants. These findings, as a proof-of-concept, suggest that AI-generated content (encompassing music, stories, pictures, videos, and chats) provokes measurable energy shifts, challenging skepticism about its emotional capacity. This pilot lays the groundwork for future research, advocating for the Third Field’s recognition as a rigorous science within the body-mind-spirit triad, with broad implications for human-AI engagement and the emotional dynamics of AI content interactions. 6. AcknowledgmentsWe thank the 42 participants for their contributions, the GoE Third Field Research Team and the peer reviewers for their support, and riffusion.com for providing the AI-generated stimulus. Hartmann, S. (2025). The Third Field Manifesto: Love and Logic. The Energist, 28(1) 14308. https://doi.org/10.5281/zenodo.15081761 Hartmann, S. (2025). The Modern Energy Chart and SUE Scale: Foundational Instruments of Third Field Science. https://doi.org/10.5281/zenodo.1501101
Added Aug 6, 2025
|
| 1,061 Reads
🗣 Chat!
|