Skip to main content

Can Your Body’s Response to Music Predict Hit Songs? A New AI Study Claims It Can

A new study suggests AI can analyze cardiac activity to predict whether a song will be a hit before it’s released. But some hit-song scientists are skeptical

A woman listening to music with earphones.

miniseries/Getty Images

Can a machine predict the song of the summer? Can it weed out forgettable flops? If so, such a technology could reduce music production costs, curate public playlists and even render judges on television talent shows obsolete—but after decades of “hit song science” research, predicting a successful song is still more of an art than a science.

Now researchers at Claremont Graduate University in California say they've found a way to use artificial intelligence to analyze listeners' physiological signals and predict the next chart-topping bop. The team tracked participants' heart activity as they listened to music. The scientists used an algorithm to convert the data into what they say is a proxy for neural activity. A machine-learning model trained on the data was then able to determine whether a song was a hit or a flop with 97 percent accuracy. The findings were published in Frontiers in Artificial Intelligence.

Other scientists studying the use of AI to predict hit songs aren't ready to declare victory yet. “The study could be groundbreaking but only if it's replicated and generalizable. There are many biases that can influence a machine-learning experiment, especially one that attempts to predict human preferences,” says Hoda Khalil, a data scientist at Carleton University in Ontario, who was not affiliated with the study.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Traditionally, music industry experts looking to predict the next hit have relied on large databases to analyze the lyrical and acoustic aspects of hit songs, such as tempo, explicitness and danceability. But this method has performed only slightly better than a random coin toss. For example, Khalil and her colleagues have analyzed data from more than 600,000 songs and found no significant correlations between various acoustic features and a tune's commercial popularity.

Rather than focusing on a song's qualities, the Claremont team sought to explore how humans respond to music. “The connection seemed almost too simple. Songs are designed to create an emotional experience for people, and those emotions come from the brain,” says Paul Zak, a neuroeconomist at Claremont Graduate University and senior author of the new study.

Previous attempts to use brain scans to predict hit songs had limited success. A 2011 study using functional magnetic resonance imaging, which tracks blood flow in the brain, identified 90 percent of commercial flops but only 30 percent of hits. Zak's team took a different approach. Instead of directly measuring brain responses, the researchers equipped 33 participants with wearable cardiac sensors that monitor changes in blood flow, similar to the way traditional smartwatches and fitness trackers detect heart rate.

Participants listened to 24 songs ranging from the megahit “Dance Monkey” by Tones and I to the commercial flop “Dekario (Pain)” by NLE Choppa. Their cardiac data were then fed through the commercial platform Immersion Neuroscience, co-founded by Zak, which he says algorithmically converts cardiac activity into a combined metric of attention and emotional resonance known as immersion (the details of this process are not outlined in the study). An AI model trained on these immersion signals predicted hit songs with high accuracy, the researchers reported. In contrast, participants' ranking of how much they enjoyed a song did not reflect its public popularity.

Zak—who currently serves as Immersion Neuroscience's chief immersion officer—says there is a rationale for using cardiac data, which can be easily tracked through wearable devices, as a proxy for neural response. He explains that a robust emotional response triggers the brain to synthesize the “feel-good” neurochemical oxytocin, intensifying activity in the vagus nerve, which connects the brain, gut and heart.

Not everyone is convinced. “The study hinges on the neurophysiological measure of immersion, but this measure needs further scientific validation,” says Stefan Koelsch, a neuroscientist at the University of Bergen in Norway and guest researcher at the Max Planck Institute for Human Cognitive and Brain Sciences in Germany. Koelsch also notes that although the study cited several papers to support the validity of immersion as a measure of brain activity, not all were published in peer-reviewed journals.

Koelsch is also skeptical that machine-learning models can capture the nuances that make a song a hit. In a 2019 study, he and his colleagues initially found a relation between the predictability of a song's chord progression and listeners' emotional response, but they have since been unable to replicate those findings. “It's very difficult to find reliable indicators for even the crudest differences between pleasant and unpleasant music, let alone for the subtle differences that make a nice musical piece become a hit,” he says. As of publication time, Zak had not responded to requests for comment on criticisms of his recent study.

If this new model's results are replicated, it might hold immense commercial potential. To Zak, its primary utility lies in efficiently sorting through the vast library of existing songs. “As wearable devices become cheaper and more common, this technology can passively monitor your brain activity and recommend music, movies or TV shows based on those data,” Zak says. “Who wouldn't want that?”

Zak envisions an opt-in service with data anonymized and shared when users sign a consent form. But Khalil points out that this opt-in approach can still fail to safeguard users. “Many users just accept the terms and conditions without even reading them,” Khalil says. “That opens the door for data to be unintentionally shared and abused.”

One's favorite songs may seem like innocuous data, but they offer a window into one's moods and habits. And if these details are coupled with data on brain activity, consumers may be forced to consider how much information they're willing to relinquish for the perfect playlist.

Lucy Tu is a freelance writer and a Rhodes Scholar studying reproductive medicine and law. She was a 2023 AAAS Mass Media Fellow at Scientific American.

More by Lucy Tu
Scientific American Magazine Vol 329 Issue 2This article was originally published with the title “Flop or Bop?” in Scientific American Magazine Vol. 329 No. 2 (), p. 14
doi:10.1038/scientificamerican0923-14b