CULTURE

A Robot Has A Record Deal: How Artificial Intelligence Could Save Music

Warner music group just signed a deal for 20 records made by an algorithm.

FACT Magazine

Despite movies like The Matrix trying to warn us against the dangers of artificial intelligence, it appears humankind is walking eyes-wide-open into an AI induced apocalypse.

Look around, signs that malevolent technology is already among us are everywhere. Deep down, you know Siri resents you, you can hear it in her tone when you ask her to place your saved Domino's order. Your smart TV didn't glitch; it deleted your taped episodes of The Bachelor out of spite. Alexa didn't mishear you, she's just tired of your shit.

Now, you can add music to the ever-growing list of things AI will inevitably use against the human race when our iPhones enslave us. For the first time ever, an algorithm has landed a record deal. According to Music Business Worldwide, "A stress-reducing sound app called Endel has become the first-ever algorithm to sign a deal with a major label, Warner Music Group, and will be releasing 20 albums throughout the year."

This doesn't exactly mean Plankton's wife, Karen, will be the next big American pop star, as Endel is in the business of "relaxing soundscapes," not mainstream music. The app uses a variety of factors such as time of day, location, heart rate, and weather to create custom sound frequencies to relax the listener. While the existential dread we feel when listening to music written by a robot is anything but relaxing, we're sure some people can look past the spookiness to achieve a moment of big-business-manufactured zen.

karen spongebob

Still, if algorithms can play you what you want to hear when it comes to a mixture of rainforest sounds and white noise, what's stopping them from creating the next perfect dance hit? Nothing. In fact, it's already happening. While this is the first time an algorithm has ever had its very own record deal, producers have used algorithm technology in music production for a long time. And maybe, it's not quite as scary as it sounds.

According to The Verge, this technology works by "using deep learning networks, a type of AI that's reliant on analyzing large amounts of data. You feed the software tons of source material, from dance hits to disco classics, which it then analyzes to find patterns. It picks up on things like chords, tempo, length, and how notes relate to one another, learning from all the input so it can write its own melodies. There are differences between platforms: some deliver MIDI while others deliver audio. Some learn purely by examining data, while others rely on hard-coded rules based on musical theory to guide their output."

If you listen to the songs that have experimentally been produced by AI, it quickly becomes apparent that something is missing. "Daddy's Car," an AI made pop song created under the supervision of songwriter and human man, Benoît Carré, was made by Sony's Flow Machine software, a program that analyzes a database of some 13,000 tracks from around the world to create music.

Daddy's Car: a song composed by Artificial Intelligence - in the style of the Beatles youtu.be

Carré taught the software to use the music of The Beatles as a template, and while that influence is apparent, something is hauntingly off. It's tempting to say that the song is missing a vital humanity; a warmth, intention, and spontaneity that the human mind connects to subconsciously when experiencing a piece of art. But in all likelihood, the problem with "Daddy's Car" is simple: it's just not a very good song.

As Hang Chu, a computer science Ph.D. student at the University of Toronto explained, making music with AI is different than, for example, generating images with AI (something that's been done successfully and convincingly for a long time). "Music is not something where if you throw enough data at it and hope the algorithm can figure it out, it will work," he said. Considering the number of factors that go into making a piece of music: tempo, tone, melody, harmony, and timing, this makes sense. Even though these factors are quantifiable, it's a shot in the dark to know what combination of them is going to create a good song. After all, if making good music was an easy math problem, everyone would do it.

Musical AI still needs a person to operate it, and ultimately, make computer generated products into a song that a person would think is worth listening to. There is still human artistry in AI music technology, just a new and different kind. While most people react with anger and discomfort when confronted with the idea of computer-made music — immediately jumping to arguments about music being at the very core of what makes us people, so should, therefore, be made by people — isn't it true that the technology of music has always been evolving? Minstrels didn't have electric guitars and George Martin didn't have Pro Tools; how is AI anything but a new man-made instrument?

John Smith, a fellow at IBM's AI research center, argues that AI-made music may actually help to take music to places it could never get otherwise. He says that we need to widen our ideas of "good" or "bad" music or art when it's created by AI. "Part of the joy of AI is that it doesn't think like humans do, and so it comes up with concepts and ideas we would never think of. That can be helpful for pushing art into new realms," Smith said, but added, "The computer can start to do more and more of the groundwork and prep work and even suggest different ideas, but that leap of creative thought, that spark of imagination, still has to come from a human."

Spotify algorithm

Not only that, but your taste in music is already probably heavily influenced by computers. Algorithms created by streaming platforms determine what kind of songs they think listeners are going to like, and then push these songs to listeners, inevitably boosting the popularity of certain kinds of music, which serves to influence musicians into copying this sound. People readily accept this kind of computer interference — interference that is arguably more damaging to the autonomy of the music listener than AI music.

So, though music generation software hasn't quite learned its trade well enough to create something convincingly human, inevitably, it will. Luxembourg company, AIVA (Artificial Intelligence Virtual Artist), created an AI-generated score for a video game called Pixelfield using a slightly different method than Sony, taking into account that the specific combination of different musical aspects, not each aspect on its own, is what makes a quality piece of music: "We asked, 'what are the building blocks to create an entire song?'" Pierre Barreau, the CEO of the company said. "If you consider just melody, you can generate that. Then based on that melody, you can make another model that creates instrumental accompaniment for that melody. If you break it down, it becomes substantially easier."

It's naive to believe that as new systems like AIVA's are invented, the technology inevitably improves, and AI-made music becomes something you hear more of, there will be an inherent lack of humanity. After all, it's human created software, run by a human, using building blocks made of patterns and aspects of songs produced by people; how can the resulting whole not be imbued with the same humanity?


Brooke Ivey Johnson is a Brooklyn based writer, playwright, and human woman. To read more of her work visit her blog or follow her twitter @BrookeIJohnson.


POP⚡DUST | Read More...

Amy Schumer's "Growing" Is Stunted

How is Grime Different from Hip-Hop, and Is Drake Ruining It?

Disney/Pixar Unveils Original Trailer for "Toy Story 4"

FILM

Stop Putting Jared Leto in Movies

Morbius is going to suck.

Sony

There's a big problem with the trailer for Morbius, Sony's upcoming Marvel outing that is definitely not part of the Marvel Cinematic Universe even though it has Michael Keaton reprising his role as Vulture (please let us keep our license, Disney!).

See if you can spot it.

MORBIUS - Teaser Trailer www.youtube.com

If you answered, "Sampling Beethoven's 'Für Elise' to line up with blue-tinted action shots is the absolute lowest effort, brain-dead attempt to signify 'gothic vampire movie' in the entire history of movie trailers," you're correct, but that's still not the biggest problem with Morbius. No, the biggest problem is that Morbius is played by Jared Leto.

Keep Reading Show less
Top Stories

Is Artificial Intelligence Saving or Destroying Music?

AI can now write and record songs incredibly well. But is there something about music that is distinctly human, something that an AI cannot replicate?

What makes great music great? Is it technical mastery, emotional payoff, or some other inexplicable x factor, something that could never be replicated by an algorithm?

The answer might be the key to placing judgment on music created by artificial intelligence. If AI can make music that achieves or supersedes human-level mastery, then we humans could be left in the dust as supercomputers build on and refine their own work more expertly than we ever could.

So we're left with the question: is there something about music that is distinctly, inextricably human?

Nick Cave thinks so. In an open letter to a fan, he argued that when we listen to a song, we're actually listening to many songs at once—songs of a person's past, of their identity, struggles, and shortcomings—and AI could never replicate the imperfections and histories which form the bases of music's power.

Nick Cave with fansImage via NME.com

When we listen to music, "what we are actually listening to is human limitation and the audacity to transcend it," wrote Cave. "Artificial Intelligence, for all its unlimited potential, simply doesn't have this capacity. How could it? And this is the essence of transcendence. If we have limitless potential then what is there to transcend?"

A fair argument. But it's easy to imagine that AI could learn to replicate flaws, that it could texture its compositions with bits of hoarseness and the little trips and stumbles that define some of our most beloved recordings, making us feel connected to the musician on the other side.

Though AI is a relatively new phenomenon, its arrival in music is far from unprecedented. Music has been connected to technology since the phonograph was invented in 1877, making recordings replicable on a massive scale. Before that, all music was played live and listening was an exclusively communal experience. Now streaming services let you listen to any song, anywhere.

AI began to creep onto the scene about a century after the phonograph revolutionized the recording industry. In 1995, David Bowie was one of the first mainstream musicians to use AI by creating the Verbalizer, a computer application that took quotes from various literary sources, cut them up, and reordered them into verses—sort of like your average TwitterBot. Bowie used the lyrics for his album Outside. (Flash forward to 2019, and Bowie is the star of an augmented reality app that allows you to project a 3D exhibition experience into your bedroom).

How David Bowie used 'cut ups' to create lyrics - BBC News www.youtube.com

Flash forward to today, and intelligent recording softwares are allowing people of all musical skill levels to create far beyond their abilities, and digital workstations allow for endless refinement. Most softwares let users endlessly re-record any section, adjust any note via pitch modulators, restructure rhythms, and program whole orchestras—all at a computer's keyboard. Logic's trigger function allows you to link letter keys to complex chords, which build off each other on their own to craft perpetually mistake-free compositions.

Today there are dozens of AI music composition programs on the market, and some have partnered successfully with humans. In 2016, Sony Music commissioned the songwriter Benoit Carre to write a song with the AI software Flow Machines. Carre penned the lyrics and arranged the song, but the software wrote the music and melodies, and the result was "Daddy's Car," a disconcertingly catchy, Beatles-esque tune that could easily pass as manmade.

Daddy's Car: a song composed by Artificial Intelligence - in the style of the Beatles www.youtube.com

Carre is far from the only artist to collaborate with artificial intelligence. In 2018, Taryn Southern, former American Idol contestant, released her debut album I Am AI, a collaboration with the software Amper. "I imagine in 20 years, 'coding' songs will be commonplace," Southern told VideoInk. "It's still incredibly early for AI, but I could see artists using machine learning for all kinds of applications: to mix and master their songs, to help them identify unique chord progressions, alter instrumentation to change style, determine more interesting melody structures based on a musician's given sound and style preferences, even gage their audience's emotional response to a song. The sky is the limit."

If you agree with Southern, then AI is simply leveling the playing field, making music more accessible to people with different skills and opportunities. AI mastering services are saving artists thousands of dollars on an expensive and time-consuming aspect of post-production. And of course, now people who can't play any instruments can suddenly compose and record.

Image via steemit.com

On the other hand, these softwares are sparing people the work of actually learning an instrument, letting them attain the satisfaction of creation without the dedication it used to take to get there—potentially perpetuating the social media era's trend of instant gratification that some argue is putting us all constantly on edge.

But progress will not be stopped, and scientists are making AI more virtuosic every minute. Douglas Eck, the creator of NSynth—a project that utilizes deep learning to dissect and combine sounds to create completely new ones—views AI's advancing musical aptitude as progress, just like any other invention. "We're making the next film camera," Eck told the New York Times. "We're making the next electric guitar."

Image via artandlogic.com

Investors seem to agree. AI compositional softwares have amassed millions in funding, and some AI-human collaborations are even edging towards mainstream success. In 2017, the producer Bauer collaborated with musician Lil Miquela—who also happens to be an artificial Instagram influencer, and whose synthetic vocals and CGI-generated appearance garnered her a Times Square billboard ad. And now Amazon's Alexa has a new feature called Deep Music, which automatically weaves audio samples together to match whatever 'vibe' it detects in the surrounding atmosphere— because Alexa wasn't already listening too closely.

What does she know?Image via inquisitr.com

But none of this answers the central question: can an AI ever make music as well as a human? Or does it lack something fundamental, some mysterious quality?

According to posthumanists like Claire L. Evans, there shouldn't have to be an either/or, a man versus machine binary. "It's almost exquisitely myopic to judge posthuman music on its ability to "pass" as human," she wrote on Motherboard. Instead, she argues, AI might help us access unprecedented levels of mastery. Combine the technical skills of an AI with the emotional experiences of a human and you might just have a new kind of hit, one that could achieve a cosmic level of brilliance.

Image via the Paris Review

After all, music, science, and the cosmos have always been interconnected. Recent revelations in quantum physics have revealed that the universe is inextricably connected to sound; superstring theory proposes that the universe is a form of oscillations that resemble the compressions and rarefactions that form sound waves. Basically, each individual sound wave contains harmonics, which align to form spiral formations when transcribed onto staff lines. In what could be an uncanny coincidence (or proof of divine providence), these formations resemble the architecture of shells or the spirals of galaxies.

Image via whatmusicreallyis.com

Essentially, musical harmonics are mathematical systems reflected by (or reflecting) the structure of the universe. It's not hard to imagine a scenario where AI could develop an understanding of harmonics that's far more advanced than anything we could come up with on our own, one that accesses secrets of the universe we couldn't imagine.

And yet there's still something unknowable involved in artistic creation, something that makes certain pieces resonate while others falter. Music has always been a combination of technical skill, inspiration, science and faith; it seems that ultimately, one can't exist without the other.


Eden Arielle Gordon is a writer and musician from New York. Follow her on Twitter at @edenarielmusic.


POP⚡DUST | Read More...

Fandom for the Faithless: How Pop Culture is Replacing Religion

Nazi-Chic: The Aesthetics of Fascism

Brands Are Not Your Friends