Google Play Music, Google’s native music streaming service, is trash. Somehow, Google, that information giant, failed to gather its search results into what makes a good music app, and make a good music app. My top two annoyances—my top two provable annoyances—are the following:
- You can’t download individual songs.
- Searching for songs, artists, albums—whatever, anything—opens up a new window that prevents you from going back to your music library. The only way to go back to the main screen is to keep pressing the back button until you’re back in the main screen. If you don’t want to find yourself continuously pressing the back button, you’ll go back to the main screen after every search whether you need to or not, otherwise you line up a queue of search tabs that pile on top of one another. It’s awful.
But my top annoyance is due to what I suspect is an “A.I. driven software that learns from how you listen and adapts to your habits,” you know, marketing-speak for predictive software. Here’s how I suspect Google thinks it’s making its app better with the supposed betterment of its algorithm.
Take your entire music collection. Before hitting shuffle for the first time, there’s an equal chance of playing any one song. As soon as the first song is played, Google thinks, “Okay, you like this song because you’ve listened to at least 10 seconds (or whatever their criteria is) without skipping it. So we’ll put a +1 on this song’s preference.” Later on, you hit shuffle and hear another variety of songs, but one of them plays again. You don’t skip it. Google thinks, “Okay, +1 again.” Google’s A.I. thinks you like the song because you haven’t skipped it, you haven’t deleted it to reset its play data, and you listened to enough of it to have its play time registered.
The A.I. doesn’t understand that we, as human beings, put value on deprivation. We only have enjoyment because we are deprived of enjoyment. Enjoyment is scarce, so we revel and bask in it when we experience it. There are certain songs that bring us enjoyment, and so we pick them out of our vast collection of accumulated music and play them at our convenience. But we don’t play them at every opportunity, and we don’t want to listen to them during every music session. We’d get sick of them. But A.I. doesn’t understand that, and for the moment, programmers and app developers haven’t found a way to make A.I. understand that.
Everyone who has searched for anything online has experienced the following: marketed ads. Maybe you’re bored and want to look up a car or toothpaste or an overcoat, and, before long, suddenly on every website you see a Hyundai Elantra, or Crest toothpaste, or the latest offering from overstock.com. For A.I., the reasoning is simple—you looked something up because you were interested in buying it. Maybe we were, or maybe we were just bored. The targeted ad algorithm would benefit from a toggle akin to incognito mode on the Chrome browser. If, let’s say, by pressing the X key on the keyboard I could indicate that I was searching out of boredom and please don’t track my searches, instead of, “I’m searching because I am genuinely interested in buying something soon,” then A.I. wouldn’t be so simplistic in its calculation. For now, though, A.I. is that simplistic in its calculation.
A.I. thinks that you want to buy things. It can’t identify abstract concepts like happiness or sadness, but it can identify purchasable services and goods that might be able to sustain that happiness or mitigate that sadness, like cruises or care centers—with their current rates, of course. This isn’t a fundamental problem of A.I; instead, it’s the fault of the motives behind the companies creating the A.I. Companies think you want to buy things and so will latch on to anything they can to ensure that whatever it is you have even a remote interest in comes from their store or is their product. A.I. is a construct created not to make your individual experience with your devices more unique, but to make your preferences, tendencies, and actions translatable into a readily-identifiable shopping profile. You like X product? Then you might like Y product. Your friends bought W service. Have you considered Z’s services? A.I. is currently built to take what you like, what your friends like, and what you previously liked not to predict what you might like next, but what you might like to buy next.
Google Play Music is based on a subscription service. Yes, you can buy songs, but like Spotify, you’re paying for the access to their entire library of music. However, because Play Music’s A.I. or algorithm is so binary in its calculations—if x, then y, or if not x, then not y—it ends up providing insight into the operation of that A.I. or algorithm that we, as human beings, understand to be a deeply-flawed system of categorization. We are complicated and complex. A.I. is simply complex. Maybe it just needs more information so that companies can improve the A.I.’s operation. But if the company’s motives don’t change—and why would they—then further information assists the complexity behind our shopping profiles, without accounting for the complications behind our motivations. You don’t just like high-backed chairs. You like blue high-backed chairs. These are on sale.