Luke Shadwell – 31st March 2023. Twitter / LinkedIn / Portfolio
Ever since the very idea of artificial intelligence (AI) touched down on a silicon valley boardroom whiteboard somewhere in California decades ago, the narrative on what it is, what it’s capable of and how it will shape our lives has been harnessed by marketers and product managers from start-ups, big-tech companies, consultancies, and anyone who sees value in touting it’s benefits.
It didn’t take long for relatively basic systems to be lumped in with machine learning, neural nets and complex learning models, seemingly anything that wasn’t static content was suddenly worthy of this ‘AI’ title. For most of my own life, despite AI being at heart very much a complex piece of engineering terminology, the people deciding when and where it should be used have not been engineers. In the public consciousness the term has lost near all true meaning.
Then came the product managers in charge of large content repositories, like Spotify, Facebook and YouTube. They all wanted in on this magical, not technological, phenomenon of AI. No, our product is not a tool, it’s not a repository of information, it’s a recommendation algorithm, a program that works alongside the user to tell them what they should want. This idea that AI could magically find what the user truly wanted deep down better than the user themselves dominated these products. Once you’re in on the AI idea, it becomes your sole focus. Who cares what the user wants or thinks? What matters are these KPIs we’ve decided on, and creating the right systems that supposedly tell us the user is having a good experience.

Take Spotify’s shuffle algorithm for example:
2014: The algorithm was introduced. It was truly randomised. It took your playlist and at random, gave you songs from it. As described.
2017: Supposed complaints from users about songs playing over and over again (which happens with true random) prompted changes to make the algorithm ‘more’ random and reduce the chances that users would hear the same song over and over again.
2020: Seemingly unprompted, Spotify makes a further change. This one deviates from even the description of what the button does, and is borderline misleading. Spotify now introduces ‘personalisation’ to it’s shuffle algorithm. Now it’s based on the user’s listening habits.
2022: Spotify now introduces ‘Supermix’, a shuffle that introduces songs that weren’t even included in your playlist. In an effort to reduce backlash and make this seem like a positive change, they frame it as giving the user more freedom and being more transparent about their algorithm. They’re so kind, that they’re not just going to force you to use this ‘shuffle’ button that is absolutely not a shuffle, they’re going to hide away in the UI a way to turn off this and go back to their old shuffle, which is still personalised, and not a real shuffle.
In Spotify’s case, there’s a common theme in their development of these features. Everything is purposefully obscured. The idea here is clearly not about giving users more power to decide what they want to do, it’s about making that decision for them. There was and never has been an option to roll back to the original shuffle, or to customise it. You’re stuck with it. Because their KPIs tell them it’s performing well.
At it’s heart it’s a power trip. Product people have decided that they know better than you and every other user what you want to do, how to do it, and what power you should have. It’s very blatantly anti-user freedom and autonomy.
Next is the very well known case of Google. In it’s early days as a search engine it’s purpose was simple; show users webpages with the content that they are searching for in them, and use backlinks to create a kind of trust score of how reliable those websites are. This, just like with Spotify, gave the power to users. I as the user know exactly how this tool works, what it’s doing and how I can use it to get what I want. If it doesn’t give me that first time, I can keep trying: different queries, look deep into the results, whatever.
Taking out the introduction of PageRank and AdWords, which I think are insignificant here, let’s take a look at Google’s timeline too:
2004: Google introduces the Panda algorithm that penalises websites with low quality content
2007: Penguin algorithm, penalises spam content
2011: A key point, Google introduces the Hummingbird algorithm; designed to ‘better understand the meaning of search queries’. In order to increase relevance to what the user is really looking for – not what they asked for, or wrote, but what they want deep down without knowing it.
2015: RankBrain algorithm, an improvement to PageRank to rank search results.
2018: BERT algorithm, natural language processing (AI) that again helps understand the ‘meaning’ of search queries.
2020: A raw engineering and data change that introduced Core Web Vitals, essentially stats on how readable and fast the website is.
Now, what happened in this time? Almost universally users of Google will tell you that search results have got worse: less relevant, less useful, it’s harder to find things you want. Many have taken to just putting ‘reddit’ before or after a search query to get to actual answers on a topic rather than a whole page of the same useless SEO capturing spam that Google’s algorithms have decided are best for the user, with no input from the user themselves.
Where is the user autonomy in using these tools with these changes? It’s not there. Google is better than some, having search operators like filetypes and exact string matches, but your power over what it does is still massively lacking.
Google uses the same obscurity in their changes as Spotify, and is arguably the worst offender when it comes to hiding results behind the ‘machine learning’ or ‘AI’ excuse that even they don’t know how it works, so it’s useless responses aren’t their fault. This came up publicly with the outcry at YouTube recommending inappropriate and violent content in so-called ‘Rabbit Holes’, where the algorithm would continuously recommend a user this same content.
“The major finding is, some of those feelings people had expressed of not being in control — that was actually born out in the data, By and large, a lot of unwanted videos do slip through.”
Forbes
Still, through these controversies, YouTube and Google constantly insist that they don’t know how things work, that they’re going to make changes to improve things with no explanation of what they actually are, and no input or ability to alter these changes from the user.
In 2019, during the heat of this backlash, Youtube released an article ‘Continuing our work to improve recommendations on YouTube’ where they say:
“we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways”
There’s no information on what this actually means, how it’s going to be done, what effect it could have on other content that gets caught up in this accidentally. They do say
“As always, people can still access all videos that comply with our Community Guidelines and, when relevant, these videos may appear in recommendations for channel subscribers and in search results“
‘Where relevant’, what does that mean? Do I get any information, or god forbid, any say in what that is? You’re not telling me I’ll be able to find videos on search, you’re telling me that even your search algorithm, the one thing that supposedly allows access to all this content, is also curated. It’s also restricted to what Youtube thinks I want to see, not what I’ve instructed it to do, what I think it wants.
OpenAI’s approach with ChatGPT has destroyed this entire notion, and thrown Silicon Valley into disarray.
In late November 2022 ChatGPT was released. It’s very clearly AI-centred as a piece of software, I mean, it is literally an AI chatbot. It blew up, everyone is using it. It’s the fastest growing software released, ever. If we knew nothing about it and going off how AI is generally treated by big tech, how would we expect this to work?
If product managers at Google or Spotify had got their hands on it, my view is that it would’ve been a completely different story. You can actually already see this with Google’s late entry to the party despite their claims they’ve been way ahead on AI for years. Bard, their chatbot, doesn’t even allow you to generate code yet. Not only that, you have to apply to be able to use it. It’s purposefully restricted. It’s much less responsive to requests for alterations to it’s responses, it doesn’t want to do what it’s been told it shouldn’t. Someone has decided that they or an algorithm they have written can decide better than users what they want, again, and this is in a piece of software released to fight ChatGPT.
In contrast, ChatGPT does whatever you want it to do, with the exception of frantic and overboard efforts to prevent it saying anything inappropriate or offensive, it does whatever you ask. Are users overwhelmed by this? Is it too much for them? Do they tune out and close the software? No. Of course they don’t. Because as has been pretty clear to anyone who’s actually used a piece of software, giving power to users, when properly considered, is much better than taking it away. Closing up the box of potentials and possibilities limits the people who have true inspiration to create something great. They’ve motivated. They’ll do whatever it takes, just give them the tools they need.
This is a true wake up call, and the companies that don’t take notice of this will fail. The age of making choices for users in general, and especially around tools as powerful as AI, must end. Elon Musk for all his faults and mistakes with Twitter, has already vowed to make the algorithm public and to introduce a recommendation algorithm marketplace. My view is that this is going to be more successful than anyone could’ve imagined. The people looking for powerful search and recommendation tools are not the researchers and the competitors you think them to be, they are you and me. Everyday users. Everyone wants to be a power user at heart, let them. Your software is a tool, it’s not an experience, it’s not a show, it’s a tool.
Leave a Reply