Issue 12, Volume 18
My pitch for why we need a better dialogue about data rights and algorithm regulation.
Netflix’s new documentary ‘The Social Dilemma’ is a critique of the immense power Facebook and Google (“the platforms”) hold over the lives of its users, and has generated a polarised public response. In the film's wake, many have deleted their social media accounts and have changed their notification settings to reduce the distraction presented by the platforms. Others, including Facebook itself, have labelled the film ‘sensationalist’, ‘distorted’ and lacking nuanced understanding of its technology. Facebook has also pointed to its self-regulatory efforts in curbing the issues presented in ‘The Social Dilemma’. 
Personally, I think it’s an overreaction to delete your social media account. The benefits of these platforms are enormous, especially during Covid. In fact, account deletion is not what the film advocates.  However, I’ve been surprised at the level of apathy and even resistance to accept many of the issues raised by ‘The Social Dilemma’ amongst friends who have seen it. I get the sense some consider it akin to an out there black mirror episode, or perhaps a new genre of barely-realist horror doco – I’ll leave you to the naming.
Granted, the film is dramatic. But I wouldn’t mistake a touch of artistic licence for a lack of substance. If you haven’t seen it, ‘The Social Dilemma’ powerfully illustrates how the harvesting of our data leads to addictive platform use, manipulative sales techniques and dissemination of hyperbolic and mis-information rather than balanced journalism (surely I’m not the only one triggered by that fa** news phrase). Closer to home, the ACCC just last year released a brooding report with 26 recommendations to government in response to many of these issues facing consumers – the vast majority which have not been adopted. 
Why then in talking with friends who have seen ‘The Social Dilemma’ do I often feel like a much less cool and charismatic Greta Thunberg speaking to world leaders about the urgency of greenhouse gas reduction to mitigate climate change?
What I’ve found is that without close attention, it can be quite difficult to put your finger on exactly how Google and Facebook differ from past discoveries like television and its adverts, and political and confirmation bias in print media.
A common obstacle to underweighting the significance of the film is a seeming misapprehension of the quantity of data being extracted from platform users. Of course, this is the quid pro quo of the advertising model. Per the film; “if you’re not paying for the product, you are the product”.
The ACCC’s Digital Platforms Inquiry found that consumers seriously underestimate the amount of the data being mined from their internet usage. And when they are made fully aware, individuals very often feel violated.  Moreover, the superior bargaining power of the platforms allows privacy protections to be presented on an unequal playing field. This includes bundled consents, take-it-or leave it terms and generally illusory protections  despite Facebook’s claims to have “strong privacy protections in place”.  Nothing illustrates this more than the 2016 US election Cambridge Analytica scandal, where the company boasted to have over 5000 data points on most American voters – this data being used for personalised messaging prior to the election.
Of course, data alone is harmless. It is what can be done with it that is chilling. This seems to be another point of misconception. Whilst most people I’ve spoken to understand that the risk here isn’t a big brother-esque engineer from Silicon valley sitting behind a computer tracking your questionable affinity for obscure subreddits, there does seem to be an underestimation of the predictive and thus manipulative power of AI as a tool for influencing the user’s consumer decision making and political persuasion. As a friend put to me, “what’s wrong with getting ads for fishing rods and travel destinations instead of something irrelevant to me like baby formula which keeps cropping up on my news feed?”.
Ostensibly, this is a fair question. However, this conception of big data marketing as a list of recommendations, as merely a presentation of choices, misses what in my opinion is the most insidious aspect of platform marketing, and I think the point ‘The Social Dilemma’ tries to make. Rather than targeting purely the moments just prior to a purchase decision any marketing 101 student will be able to tell you the consumer is susceptible at earlier points in the purchasing process. These earlier stages are before you start weighing up a purchase and include the need recognition and perhaps even need creation stages. Here, consumers are rarely aware of marketing efforts being made to them. Combine the increasingly powerful capabilities of machine learning and predictive analytics with subliminal messaging at early stages of the purchase process and you’re left with a pretty bleak outlook for the future of the consumer, and by extension, democratic autonomy.
To use an overly simplistic example, there is no reason why algorithms programmed for maximising the ultimate purchase are not learning what buttons to push for an individual (based on data collected of that person’s particular mood, location, political orientation…), to affect a purchase decision 2 or 3 or 10 years down the track. Here I say watch out mate, that baby formula may not be a mistake.
A key quote from the ‘The Social Dilemma’ is that the bargain of the advertising model is not really your data in and of itself, it’s your decisional autonomy. And I’m sure you’d agree, this crosses the line from marketing persuasion to marketing manipulation. Alarmingly, the same principles apply to a political-democratic setting.
‘The Social Dilemma’ also illustrates how hyperbolic and mis-information travels 6 times faster than more deliberative, more balanced journalism on the platforms.  Whilst it may not be the best for our perceptions of the world, humans can’t seem to stop looking at the metaphorical car accident on the side of the highway.  Given 70% of videos watched on YouTube are based on recommendations which in turn are based on algorithms that reward videos that receive the most attention and curiosity,  it’s not hard to see how a growing international group of people believe Donald Trump is the messiah saving the world from a cabal of satanic paedophiles - you need to get out from under your rock if you’re yet to discover the joys of a QAnon. 
At the risk of sounding too much like a doomsayer, I’ll get off my virtual milk crate on the corner of Flinders and Swanston, and I’ll cut myself off there.
Clearly, we need a greater discussion around the type of data we are willing to forego to the tech giants. As Jaron Lanier puts well in the film, there’s something wrong about needing a third party to pay to manipulate us just so we can communicate with others.
At least from the slice of reality the algorithms are feeding me, it seems we need more dialogue about how we regulate the tech giants. Some really fascinating ideas have been proposed including data rights, algorithmic moderators, independent misinformation watchdogs and even a movement away from the advertising business model entirely. 
What does it look like on your end?
Sam Lucas is a first year JD student.
 The Social Dilemma, Netflix