THE ENDGAME OF THE SILICON VALLEY BUSINESS MODEL

“YOU DON’T HAVE TO BE ABUSED TO USE THESE SERVICES”

DAVID CARROLL

David Carroll won the battle, but still hasn't been able to get his data back. A paradox that points to to the power structure of the digital economy. “It is very similar to climate change – people feel an individualised responsibility to fix something that is a collective action problem. The harm occurs collectively not individually – that is one of the problems that countered the way that my story has over-influenced the individual at the expense of articulating the collective harm and over-complicates this, like the duty of the individual to solve these problems in the same way – as though your consumer choices are going to save the world. No! As an individual, you cannot do that. Like the privacy and the data abuse and children and digital media – it is the same: you, as a parent, can do what you can for your kid, but you can’t solve these problems in your family.”

                 

The Silicon Valley driven culture shift

One example of this is how Sillicon Valley had a major impact on a global social, political and cultural shifts - changing not only our democratic concepts of political systems but also behaviors, beliefs, and even dreams. Carroll mentions how young people increasingly see becoming an influencer as a career path and even the most desirable career path. “However, the issue is that you essentially become an advertiser. It really is the terrifying endgame of the Silicon Valley business model – that everyone becomes an advertiser, selling themselves”. He quotes Douglas Rushkoff (“Generation Like”) who predicted just how young people no longer would have the idea of being “a sell-out” and how the vision that everything is for sale and whatever can be monetisable should be monetisable would succed. “As soon as you get to the point when a generation grows up and can’t even understand the idea of being a sell-out – then we’ve really seen the culture shift. And that is what worries me so much about the different threads of parental anxiety, whether it be that the kids just spending too much time on screens or that they’re falling down disinformation rabbit holes, or that they are adopting this idea that they have to monetise their lives in order to succeed in society. That one is the scariest one to me. And that is the one that the platforms and the UXes and the business models most engender in the most insidious way. The idea that children are the greenfield in the market just fit perfectly with the marketer mindset”.

The Cult Of Innovation

In recent years, Carroll has been preoccupied with the clash between national and transnational structures and cultures within technological development: “In the US, the understanding of privacy comes from private property and the so-called “right to be left alone”. But also, from the anti-regulatory political machine, a sort of cult of innovation – all of these things are in play to create a reason why we don’t have a privacy right and regime in the US. It comes from a different kind of philosophy about it, especially compared to the EU, which has the right to data protection in its charter and it comes from a different understanding of privacy as an expression of human dignity rather than the right to be left alone. And if we look at how the US has passed very narrow, specifically-tailored privacy laws compared to the EU’s horizontal, general and broad ones – it derives from the EU’s charter and the reality that the EU is a relatively new political architecture, and so it reflects the realisation that 25 years ago people realised that data protection needed to be a fundamental human right – whereas it hasn’t dawned on Americans yet. Part of it is just that we have taken a flawed approach to regulation”.

In his research and teaching at The New School in New York, Carroll is interested in the philosophical and cultural dimensions of the handling of the transnational tech industry’s approach to, for example, privacy, by national legislators: “I co-teach it with a co-faculty member, Melanie Crean, who is herself an artist with social practice. I sort of take from the industry and policy side and she takes it from a philosophical and critical, art practice side. We’ve organised the class around four dominant global models, so we look at the US, the EU, India and China – each has their own philosophy of privacy and relationship to the state and the firm and civil rights, human rights, etc. And how the technologies are implementing those national or transnational frameworks. And it helps us navigate as to why things are the way they are – so, in the case of the EU contrast against the US, it provides a rich place to figure out what the tension is between Brussels being the default regulator of Silicon Valley”.

The territoriality of data

For Carroll, China represents a third model. One demarcated by a separate Internet and an industry that exists in parallel to, but increasingly connected to, the rest of the world via popular platforms, such as TikTok, and products, such as Huawei. Despite the isolation, Chinese tech companies have managed to occupy a powerful global position. India has, for instance, long tried to prevent Chinese influence, and in June 2020, the government banned 59 Chinese-made apps for mobile phones. In recent years, India has generally focused on data protection and, as early as during the 2000s, implemented an “Information Technology Act”, a committee comparable to the European Union’s General Data Protection Regulation: “India is interesting with the adaption of the Information Technology Act and the problems that it has caused – even with its well intentions and interesting supreme court decisions there about the right to privacy”. Carroll explains: “So, each place gives us a model to understand and to show the existing frameworks that are dominant and how they compete against each other – and what the rivalries are between their approaches; how they influence each other – they are not totally separate. They coexist and influence each other”.

These four models signify different cultural ideas: “In the US, the idea is that a University is a privacy-regulated industry because of the Family Educational Rights and Privacy Act (FERPA), which protects student privacy in a very particular sort of ways. And the same when you go to your doctor or the hospital, you have privacy rights. And the COPPA law gives minors under 13 privacy rights. So, these are very narrow situations where the US is protecting privacy. And how that contrasts with the idea of data protection. It’s an expression of the political and cultural values of not wanting to overregulate – of being so scared of overregulating that we create these very narrow regulations; that are protective in certain contexts and they do have positive effects on society, but we do not deal with the problem of data ignoring these boundaries; it ignores the boundaries of situations, of geography, etc”.

In this case, the Cambridge Analytica story is really useful because it’s a narrative that expresses the boundaries between the US’s and the EU’s philosophy. “It is an example of the territoriality of data – that is, where data is processed matters to the jurisdiction – and it also shows how – because of that territoriality – US citizens ended up with a right that they wouldn’t have normally – so, it is a weird way of showing our own failure to protect ourselves and in ways that our allies have”.

You don’t have to be abused to use these services

According to Carrol, whether it’s the California privacy laws shaking things up or further enforcement of GDPR – it is going to have a positive effect on reducing data collection “and also inform people of their rights – and sort of change the attitude that ... you don’t have to be abused to use these services. The whole position of that is: we abuse you and you get to use these things. And we are seeing a recalibration of that in positive ways. So, in terms of that, I’m feeling optimistic. Especially against the insidious nation business model – things are happening fast. I think, in a way, the Cambridge Analytica story is a narrative that sits on top of these other developments – they are disconnected but they are showing influence. I feel optimistic, seeing these things happening”.

Data is known as the new oil, and, according to Carroll, the hunt for data is gradually permeating most industries: “Whereas before the attention economy was subservient to other economies – we have crossed the threshold in which the only economy that matters is the attention economy. That is part of what drives people to see it as a career path – society has privileged this kind of merchandising over other kinds. And all the other kinds are at its mercy – meaning all business is conducted through attention merchandising. So, it’s hard to unravel but just the idea of marketising everything – that everything is an algorithmically-driven marketplace, on the frontend by UX and on the backend by data pools and algorithms to do highspeed training. It is really the financialisation – so attention is on the top level, below that is financialisation. So, it’s basically the Wall Street model pervasive through Silicon Valley”.

Awareness of code as a mediating force

The awareness of how academia must engage with the business world and how the business world needs to be challenged by academia is a paradox that caught Carroll’s attention.  “I think, it is actually a central problem in academia’s approach to new technology, such as Big Data and AI, that the different research fields exist in research bubbles, competing for funding instead of looking for multidisciplinary approaches. Because you cannot isolate the issues that we are now dealing with – you need to look at them from a more transdisciplinary perspective,” David Carrol states. “And, I think, coming from an art and design institution where the positionality of the designer, in particular, this 360 degree view – the designer is a transdisciplinary character in the play of business, and so, we are attuned to seeing things from different perspectives, and being able to see how different disciplines interconnect to make sense of it means to be a designer. In particular, with tech, and the sort of ethical issues and political issues around tech – it’s the role of the designer as the person who is building these infrastructures”.

However, the designer’s perspective is not sufficient in a tech industry that does not yet have a moral code. Many design choices are made and there is no ethical framework on which to base choices and decisions. According to David Carroll, a crucial point is: “how can designers become aware of what they are doing because some of the manipulation is right on the surface; it is right in the UI (User Interface)”.

He describes the methods he uses in his approach to revealing the Cambridge Analytica scandal as an art-based critical thinking within the UX-design process: “Art is about asking better questions and design is about trying to answer those – and that is a good argument for why they need to co-exist. In many ways, the artworks are being critical through the questioning that they are putting forward, and when it comes to the design side, we try to imbue it with criticality as well and critique solutionism as this impulse. It’s like a reflex in questioning the assumptions that the designer put into things, that may not be prove out. So, a lot of it is about having the designer approach a problem with the right kind of positionality to avoid some of the mistakes that are probably made, especially with engineering solutionism – we try to have designers who can do what an engineer can do but don’t take the approach of the engineer. The engineering school is about how do I figure this out, whether or not it should be irrelevant.  The designer is more inquisitive about the implication of why things should be in a certain way. So, we try to encourage that. That’s how we can create more mindful outcomes – but it’s not easy”.  

He talks about the 1000 Cell Phones project, which exposes the invisible conversation that constantly occurs between the networked devices we carry throughout our nomadic urban daily life. As Carroll puts it: “Making the invisible visible is what art can do better than most things. As software designers, we need to understand the tools we use. You have to understand how code is a mediating force – the critical and ethical perspectives are built into the curriculum and pedagogy of the programme always – so, it is out of that foundation that we understand many of the art projects, art practices and our form of research: researching and interrogating the ideas in ways that traditional research can’t achieve. And, in some ways, makes them more accessible through narratives, even if they are critical design narratives. It helps us unpack some of the challenging aspects of the technology that is difficult to illuminate otherwise. So, it’s very common for most of the students’ works to end up being like this. And because the students are engaged in both computational and digital fabrication, they naturally see everything as a raw material that way, so it does help. The small amount of exhibited artwork that I’ve done has been the same kind of idea – how do you take an infrastructure of technology to make an art piece to then illuminate the issues around it?”

This interview has been lightly edited for clarity and length.

Screenshot and trailer from the NETFLIX Documentary The Great Hack.

Previous
Previous

Online games gamble with children's data

Next
Next

How Children’s Rights Apply in the Digital Environment