Disrupting the Disruptors: Technology, Politics, and Back-End Morality
Freddy Foks
Eric Giannella recently argued in the BJS that Silicon Valley’s faith in progress has led to an ‘amorality problem’: We are being sold an overly simplistic world of rational progress. But a more fundamental issue is at stake: we don’t know what we are being sold at all.
In a recent article in the BJS, Eric Giannella argued that Silicon Valley’s “amorality problem arises from the blind faith many place in progress.” However, the problem with Silicon Valley is not just that we are being sold an overly simplistic world of rational progress, but a much more fundamental issue: that we don’t know what we are being sold at all. If we’re going to enrich our moral intuitions then we’ll have to get a better sense of what those intuitions might be about in the first place. We will have to look behind the scenes at the back end of things.
Different kinds of ‘good’
Giannella’s argument goes something like this: the word ‘good’ can mean different things in different contexts. If I was going to buy some piece of tech—a new rifle, let’s say—I would be buying it to meet some need. Imagine that I’ve chosen this rifle because I reckon that it’s a better gun than the one I currently own. I have based this judgment on a couple of its attributes: its smaller size and its higher rate of fire. These qualities represent ‘progress’ on a number of levels. Yet, this would not mean that selling it to me would be ‘a good thing.’ Giannella applies this kind of argument to technology in general and effectively punctures the complacency of a discourse that equates ‘moral good’ with progress understood simply as increased efficiency. The gun’s uses, intended and unintended, raise a host of moral dilemmas. What is efficiency for and who benefits: whose progress and which rationality?
What is efficiency for and who benefits: whose progress and which rationality?
A similar case was made by the eighteenth century economist and moral philosopher Adam Smith when he wrote, “it seems impossible that the approbation of virtue should be a sentiment of the same kind with that by which we approve of a convenient and well-contrived building; or that we should have no other reason for praising a man than that for which we commend a chest of drawers.” Recognizing the distinctions between ‘good guns,’ ‘good people,’ and ‘good policies’ doesn’t solve our moral quandaries. Giannella provides a nuanced analysis of these distinctions in relation to Silicon Valley. Placing an emphasis on cultivating our “moral intuitions,” he wants us to avoid the ‘short-circuiting’ of inevitably messy ethical discussions in order to make us think more seriously about the kind of social world we want to create. A similar point was made in the classic paper ‘Do Artefacts have Politics’ by Langdon Winner.[1] However, Winner’s and Giannella’s arguments share a common problem rooted in their shared view of ‘tech.’ The ‘artefacts’ of the new age are not simply tools. Silicon Valley sells Information Technology. Information can be used for many different things.
Our capacities for judging whether a thing is good or not clearly depends, in part, on how we think the thing is going to be used and what its human consequences will be. On this point Giannella’s argument is useful. But Silicon Valley’s “amorality problem” lies as much in its talk of progress as in its handling of information. How is it that we have so little information about our information economy? Who holds this information, what is it, and what is it used for? Giannella inadvertently points to the importance of these questions in the examples of successful regulation he cites. Brian Mayer was lambasted for his app once people realized what it was doing. The company 23andMe was censured by the FDA, and banned from selling its product in the United States, when the consequences of its services were realized. Both cases demonstrate that critiques of technology can be leveled once information is available about what the service is and when we can begin to see how it might be used. Knowing what a technology will be used for is premised on the ability to know, or guess, what the technology is. Giannella cites examples of genetic-screening services. Langdon Winner discussed tomato-picking machines and highway overpasses. Yet if we keep treating connective media as ontologically identical to non-, or minimally-, connective commodities, we will continue to have poor grounds for moral reasoning, which no deflation of ‘progress’ discourse can begin to remedy.
Marx’s cherry tree, or two kinds of moral reasoning
On this count, I want to distinguish between two kinds of moral reasoning. I call these two modes of enquiry the moral front end and the moral back end. Attending to both front and back ends might give us a better way to grapple with Silicon Valley’s amorality problem. We discuss the moral front end of a thing or activity when we discuss the morality of its uses based on immediate information. The moral back end is only available once we have additional information about what something is, how it came into being, and what it might be hiding within it. Both front and back ends make moral intuitions available.
This distinction is similar to a point made by Karl Marx in the German Ideology. Marx’s criticism of Ludwig Feuerbach’s epistemology lay in what we might call today ‘social ontology.’ In a key passage he raises the question of what philosophical quandaries might be generated by perceiving a cherry tree in the garden. To Feuerbach, cast as a naive empiricist, problems of perception may be posed in the language of ‘essences’ and ‘representations.’ The empiricist may also be concerned about the tree’s uses: how we might chop it up, distribute it, or make something from it. The moral dilemmas here are all about surfaces, and uses.
Knowing what something is involves knowing how it came into being.
To Marx, these epistemological questions are only partly useful. He wants us to focus on history and economics: “The cherry-tree, like almost all fruit-trees, was, as is well known, only a few centuries ago transplanted by commerce into our zone, and therefore only by this action of a definite society in a definite age it has become ‘sensuous certainty’.” Knowing what something is involves knowing how it came into being. To be sure, we can discuss how to use the tree’s trunk and branches after cutting it down without being able to answer why it came to be planted in the back garden in the first place. Something can be available for discussion without having a full knowledge of what it is. However, Marx’s point suggests that knowing about something’s human origins, even something as seemingly ‘natural’ as a cherry tree, makes better discussion and richer moral intuitions available.
A different example might bring this home more clearly. Let’s say that I want to buy a new pair of sneakers. There seem to be no moral front-end problems here, unlike in the case of the rifle above. I will use the sneakers to play sports, look great, and appear fashionable. However, a different kind of moral claim may be lodged against the shoes. I am about to buy the pair when a friend stops me and says that the firm, whose trade-mark is swooshed along the side of the shoe, is well-known for employing desperately poor workers in near sweatshop conditions. What I thought I was buying was a pair of shoes to go jogging in, but what I seem to be buying now is a part in the exploitation of workers. My response to the sudden appearance of this moral back end can go a number of ways: I could choose to buy another pair of sneakers, I could organize boycotts of products made in sweatshops, or I could argue that buying the shoes means that the workers gain some share in their value as a result of trickle-down economics. Whatever. The point is that my friend’s intervention enabled me to see, like Marx’s cherry-tree, that what I thought of as a clear decision about objects and utility—running comfortably, looking good, showing off—involved a significant and hitherto hidden moral hazard. More information allows us to generate richer moral intuitions. With this in mind, what kind of answers might be generated by the questions: ‘What is an iPhone?’ or ‘What is a social network?’
The question of moral front ends and moral back ends is the difference between imagination and knowledge; between surfaces and origins. An immediate criticism can be lodged against my division between front and back ends. There is, strictly, no clear-cut distinction here that holds in all cases. For instance, I may be eminently well informed on a specific subject, commodity, or process. In this case there may be an open door between front-end and back-end morality, with very little distinction drawn between how something comes into being and the large number of immediately available imaginative moral situations that a technology, product, or policy might present to me for comment and debate. If I am an expert on a thing—its production, exchange, and uses—there may be a large range of possible moral intuitions I can immediately experience without recourse to additional information. Marx may have had much to say about the Cherry tree in his garden without needing to be told anything about the economic history of southern Germany—he was an expert. His moral intuitions had been sufficiently cultivated by a certain way of thinking about the world to recognize in the object of analysis a number of moral quandaries relating to proletarianization, labor, capital, and politics. On the other hand, I may be overtly imaginative and able to construct multiple possible alternate social worlds on the basis of a small smattering of facts and a large deal of ethical creativity—I may be a science fiction writer.
In both limit cases, however—the expert and the creative savant—attending to back-end morality is part of the process; a frame of mind which we know from experience can lead to many new moral intuitions. There is always the possibility that the appearance of a seemingly clear ethical situation may be concealing a back door—hidden until new knowledge emerges or is introduced into the public domain. Then more back ends—hitherto unseen and unimagined—can be delineated and brought to light to provoke new moral intuitions. This was Marx’s critique of Feuerbach’s epistemology. This is why Edward Snowden felt compelled to reveal the structure of America’s national security state. We need an account of social forces, intentions, and economic processes before we can understand an artifact like a cherry tree. When information about the back end of things is not readily available, and, even more so, when information is made unavailable, about what a thing might be and how it came into being, the distinction between front- and back-end reasoning becomes more closed off; the architecture of moral reasoning more baroque. If we attend to surfaces we might be left with shallow discussions.
Silicon Valley, technology and back-end morality
Anyone who has done a job search in tech, or who talks to friends in the sector, will have heard of roles like: “software engineer for new youth-focused website. Front-end new media, back-end data capture, market metrics.” Front-end disruption, back-end consolidation. The Janus-face of new communications technology has long been recognized. In an essay published in 1953, sociologist Karl Mannheim made the argument that communications networks—e.g., roads, mail-systems, and printing presses—have an ineluctably centripetal effect, drawing power closer into itself. The Feudal state, Mannheim thought, was so limited and dispersed, despite the violence and megalomaniac intent of its ruling class, because of the terrible roads, slow water transport, and the impoverished communications technologies that existed in the Middle Ages. Rule is only exercised over constituencies that can be brought into being through some kind of representative and communicative function.
Power, Mannheim wrote, has an inherent “tendenc[y] towards concentration.”[2] Twentieth-century worriers about telephone and radio and television knew that the increase of easy leisure meant the dispersal of a certain kind of political communication on the soapbox and in the public hall. What will it mean for our democracies when advertising companies own our voting records; when where we go, what we read, and who we meet are the property of bodies with opaque interests, or interests as banal as trying to sell us new televisions and telephones and radios?
For the generation of values we rely on separate spheres of exchange and labor and of work and leisure. We rely for the protection of our liberties on hard-fought distinctions between power and politics and money. How will the recurrent scandals that beset public institutions corrupted by profit look in contrast to the quickly arriving world in which profit-seeking organizations, aided and abetted by the most coercive arms of state power, capture the grounds of potential for democratic citizenship itself? Is this the new dawn of the age of Silicon Valley? Is this its “amorality problem”?
I may use my phone to make calls to friends, but the phone’s creators may simultaneously be using my phone calls to build a picture of my patterns of consumption…
Giannella’s article presumes that ‘tech’ is something obvious, something that has multiple uses; that we have to know how to use it to build a better world. But an iPhone is not simply a tool, like a pair of shoes or a gun is a tool. Its functions are dispersed beyond the uses its users make use of it for. Its uses extend beyond its surface. While Marx pointed out that the cherry tree was, ontologically, minimally-historically-connective—connecting constituencies of traders and workers in historical relations—information technology is maximally-simultaneously-connective—connecting multiple constituencies at the same time.[3] For instance, I may use my phone to make calls to friends, but the phone’s creators may simultaneously be using my phone calls to build a picture of my patterns of consumption, or sexual, or political, persuasions—conceived as patterns of consumption—or any other number of uses. This state of affairs is itself the result of the concretization of layers of various, sometimes contradictory, historical and political conjunctures: the laying of undersea cables, orbiting satellites, and engagements in grand strategy. Individual items of ‘technology’ nest in webs of connective tissue. A smartphone has a multitude of moral back ends.
To pull back from the global to the local, a particularly jarring example of contemporary ontological mystification is the iPhone flashlight app developed by a company called iHandy. The front end was all about utility for the user. The back end was all about utility for the data gatherers. What looked like a helpful tool with an obvious function for the end user—a flashlight on your phone—was also a helpful tool with an utterly different function for the producer. There was a firewall between these uses and ends, or to put it better, perhaps, a one-way mirror. The app’s designers could see through the back end into the front, but the users could not. Only additional information and journalistic reporting meant that users could make informed choices about the app. When tools like flashlights end up hiding surveillance technology in them, we should recognize that we have entered a strange new age, an age characterized by almost limitless information stored out of sight.
We are beginning to get a better sense of the politics and economics of Silicon Valley. What is becoming increasingly clear is that nothing is as it seems. This is because when we buy Silicon Valley’s goods we do not simply buy things, but services. In Michel Callon’s terms, they ‘translate’ between constituencies of agencies simultaneously.[4] This makes them different from Marx’s cherry tree in which relations are crystallized in history. History is now and America—to bowdlerize T.S. Eliot. The things we use, wear, and manipulate are connective devices. They are not widgets. They are not like other machines. They are multimedia nodes, whose uses we do not own, masquerading as useful things we are in control of. Once embedded in the network, a phone or computer’s functions are never wholly ours. This means that we are, at least minimally, enmeshed in the intentions, purposes, and motives of a set of opaque agencies. It seems unlikely that ‘progress’ is the right way to describe this world—it might be, but it might not.
The indeterminacy of our media age means that the back ends of many of the things we confront are closed and locked and unavailable for discussion. In fact, we have very little idea about what kinds of back ends might exist at all. How are moral intuitions meant to be provoked in this case? When confronting these new technologies we are not in clear possession of our ethical faculties.
Criticizing the discourse of ‘progress’ is a helpful start, but understanding why we could even begin to believe this description of the social reality of ‘tech’ is surely the real problem.
As Evgeny Morozov has recently pointed out, many works of tech criticism, Giannella’s included, contain their own idea of historical development—a story, not of progress, but decline. In Giannella’s case it is Max Weber’s ideas about modernity and bureaucratization and the Frankfurt School’s notions of Enlightenment and decay that he uses to argue against the slick techno-speak of Silicon Valley. The role of the expert, the Silicon Valley venture capitalist or the critical social scientist, is to understand these historical processes and exploit them for rhetorical capital. Both rely on a meta-history, to use Hayden White’s term, that presumes some process or rationality at work in history itself. Yet if we really face a world of endless back ends, neither Max Weber nor Silicon Valley’s brand of apolitical libertarian pap will help us much. We see a cherry tree in the garden and have no account of how or why it got there. As the political theorists Quentin Skinner and Philip Pettit have persuasively argued, there is a name for the political class that is beholden to such hidden, arbitrary, power: slaves.[5]
Political ontologies
To back-peddle a bit from the dire conclusion to the previous section, Pettit and Skinner’s republican theory of liberty concerns the arbitrary subjection of persons to sovereign rule. I may be arbitrarily subjected to a domineering parent or partner, and thus seemingly enslaved, but my political status as a free citizen trumps this state of affairs and renders my subjection illegal and not a question of political ontology, i.e., the kind of citizen I am. Yet if the arbitrary opacity of intent begins to blur into the grounds of democratic citizenship itself—e.g., free association, secret ballot, and free speech—then we may need to reconsider the basis of our political statuses.
The convergence of state power and individualized technology is not new. Legal instruments have always mediated these relationships. The spaces for political contestation, economic production, and communal distribution have always been sites of debate, liable to provoke moral sentiments—often anger—and have led to the contracting of multiple parties within the settlements of law and custom. Putting in place robust legislation for data protection was considered a commonsense response to the intuition that data are sensitive. The previous few years have seen this moral intuition thrown out repeatedly because we should trust tech companies. Putting trust in power, concentrations of power, and agencies of powerful bodies, is antithetical to democratic, and certainly liberal, citizenship. Liberals, historically speaking, have been intent on busting trusts.
There are signs that political communities are beginning to wake up to the reorganizations of liberty, law, and economy that are being swiftly secreted through the fiber-optic cables of the world.
There are signs that political communities are beginning to wake up to the reorganizations of liberty, law, and economy that are being swiftly secreted through the fiber-optic cables of the world. Recent discussions in the European Parliament about the virtual monopoly of Google in the world of web search is a beginning. The British political party the Liberal Democrats have proposed the creation of a digital bill of rights in their election manifesto. Despite this progress, we have yet to get a good conceptual grip on the production of the new media that seeks to provide information, money, and jobs in relationships that subvert, minimize, or ignore traditional legal jurisdictions. In Langdon Winner’s terms, we are in a state of considerable technological “flexibility.” Yet unlike the technology he discussed, it is not clear that biometric data, data-mining, and online voting are of the same ontological status as highways, casting-moulds, and agricultural machinery. It seems like the technology that is so flexible today is the technology of governance itself.
In his recent inaugural lecture “Political Theory and Real Politics in the Age of the Internet,” Professor of Politics David Runciman said that we are waiting for the Thomas Hobbes of the Internet age. This is a rather big ask. While it would be nice for our age to birth a thinker like Hobbes to grapple with the politics of the new total media age, in the meantime we need to begin to think of ways to legislate and disrupt the disruptors so that, as Giannella urges, the new technology is bent to the peoples’ will rather than the people to the tech giants.
A healthy dose of skepticism about ‘progress’ is a beginning. But it only gets us so far. Positing other meta-narratives about ‘modernity,’ ‘hyper-modernity,’ or ‘rationalization’ in response can only ever be partially helpful in understanding our world. Weber’s ‘Puritan Ethic’ may tell us something about the mentalité of tech-workers but it tells us little about their power. Replying to meta-historical accounts of rationalization or modernization, like Giannella’s, by arguing that we need to attend to back ends does not necessarily mean falling into fears of the Loch Ness monster “lurking beneath the surface,” as Andrea Denhoed put it last year in The New Yorker. Back ends can always be pried open. Unlocked, and cleared out, however, they may prove to be hiding rather politically embarrassing piles of paperwork and mess; structures of power that might look imperial rather than national, economic rather than political, confused rather than rational.[6] Opening them up to inspection would require significant judicial or political will.
If Silicon Valley has an amorality problem it lies in its self-belief that it is inaugurating a politics of democracy while keeping the sinews of its power hidden from view behind empty words and the rhetoric of not being evil. This is not a problem of progress. Moral intuitions have become separated from the messy world of action in which moral practice must be brought to bear. The role of the expert must be to adopt the role of my friend in the sportswear store. Rather than pose more meta-histories, we need more facts. Back ends need to be opened up so that we can begin informed public discussions. Whether we will then see connective technologies as things to be heavily regulated, like guns, or minimally regulated, like sneakers, can be debated. For the time being, the kinds of moral intuitions provoked by assessing the back ends of the Valley’s corporations are impossible to ascertain. We have so little information that we are left guessing. As Dave Eggers’s book sales suggest, this is good for the science fiction trade. It is not good for progressive politics.
References and Footnotes
- Winner, Langdon. 1980. “Do Artifacts Have Politics.” Daedalus, 109(1):121-136 ↩
- Mannheim, Karl. 1953. “Planned Society and the Problem of Human Personality” in Karl Mannheim ed. Paul Keckskemeti Essays on Sociology and Social Psychology. London: Routledge & Kegan Paul. P.258 ↩
- I am indebted here to the discussion about ‘representational media’ and ‘connective media’ in David Trotter’s Literature in the First Media Age (Harvard University Press, 2013) Pp.7-8 ↩
- Callon, Michel. 1986. “Some Elements of a Sociology of Translation: Domestication of the Scallops and the Fishermen of St Brieuc Bay” in ed. J. Law, Power, Action and Belief: A New Sociology of Knowledge? Routledge ↩
- Pettit, Philip. 2002. “Keeping Republican Freedom Simple: On a Difference with Quentin Skinner.” Political Theory 30(3):339-356 ↩
- On the weighing of these claims Edward Snowden’s views are: imperial, economic-cum-political, and confused, see Laura Poitras’s documentary Citizen Four (2014) ↩