10/27/15

Finn Brunton and Helen Nissenbaum - To the toolkit of privacy protecting techniques and projects, they propose adding obfuscation: the deliberate use of ambiguous, confusing, or misleading information to interfere with surveillance and data collection projects


Finn Brunton and Helen Nissenbaum, Obfuscation: A User's Guide for Privacy and Protest, The MIT Press, 2015.


 
With Obfuscation, Finn Brunton and Helen Nissenbaum mean to start a revolution. They are calling us not to the barricades but to our computers, offering us ways to fight today's pervasive digital surveillance - the collection of our data by governments, corporations, advertisers, and hackers. To the toolkit of privacy protecting techniques and projects, they propose adding obfuscation: the deliberate use of ambiguous, confusing, or misleading information to interfere with surveillance and data collection projects. Brunton and Nissenbaum provide tools and a rationale for evasion, noncompliance, refusal, even sabotage -- especially for average users, those of us not in a position to opt out or exert control over data about ourselves. Obfuscation will teach users to push back, software developers to keep their user data safe, and policy makers to gather data without misusing it.Brunton and Nissenbaum present a guide to the forms and formats that obfuscation has taken and explain how to craft its implementation to suit the goal and the adversary. They describe a series of historical and contemporary examples, including radar chaff deployed by World War II pilots, Twitter bots that hobbled the social media strategy of popular protest movements, and software that can camouflage users' search queries and stymie online advertising. They go on to consider obfuscation in more general terms, discussing why obfuscation is necessary, whether it is justified, how it works, and how it can be integrated with other privacy practices and technologies.

By mapping out obfuscation tools, practices, and goals, Brunton and Nissenbaum provide a valuable framework for understanding how people seek to achieve privacy and control in a data-soaked world. This important book is essential for anyone trying to understand why people resist and challenge tech norms, including policymakers, engineers, and users of technology - danah boyd


Obfuscation is an intelligently written handbook for subversives. I found the historical examples fascinating and the ethical discussion thought-provoking. - Lorrie Faith Cranor


This book presents a fascinating collection of examples of decoys, camouflage, and information hiding from the human and animal worlds, with a discussion of how such techniques can be used in applications from privacy online through search optimization to propaganda and deception. It leads to discussion of informational justice, and the extent to which camouflage can perhaps help people hide in plain sight online. - Ross Anderson




When downloading an app or approving a software update, I now usually hesitate for a moment to consider something the comedian John Oliver said early this summer: a software company could include the entire text of Mein Kampf in the user agreement and people would still click the “agree” button.
“Hesitates” is the wrong word for something that happens in a fraction of a second. It’s not as if I ever scrolled back through to make sure that, say, Microsoft is not declaring that it owns the copyright to everything written in OneNote or Word. The fine print goes on for miles, and anyway, a user agreement is typically an all-or-nothing proposition. Clicking “agree” is less a matter of trust than of resignation.
But then, that’s true about far more of life in the contemporary digital surround than most of us would ever want to consider. Every time you buy something online, place a cell phone call, send or receive a text message or email, or use a search engine (to make the list no longer nor more embarrassing than that), it is with a likelihood, verging on certainty, that the activity has been logged somewhere -- with varying degrees of detail and in ways that might render the information traceable directly back to you. The motives for gathering this data are diverse; so are the companies and agencies making use of it. An online bookseller tracks sales of The Anarchist Cookbook in order to remind customers that they might also want a copy of The Minimanual of the Urban Guerrilla, while the National Security Administration will presumably track the purchase with an eye to making correlations of a different sort.
At some level we all know such things are happening, probably without thinking about it any more often than strictly necessary. Harder to grasp is the sheer quantity and variety of the data we generate throughout the day -- much of it trivial, but providing, in aggregate, an unusually detailed map of what we do, who we know and what’s on our minds. Some sites and applications have “privacy settings,” of course, which affect the totality of the digital environment about as much as a thermostat does the weather.
To be a full-fledged participant in 21st-century society means existing perpetually in a state of information asymmetry, in the sense described by Finn Brunton and Helen Nissenbaum in Obfuscation: A User’s Guide for Privacy and Protest (MIT Press). You don’t have to like it, but you do have to live with it. The authors (who teach media culture and communications at New York University, where they are assistant professor and professor, respectively) use the term “obfuscation” to identify various means of leveling the playing field, but first it’s necessary to get a handle on information asymmetry itself.
For one thing, it is distinct from the economic concept of asymmetrical information. The latter applies to “a situation in which one party in a transaction has more or superior information compared to another.” (So I find it explained on a number of websites ranging from the scholarly to the very sketchy indeed.) The informed party has an advantage, however temporary; the best the uninformed can do is to end up poorer but wiser.
By contrast, what Brunton and Nissenbaum call information asymmetry is something much more entrenched, persistent and particular to life in the era of Big Data. It occurs, they explain, “when data about us are collected in circumstances we may not understand, for purposes we may not understand, and are used in ways we may not understand.” It has an economic aspect, but the implications of information asymmetry are much broader.
“Our data will be shared, bought, sold, analyzed and applied, all of which will have consequences for our lives,” the authors write. “Will you get a loan, or an apartment, for which you applied? How much of an insurance risk or a credit risk are you? What guides the advertising you receive? How do so many companies and services know that you’re pregnant, or struggling with an addiction, or planning to change jobs? Why do different cohorts, different populations and different neighborhoods receive different allocations of resources? Are you going to be, as the sinister phrase of our current moment of data-driven antiterrorism has it, ‘on a list’?”
Furthermore (and here Brunton and Nissenbaum’s calm, sober manner can just barely keep things from looking like one of Philip K. Dick’s dystopian novels), we have no way to anticipate the possible future uses of the galaxies of personal data accumulating by the terabyte per millisecond. The recent series Mr. Robot imagined a hacker revolution in which all the information related to personal debt was encrypted so thoroughly that no creditor would ever have access to it again. Short of that happening, obfuscation may be the most practical response to an asymmetry that’s only bound to deepen with time.
A more appealing word for it will probably catch on at some point, but for now “obfuscation” names a range of techniques and principles created to make personal data harder to collect, less revealing and more difficult to analyze. The crudest forms involve deception -- providing false information when signing up with a social media site, for example. A more involved and prank-like approach would be to generate a flood of “personal” information, some of it true and some of it expressing one’s sense of humor, as with the guy who loaded up his Facebook profile with so many jobs, marriages, relocations, interests and so on that the interest-targeting algorithms must have had nervous breakdowns.
There are programs that will click through on every advertisement that appears as you browse a site (without, of course, bothering you with the details) or enter search engine terms on topics that you have no interest in, thereby clouding your real searches in a fog of irrelevancies.
The cumulative effect would be to pollute the data enough to make tracking and scrutiny more difficult, if not impossible. Obfuscation raises a host of ethical and political issues (in fact the authors devote most of their book to encouraging potential obfuscators to think about them) as well as any number of questions about how effective the strategy might be. We’ll come back to this stimulating and possibly disruptive little volume in weeks to come, since the issues it engages appear in other new and recent titles. In the meantime, here is a link to an earlier column on a book by one of the co-authors that still strikes me as very interesting and, alas, all too pertinent. - Scott McLemee


Obfuscation: how leaving a trail of confusion can beat online surveillance


An interview with Obfuscation co-author Finn Brunton about ...

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.

Lionel Erskine Britton - a drama from 1930. in which a giant Computer is set up in the Sahara to run human affairs according to ambiguously Utopian tenets.

  Lionel Britton, Brain: A Play of the Whole Earth , 1930 A Brain is constructed in the Sahara Desert -- presently It grows larger than the ...