Monthly Archives: February 2014


Attitudes towards “Spiny CACTOS”


It is one thing to ask people if they want to control the appropriate flow of their disclosures (or disclosures of others about them) on Online Social Networks (OSNs), it is another to ask who they think should be responsible for ensuring the appropriate flow of this information. In the first part of a small study conducted last summer at CMU, which Ero Balsa will present next week at USec 2014 , participants were asked these two questions. The objective of the study was to find out if users feel that they should be responsible for taking extra measures to avoid privacy problems that Cryptographic Access Control Tools for Online Social Networks (CACTOS) hope to mitigate. Namely, privacy problems resulting from disclosure of all user data (including private messages) by default to the OSN provider and delegation of privacy setting enforcement to OSN providers. In other words, are the privacy concerns that the developers of CACTOS have aligned with that of the users, and, if so, who do the users think should be responsible for mitigating these privacy problems?

http-::rbedrosian.com:Folklore:hp11to14a

Somewhat unsurprisingly, the study participants said they want full control over determining the disclosure and appropriate flow of their public and private messages on OSNs, but Facebook (the OSN used in the study) should share responsibility for making sure that their privacy is respected. For example, despite identifying increasingly permissive privacy settings in Facebook as a problem, the participants thought that it is their responsibility to configure their privacy settings correctly. However, they found that it is the responsibility of the OSN to make sure that privacy settings are effective. When it came to undesirable disclosures about the person by other OSN users, some participants expected that the OSN should ensure removal. While they were aware that facebook had provided advertisers access to their profiles and tracked their activities across the web, many participants agreed that it is the responsibility of the OSN to make sure that their disclosures do not all of a sudden pop up in the databases of third parties.

In the second part of the study, participants were offered a cryptographically powered tool called “Scramble!” that would allow its users to have “strict and highly granular controls” on who sees a disclosure on OSN and that prevents the OSN from accessing the content of the disclosure. The tool had the usual problems that security tools have with usability and complexity, but all in all the participants thought the tool was useful and could provide them with a desired granularity of control. However, due to a variety of reasons, they found it too cumbersome to use a CACTOS like Scramble! to ensure that their information flows appropriately. Even though the participants complained about the indeterministic privacy settings and the OSN provider’s data usage practices, they thought that to use a cryptographic tool to eliminate these problems was a disproportionate measure — such heavy duty tools were found appropriate for “others” who have secrets. They also did not want to establish trust towards yet another entity, in this case the CACTOS provider. They shied away from sharing their data with the CACTOS provider, although Scramble! would not be “seeing” their disclosures in clear text. Others even suggested that they would trust the tool if Facebook certified it. Many participants also agreed that if a disclosure is jeopardizing, they could send it as a private message (which they assumed would be kept confidential by the OSN), or, most strikingly, they could just remain silent.

ScrambleKuLeuvenLogo

At this point it is reasonable to ask why shift the focus of a study from usability to responsibility? The idea of the study was developed within the SPION project where responsibilization of users with respect to protecting their privacy is one of the main themes. The argument is that information systems that mediate communications in a way that also collects massive amounts of personal information may be prone to externalizing some of the risks associated with these systems onto the users. This can easily happen under the label “privacy” which can be leveraged to put the individual at the center of responsibility. Hence, privacy protection can become a way of burdening the users with the risks externalized by those systems and an apparatus for disciplining them. The objective of the SPION project is hence to critically assess the ways in which privacy technologies may intensify the responsibilization of OSN users, or explore whether they can be designed to shift back responsibilities to those providing the OSN services.

This idea of “responsibilization” is borrowed from David Garland’s article titled “The Limits of the Sovereign State: Strategies of Crime Control in Contemporary Society” which was applied in the domain of “identity management systems” by David Barnard Wills and Debi Ashenden in their article titled Public Sector Engagement with Online Identity Management . Responsibilization (a terrible word to pronounce) is a complex concept in studies of governmentality to which I can do no justice here, so I will stick to the basic definition above.

These researchers that write about the topic note that people may have various reactions to responsibilization, including one of rejection and pointing back to the institutions that cause the problem in the first place. For example, OSN privacy policies will often say it is the responsibility of the users to avoid undesirable information flows e.g., by watching over themselves and their privacy settings. As I mentioned earlier, OSNs will often change the semantics of privacy settings, this is a very slippery responsibility to put on the users shoulders. Nevertheless, the participants of our study seemed to have internalized that message. They mainly thought that they should be responsible for what they post and how they use functionality. Yet, although our study was small and limited, it seemed that the participants also pushed back on the configuration of responsibilities. Scramble! here functioned as an artefact through which they could imagine a different way of controlling information flows and express their needs: for most participants it was too cumbersome to make up for the unreliability of the OSN by enforcing appropriate information flows through Scramble!. Instead, following from the first part of the study, it should be the responsibility of the OSN to get privacy settings right and not share their information with third parties.

There are many limitations to this small scale study. It is small, and it is about one OSN and one CACTOS. Further, if for many people “technology” is a scary thing, then “encryption” is likely to give them nightmares. Surely, mentioning that Scramble! was based on “encryption” primed the users in a certain way and influenced their responses. “Responsibility” itself is just as loaded a concept as privacy, control or encryption. Shanto Iyengar has shown in his paper titled “Framing Responsibility for Political Issues: The case of Poverty” that framing has an important impact on who people will see as responsible for political issues. Exactly how this may also apply with respect to the framing of privacy and responsibilization is a great question for future research.

Finally, it seems that many participants of our study preferred to censor their speech or control their actions over deploying tools to protect their privacy. This is a troubling matter. Most computer science research on privacy looks to provide techniques and tools that are expected to support users in their everyday negotiation of privacy, e.g., CACTOS, anonymous communication tools, adblockers, identity management systems or privacy nudges. As computer scientists, we may have become too comfortable with a world-view in which “privacy protecting machines” can protect users or aid them in protecting themselves from “privacy intrusive machines”. In doing so, we may have overestimated the part “the users” may actually want play in this challenging game between machines. I hope by providing some insight into attitudes of users towards responsibilization in OSNs as well as towards CACTOS, this paper serves to think about where users may or may not want to enter this game.

Ero Balsa (KU Leuven), Laura Brandimarte (Carnegie Mellon University), Alessandro Acquisti (Carnegie Mellon University), Claudia Diaz (KU Leuven), Seda Gürses (New York University), Spiny CACTOS: OSN users attitudes and perceptions towards cryptographic access control tools , USec’14, San Diego.

Data Shadows: Anonymity and Digital Networks at apexart

Next week I will be participating in an evening event titled “Data Shadows: Anonymity and Digital Networks” put together by Alexander Benenson as part of the public program in conjunction with the exhibition Private Matters organized by Ceren Erdem, Jamie Schwartz and Lisa Hayes Williams. I plan to present the next episode of the series “A Failed Coup Attempt with Folk Songs” of which I found traces of part II , part III, and part V . Now the task is to figure out what the number of the part is when presenting at apexart where I will have the pleasure of sharing the room with, among others, Finn Brunton and John Menick. Most of my talk will be a reflection on some of the thoughts in the article titled “The Spectre of Anonymity” given the backdrop of the revelations about the NSA surveillance programs.

The event will take place on the 27th of February, 2014 at 7pm at apexart.

Symposium on Obfuscation

My first encounters with the concept of obfuscation go back to discussions that the privacy research group at COSIC/ESAT (KUL) had about TrackMeNot in 2012. Back then we were discussing the efficacy of the possible protections offered by TrackMeNot when faced with a “learning” machine. Little did I know that one day I would have close encounters with the creators of TrackMeNot, Helen Nissenbaum, Vincent Toubiana and Daniel Howe. All three will be at the Symposium on Obfuscation which takes place next week at NYU. The line up of speakers include Susan Stryker, Nick Montfort, Laura Kurgan, Claudia Diaz, Günes Acar, Finn Brunton, Hanna Rose Shell, Joseph Turow as well as Rachel Greenstadt representing her research group that developed “Anonymouth”, Daniel Howe, the creator of “Ad Nauseam”, and Rachel Law, the maker of “Vortex”. You can find out more about the event here.

, February 7, 2014. Category: news.