Great news: the Call for Papers for the fourth iteration of the International Workshop on Privacy Engineering (IWPE) is out! This year’s program seeks to highlight challenges to privacy posed by widespread adoption of machine learning and artificial intelligence technologies. One motivation for this focus stems from goals and provisions of the European General Data Protection Regulation (GDPR), including requirements for privacy and data protection by design, providing notices and information about the logic of automated decision-making, and emphasis on privacy management and accountability structures in organizations that process personal data. Interpreting and operationalizing these requirements for systems that employ machine learning and artificial intelligence technologies is a daunting task and we hope to attract papers from researchers, civil society and industry on the topic.
This year we decided to co-locate IWPE with the European IEEE S&P which will take place in London between the 24th and 26th of April. With this, we hope in the coming years to establish a tradition of moving the workshop (for now) between the US and EU.
Workshops are the product of all the dedicated researchers who agree to serve on our PC as well as the hard work of the organizers of the conferences where we co-locate our workshop. We are delighted to once again have a fantastic and interdisciplinary PC. There is also great effort that goes into establishing a new workshop. For this, I would like to thank Jose M. del Alamo (Universidad Politécnica de Madrid) who has pulled the heavy weight of putting together our workshop for the last four years. Special thanks also goes out to our current program co-chairs Anupam Datta (Carnegie Mellon University) , Aleksandra Korolova (University of Southern California) and Deirdre K. Mulligan (UC Berkeley); our industry chair Nina Taft (Google); our mentoring and local chair Jose Such (King’s College London); and, our publicity chair Arunesh Sinha (University of Michigan). We look forward to seeing you at IWPE’18.
It is hard to keep floating when two people who have inspired you in life pass away within days from each other. I owe it to these two troublemakers to thank them for their great work and for the paths that they have opened to many of us.
Today the news came that we lost Özgür Uçkan. Özgür was a digital rights activist, as well as a professor, philosopher, artist, economist, and one of the founding members of Alternatif Bilisim, an association based in Turkey working on digital rights and freedoms. I have had the fortune of meeting a number of polymaths in my life, but few of them sustain an equal passion for working with people, as they do for their intellectual endeavors like Özgür did. The picture below from an anti-censorhip protest in Istanbul that Ismail Hakki Polat used in his eulogy says it all.
Özgür, in the brown t-shirt, is standing tall and proud, and most probably having some good fun at the front-line. Most importantly, he is surrounded by peers and some of the many young people he inspired, many of whom continue to be part of the struggle for digital rights and freedoms in Turkey. Within a year from the time that picture was taken, the same networks would organize large protests that would come to attract 60.000 people in over 30 cities within and outside of Turkey. People have argued that these series of actions were some of the stepping stones that led to the Gezi Park protests. After all, ruptures like Gezi are often the product of widely felt frustration as well as the accumulation of years of organizing. From where I stand, Özgür Uçkan belonged to the group of people who understand what it takes to create a collective vision, and then to organize and mobilize people around it. He worked relentlessly to capture the spirit of our times, to resist infringements upon our fundamental freedoms, and to do so in a way that inspired action and change.
There is another detail in that same picture which will bring me to Caspar Bowden, the other person who passed away this week. Next to Özgür Uçkan stands Yaman Akdeniz, yet another important academic, activist, and free-speech advocate. Caspar Bowden was the first person to mention Yaman’s name and work to me. Yaman Akdeniz and Caspar Bowden went way back. Here is a chapter in a book the two wrote together titled “Cryptography and Democracy: Dilemmas for Freedom” in 1999. The piece was written during Caspar’s time at the Foundation for Information Policy Research. While Yaman Akdeniz moved onto fighting government censorship as his prime area of activity, Caspar Bowden switched to Microsoft where he would later become the Chief Privacy Adviser. I met him during this time and was surprised by his commitment to promoting Privacy Enhancing Technologies given the title he was holding. Throughout the years, I witnessed how he leveraged all the powers and connections he had to push forward technical architectures and designs that would serve to protect privacy. He would encourage those of us working on such systems to continue our line of work, while also pulling us into rooms with policy makers and parliamentarians so that we could demonstrate the powers of encryption and distributed computation in the service of protecting privacy. When he parted paths with Microsoft and returned to his advocacy work, I saw him at first struggle with the legacy of his association with the company. But this being Caspar, he just held on to his grounds and pushed every channel possible to make it known to the public what Edward Snowden’s revelations about NSA and GCHQ surveillance programs would eventually confirm.
Today, the loss of Özgür Uçkan and Caspar Bowden feels like two hard punches. Tomorrow, I can imagine gaining courage from the many inspiring memories we have of them and to dream futures informed by the principles they held true. As one wise community activist from NYC once said, “they rolled the ball over to us, it is now our turn to keep it rolling”.
For a collection of videos of interventions by and about Özgür Uçkan see Erkan Saka’s compilation.
For a sweet farewell to Caspar Bowden, see Malavika Jayaram’s post.
And, here is a video of Caspar’s talk at 31C3 which will allow you to enjoy his talk _and_ his infamous slides.
The Women and Surveillance Initiative just announced a workshop on Machine Learning. Please consider joining us!
Touching correlations: A hands-on Workshop on Machine Learning
Organized by the Women and Surveillance Initiative, NYC
18.-19. July 2015
Location: Data and Society Offices
Are you interested in how computers use algorithms to learn from data? Curious what kinds of things machine learning can be used for? Want to understand and discuss the culture of machine learning? Then join us for a participatory workshop!
Networked machines amassing large databases and running powerful machine learning algorithms touch all aspects of our lives, and yet they mainly remain a black box. These systems are increasingly used for face recognition, targeted advertisement, predicting consumer behavior, medical predictions, social network analysis, financial predictions, and yet sometimes even the experts will not be able to explain why they work, what it means to say that they “work”, or to comprehend the work they do in social settings. At this workshop, we will try to open the black box of “Machine Learning” and discuss what actually goes into making these kinds of predictions. After a primer on the basic concepts and procedures, we’ll do some hands-on experiments looking at real world datasets and discuss collectively the different elements that make up what machine learning.
Our objective is to explore machine learning from the perspective, experience and expertise of the participants. No prior knowledge of mathematics or algorithms is required, nor should having such expertise hold you back from participating. We are in the process of preparing a short reading/video list that can be used prior to or after the workshop for further exploration. We also recommend installing Weka, an open source software used for machine learning , on a device that you bring along. We hope that throughout the workshop, we can experiment with and make sense of the practice of machine learning based on our everyday experiences.
If you have a background in machine learning, and would like to help us make this workshop happen, please get in touch with us before 13th of July.
The workshop will take place on the 18th and 19th of July from 10am-4pm at the premises of Data and Society . Those interested in participating should register by the 13th of July by sending an email to email@example.com.
Participation in the workshop is free of charge. We will provide some drinks and snacks and would appreciate a donation of up to 10$s from participants. Participation is limited to 20 people.
Touching Correlations is organized by the Women and Surveillance Initiative based in New York City. The workshop is open to all past, present and future women or anyone who feels like they have a place in a women’s community.
 Weka Data Mining Software: http://www.cs.waikato.ac.nz/ml/weka/
 Data and Society http://www.datasociety.net
Our plan for a panel on the implications of the disclosed NSA and GCHQ surveillance programs for PETs researchers is materializing. The panel will take place on the 17th of July in Amsterdam at the PETs Symposium. We expect to have a lively discussion with Susan Landau, Wendy Seltzer, Stephanie Hankey, Nadia Heninger and George Danezis. In fact, thanks to a blog post on “The Dawn of Cyber-Colonialism”, it is maybe better to state, George has already kicked off the discussion.
Great thanks goes out to the program committee who have supported the idea from the first minute, and to the general chair Hinde ten Berge, Jaap Henk Hoepman from the PI.lab, and NWO for their material support.
PETs Post-Snowden: Implications of the revelations of the NSA and GCHQ Surveillance Programs for the PETs community
Despite the entertainment value of program names like “egotistical giraffe”, “onion breath” and “moth monster”, the revelations about the NSA and GCHQ surveillance programs are more than troubling. Specifically, BullRun (attacks on crypto) and the egotistical series (attacks on Tor) pose challenges to the PETs community and the solutions they work on. This panel focuses on some of these challenges, discuss their implications for PETs researchers and practitioners, and explore ways forward.
According to some, the revelations show that law and policy have failed to protect citizens around the globe from surveillance. It falls, among others, upon the shoulders of the PETs community to build technical solutions that are resilient to “mass surveillance” practices. But while Edward Snowden announced that “crypto still works”, intelligence agencies will continue to find ways to work around it. So others have argued that technology is far from a complete answer and that working with policy and law is more necessary than ever. If so, the challenges here range from finding ways to convince policy makers that weakening the Internet for surveillance is not acceptable to actually regulating “good” security and “bad” surveillance practices.
Both positions are troubled by motions to prevent companies from applying secure designs that may be seen as obstructing law enforcement agencies from conducting investigations. Further, governments around the globe are likely to consider implementing “back doors” as well as utilizing zero-day exploits as a way to guarantee law enforcement and intelligence access. These aggressive policies raise questions about where PETs can and should live; and, how to guarantee that their design remains robust, e.g., by keeping the implementation open to scrutiny?
Simultaneously with the revelations, cybersecurity for critical infrastructures has gathered force. Governments around the globe now bring intelligence agencies, standards bodies, contractors as well as academic researchers around tables in order to align technical security issues with national security interests. Cybersecurity funding abounds, affecting research trajectories as well as what gets done. How are PETs researchers and practitioners to manage these increasingly politicized demands along national lines?
Finally, people in their everyday lives navigate the implications of the revelations about the surveillance programs as much as engineers and researchers. Prominent security engineers have favored prioritizing developing measures against mass surveillance rather than for targeted surveillance. How “targeted” end users may be impacted by the prioritization of protections against “mass surveillance” is unclear. And indeed, the distinction itself may not be as clear cut as some of its proponents suggest. In other words, the issues raised here beg the question as to how we can ensure that user interests can be a continuous part of the PETs community’s priorities?
How can we live together? This is the very simple and fun (if not challenging) question that the participants of the Osthang Architecture Summer School will be asking this Summer in Darmstadt, Germany. The program will come to a closure with a nine day(!) public forum titled “thinking together” curated by Berno Odo Polzer. As Berno writes:
“«Thinking Together» is focused on rethinking future modes of living together in a pluricentric world, so it is a transdisciplinary platform for political imagination: ‘political’ because it is concerned with the way in which we organize the spaces, practices and lives that we share, locally as well as globally – ‘imagination’ because it is aimed at forming new ideas and imaginaries about how to do so.”
As part of “thinking together”, Femke Snelting, Miriyam Aouragh and myself will be organizing an afternoon with the title “Let’s First Get Things Done!”. In other words, how we can resist the divisions of labor between “activists” and “techies” that occur in those sneaky moments of moving forward?
Our experience is that the politics, values and practices of those activists heavily using networked technology for their struggles, and of techno-activists who struggle for progressive and alternative technologies do not always concur. Loyal to the utopia of a globally functioning interwebs, techno-activists usually organize around universal values: information must be “free”, secure, “privacy-preserving”, accessible, etc. In comparison, those who bring their political struggles to the interwebs may express political differences across a broader spectrum, situated in local and/or global contexts. Furthermore, “pragmatic decisions” due to time pressure and lack of resources often mean that these struggles integrate themselves into proprietary and conservative technical infrastructures. In the process, many organizational matters are delegated to techies or to technological platforms. Imagining our futures together, we may want to radically reconfigure these divisions of labor. But how? Where do we start? Well, it seems we will start in Darmstadt.
But, we are not the only ones asking these questions. This June members of some of the most successful alternative projects met at Backbone 409 in Calafou with the objective “to build infrastructures for a free Internet from an anti-capitalist point of view: autonomous servers, open networks, online services, platforms, open hardware, free software, etc.” . I am looking forward to hearing back from that meeting.
In August, Interference will take place in Amsterdam and also raise similar questions with respect to the politics of technology and infrastructures of politics. The organizers write:
“Interference is not a hacker conference. From a threat to the so-called national security, hacking has become an instrument for reinforcing the status quo. Fed up with yet another recuperation, the aim is to re/contextualize hacking as a conflictual praxis and release it from its technofetishist boundaries. Bypassing the cultural filters, Interference wants to take the technical expertise of the hacking scene out of its isolation to place it within the broader perspective of the societal structures it shapes and is part of.”
And surely, these discussions will show up at the TransHackFeminist Camp organized in collaboration with Calafou and the eclectic tech carnival people, and also at HOPE. It also seems that the topic has found interest among academics. See the call for papers for the next issue of the FibreCulture Journal titled: “Entanglements: activism and technology”.
Thanks to all these events, this will be a summer of collaboration and labor. I cannot wait to see what thoughts and actions we return with for the Autumn.
Let’s first get things done: on division of labor and practices of delegation in times of mediated politics and politicized technologies
4th of August, 2014
Osthang, Darmstadt, Germany
Be it in getting out the call for the next demonstration on some “cloud service”, or developing a progressive tech project in the name of an imagined user community, scarcity of resources and distribution of expertise makes short cuts inevitable. But do they really?
The current distance between those who organise their activism to develop “technical infrastructures” and those who bring their struggles to these infrastructures is remarkable. The paradoxical consequences can be baffling: (radical) activists organize and sustain themselves using “free” technical services provided by Fortune 500 companies. At the same time, “alternative tech practices”, like the Free Software Community, are sustained by a select (visionary and male) few, proposing crypto with 9-lives as the minimum infrastructure for any political undertaking.
The naturalization of this division of labor may be recognized in statements about activists having better things to do than to tinker with code or hardware, or in technological projects that locate their politics solely in the technology and infrastructures as if they are outside of the social and political domain. What may seem like a pragmatic solution actually re-iterates faultlines of race, gender, age and class. Through the convenient delegation of “tech matters” to the techies or to commercial services, collectives may experience a shift in the collective’s priorities and a reframing of their activist culture through technological decisions. The latter, however, are typically not open to a broader political discussion and contestation. Such separation also gets in the way of actively considering the way in which changes in our political realities are entangled with shifts in technological infrastructures.
We want to use this day to resist the reflex of “first getting things done” in order to start a long term collaboration that intersects those of us with a background in politics of society and politics of technology.
It is one thing to ask people if they want to control the appropriate flow of their disclosures (or disclosures of others about them) on Online Social Networks (OSNs), it is another to ask who they think should be responsible for ensuring the appropriate flow of this information. In the first part of a small study conducted last summer at CMU, which Ero Balsa will present next week at USec 2014 , participants were asked these two questions. The objective of the study was to find out if users feel that they should be responsible for taking extra measures to avoid privacy problems that Cryptographic Access Control Tools for Online Social Networks (CACTOS) hope to mitigate. Namely, privacy problems resulting from disclosure of all user data (including private messages) by default to the OSN provider and delegation of privacy setting enforcement to OSN providers. In other words, are the privacy concerns that the developers of CACTOS have aligned with that of the users, and, if so, who do the users think should be responsible for mitigating these privacy problems?
Somewhat unsurprisingly, the study participants said they want full control over determining the disclosure and appropriate flow of their public and private messages on OSNs, but Facebook (the OSN used in the study) should share responsibility for making sure that their privacy is respected. For example, despite identifying increasingly permissive privacy settings in Facebook as a problem, the participants thought that it is their responsibility to configure their privacy settings correctly. However, they found that it is the responsibility of the OSN to make sure that privacy settings are effective. When it came to undesirable disclosures about the person by other OSN users, some participants expected that the OSN should ensure removal. While they were aware that facebook had provided advertisers access to their profiles and tracked their activities across the web, many participants agreed that it is the responsibility of the OSN to make sure that their disclosures do not all of a sudden pop up in the databases of third parties.
In the second part of the study, participants were offered a cryptographically powered tool called “Scramble!” that would allow its users to have “strict and highly granular controls” on who sees a disclosure on OSN and that prevents the OSN from accessing the content of the disclosure. The tool had the usual problems that security tools have with usability and complexity, but all in all the participants thought the tool was useful and could provide them with a desired granularity of control. However, due to a variety of reasons, they found it too cumbersome to use a CACTOS like Scramble! to ensure that their information flows appropriately. Even though the participants complained about the indeterministic privacy settings and the OSN provider’s data usage practices, they thought that to use a cryptographic tool to eliminate these problems was a disproportionate measure — such heavy duty tools were found appropriate for “others” who have secrets. They also did not want to establish trust towards yet another entity, in this case the CACTOS provider. They shied away from sharing their data with the CACTOS provider, although Scramble! would not be “seeing” their disclosures in clear text. Others even suggested that they would trust the tool if Facebook certified it. Many participants also agreed that if a disclosure is jeopardizing, they could send it as a private message (which they assumed would be kept confidential by the OSN), or, most strikingly, they could just remain silent.
At this point it is reasonable to ask why shift the focus of a study from usability to responsibility? The idea of the study was developed within the SPION project where responsibilization of users with respect to protecting their privacy is one of the main themes. The argument is that information systems that mediate communications in a way that also collects massive amounts of personal information may be prone to externalizing some of the risks associated with these systems onto the users. This can easily happen under the label “privacy” which can be leveraged to put the individual at the center of responsibility. Hence, privacy protection can become a way of burdening the users with the risks externalized by those systems and an apparatus for disciplining them. The objective of the SPION project is hence to critically assess the ways in which privacy technologies may intensify the responsibilization of OSN users, or explore whether they can be designed to shift back responsibilities to those providing the OSN services.
This idea of “responsibilization” is borrowed from David Garland’s article titled “The Limits of the Sovereign State: Strategies of Crime Control in Contemporary Society” which was applied in the domain of “identity management systems” by David Barnard Wills and Debi Ashenden in their article titled Public Sector Engagement with Online Identity Management . Responsibilization (a terrible word to pronounce) is a complex concept in studies of governmentality to which I can do no justice here, so I will stick to the basic definition above.
These researchers that write about the topic note that people may have various reactions to responsibilization, including one of rejection and pointing back to the institutions that cause the problem in the first place. For example, OSN privacy policies will often say it is the responsibility of the users to avoid undesirable information flows e.g., by watching over themselves and their privacy settings. As I mentioned earlier, OSNs will often change the semantics of privacy settings, this is a very slippery responsibility to put on the users shoulders. Nevertheless, the participants of our study seemed to have internalized that message. They mainly thought that they should be responsible for what they post and how they use functionality. Yet, although our study was small and limited, it seemed that the participants also pushed back on the configuration of responsibilities. Scramble! here functioned as an artefact through which they could imagine a different way of controlling information flows and express their needs: for most participants it was too cumbersome to make up for the unreliability of the OSN by enforcing appropriate information flows through Scramble!. Instead, following from the first part of the study, it should be the responsibility of the OSN to get privacy settings right and not share their information with third parties.
There are many limitations to this small scale study. It is small, and it is about one OSN and one CACTOS. Further, if for many people “technology” is a scary thing, then “encryption” is likely to give them nightmares. Surely, mentioning that Scramble! was based on “encryption” primed the users in a certain way and influenced their responses. “Responsibility” itself is just as loaded a concept as privacy, control or encryption. Shanto Iyengar has shown in his paper titled “Framing Responsibility for Political Issues: The case of Poverty” that framing has an important impact on who people will see as responsible for political issues. Exactly how this may also apply with respect to the framing of privacy and responsibilization is a great question for future research.
Finally, it seems that many participants of our study preferred to censor their speech or control their actions over deploying tools to protect their privacy. This is a troubling matter. Most computer science research on privacy looks to provide techniques and tools that are expected to support users in their everyday negotiation of privacy, e.g., CACTOS, anonymous communication tools, adblockers, identity management systems or privacy nudges. As computer scientists, we may have become too comfortable with a world-view in which “privacy protecting machines” can protect users or aid them in protecting themselves from “privacy intrusive machines”. In doing so, we may have overestimated the part “the users” may actually want play in this challenging game between machines. I hope by providing some insight into attitudes of users towards responsibilization in OSNs as well as towards CACTOS, this paper serves to think about where users may or may not want to enter this game.
Ero Balsa (KU Leuven), Laura Brandimarte (Carnegie Mellon University), Alessandro Acquisti (Carnegie Mellon University), Claudia Diaz (KU Leuven), Seda Gürses (New York University), Spiny CACTOS: OSN users attitudes and perceptions towards cryptographic access control tools , USec’14, San Diego.
Next week I will be participating in an evening event titled “Data Shadows: Anonymity and Digital Networks” put together by Alexander Benenson as part of the public program in conjunction with the exhibition Private Matters organized by Ceren Erdem, Jamie Schwartz and Lisa Hayes Williams. I plan to present the next episode of the series “A Failed Coup Attempt with Folk Songs” of which I found traces of part II , part III, and part V . Now the task is to figure out what the number of the part is when presenting at apexart where I will have the pleasure of sharing the room with, among others, Finn Brunton and John Menick. Most of my talk will be a reflection on some of the thoughts in the article titled “The Spectre of Anonymity” given the backdrop of the revelations about the NSA surveillance programs.
The event will take place on the 27th of February, 2014 at 7pm at apexart.
My first encounters with the concept of obfuscation go back to discussions that the privacy research group at COSIC/ESAT (KUL) had about TrackMeNot in 2012. Back then we were discussing the efficacy of the possible protections offered by TrackMeNot when faced with a “learning” machine. Little did I know that one day I would have close encounters with the creators of TrackMeNot, Helen Nissenbaum, Vincent Toubiana and Daniel Howe. All three will be at the Symposium on Obfuscation which takes place next week at NYU. The line up of speakers include Susan Stryker, Nick Montfort, Laura Kurgan, Claudia Diaz, Günes Acar, Finn Brunton, Hanna Rose Shell, Joseph Turow as well as Rachel Greenstadt representing her research group that developed “Anonymouth”, Daniel Howe, the creator of “Ad Nauseam”, and Rachel Law, the maker of “Vortex”. You can find out more about the event here.