Electronic voting: how logic can help
Abstract: Electronic voting should offer at least the same guarantees than traditional paper-based voting systems. In order to achieve this, electronic voting protocols make use of cryptographic primitives, as in the more traditional case of authentication or key exchange protocols. All these protocols are notoriously difficult to design and flaws may be found years after their first release. Formal models, such as process algebra, Horn clauses, or constraint systems, have been successfully applied to automatically analyze traditional protocols and discover flaws. Electronic voting protocols however significantly increase the difficulty of the analysis task. Indeed, they involve for example new and sophisticated cryptographic primitives, new dedicated security properties, and new execution structures. After an introduction to electronic voting, we will describe the current techniques for e-voting protocols analysis and review the key challenges towards a fully automated verification.
Max Planck Institute for Software Systems, Germany
Privacy and Fairness Concerns with PII-based Targeted Advertising on Social Media
Abstract: All popular social media sites like Facebook, Twitter, and Pinterest are funded by advertising, and the detailed user data that these sites collect make them attractive platforms for advertisers. Historically, these advertising platforms allowed advertisers to target users with certain attributes, but not to target users directly. Recently, most advertising platforms have begun allowing advertisers to target users directly by uploading the personal information of the users who they wish to advertise to (e.g., their names, email addresses, phone numbers, etc). Such targeting is referred to as custom audience targeting.
In this talk, I will discuss numerous privacy and fairness concerns that arise with such custom audience targeting on the Facebook ad platform. I will show how custom audience targeting would allow malicious advertisers to leverage existing public records (e.g., voter records) for discriminatory advertising (i.e., excluding people of a certain race), and how this type of discrimination is significantly more difficult for Facebook to detect automatically. We also find that the custom audiences can be abused by malicious advertisers to learn about hundreds of demographic, behavioral, and interest attributes of a Facebook user even with limited knowledge about their PII like their email addresses or phone numbers. Finally, we find that users generally have no control over their data that is used to create custom audiences. Overall, our results indicate that advertising platforms need to more carefully consider the privacy and fairness concerns that arise out of custom audience targeting.
University of Waterloo, Canada
Building Secure Applications Using Intel’s SGX
Abstract: Intel’s Software Guard Extensions (SGX) are a new technique in order to build applications shielded from the administrator and system software. In principle they allow the equivalent of an encrypted computation, such as homomorphic encryption. While it has been demonstrated that entire applications can be run in SGX protected memory, from a security point of view this is not advisable due to the many side channels and software vulnerabilities. In this talk I will show how to implement an application-specific secure program — a database index — and how to compile application-indepedent secure programs for Intel’s SGX. This will highlight programming principles towards secure applications using SGX.