COMP 7860 Security and Privacy





o   06/09: Course outline and Lecture 1 are now available.




o   Instructor: Noman Mohammed

o   Email: (Please mention COMP 7860 in the subject)

o   Lecture time and location: T/TH 11:30 AM -12:45 PM in E2-360

o   Office location: EITC E2-417

o   Office hours: T- 10:00 AM -11:15 AM, TH-10:00 AM -11:15 AM, or by appointment


Course Description


The objective of this course is to train students in secure data collection, protection, and dissemination of information for data analysis. This course will cover legal, ethical, and technical aspects of information privacy. The main topics include conceptions and legal foundations of information privacy, security primitives, different privacy models, and various anonymization algorithms for diverse data sharing scenarios. Students will explore cutting-edge privacy solutions and learn how to apply privacy technologies to real-life applications.




o   COMP 4580 (Computer Security) is recommended.




o   There will be no required textbook. Reading material for the course will consists mainly of research articles, in conjunction with sections from a selection of different books.




  Paper presentation: 15%

  Participation: 15%

  Assignment (mostly review of papers): 20%


  Proposal: 5%

  Progress report: 10%

  Final report: 25%

  Project presentation: 10%


The course will consist of a combination of lectures by the instructor, student-led discussions of research papers, and occasional visiting lecturers. This course will primarily consist of reading, reviewing, and presenting research papers (the exact number per student will depend on the total enrollment). There will be two papers assigned to each class period, selected from the reading list. All students are to have read both of the papers before the class, and to have submitted a review (comments, discussion questions, etc.) for one of them (of the student's choice) on the evening before the class.


Administrative Policy


  • Students are expected to attend every class. Some material may only be covered in class and not made available on the course note/website. Students are expected to read the assigned materials and to actively participate in class discussions.
  • In the event of extraordinary circumstances beyond the University's control, the content and/or evaluation scheme in this course is subject to change.










Sep 7





Sep 12



      Helen Nissenbaum. Privacy in Context. Stanford University Press, 2009. Review

      TED talk: Why Privacy Matters



Sep 14


k-anonymity and anonymization techniques

      L. Sweeney. k-anonymity: a model for protecting privacy. International Journal on Uncertainty, Fuzziness and Knowledge-based Systems, 10 (5), 2002; 557-570.

      N. Mohammed, B. C. M. Fung, P. C. K. Hung, and C. Lee. Anonymizing healthcare data: a case study on the blood transfusion service. In Proceedings of the 15th ACM International Conference on Knowledge Discovery and Data Mining (SIGKDD), pages 1285-1294, Paris, France, June 2009.


      Sample final project





Sep 19




Algorithms for k-anonymity

      Proposal due [1-2 pages]

      K. LeFevre, D. J. DeWitt, and R. Ramakrishnan. Incognito: Efficient full domain k-anonymity. In Proceedings of the ACM SIGMOD International Conference on Management of Data (SIGMOD), pages 49–60, 2005.

      K. LeFevre, D. J. DeWitt, and R. Ramakrishnan. Mondrian multidimensional k-anonymity. In Proceedings of the 22nd International Conference on Data Engineering (ICDE), 2006.








Sep 21

Methodology for managing re-identification risk

      F. Dankar & K. El Emam. A method for evaluating marketer re-identification risk. In Proceedings of the 2010 EDBT/ICDT Workshops. 2010: 28.

      K. Benitez, and B. Malin. Evaluating re-identification risks with respect to the HIPAA privacy rule. Journal of the American Medical Informatics Association, 17(2): 169-177, 2010.







Sep 26

Beyond k-anonymity

      Machanavajjhala, D. Kifer, J. Gehrke, and M. Venkitasubramaniam. l-diversity: Privacy beyond k-anonymity. In ICDE 2006.

      N. Li, T. Li, and S. Venkatasubramanian. t-Closeness: Privacy Beyond k-Anonymity and l-Diversity. In ICDE 2007.






Sep 28

Anonymization of set-valued data

      M. Terrovitis, N. Mamoulis, and P. Kalnis. Privacy-preserving anonymization of set-valued data. Proceedings of the VLDB Endowment, 1(1):115–125, 2008.

      G. Loukides, A. Gkoulalas-Divanis, and B. Malin. Anonymization of electronic medical records for validating genome-wide association studies. In Proceedings of the National Academy of Sciences of the United States of America, 107(17), 2010.






Oct 3

Attacks on privacy models


      Daniel Kifer. Attacks on Privacy and de Finetti's Theorem. SIGMOD 09.

      R. C. W. Wong, A. W. C. Fu, K. Wang, and J. Pei. Minimality attack in privacy preserving data publishing. In VLDB, 2007.






Oct 10


Differential Privacy

      C. Dwork. A firm foundation for private data analysis. Commun. ACM, 54(1):86–95, 2011.


Oct 12

Application of differential privacy

      C. Clifton and T. Tassa. On syntactic anonymity and differential privacy. Transactions on Data Privacy. Volume 6 Issue 2, August 2013.

      F. Dankar and K. El Emam. The application of differential privacy to health data. In Proceedings of the EDBT/ICDT Workshops. 201: 158-166.







Oct 17

Model inversion attacks


      M. Fredrikson, E. Lantz, S. Jha, S. Lin, D. Page, T. Ristenpart. Privacy in pharmacogenetics: an end-to-end case study of personalized warfarin dosing. In Proceedings of the USENIX Security Symposium, 2014.

      M. Fredrikson, S. Jha, T. Ristenpart. Model Inversion Attacks that Exploit Confidence Information and Basic Countermeasures. ACM Conference on Computer and Communications Security (CCS), 2015.






Oct 19

Basic Crypto I

      Lecture note


Oct 24

Basic Crypto II



Oct 26

Privacy in data management


      Femi Olumofin, Ian Goldberg. Privacy-preserving Queries over Relational Databases. 10th Privacy Enhancing Technologies Symposium, July 2010.

      R. A. Popa et al. Cryptdb: protecting confidentiality with encrypted query processing. In SOSP 2011.






Oct 31

Secure multiparty computation I

      Progress report due (Submission format) [3 to 4 pages]

      W. Jiang and C. Clifton. A secure distributed framework for achieving k-anonymity. Very Large Data Bases Journal (VLDBJ), 15(4):316–333, November 2006.

      Wei Xie et al. SecureMA: protecting participant privacy in genetic association meta-analysis. Bioinformatics, 2014.






Nov 2

Secure multiparty computation II

      Murat Kantarcioglu, Wei Jiang, Ying Liu, and Bradley Malin. A Cryptographic Approach to Securely Share and Query Genomic Sequences. IEEE Transactions on Information Technology in Biomedicine, 2008.

      Y. Lindell and B. Pinkas. Privacy preserving data mining. Journal of Cryptology, 15(3):177–206, 2002.






Nov 7


Computation on encrypted data



Guest Lecturer


Nov 9



      Randomized requantization with local differential privacy. In ICASSP, 2016.

      Privacy Protection for Preventing Data Over-Collection in Smart City. IEEE Transactions on Computers, 2016.

Calvin and Kenny

Puja and Cheng

Nov 14




      The HybrEx Model for Confidentiality and Privacy in Cloud Computing. In HotCloud2017.

      Privacy Preserving Automatic Fall Detection for Elderly Using RGBD Cameras. In ICCHP, 2012.


Naqvi and Anurag


Gurnoor and Manmohit

Nov 16




      Privacy-Preserving Distributed Linear Regression on High-Dimensional Data. In PETS, 2017.

      Annotating longitudinal clinical narratives for de-identification. In Journal of Biomedical Informatics, 2015.

Toufique and Sabab


Nima and Fouad

Nov 21




   Nicolas Papernot, Martn Abadi, lfar Erlingsson, Ian Goodfellow, Kunal Talwar. Semi-supervised Knowledge Transfer for Deep Learning from Private Training Data. In ICLR 2017.

   Privacy-CNH: A Framework to Detect Photo Privacy with Convolutional Neural Network using Hierarchical Features. In AAAI, 2016.


Linwei and Binglin



Shahab and Asif

Nov 23



      Distillation as a Defense to Adversarial Perturbations Against Deep Neural Networks. In S&P, 2016.

      Deep Learning with Differential Privacy. In CCS 2016.

Amirul and Reza



Nov 28


Project Presentation



Toufique and Sabab


Nov 30


Project Presentation

Linwei and Binglin


Puja and Cheng


Dec 5


Project Presentation

Gurnoor and Manmohit


Naqvi and Anurag


Amirul and Reza


Dec 7

Project Presentation



Nima and Fouad


Calvin and Kenny


Shahab and Asif

Dec 12


Final report due

      (Submission format)

      6 to 7 pages including references