"The Princeton Dialogues on AI and Ethics" Primary Case Studies Released

Monday, May 21, 2018

Princeton's University Center for Human Values (UCHV) and the Center for Information Technology Policy (CITP) are excited to announce the release off the first case studies of their joint research project: “The Princeton Dialogues on AI and Ethics.”

The aim of this project is to develop a set of intellectual reasoning tools to guide practitioners and policy makers, both current and future, in developing the ethical frameworks that will ultimately underpin their technical and legislative decisions.

Please find the cases on the dedicated website: https://aiethics.princeton.edu/case-studies/

An accompanying blogpost: https://freedom-to-tinker.com/2018/05/21/princeton-dialogues-of-ai-and-ethics-launching-case-studies/

And a retweetable tweet: https://twitter.com/PrincetonCITP/status/998572513981599752

Some background to the project:

The Princeton Dialogues on AI and Ethics so far convened two invitation-only workshops in October 2017 and March 2018, in which philosophers, political theorists, lawyers, and machine learning experts met to assess several real-world case studies that elucidate common ethical dilemmas in the field of AI. The aim of these workshops was to facilitate a collaborative learning experience which enabled participants to dive deeply into the ethical considerations that ought to guide decision-making at the engineering level and highlight the social shifts they may be affecting. In March 2018, we also hosted a public conference, titled “AI & Ethics,” where interested academics, policy makers, civil society advocates, and private sector representatives from diverse fields came to Princeton to discuss topics related to the development and governance of AI: “International Dimensions of AI” and “AI and Its Democratic Frontiers”. This conference sought to use the ethics and engineering knowledge foundations developed through the initial case studies to inspire discussion on AI technology’s wider social effects.