PERSONHOOD CREDENTIALS

August 15th, 2024

In light of the rapid development and deployment of AI, decentralized tools that can distinguish between human and machine while maintaining user privacy and data sovereignty are more critical than ever.

We’re proud to have co-authored a paper, published today, entitled Personhood credentials: Artificial intelligence and the value of privacy-preserving tools to distinguish who is real online with researchers from OpenAI, Harvard Society of Fellows, Microsoft, University of Oxford, Spruce ID, MIT, and many other organizations leading initiatives on digital identity.

Click above for the full paper.

PHCs shift the burden of proof from what is fake — an ever-changing and contested concept — to who is real — a relatively clearly demarcated benchmark. Robust PHC systems are built on decades of cryptographic research and could provide IDs, unique to each individual, that protect user privacy at every stage. PHCs offer multiple benefits: they can signal who is human online, filter out AI-powered attacks, and distinguish between AI agents that are relatively benign (those authorized by a real person to act as their representative) and those acting on behalf of deceptive, malicious actors. Increasing the use of these systems will be necessary in order to respond to the threat AI poses to identity online.

 

The solution to this AI digital imposter problem cannot simply be government databases of citizens’ information. While governments must play a role, we need systems that are decentralized and which guarantee privacy and autonomy to users both on and offline. We have already seen many promising uses of blockchain technology in building decentralized digital identity systems. If you are working on research or platforms that could aid in the effort, we encourage you to reach out to us via info@thedrcenter.org.