Skip to content
About
Team
Newsletter
Podcast
Research
2025 Summit
Join Our Team
Menu
About
Team
Newsletter
Podcast
Research
2025 Summit
Join Our Team
Twitter
Research
Leveraging Blockchain in a New Era of Antitrust
This report examines how blockchain technology can address modern antitrust challenges by fostering more transparent, competitive markets. It highlights the structural advantages blockchains can offer—such as decentralization, transparency, and data portability—and explores real-world examples where these features might reduce monopolistic behavior. The authors spotlight cases including a major mortgage software provider whose acquisitions risked consolidating the loan process, and an ad auction platform that benefited from opaque systems. By showing how blockchain-based frameworks could increase trust and lower switching costs, the report argues for a new era of antitrust enforcement—one where companies can innovate while still meeting competitive and regulatory requirements. It offers policymakers and businesses a glimpse of how decentralization may align with consumer protection and open market principles.
Designing Policy for a Flourishing Blockchain Industry
An exploration of how regulatory frameworks can support the growth of decentralized blockchain networks. The report outlines key decentralization criteria, emphasizing the importance of open, permissionless, and autonomous networks that enhance transparency, security, and user control. It warns against policies that conflate centralized and decentralized systems, which could stifle innovation and reinforce existing power structures. The report provides a framework to help policymakers craft regulations that foster a thriving blockchain ecosystem while ensuring compliance and consumer protection.
AI and Democracy’s Digital Identity Crisis
By understanding how identity attestations are positioned across the spectrum of decentralization, we can better grasp their costs/benefits. Improving and integrating them into our interactions with the digital sphere will help protect democratic systems from AI-generated harm.
Toward Equitable Ownership and Governance in the Digital Public Sphere
We Have a Big (Tech) Problem
The harms of dominant technology platforms are manifold and include the exploitation of data and the mental health and safety of minors, the explosion of misinformation, and the negative impact on political institutions and behavior. Big Tech and especially social media companies have therefore become objects of public scrutiny and criticism. However, internal company efforts and external bipartisan attempts to rein in these harms have largely failed.
Personhood credentials: Artificial intelligence and the value of privacy-preserving tools to distinguish who is real online
Anonymity is an important principle online. However, malicious actors have long used misleading identities to conduct fraud, spread disinformation, and carry out other deceptive schemes. With the advent of increasingly capable AI, bad actors can amplify the potential scale and effectiveness of their operations, intensifying the challenge of balancing anonymity and trustworthiness online. In this paper, we analyze the value of a new tool to address this challenge: "personhood credentials" (PHCs), digital credentials that empower users to demonstrate that they are real people -- not AIs -- to online services, without disclosing any personal information. Such credentials can be issued by a range of trusted institutions -- governments or otherwise. A PHC system, according to our definition, could be local or global, and does not need to be biometrics-based. Two trends in AI contribute to the urgency of the challenge: AI's increasing indistinguishability from people online (i.e., lifelike content and avatars, agentic activity), and AI's increasing scalability (i.e., cost-effectiveness, accessibility). Drawing on a long history of research into anonymous credentials and "proof-of-personhood" systems, personhood credentials give people a way to signal their trustworthiness on online platforms, and offer service providers new tools for reducing misuse by bad actors. In contrast, existing countermeasures to automated deception -- such as CAPTCHAs -- are inadequate against sophisticated AI, while stringent identity verification solutions are insufficiently private for many use-cases. After surveying the benefits of personhood credentials, we also examine deployment risks and design challenges. We conclude with actionable next steps for policymakers, technologists, and standards bodies to consider in consultation with the public.
Open Problems in DAOs
Decentralized autonomous organizations (DAOs) are a new, rapidly-growing class of organizations governed by smart contracts. Here we describe how researchers can contribute to the emerging science of DAOs and other digitally-constituted organizations. From granular privacy primitives to mechanism designs to model laws, we identify high-impact problems in the DAO ecosystem where existing gaps might be tackled through a new data set or by applying tools and ideas from existing research fields such as political science, computer science, economics, law, and organizational science. Our recommendations encompass exciting research questions as well as promising business opportunities. We call on the wider research community to join the global effort to invent the next generation of organizations.