Creating Digital Rule of Law

Alan Rodriguez
13 min readApr 15, 2021

Most people don’t know, but the creators of the Internet also worked on technology to control our data with digital contracts. But all their efforts ended abruptly in 2012.

This mosaic design is Copyright © 2011 Kaamar Ltd, based on an original photograph Statue of Liberty by pbutke under CC BY 2.0 license

Part 1 of 3 — Smart Data & Digital Rule of Law Series:

  1. Part 1: Creating Digital Rule of Law
  2. Part 2: Smart Data — A Brief Timeline of Intelligent Technology
  3. Part 3: Open Source Architecture for Smart Data and Digital Rights

This is the first part in a three part series about data and digital civil rights. We’re focusing on how humanity can personalize the technology around us by tailoring the uses of our data, even after it’s been shared. This requires creating a standard that binds and enforces digital contracts to our increasingly mobile data. Recent advancements in cryptographic techniques like Smart Contracts, Zero-Knowledge Proofs, and Secure Multi-Party Computation finally provide workable solutions to this problem.

What is Rule of Law?

John Adams “A republic is a government of laws, and not of men.”

John Adam’s words pointed toward a fundamental proposition at the core of America as a new nation: that nobody is above the law and the law applies equally to everyone. At a minimum, that’s the big idea every generation tries to achieve and sustain.

America’s founders had just emerged from the rule of King George III. They sought to establish a nation governed by the Rule of Law carefully designed to replace the need for a monarchy to maintain order and peace. A government carefully designed to avoid the arbitrary and corruptible rule of a powerful few often under the control of an authoritarian God/King/Ruler.

Digital & Data Lawlessness

After 30 years of commercial Internet development, we’ve created a global digital marketplace powered by capitalist free-market ideas but paradoxically paired with lawless data flows.

Early Internet pioneers understood the seriousness of this problem. Between 1997 and 2012, the World Wide Web Consortium (W3C) and Internet Engineering Task Force (IETF) worked collaboratively through many interrelated efforts to create control architectures for our shared data. Despite this extended effort, we did not adopt any standard that coupled our data with digital contracts that constrain data usage.

We can sign paper agreements, select terms of use checkboxes, and pass legislation, but none of these actions directly impacts or automatically alters any actual data transactions. Organizations must re-actively audit and manually remediate an exponential growing ocean of data for compliance, which is simply impossible.

Digital Capitalism Requires Digital Rule of Law

Creating and sustaining a market economy as a free people and free society requires upholding The Rule of Law. A digital market economy requires Digital Rule of Law.

The ‘Rule of Law’ is an indispensable foundation for a market economy, which provides an essential environment for the creation and preservation of wealth, economic security, and well-being, and the improvement of the quality of life. [2 Penn State Law]

Capitalism requires each generation to refine The Rule of Law within their evolving social and technological landscape. By all practical measures, we are failing to do this today. This is the underlying root cause of many of our most challenging technology and data challenges.

Defining Digital Rule of Law

Digital Rule of Law extends the ideas of Rule of Law into our digital landscape.

Let’s explore what this might look like:

  1. Rule of Law is a set of principles, or ideals, for ensuring an orderly and just society. Where all laws and contracts apply to everyone equally, there are clear and fair processes for enforcing laws and contracts, there is an independent judiciary, and human rights are guaranteed for all. [13 American Bar Association]
  2. Digital Rule of Law is a set of principles, or ideals, for ensuring an orderly and just digital society. Where all laws and contracts apply to everyone equally and automatically, there are clear and fair automated processes for enforcing laws and contracts, there is an independent judiciary, and human digital rights are guaranteed for all.

We’re describing Algorithmic Law, which must be transparent, auditable, self-explanatory, unbiased, and ultimately highly trustable. We are specifically NOT describing AI powered platforms that are opaque, unaccountable, unexplainable, biased, and largely untrusted.

Digital Rule of Law does not replace our existing legal system. It automates our data in compliance with existing laws and contracts.

While regulatory rights like GDPR, CCPA, and CPRA are essential to preserve digital right, regulations are simply not enough. We also need technology standards that protect our rights.

Unadopted Internet Standards Efforts

Most people believe we cannot control our data after it’s been shared. This is because Big-tech, which funds the academic institutions that run our most important standards bodies, interfered with standards groups working to bind digital contracts to data. Not because binding digital contracts to data “cannot be done”, but because Big-tech “doesn’t want it to be done”.

Our Big-tech companies are Data Monopolies, which were carefully designed through deliberate standards body manipulation.

The current Data Monopolies overwhelmingly fund our most important standards bodies which allow them to police what gets funded, and studied, and eventually adopted as global standards. The technology to control our data with digital contracts was deliberately not funded, studied, or adopted as a standard.

By 1997, Internet Standards Groups were already working on privacy and data control-related standards. Between 1997 and 2012, various academic institutions, in collaboration with World Wide Web Consortium (W3C) and Internet Engineering Task Force (IETF), worked through many interrelated efforts to create control architectures with digital contracts for our shared data.

Let’s review the four most important projects:

Platform for Privacy Preferences (1997 to 2006):

P3P was our first attempt to standardize privacy choice and consent. It would have enabled websites to express their privacy practices to users in a standard format that could be easily understood. P3P enabled browsers would have allowed users to set predefined preferences and inform users when site privacy practices deviate from those preferences. [3 P3P: The Platform for Privacy Preferences]

Policy Aware Web (2005 to 2006):

As early as 2005, there were the beginnings of a workable solution. The Policy Aware Web (PAW) project would have provided for the publication of declarative access policies providing greater control over information sharing by information owners by employing rule-based discretionary access control.

PAW combines data access policies that constrain data with a Proof-Capable Reasoner that can answer questions about data instead of allowing direct read access to data. Meaning we could control the questions asked of data. A cool new idea that has evolved into Zero Knowledge Proofs at the heart of every blockchain doing proof of work and proof of stake computations. [4 Policy Aware Web]

Transparent Accountable Datamining Initiative (2006 to 2012):

The TAMI Project attempted to create the technical, legal, and policy foundations for transparency and accountability in large-scale aggregation and inferencing across heterogeneous information systems. The project could have provided precise rule languages or contracts that could have expressed policy constraints and reasoning engines that were able to describe the results they produce. [5 Transparent Accountable Datamining Initiative]

TAMI adds several exciting elements to the 2006 Policy Aware Web Project.

  1. We see a repeat of the Proof-Capable Reasoner idea enabling questions to be asked of data.
  2. We see policies generalized to include contracts and regulations with “The Law Cloud.”
  3. We see data from the entire web formed in the scope of policy controls.
  4. We’ve added the proof explanation for auditors and an intended use reasoner for analysts mirroring early GDPR thinking around intended use enforcement.

The EnCoRe Project — Ensuring Consent and Revocation (2009 to 2012):

In parallel, another group was tackling the technology standards to control data as it moves between organizations.

EnCoRe proposed an architecture where encrypted personal data, with a machine-readable policy stuck on, can only be decrypted and read by entities that abide by the policy rules. They explored a technique called “data tagging” or “sticky policies” where a user’s personal information is labeled or tagged with instructions or preferences specifying how service providers should treat their data. This effort envisioned “sticky policies” automatically enforcing preferences as personal data flows across multiple parties. Here we see the first effort to control the uses of data as it moves. That data must carry its own instructions with some capacity to protect itself from inappropriate access and use across network and organizational boundaries. [6 The EnCoRe Project]

What’s The Deal With 2012?

By 2012 all of these standardization efforts ended, without being adopted, several at the same time.

A decade of focused work on multiple data control technologies by early Internet thought leaders who predicted our future, who explained the risks clearly at that time, and who worked extensively to mitigate these risks — all end without adoption in 2011.

Here are a few interesting contributing events:

  1. GDPR — The EU took decisive political action on January 25, 2012, releasing their highly anticipated draft proposal for EU General Data Protection Regulation (GDPR). This effort shifted our focus from technologies and standards, as had been the case since 1997, to a regulatory remediation approach. [7 GDPR]
  2. Social Platforms — Facebook’s initial public offering came on May 17, 2012, at a share price of US$38. The company was valued at $104 billion, the most significant valuation to that date. The IPO raised $16 billion, the third-largest in U.S. history. [8 Facebook]
  3. The AI Arms Race will be won by the few organizations who successfully monopolize the most data over the most time. The race for AI Supremacy was well understood by 2011, fueled by the convergence of “Big-Data” and “Cloud Computing”.
  4. By 2017, the Economist published a report titled “The world’s most valuable resource is no longer oil, but data,” formally announcing the rise and near-total dominance of the modern Data Monopoly. [9 The Economist]

Perhaps the proposed solutions were immature, or perhaps commercial interests prevailed over human interests? Regardless of the reason, commercial interests exercised control of key Internet Standards Groups to ensure profits were prioritized over transparency and accountability. By 2012, it was too late for standardization efforts to curb the emergent data abuses that would culminate by 2021 as a widely recognized existential threat to democracy and the very ideas of self-governance and free will. (Note this was written a year before Jan 6th)

The Consequences are Everywhere

Because we’re missing technology standards to control the uses of our data, by 2023 Gartner predicts that the individual activities of 40% of the global population will be tracked digitally to influence our beliefs and our behaviors. That’s more than 3 billion people who are being targeted for the gradual alteration of their beliefs and behaviors without their awareness or consent.

Gartner Research coined and popularized the phrase Internet of Behaviors (IoB) to capture the essence of this pervasive surveillance activity concluding:

The Internet of behaviors (IoB) will challenge “what it means to be human in the digital world”. [10 Gartner]

This results in increasingly powerful Data Monopolies powered by unaccountable AIs. AIs fueled by massive amounts of user data. AIs that create addictive digital experiences, reality-altering media, pervasive surveillance, invisible bias, polarization and extremism, driven by oceans of user data iteratively optimized and eventually monetized into manipulated behaviors. Manipulation carefully crafted by the highest bidders. While largely invisible to the average individual, these horrors are simply the default price for free, convenient, and personalized digital experiences.

Our personal data is the blueprint of our individual reality, without which technology cannot interoperate with humanity nor humanity with technology. Our data are both the outputs to control of the technology around us as well as the inputs for the slow manipulation of our beliefs and behaviors. Over time our collective data is also the inputs for the future “guided manipulation” of entire societies over many generations.

As Aldous Huxley author of ‘Brave New World’ pointed out in 1962 in his speech at Berkeley, titled The Ultimate Revolution [11 Aldous Huxley]:

“…we are faced, I think, with the approach of what may be called the ultimate revolution, the final revolution, where a man can act directly on the mind-body of his fellows… we are in process of developing a whole series of techniques which will enable the controlling oligarchy to get people to love their servitude… this will be done not by using violence, but by creating a prison of the mind using the combined tactics of Pavlovian conditioning and propaganda on a population made more malleable… by direct electrical stimulation of the mind by machines.”

Google surprisingly states this very idea as the core purpose for their organization and for our collective personal data in a leaked video intended only for Google employees titled The Selfish Ledger [12 Google]:

Global Democracy at Risk

Without the foundation of Digital Rule of Law, we will remain paralyzed in the current data lawlessness paradigm. Director Shalini Kantayya discusses in her new film ‘Coded Bias’ shedding light on the urgent threats machine learning, and intelligent technology poses to individual freedoms and democracy, concluding:

“Data rights are civil rights”. [14 Shalini Kantayya]

In hearing before the Senate Judiciary Subcommittee on Privacy, Technology and the Law, Tristan Harris, a former industry executive who became a data ethicist and now runs the Center for Humane Technology, told the committee:

“That no matter what steps Data Monopolies took, their core business would still depend on steering users into individual “rabbit holes of reality.” Their business model is to create a society that’s addicted, outraged, polarized, performative and disinformed.” [15 Tristan Harris]

Joan Donovan, the research director at the Harvard Kennedy School’s Shorenstein Center on Media, Politics and Public Policy, said:

“The cost of doing nothing is nothing short of democracy’s end.” [15 Joan Donovan]

If we fail to sort out data rights and digital civil rights, none of our other rights as free self-governing people will survive. The unstoppable transition into an age of merged and pervasive digital technologies will slowly and fundamentally alter what it means to be human. We will lose this final revolution and our free will as free-thinking people.

Controlling Data May Be The Only Solution

An international group of researchers warned of the potential risks of creating overly powerful and standalone software. Using a series of theoretical calculations, the scientists explored how artificial intelligence could be kept in check. They concluded it was impossible. According to the study published by the Journal of Artificial Intelligence Research titled Superintelligence Cannot be Contained: Lessons from Computability Theory.

The scientists experimented with two ways to control artificial intelligence. The first was to design a “theoretical containment algorithm” to ensure that an artificial intelligence “cannot harm people under any circumstances.” The team concluded algorithmic technology control to ensure technology never harms humans showed no such algorithm could be created. [16 Journal of Artificial Intelligence Research]

The other option was to isolate AI from the Internet and other devices, limiting its contact and data interactions with the outside world. They concluded this would significantly reduce its ability to perform its designed functions because this limits its access to data inputs it requires for training to interoperate with humanity. Since a “theoretical containment algorithm” cannot be made, our only option is tailoring our connections to all technologies by isolating them from our data.

Accomplishing this requires defendable, explicit, informed, consensual, eternal, and unlimited right to control the uses of our accumulated data.

What’s Next?

The next two articles explore programmatic themes in technology over the last two decades culminating with the idea of Smart Everything. This shows us a number of interesting repeatable themes that apply to Smart Data. Our final article proposes a high-level architecture within which we can begin to create Digital Rule of Law as an essential foundation for a free digital society.

Want to help?

  1. Join the Smart Data Ecosystem and help create Digital Rule of Law to preserve all human rights.
  2. Follow Data Freedom Foundation and Accesr on social media and follow me on Medium. Data Freedom Foundation is on LinkedIn, Twitter, Facebook and YouTube. Accesr is on LinkedIn, Twitter, Facebook and YouTube.
  3. Contact us to get involved — we have many open roles and stock options for early supporters.

Part 1 of 3 — Smart Data & Digital Rule of Law Series:

  1. Part 1: Creating Digital Rule of Law
  2. Part 2: Smart Data — A Brief Timeline of Intelligent Technology
  3. Part 3: Open Source Architecture for Smart Data and Digital Rights

Alan Rodriguez is an accomplished digital leader, startup founder, and patent author with a passion for innovation, strategy and emergent digital business models. He’s available to tailor an IP and digital strategy for a few select organizations.

alanrodriguez@accesr.com

References:

  1. Renew Democracy Initiative, Defining Democracy and Rule of Law https://rdi.org/defining-democracy/2020/5/22/defining-democracy-rule-of-law/
  2. Penn State Law, International Rule of Law and the Market Economy https://elibrary.law.psu.edu/cgi/viewcontent.cgi?article=1161&context=fac_works#:~:text=Law%20matters%20in%20economic%20development,of%20the%20quality%20of%20life.
  3. P3P: The Platform for Privacy Preferences https://www.w3.org/P3P/
  4. Policy Aware Web http://www.policyawareweb.org/
  5. Transparent Accountable Datamining Initiative http://dig.csail.mit.edu/TAMI/
  6. Privacy Enhancing Technologies, A Review of Tools and Techniques https://www.priv.gc.ca/en/opc-actions-and-decisions/research/explore-privacy-research/2017/pet_201711/
  7. Gartner, Gartner Unveils Top Predictions for IT Organizations https://www.gartner.com/en/newsroom/press-releases/2019-22-10-gartner-unveils-top-predictions-for-it-organizations-and-users-in-2020-and-beyond
  8. General Data Protection Regulation (GDPR) https://en.wikipedia.org/wiki/General_Data_Protection_Regulation
  9. Facebook https://en.wikipedia.org/wiki/Facebook
  10. The Economist, The world’s most valuable resource is no longer oil, but data https://www.economist.com/leaders/2017/05/06/the-worlds-most-valuable-resource-is-no-longer-oil-but-data
  11. Aldous Huxley, The Ultimate Revolution ‘Brave New World’ (Berkeley Speech 1962) https://www.youtube.com/watch?v=2WaUkZXKA30
  12. Accesr Commentary of Google’s Selfish Ledger Video, https://youtu.be/KINIJ-xYOQw
  13. American Bar Association, What is the Rule of Law?https://www.americanbar.org/groups/public_education/resources/rule-of-law/
  14. Shalini Kantayya, Director of ‘Coded Bias’ https://hai.stanford.edu/news/coded-bias-director-shalini-kantayya-solving-facial-recognitions-serious-flaws
  15. Regulate Social Media or Risk ‘Democracy’s End’ Researchers Tell Senators https://www.pennlive.com/nation-world/2021/04/regulate-social-media-more-or-risk-democracys-end-researchers-tell-senators.html
  16. Journal of Artificial Intelligence Research, Superintelligence Cannot be Contained: Lessons from Computability Theory https://jair.org/index.php/jair/article/view/12202

--

--

Alan Rodriguez

Startup Founder, Inventor, Product Leader, Digital Hunter & Marketer, Data & Privacy Renegade, Philosopher, Digital Humanist