Protecting the Right to Know and Legal Innovation

on Topics: Future Law | Legal Analytics | Legal Tech

Protecting the Right to Know and Legal Innovation

A new French law threatens to send the country ten steps backward in the continued movement to advance access to justice and the Right to Know.

Article 33 of the Justice Reform Act, made effective in March but only recently publicized, criminalizes the use of data analytics to assess and predict patterns in judicial decisions. The law also forbids any reference to judges’ identities in publicly available court data, stating that “[t]he identity data of magistrates and members of the judiciary cannot be reused with the purpose or effect of evaluating, analysing, comparing or predicting their actual or alleged professional practices.” Violation of the law is punishable by up to five years in prison.

The law marks a preposterous slide into the dark ages of legal innovation and access to justice – a remnant of a time where the law was sealed and concealed, and public officials hid their deeds and misdeeds from the public. While it is difficult to understate the problems this law poses, there are ultimately three:

  • First and foremost, public accountability and judicial scrutiny are fundamental pillars of any justice system, and removing judges’ names from their opinions and judicial analytics would encourage less accountability and scrutiny, impinging on the completeness of the appellate record.
  • Second, the French government’s justifications for the law simply don’t come close to outweighing the importance of furthering legal innovation and progress, which is impossible without unfettered access to court data (including the identity of judicial officials).
  • Finally, the criminal penalties the law imposes are unfounded and unduly harsh, and will chill continued innovation in the legal field.

Judicial names should not be deemed PII when contained in public records.

Judges are public officials and by virtue of their positions, they naturally cast themselves into the public eye. As such, they should expect their identities to be known and used. Unless a case is sealed, there is simply no reason to redact or remove a judge’s name from a record.

As privacy lawyer Lily Li recently wrote in a guest article for our blog, the availability of court data should not be governed by traditional privacy rules. “The right of access to court records ensures accountability in the legal system,” Li writes. “With this right of access, we can see the reasoning behind court opinions, understand the implications of new laws, and report and comment on biases and power imbalances within the courtroom. Without this right, we lose our ability to place checks on judicial power.” She goes on to say that “[a]t its core, the rule of law requires transparency to function, and for this transparency to exist, there must be some sacrifice of personal privacy.”

Subjecting judges’ identities to the use of predictive analytics is critical to legal innovation, and fostering the accountability and transparency that are so desperately needed among legal officials. Through increased transparency, we can see the holes in our system and work to repair them, rather than continuing to shroud them in mystery.

An interest in promoting innovation should always prevail.

According to the French government and proponents of the law, Article 33 is intended to increase transparency among judicial officials by promoting easy public access to court decisions. It seems, though, that the law has the opposite effect. The policy motivations undergirding the law involve judicial officials’ desire for increased privacy: Judges no longer want attorneys to profile them based on their past decisions and use that information to forum shop.

Notwithstanding the fact that courts can and do use technology to randomize the assignment of judges to alleviate judge shopping, these justifications are wholly insufficient in comparison to their consequences. They don’t outweigh the importance of promoting legal innovation and progress, which unfettered access to all court data – including the names of the officials who render landmark decisions that affect all citizens – is vital to protecting. Not to mention, casting a public spotlight on judges who make dispositive rulings, and subjecting those rulings to scrutiny, is a necessary part of the judicial process.

The beating heart of democratic societies is the freedom to openly discuss, debate, and scrutinize the law in a way that can catalyze positive change. We simply cannot do this if those who are granted power to administer justice are concealed behind a proverbial curtain like the Great and Powerful Oz.

French lawyers in defense of the law argue that consumers and tech companies can still employ predictive analytics while omitting the identity of the judge who rendered the decision. But, is this really possible? Isn’t a judge’s identity critical to predicting case outcomes? Consider, for example, SCOTUS. The identities of our particular justices – and their various religious, political, and personal backgrounds that undeniably impact their rulings – is widely known. Within a Court that is now stacked as a 5-4 conservative majority, we can generally sense how certain decisions (which become the ultimate Law of the Land) will be resolved. Moreover, we can generally predict how each individual justice will rule, given his or her track record of rulings in past decisions. Omitting a judge’s identity from predictive analytics essentially strips the analytics of any meaningful utility.

The criminal penalties are unfounded and will stifle innovation.

Simply stated, criminalizing the use data analytics to assess and predict judicial decisions is preposterous. As proponents of open access to data, we vehemently oppose criminalizing members of the legal tech community who are dedicated to improving access to justice and developing the tools necessary to foster innovation in the legal profession. Fear of these penalties will frustrate the continued push toward making data freely accessible. Tech companies now face penalties for providing specific types of services to lawyers in France – and whether this law will take root and other EU nations follow suit remains to be seen.

What’s even more disturbing is that France’s national bar association, the Conseil National des Barreaux (CNB), is also lobbying for the passage of a new law criminalizing attorney analytics along with judicial analytics. CNB recently adopted a resolution formalizing their opposition to attorney analytics, stating that, “It being specified that the identity data of judges and members of the court cannot be reused with the purpose or effect of assessing, analysing, comparing or predicting their actual or perceived professional practices, [we] demand that identical treatment be reserved for identity data of lawyers in the context of the dissemination of court decisions in open data.”

Considered in conjunction with the General Data Protection Regulation (GDPR), Article 33 of the Justice Reform Act and CNB’s formal resolution represent a downward spiral and continuing trend across Europe to place the privacy rights of individuals over the public’s right to know and access information. As we have argued previously, GDPR’s “right to be forgotten” is already acting as a sword and shield allowing criminals and fraudsters to remove and/or obscure court records from the websites of legal technology companies without taking into consideration the importance of the public’s right to know. In general, Europe has “thrown the baby out with the bathwater” when it comes to privacy, and Article 33’s practical result is to not only destroy a sense of accountability and transparency within the judiciary, but also to criminalize legal innovation.

Revitalizing the Access to Justice Movement

While the French law represents a disturbing step backwards, we at UniCourt are proud to work against this type of retrograde justice by continuing to increase access to justice and the public’s right to know, understand, and use the law. Our mission is to promote open, unfettered access to court data, and we’ve been fortunate to work with other legal tech pioneers to make this a reality across the United States. From our partnership with Justia, to joining other legal tech innovators as an Amicus to improve access to PACER data and protect the right to know and publish the law, UniCourt is committed to improving access to the law.

If you’re interested in learning more about UniCourt or joining our PACER Collective to make legal data more accessible, we’d love for you to contact us.