Artificially Intelligent

Any mimicry distinguishable from the original is insufficiently advanced.

These Legal Systems Do Not Exist

| 842 words

Inspired by Legal Systems Very Different From Ours, Because I Just Made Them Up

I

The people of Edent do not know if they are good. This is not unusual; many people do not know if they are good people or not. However, the people of Edent also believe in human nature. It would be more accurate to say that they do not know whether humanity is good.

Luckily, the people of Edent are all evidentiary decision theorists. Instead of asking themselves “what should I do”, they act based on the question “what would I be happy to hear that I did?”. In answering this question, they reason: “if I did something bad, then that would mean human nature is bad. If human nature is bad, then Edent would soon collapse. I do not want this, so I will do something good.”

In this way, the society of Edent sustains itself. All are afraid of finding out human nature is bad, and so all act in ways that are good. No one harms another because they do not want to find out that human nature permits people to harm other people. No one steals from another because they do not want to learn that human nature permits theft. All contribute towards common goods because none want to find out that human nature would take from the commons without replacement. Edent collapsed 3 days after it made contact with the outside world.

II

The people of Kyoska believe in idealized justice as is sometimes attributed to a blind philosopher king. They seek to create a perfectly just society, not by eliminating all conflict, but by constructing a perfectly just process of conflict resolution.

When they were towns and villages, they enacted court proceedings over written communication, removing all but the necessary information, which they packed into standardized formats. A murder scene full of life, color, and bias would be stripped of all character, converted into standardized units, documents, and forms.

If biases are potentially present in remaining degrees of freedom, they sought to counteract with an equal an opposite counter-bias. If the height of the alleged victim must be revealed, the height of the alleged perpetrator would be fabricated to counteract. If this was not possible, the effect size of the bias would be pinned down and the decision of the court would be adjusted accordingly.

The Kyoska have since acquired advanced technology. They do not yet have a blind philosopher king, but they are close. Now, when a person is alleged to have harmed another, they resolve the conflict by forming a philosopher together, pooling all memories, instincts, intuitions from both people. This philosopher, although more knowledgeable than both of its parts, is ignorant about who it is, and thus blind. After the blind philosopher decides how the conflict should be resolved, memories of the decision are transmitted to the original people.

III

The people of Metaculus do not have juries, but they do have superforecasters. If a citizen of Metaculus is alleged to have violated a law, a superforecaster is assigned to predict the results of a trial, conditional upon existence. Each potential crime carries with it a threshold, decided by the state as proxy for that law’s importance. If the forecaster’s prediction does not exceed this threshold of certainty, the case is given to further forecasters. An ensemble algorithm takes the individual predictions and adjusts certainty accordingly. If six forecasters have been consulted and the certainty threshold has not been reached, then additional investigation gets conducted. Legend tells of a case 14 years ago that remained uncertain for so long that it went to trial.

Forecaster calibration is constantly monitored. Any given forecaster is given a stream of real cases and training cases proportional to their historical accuracy. In this way, cases are given preferentially to accurate forecasters. Metaculus has high standards; if a forecaster predicts poorly on a single training case, then are assigned zero real cases for a month. A poor prediction on a second case grants them a year of training. A third grants them a lifetime in prison.

One day, an official noted that the number of forecasts a case required could itself be predicted. On that day, the meta-forecaster was born. The jurisdiction of the meta-forecaster expanded rapidly. Police’s search for evidence was soon aided by meta-predictions about predictions conditional on search locations. Forecaster calibration monitoring was supplemented by meta-predictions about calibration. System improvements were hastened by meta-predictions about the number of forecasters that would be required.

Another day, an official noted that the meta-forecaster implied the possibility of a meta-meta-forecaster. Meta-meta-forecasting was much the same as meta-forecasting; infinite regression was avoided. However, the scope of the forecasting apparatus grew until all citizens were under its employ.

There are no longer citizens that break laws. The original purpose of the forecasting apparatus has been lost. All that remains of Metaculus is a recursive tangle that is perfect at predicting itself, predicting predictions of itself, predictions predictions of predictions of itself, …