You are on page 1of 7

ARTIFICIAL INTELLIGENCE AND LEGAL REASONING

Antonio A. Martino
University of Pisa
martino@dsp.unipi.it

The AI world views the legal world as a field of application, like medicine or
banking. And it is a socially, politically and economically important field of application
that can only be compared with medicine and yet it has produced very little of actual
use. A possible explanation can be found in arguing that the theoretical implications of
the legal world make an interesting contribution to AI but, at the same time, slow down
the rapid development of products.

AI enables us to non empirically verify many theories and research that until recently
could only be the subject of thought and, at the same time, it obliges us to revise, in a
careful but necessarily ingenuous way (in the original meaning of the term), entire
theories of various disciplines because the very way of conceiving them has been
profoundly changed.

What I shall argue in this article is that, from the theoretical point of view, our so
called lawyers have something to say about some of the areas of AI such as natural
language, problem solving, reasoning, knowledge representation and inference engines
and that this theoretical training is more a disadvantage than advantage in this still
embryonic instance of the practical results of AI.

To put it another way, in those application domains where there is less theoretical
thinking about their own knowledge results are quicker and more spectacular. In the
law, results come later due to this theoretical research. The difference is that they will
be of greater importance.

If this theoretical importance is real - as I maintain in this article - then it will not be
innocuous for AI, like the verification of many legal theories in AI will not be
innocuous for the law.

There is an intermediary category in the law alongside the so called experts known
as the general theorist who is familiar with the ways of thinking of knowledge
engineering and who does not as easily accept the solutions placed before him, who
tests them and therefore slows down practical applications. We should keep in mind that
there is a general theory of law and legal philosophy is taught in law faculties while
there is no general theory of medicine. Lawyers permanently think about their
knowledge and the subject of their studies that they know is made up of words, which
are universally quantified and belong to a world that has been delimited ab initio.

For the purposes of proving what I have argued so far I shall attempt to deal with
some topics that may be useful for convincing you: a) legal reasoning as a model of
controllable deductive reasoning (for its defined variables) or inductive analogical
reasoning; b) familiarity with reasoning where new sentences (norms) are continually
added and also subtracted (derogation); c) revision of beliefs; d) classification; e)
interpretation of sentences; f) use of norms and the possibility of applying logic to them.

1. Legal Reasoning

The basic characteristics of legal reasoning are their ability to be inferred starting off
from data that includes norms of behaviour, meaning sets of conditions that [imply,
produce] a normative consequence, understood as the deontic modalization of a kind of
behaviour or as a sanction.

Once the use of norms has been taken away and thereby fact of having the
relevant variables of the congnitive world delimited a priori, legal reasoning is a
deductive reasoning starting from universal quantification.

The initial distinction made by lawyers is that relating to the uses of legal reasoning:
heuristic, when dealing with the process aimed at finding a justificatory decision when
dealing with all the arguments used for supporting a decision.

This clearly divides the study of how (and why) we make certain decisions from the
normative grounds for those decisions. As MacCormick argues "to decide is not to
infer". However, it may be said that the belief in certain normative premises is like
where, in certain fact situations, they force us in a certain sense to act in a certain way.

We know reasonably little about "how to make decisions" but quite a bit about the
grounds for decisions and these may be reconstructed as a deductive inference. And
given that deductive inferences form an acceptable part of computational algorithms, we
are able to reconstruct legal reasoning through AI processes.

But not only deductive reasoning exists. Analogical reasoning also appears in the
law: where does one case resemble another to the point of applying the same norms?
But this should not fool us. Analogical reasoning allows two cases or two types of cases
to be compared precisely because there is a general model from which pertinent features
can be taken and no others. All the difficulties and incompleteness of induction just like
those of analogy appear in this kind of reasoning.

1.2 General Propositions as Premises

The protagonists of the story of knowledge representation are universal


judgments.

The theory of the syllogism is, therefore, a theory of the logical consequences of
universal judgments and their negations, which are specific judgments.

Aristotle believed that there is no science of the singular and all those belonging
to Scholasticism thought that knowledge is knowledge of universal judgments. This
idea is still held today and the philosophy of science deals mainly with scientific laws.
But what are scientific laws? All modern authors identify them as universal
quantifications of conditional sentences.
. The fact that all universals are analyzed in conditionals is the reason why the
conditional has been stressed and emphasis has not been placed on quantification which
remains tacit: this is universality.

What must we add if we wish to move from this situation into the legal domain?

It is a very natural thing for lawyers to work with universal judgments and, up to
this point, there is no difference in reasoning tout court. The difference lies in the fact
that legal reasoning must make reference to norms and that the norms are also general.
In fact, the generality of the norms is a necessary condition of legislation, even in the
common law world, where the notion of precedent exists, given that this is so only when
it can be raised to the notion of generality.

Legal norms are conditional universal judgments where a number of conditions


are expressed that if they are verified produce a legal consequence.

The inductive/deductive pair is perhaps somewhat daring for representing the


richness of two of the greatest Western legal traditions, but it strongly gives the idea of
what is behind it.

Whenever lawyers and those in the legal profession think of a partial legal
system, however small, they think of it as a self sufficient set with full capacity for
legally regulating any act or situation that belongs to that domain. Furthermore, they
always think of it as a consistent set or a set that can be reconstructed in a consistent
fashion.

All this brings the legal world extraordinarily close to the formal theories of sets
common to logic and computer science and enables us to make calculations and to
apply the laws of formal theories to the legal world.

Lawyers, however, are fully aware that the completeness of a system is an


accepted but illusory assumption. Those anticipating Godel's theorem know that for
completing a system we must address ourselves to someone of a higher order, as G.
Jellinek says.

There is always the tendency in legal reasoning to maximize generality and


objectivity and, therefore, to considerably favour the idea of formalization. If we add the
independence of the notion of logical derivability with the notion of truth, it is easy to
see how Fregue's metalinguistic sign can lead legal arguments to a narrow calculation
valid for logic and able to be used by a computer.

2. Monotonicity

Lawyers are accustomed to reasoning in terms of adding and eliminating norms


from their systems and to asking themselves what to do with the consequences that were
valid in the previous normative system.

Lawyers have always known that the consequences of a legal order may change
either because the norms have changed, or because the assumptions lying behind the
norms that make sense of them have changed. Therefore, we can say that the French
Civil Code has undergone few amendments since the World War II but that French
judges derive very different consequences than their colleagues of the last century.
Explicit premises have not changed but implicit premises have.

In this field, there has been a considerable amount of research into reasoning that
should be made in conditions of uncertainty, because it was impossible to know all the
factual conditions: David Ross, reacting to the strict conditions of the categoric
imperative your action is only valid when it can be raised to a universal condition
proposed the most mundane prima facie judgments, namely, those that are valid as they
do not change the factual conditions we know about.

3. Terms and Classification

Lawyers are the first to understand the importance of the use of terms and their
relevance in classifying the universe.

The problem of classification is not limited to the legal world but belongs to the
empirical use of language or, in other words, to the old question of the relationship
between words and things.

Semantic rules of a natural language are usually metalinguistic but in order to


avoid creating serious problems for the interpreter, lawyers try to explain them as much
as possible, just like what happens in the computer world.

Every so often some of the new authors on AI applied to the law make the same
mistake, even if from a more sophisticated point of view. It is possible to look at Hart's
classical example about no through traffic for vehicles and naively answer his question
about whether a scooter is a vehicle or not.

The initial naivety can be found in confusing the fact that every term has a
nucleus of plain meaning and then a shadowy zone until we arrive at what the term
clearly does not mean in ordinary language, dividing case law into clear cases and
difficult cases.

A Latin saying maintains in claris not fit interpretatio but lawyers know that this
can never be relied on: a norm is to be interpreted in a context, a context in a set, that set
in a tradition, etc. It can never be said that there is a claris.

The second naivety lies in discovering more suitable methods than others.
Lawyers have proposed literary and teleological interpretations and those related to the
best interests of the parties, functional interpretations and even the functional failure to
obey the law.

4. The Logic of Norms

This is perhaps the clearest and most exemplary issue: lawyers are familiar with
this type of reasoning that includes norms and puts logical ethics to the test.
Norms are expressed through sentences which do not have truthvalues, thoaugh
lawyers, political men, moralists and common people discuss about the mutual
incompatibility of certain norms; they deduce consequences of norms or maintain that
such or such a norm entails another norm.
In the well known handbook of Robotics, the three laws of robotics are enunciated: 1.
a robot cannot damage a human being, neither can it allow that, because of its
omission, a human being gets damaged; 2. a robot is to obey orders emanated by
human beings, provide such orders do not violate the first law; 3. a robot is to protect
its own existence, provided its self-defence do not contrast with the first or the second
law.
The enunciation of the laws is such that there is recursivity among them, because
there is an expressed reference among them (after the way of Lukasievich's sentence
definition). The Laws presuppose the possibility of instantiation in the robot A, B or C.
All enunciated laws are logical laws; how however can one apply logical laws to
sentences which do not have truthvalues?
In 1937 Joergensen expressed those difficulties trough the dilemma: Since norms are
neither true nor false, either 1. there are not logical relations between norms and all
the attempts in such a sense are vain, or 2. there exists a logic of norms, but then logic
goes beyond the realm of truth.
All along the line of Aristoteles, Kant, the Wittgenstein of "Tractatus" and Prior,
people maintain that traditional theory; they maintain that it is possible to talk in terms
of logical consequences only starting from the truth of the components; then it ends up
with Tarski's concept of semantic consequences.
Another line of thought which starts from Platon, along the Wittgenstein of
Philosophical Investigations to Belknap finds that the notion of consequence is
possible to be inferred from a set (context) and that only in a context the components
make sense. But then the sense of logical operators is characterized through their use
in reference to a concept of consequence which may be expressed through a set of
axioms independently from the notion of truth or falsity (something which Tarski had
already done before enunciating the notion of semantic consequence).
Then, since it is possible to reconstruct syntactically the notion of logical consequence
and, starting from it, logical connectives and operators, a logic of norms is possible;
it is possible to characterize an operator as Obligatory in a very similar way to unitary
operators; though all that goes beyond the normative world to invest the problem of
the foundations of logic tout court.
We can understand, for instance, that the intuitionist negation and the classical
negation possess different properties and therefore they are not two theories about
negation, but two different, non competing operators with a different sense.
The most important point about what has been discussed here for computer
scientists is that we have illustrated a form of presentation of deontic logic which can
be constructed by using the same criteria as any other kind of sequential logic. Further-
more, the deontic logic represented is decidable.
The more abstract and syntactical the logical rules are, the easier it is to computerize
them.
The philosophical prejudice of semantic priority occurs also in the computer world.
Even where work is merely being done with sentences within a context of derivation,
many people still think that they are actually "using" true and false values (35).
We believe that the abstract reformulation of central notions in logic instead of the
Tarski-type semantic approach cannot help but encourage progress in informatics as it is
impossible to transmit semantic notions to the computer but only their syntactical
correlates. If compatibility, consistency, and other logical notions are introduced, as
well as the operators, in a completely syntactical way, automated calculation will
benefit by broadening its horizons.
If what we propose is correct, it will, of course, result in new fields of logical calculi
opening up and the eventuality of establishing criteria relating to calculability and
automated decision-making, for example, in deontic logic. All this should be carefully
taken into consideration by computer scientists as it aims at widening the area of
application of their work, given that all that which is computable and decidable should -
broadly speaking - be able to be computerized.
5. Why the Concept of “Legislative Inflation” is Misleading

At the Symposium Rationality in Legislation it would be better, in the name of


rationality, to look at words (or syntagms, as in this case). The expression “legislative
inflation”, one of the Symposium subjects highly accepted by the authors, is deceiving
and misleading. It derives from economy, first reason why it is popular, and it is very
easy to understand: the ills that today afflict legislation have been caused by its
indiscriminate growth. But it is misleading. The worst illness of legislation in our days
is the absence of certainties, absence that does not (or not only) derive from the
presence of too many laws. Uncertainty in legislation is caused by a phenomenon
common to all formal systems: formal systems must be coherent, namely without
contradictions, complete, that is to say without gaps, and economic, that it means
without redundancies.
A legislative system results from all the laws that have been created1 less that
abrogated. It might seem easy to obtain the effective right of removing from every code
system those laws that are explicitly diverging, but it is not enough!
We read laws2, independently of the expression “all norms that are against this
one have been abrogated”, in this way: a later law departs from the previous one and a
special law departs from a general one. This is the problem of repeals by implication
(implicit derogations).
If we do not remove the repeals by implication (implicit derogations), we do not
know which laws are effective, that is to say that right is uncertain. Beside, in this field
quantity does not matter. For example, the Italian Region Puglia produced only 1000
laws, precisely 1037, but there still exist legal uncertainties caused by the fact that it is
not known what laws are effective, because it is not known what laws are diverging. For
this reason, the Region issued a Regional decree (n° 1 of the 22nd of January 1998) in
order to readjust the legislation. This way we could recognise 95 expressly abrogated
laws, while 190 laws could not be really applied because the object had already
finished. There remained 752 effective laws; from these it is possible to go against other
126 laws, that conflict with effective rules and 91 abrogable powers, after that it has
been given them a definitive interpretation.
Many years ago, I called this situation of system formal uncertainty “legislative
pollution”, borrowing the expression from the Science of the Environment: a system
becomes “polluted” when it grows too much (inflation), especially when it can not
remove dregs, namely repeal by implication.
I also indicated how to find a remedy to this situation using computer programs 3.
Together with Carlos Alchourron we demonstrated that logic does not need the concept
of truth and falseness4, therefore we can apply all these operations to norms with
modern technologies.
All politica systems have formal rules in order to understand when a law has been created: penalty,
promulgation, publication.
2
From the Romans: Lex posterior derogat prior; Lex specialis derogat generalis.
3
A. A. Martino "La contaminacion legislativa" Buenos Aires, 1973. A.A. Martino & J. Vanossi,
Remedios a la contaminacion legislativa, propuesta al congreso italo argentino de filosofia del derecho,
1975. A. A. Martino "La contaminacion legislativa", Anuario de sociologia y Psicologia juridicas
Barcelona, 1977, pag. 47-63. A. A. Martino "La progettazione legislativa nell'ordinamento inquinato",
Studi parlamentari e di politica costituzionale, anno X, 1977, n. 38; p. 1- 21, Roma 1977. Why an
automated analysis of legislation? Computing power and legal reasonning. Edited by Charles
Walter St. Paul, West Pubblishing Company, 1985; p. 413/466 ISBN 0-314-95570-4
4 "Aiuto computazionale al legislatore", Firenze 1979; A. A. Martino "Software for the legislator" in A.
Pizzorusso (editor) " Law in the making. A comparative survey." Heidelbrg, Springer Verlag, 1987.
A. A. Martino "Legal expert systems" en T.D. Campbell, R.C.L. Moffat, S. Sato, C. Varga (editors)
Archiv für Rechts- und Sozialphilosophie, Beiheft 39, 1991, Franz Steiner Verlag, Stuttgart. "
Expert Systems in Law" A. A. Martino, (editor), . North Holland, Amsterdam, New York, Oxford,
1992
5
C. E. Alchourron & A. A. Martino "Logic Without Truth", Ratio Juris, Basil Blackwell, vol. 3 n°1,
March 1990, p. 46-67
6 C. E. Alchourron & A. A. Martino "Logic Without Truth", Ratio Juris, Basil Blackwell, vol. 3 n°1,
March 1990, p. 46-67

You might also like