Declarative vs. Imperative Programming
*** From Robert Harper, 19 Apr 2013 ***
I would point out that even Prolog can only be understood fully in
operational terms, e.g. the "cut" operator !, which controls the proof
search procedure.
*** From Vladimir Lifschitz, 20 Apr 2013 ***
Yes, and consequently Prolog is not fully declarative. But there are
other flavors of logic programming, for instance, answer set programming
(ASP, see http://en.wikipedia.org/wiki/Answer_set_programming). ASP
does not include the cut operator. On the other hand, it incorporates
constructs not allowed in Prolog, such as choice rules. It does not have
an operational semantics. In fact, different implementations use very
different algorithms, but they produce the same result for the same program.
The programmer doesn't need to know which implementation is going to be
used. ASP is fully declarative, like functional programming.
Here is how I would characterize the difference between declarative and
imperative programming. A program in an imperative language describes an
algortithm. A program in a declarative language describes a specification.
*** From Rishiyur Nikhil, 22 Apr 2013 ***
But one man's specification is another man's algorithm. Even in Haskell,
one writes a sorting program typically by choosing a particular
algorithm (heap sort, quick sort, ...). Sure, these can be considered
specifications of sorting, but they are hardly the most abstract spec
for sorting.
*** From Vladimir Lifschitz, 22 Apr 2013 ***
The way I see it, the difference between declarative and imperative
programs is as clear-cut as the difference between declarative and
imperative sentences in natural languages: "I like it" vs. "Sit down".
A declarative sentence can be true or false; an imperative sentence
can be executed or not executed.
Here is how sorting is done in answer set programming (I learned this
from Roland Kaminski):
order(X,Z) % X is the predecessor of Z
:- p(X;Z), % if both X and Z belong to p,
X The way I see it, the difference between declarative and imperative
> programs is as clear-cut as the difference between declarative and
> imperative sentences in natural languages: "I like it" vs. "Sit down".
> A declarative sentence can be true or false; an imperative sentence
> can be executed or not executed.
Indeed, "Sit down" is an imperative sentence. "To sit down" is a noun. "It
is nice to sit down" is a declarative sentence. In an imperative
programming language, you will find all three. So, to me, nothing is
clear-cut about the distinctions that people make!
> A declarative program may look similar to a procedural program if they
> use similar data structures. Still, the difference is clear: a (truly)
> declarative program describes what is counted as a solution; a procedural
> program tells us which operations to perform.
Indeed, high-level programming languages that can automate a lot of the
programming work are always brilliant. However, I don't see what this has
to do with the declarative-procedural spectrum. There are high-level design
tools using UML notations that also achieve a high degree of automation in
program generation, and a good proportion of those notations are imperative.
As I mentioned previously, the highest level programming systems that we
might imagine for applications like stock-market trading systems or on-board
flight control systems might indeed be imperative.
*** From Mark Janssen, 22 Apr 2013 ***
> Similarly, a C program is a specification for a hundred different
> machine code algorithms that manage memory, register allocation etc.
A C program is not a specification in any sense (if it's even
appropriate to call C source text a "program", prior to compilation
into an executable). Only the theorist who has never gotten "close to
*** From Vladimir Lifschitz, 22 Apr 2013 ***
Nikhil writes,
> a C program is a specification for a hundred different machine code
> algorithms that manage memory, register allocation etc.
According to Mark,
> A C program is not a specification in any sense
They obviously use the the same word, "specification," in different ways.
Mark talks about formally specifying the computational problem to be
solved (and so do I). Of course a C program is not a specification in
this sense. Algebraic and differential equations are specifications, as
well as pure LISP programs and ASP programs. Nikhil and Uday talk about
specifying a class of computational procedures that differ from each other
by low-level details such as register allocation. That usage indeed makes
the notions of specification and implementation relative. What is a
specification will depend then on how large the classes are, what level of
detail we are willing to disregard.
*** From Mark Janssen, 22 Apr 2013 ***
Yes, I think the key "articulation" point for that word/concept (of
"specification") is the distinction where one can say that they are
*commanding* the computer (i.e. imperatively) vs. merely *telling* the
computer something ("procedurally") -- a subtle but significant
difference, but the latter always has a layer of
specification/definition where there is a translation from a
higher-level domain to another, lower-level domain. The key point for
me (within the realm of physical hardware and Turing Machines), is
that there is a point where this "ladder" of going from high-level to
lower-level "bottoms out" and there is no more room for interpretation
-- the electrical signals which are governed by the laws of physics.
*** From Will Cook, 22 Apr 2013 ***
I have been thinking about the use of the words "declarative" and "imperative"
for some time. This is my understanding of how they are commonly used in
computer science today:
Declarative: describing "what" is to be computed rather than "how" to compute
the result/behavior.
Imperative: a description of a computation that involves implicit effects,
usually mutable state and input/output.
As others have pointed out, our usage originally came from the distinction
between imperative and declarative statements in natural languages. However,
I think that our usage has diverged significantly from this origin, so that
the words no longer form a dichotomy.
For example, one might argue that a finite state machine is both declarative
and imperative in the computer science senses. I think Uday made a similar
suggestion. There is also Wadler's classic paper on "How to declare an
imperative".
Another hint that "declarative" and "imperative" are not antonyms is that
the definitions don't have any significant words in common. The antonym of
"imperative" in common usage is "pure functional". I don't know of a
widely-used word that acts as an antonym for "declarative", although
"operational" might work. It may be that "imperative" has a connotation of
"step-by-step", which hints at it being the opposite of "declarative", but
this connotation is fairly weak at this point. If we wanted to we could try
to force declarative and imperative to be antonyms, possibly by
redefining what we mean by imperative, but I'm not sure that would be an
improvement.
I agree with those who say that "declarative" is a spectrum. For example,
some people say that Haskell is a declarative language, but in my view
Haskell programs are very much about *how* to compute a result. It is true
that many details about how are left out (memory management, order of
operations, etc). But if you compare a Haskell program with a logical
specification (pre/post conditions), they are quite different. Thus while
I would say Haskell is more declarative than many other programming
languages, Haskell is not a declarative language in the strongest sense of
the word. Haskell programs are not specifications, they are computations,
in the sense that they say how to compute and answer.
Here is a quick shot at a spectrum between "how" and "what". Each level has
a quick summary of the "how" that is involved, and it also includes all the
"hows" listed below them. I suspect that many of you might disagree with the
placement or absence of various languages, so I cannot claim that this list
is definitive.
** More "How"**
How the machine works
Assembly
How memory is managed
C, C++, etc
Order of operations
Java, Smalltalk, Python, Ruby, etc
How data flows (with issues like nontermination and cut)
Haskell, Prolog, Lambda Calculus (in various forms)
----split between Programming and Specification Languages---
Restricted Specification Languages
BNF, SQL, Excel, Statecharts
Logical specification languages
VDM, Z, B, Alloy
*** More "What" ***
The idea that a specification language (by definition) cannot be executed is
widely held but false. I consider BNF to be a simple counter example. BNF is
clearly a specification language, and it clearly has efficient execution
(parsing) strategies.
*** Adam Smith, 22 Apr 2013 ***
Vladimir's way distinguishing declarative vs imperative programs is indeed
clear-cut, but let me offer an explanation why it's not going to convince
every programmer or language designer. The key idea is that languages of
one type are usually rich enough enough to support an embedded language of
the other type (or possibly many layers like this), and the experience of
the programmer/designer may so strongly focus on the embedded languages
that the properties of the outer language become largely inconsequential.
Concretely, if you come across a large collection of declarative statements
that are semantically about imperative commands, you start to simply read
through the declarative aspects and see the imperative program as the real
one. Here are a bunch of declarative statements (that are simply true):
~ The first step of the algorithm is to set a variable called "maxlen" to
"0".
~ The second step of is to enact a for-loop over the variable "j" from "0"
to "2".
~ The body of this loop is the action of setting "maxlen" to the maximum of
the value of "maxlen" and the length of input value "j".
The following example is from an on-going project involving using answer
set programming to reason about the different possible execution paths
through imperative code. At the surface level, it's just one giant
declarative AnsProlog fact that states that "add(....) is a procedure". For
all other purposes (including capturing an idea in code and debugging its
representation on the basis of it's example executions), it's a pile of
quite imperative code.
procedure(
add(
set(maxlen, 0),
for(j,0,2, do(
set(maxlen, max(maxlen,input(len(j)))))),
set(c, 0),
for(i,0,maxlen, do(
set(k,0),
for(j,0,2,do(
if(lt(i, input(len(j))), then(
set(k, plus(k,input(n(j,i)))))))),
if(lt(0, c), then(
set(k, plus(k,c)))),
set(c, 0),
if(lt(9, k), then(
set(c, hi(k)))),
set(output, digit(i,lo(k))))),
if(lt(0, c), then(
set(output, digit(i,c)),
die(special))),
die(end_of_procedure))).
This specification of a concrete algorithm is interpreted by a compiler (or
is it just a specification for a compiler?) that transforms it into another
specification cast in terms of assembly-like primitives:
http://pastie.org/7698516 (following one particular compilation strategy
amongst many). Finally, it is executed (in a sense) by an abstract
processor with a program counter, registers, and application-specific ALU
pathways. The result, in terms of the answer sets of the overall answer set
program, describe the set of all branching pathways that could be explored
by the algorithm on the basis of unseen (nondeterminsitically chosen)
inputs.
At one level, I've only ever entered facts and rules that I trust the
solver will consider as a collection (but in no particular order). At
another level, I've built a procedural pipeline: compile, assemble,
generate processor, generate program inputs, execute program on processor,
summarize observed pathways. At yet another level, I've written a program
in a (made up) language that legitimately has its own operational semantics.
In traditional Prolog, every rule you write can be read as simply a
statement of a fact in the :-/2 predicate (clause/2). As a programmer,
however, you might your rules as imperative lists of execution steps for
backwards chaining -- because that's what the :-/2 rules most clearly seem
to be talking about. Meanwhile, once you get into making meta-interpreters
(quite common, I believe), you quickly go back to treating instances of
:-/2 as declarative statements of the structure of some data, a
specification of a case in some formal structure.
At every level, whether the language is imperative or declarative is quite
clear-cut, what's not at all clear is which level is the relevant one to
talk about.
*** From Marc Denecker, 26 Apr 2013 ***
Here is a another conception of the notion "declarative" logic and its
difference with imperative languages.
This chair has three legs
It is nice to sit in it.
All students attended the lecture this morning.
Each lecture takes place in exactly one room
...
They are declarative sentences, pieces of information. They are not
commands, not procedures, not descriptions of problems.
In this view, a theory in a declarative logic is a bag of
information. It is not a command, not a program. It is not
even a representation of a computational problem.
Consequently, a theory does not "do" anything. It cannot be executed.
It has no solutions. It provides information, and that is it.
That does not mean that it cannot be useful. In the first place, it can
be used by human experts to communicate information about a domain,
in the same way as currently a UML domain specification is used. It is
like a UML domain specification, but in a formal logic. In the second
place, this information can be used to solve problems, see below.
In this conception, logic is a clear and precise formal language to
express information. The role of model theory is to give a formal
account of the information content of expressions of the declarative
language.
There is no inherent form of inference to such a logic. Therefore,
theories in such a logic are not representations of problems.
However, the information in a theory can be used to solve
computational problems or perform tasks, by applying suitable forms of
inference. In fact, it may be possible to use the same
theory, the same information, to solve many types of problems, by
application of DIFFERENT forms of inference.
For example, a theory of a scheduling domain can be used for the task of
checking the correctness of a given schedule (by model checking),
but also of generating schedules (by model generation).
This conception of logic differs from the view in almost every
area of computational logic. Computational "declarative" logics such as
logic programming (e.g. prolog), functional programming languages (e.g.
Haskell), query languages (e.g. SQL), deductive database
languages, Answer set programming, abductive logic programming,
description logics, temporal logics, constraint logic programming,
"deductive" logic, ...: they have in common that the language is
associated to a unique "inherent" form of inference. They are
"uni-inferential". This has advantages and disadvantages. In any case,
it invites human experts to view a theory as a description of a
computational problem (to be solved by applying the inherent form of
inference).
In some cases, like in Prolog or Haskell, when not only the type of
inference but also its implementation is specified together with the
logic, a (unique) operational semantics is imposed on the language. From
then on, theories can be viewed as procedural programs. E.g., under
SLDNF resolution, Prolog "theories" can be understood as a sort of
procedural programs (Kowalski's procedural interpretation) with
non-common procedural features such as backtracking and unification.
We see that by imposing a unique form of inference and, next, by
imposing an implementation of that inference, "declarative" languages
gradually shift in the direction of procedural languages. I think that
this is the reason why the notion of "declarative language" and the
difference with imperative langauges has become so blurred.
On the other hand, if no inherent form of inference is attached to the
logic, a theory cannot be viewed as a problem and certainly not as a
program. It is a specification, a bag of information and not more than that.
This does not mean that declarative logic as defined above is not useful
for computational purposes. Quite on the contrary. The above conception
of logic suggests to build Knowledge Base Systems: multi-inference
engines that support different forms of inference on the same theories
(the "knowledge base"). So, perhaps surprisingly, logics without
inherent form of inference are actually "multi-inferential". This view
of logic and of computing with logic was called the "Knowledge Base
System paradigm" by me and Joost Vennekens in a position paper at
ICLP2008.
The research project of my group is to build a KBS system called IDP,
for a "declarative" language in the above sense. This language (FO(.))
is a rich extension of classical logic. An example of the knowledge
base paradigm in operation can be seen in a propotype interactive
configuration tool built with IDP and available at
http://dtai.cs.kuleuven.be/krr/software (See KB-System demo)
This tool supports 5 different forms of inference on the same theory to
support different functionalities for the user: model checking,
propagation, model generation/expansion, model generation with
optimisation, explanation. In this application, the computations to be
performed are not computationally hard, but what counts is the
principle. This system displays a unique form of reuse of the
theory that is excluded in principle in procedural languages and
uni-inferential declarative programming paradigms. Here the conceptual
difference between a declarative theory and a program is clear-cut.
About imperative and procedural languages
A program in an imperative language can be viewed as a (compound)
command to the computer but it also contains a precise description of
computer processes (the operational semantics). As such, I do not think
there is an important difference between imperative and procedural
languages.
It also means that C, C++, Java programs,... decaratively describe
something: namely these computer processes. In this respect, these
languages can be seen as process logics. So, for me C is indeed a
declarative logic.
However, it is obviously a very different logic than say classical
logic. One aspect is related to "universality": C++ can only describe a
very restricted class of (computer executable) processes. Classical
logic can be used (although not so conveniently)
to describe such processes as well (e.g., using some
methodology for temperal knowledge representation such as situation or
event calculus).
But Classical logic can also be used to describe other domains :
temporal or non-temporal. E.g., to describe the theory of groups, or
the theory of correct university programs. C++ can not be used for that,
simply because what is described in such a theory are not computer
processes. So C++ is a domain specific logic with a very narrow domain.
Another difference is that C++, seen as a declarative process logic,
is (currently) associated with a unique form of inference: "execution
inference" of the computer process.
*** From Adam Smith, 26 Apr 2013 ***
I generally agree. In this email I want to point out a common pathway for
uni-inferential languages to be used as multi-inferential languages. The
hallmark is having a "programmable X" feature.
In Haskell, the interaction between the Monad typeclass and the do-notation
allows the phenomenon called "programmable semicolons". Haskell's
do-notation commits to a particular linked structuring of computations
without defining any preferred semantics. Only once a suitable
implementation for two operations called "bind" and "return" is supplied
(selected on the basis of types) can we look at a block of code and say
what will happen operationally. By providing different combinators,
programmers can set up wildly different semantics for a single block of
code.
In Prolog, a variety of language features and programming practices allow
for a kind of "programmable implication". This not only allows the
inference algorithm to be altered (from depth-first to breadth-first, for
example) but also altering the mode of inference (from deductive to
abductive, for example). Applied to an extreme, the alternate inference
type can be taken seriously enough to consider it's pairing with the
standard syntax an entirely different language. This is how we originally
got Erlang (which I still can't help but read as a pile of Prolog terms in
the same way that someone not familiar with Ant can't help but read
build.xml files as a bunch of uninterpreted markup).
In AnsProlog, program reification allows for "programmable optimization" (
http://www.cs.uni-potsdam.de/wv/metasp/). The situation is a little better
labeled "programmable model-theoretic satisfiability" because you get to
define non-standard criteria for accepting or rejecting models. The Lua
integration in the Potassco tools allows for a kind of "programmable
uninterpreted functions" (er, actually interpreted functions) that I've
used quite a bit in multi-inference projects.
To take a uni-inferential language and extend it in the multi-inferential
direction, it's enough to make some of the core structures programmable in
useful sense. This is just one strategy, of course -- making flexing the
inference type associated with the language from the inside. Model checking
tools, by contrast, come at this from the outside.
*** From Uday Reddy, 26 Apr 2013 ***
Marc Denecker writes:
> There is no inherent form of inference to such a logic. Therefore,
> theories in such a logic are not representations of problems.
>
> However, the information in a theory can be used to solve
> computational problems or perform tasks, by applying suitable forms of
> inference. ...
>
> It also means that C, C++, Java programs,... decaratively describe
> something: namely these computer processes. In this respect, these
> languages can be seen as process logics. So, for me C is indeed a
> declarative logic.
>
> However, it is obviously a very different logic than say classical
> logic. One aspect is related to "universality": C++ can only describe a
> very restricted class of (computer executable) processes. Classical
> logic can be used (although not so conveniently)
> to describe such processes as well (e.g., using some
> methodology for temperal knowledge representation such as situation or
> event calculus).
>
> Another difference is that C++, seen as a declarative process logic,
> is (currently) associated with a unique form of inference: "execution
> inference" of the computer process.
Thank you for your long and thoughtful post. I am in agreement with much of
what you say. But some key points of difference remain.
First of all, there is no law that says that declarative languages need to
be executable. However there is a law to the effect that programming
languages need to have a declarative reading. That is so that we can reason
about the programs written in them. ("Declarative" and "procedural" were
the terms coined in the logic programming community. The corresponding
terms in programming language community are "denotational semantics" and
"operational semantics".)
The attractiveness of logic programming, when it was first launched, was that
it had a ready-made declarative reading, which was expected to make a big
difference for programming. Unfortunately, Prolog also had a bad procedural
reading that people struggled with. So, in practice, Prolog programmers
spent almost all of their effort on the procedural meaning, and the
declarative meaning went out of the window. In the end, Prolog was a good
dream, but a bad reality.
Coming to classical logic per se, I believe it is ill-fit for describing
computer programs or "processes" as you call them. First of all, classical
logic doesn't have any well-developed notion of time or dynamics, and it has
a nasty existential quantifier which assumes that all things exist once and
for all. In computer systems, new things come into being all the time, well
beyond the time of the original development or deployment. We know how to
write computer programs that deal with that. Classical logic doesn't. (LEJ
Brouwer made this criticism a long time ago in the context of mathematics.
There it might have been just a philosophical problem. But in Computer
Science, it is a *practical* problem.)
I believe logicians of philosophical inclination are prone to be enamored
with what they have and lose sight of what they don't have. For a good part
of two millennia, they kept debating Aristotilean syllogisms, without
realizing that classical logic was yet to be discovered. Finally, it was
left to the mathematicians to formulate classical logic. The logicians of
today are similarly enamored with classical logic without much of an
understanding of what it lacks. We would be ill-advised to listen to them.
Or, we would be stuck for another two millennia, God forbid.
[By the way, the Wikipedia page on Classical Logic is in a pitiful state. I
hope somebody will pay attention to it.]
Brilliant new things are happening in Logic.
- Mathematicians have formulated Toposes (a generalization of Kripke
models), which give us a great new variety of models for intuitionistic
logic. There are deep mathematical facts buried in them and continue to be
discovered. Toposes and intuitionistic logic are appropriate for modeling
computer programs, which live in a growing dynamic world rather than a
static one.
- Girard has formulated Linear Logic, which broadens our idea of what kind
of "things" a logic can talk about. David Pym and Peter O'Hearn invented
Bunched Implication Logic, extending Linear Logic with a beautiful
model-theoretic basis. These logics applied to imperative programming
(which go by the name of "Separation Logic") are revolutionizing the
development of technology for imperative programs.
It is time to leave behind the classical logic. In fact, we should have
done it a long time ago.
*** From Marc Denecker, 3 May 2013 ***
"It is time to leave behind the classical logic. In fact, we should
have done it a long time ago."
To me, that sounds like a total and unconditional rejection.
You mention "(classical logic) is ill-fit for describing computer
programs or processes" and your references to scientific progress are
all in the context of modelling (the executions of) computer programs
("processes"). I am not an expert in logics for modelling programs and
processes, and it was not my intention with my previous email nor with
this one to defend classical logic for this purpose.
But there are plenty of applications of logic outside modelling computer
programs.
I certainly have reservations about FO. I would agree that FO's syntax
need to be improved, that FO is not expressive enough and needs to be
extended. For example, I think you have a good point that classical
logic's existential quantifier is not well suited for expressing
"dynamic creation" of objects, and that such an operator might somehow
be added.
However, I would not throw away the standard
existential quantifier; in many applications it is
exactly the one that we need. In my opinion, that is the case with all
the connectives and quantifiers of FO (\land,\lor,\neg,\forall,\exists):
they are all fundamentally important information composition operators
and their semantics in FO is essentially correct.
If your rejection is as total as it sounded, you will disagree with
that. Let me give you a potential argument for your case, which would
really pull me over to your side.
Consider the following information.
A if in a semester no student registered for a course, then
this course does not take place in that semester.
In class we represent it in FO as:
B ! c ! s : Semester(s) & Course(c) & ~ ? st: Registered(st,c,s)
=> ~ TakesPlace(x,s)
or as
B' ! c ! s : Semester(s) & Course(c) & TakesPlace(x,s)
=> ? st: Registered(st,c,s)
(! is shorthand for forall, ? for exists, ~ for not)
I point my students to the precision of B and B' as representations of
A, and to the correctness of B's FO connectives and quantifiers to
capture exactly the information that was to be expressed. The example is
one of a dime a dozen. I argue to them that standard conjunction,
disjunction, (objective) negation, universal
and existential quantifiers are fundamentally important information
composition operators and their FO model semantics is the
right one. We see computational applications of such sentences using our
KBS system as a didactic tool, for querying in the context of a
database, for searching course plannings in the context of scheduling,
and for some other sorts of applications. Note that the sentence
contains the existential quantifier. But I think it is a clear case
where the standard FO existential quantifier is appropriate, and not
your "dynamic creation" one.
If my strong claims above are
right, then I would argue that FO is indeed a base language (in the
sense that \land, \lor, \neg, \forall, \exists are base composition
operators, and FO's semantics for them is correct!).
On the other hand, here is a very convincing way to show me that I am
wrong: it suffices to show me ONE database in which the informal
proposition A is true and the formal sentence B is false or vice versa.
*** From Uday Reddy, 3 May 2013 ***
Marc Denecker writes:
> "It is time to leave behind the classical logic. In fact, we should
> have done it a long time ago."
>
> To me, that sounds like a total and unconditional rejection.
No, what I meant is that the classical logic represents a stage in the
development of logic. It cannot be taken as the final answer. In fact, we
cannot accept that we have a final answer until the entire natural language
has been formalized, which might take a very very long time indeed! (The
view I take, following Quine, is that logic is a regimentation of natural
language. We can perfectly well circumscribe various regimens for various
purposes.)
I am entirely happy with the characterization of logical connectives as
"information composition" operators. But we can only accept it as a good,
but vague, intuition. We do not know what this "information" is. Neither
do we know what the information is about. So, in order to claim that
classical logic is a canonical information composition calculus, somebody
would need to formalize those notions.
Even though Vladimir has omitted the word "programming" in titling this
subthread, the discussion has been about "declarative" and "imperative" as
paradigms of programming. So, I would rather not divorce myself from
programming concerns in discussing these issues.
*** From Mark Janssen, 3 May 2013 ***
>> To me, that sounds like a total and unconditional rejection.
>
> No, what I meant is that the classical logic represents a stage in the
> development of logic. It cannot be taken as the final answer. In fact, we
> cannot accept that we have a final answer until the entire natural language
> has been formalized, which might take a very very long time indeed! (The
> view I take, following Quine, is that logic is a regimentation of natural
> language. We can perfectly well circumscribe various regimens for various
> purposes.)
But if we're going to be in the Computer Science department, can we
get away from the idea of "logic as a regimentation of natural
language" (which is fine for the Philosophy department) and move to
the idea of logic as equations of Binary Artihmetic and Boolean
Algebra?
*** From Luis Caires, 4 May 2013 ***
While what (I think) you call boolean logic may be useful for explaining
(basic) computing operations, and provide an adequate foundation for
(basic) hardware design, it just does not scale up to higher levels of
abstraction, as needed to talk and reason about complex computing
systems.
It really takes quite a bit to build up from bit-level operations as
actually performed by the hardware up to concepts such as "ADTs",
"algorithms", "objects", "functions", "modules", "types", "abstract
machines" - ideas such as "code as data", "atomicity", "fairness",
"levels of interpretation" - and all the great stuff that belong to
the world of (both practical and theoretical) computer science. To be
able to talk about all this we need much more than basic "binary
artihmetic and boolean algebra".
For our purposes, it is perhaps more convenient to think of "symbolic
logic" as just a the convenient language to describe properties and
formally reason about objects in some domain.
Currently, one single unified logic is not yet available, we need
several logics;
different logics describe different kinds of properties of CS objects,
useful for different purposes, all based, we hope, on a common trunk
of deep principles.
For example, I guess every programmer appreciate the usefulness of
types in programming languages. But a type system is nothing more than
a certain kind of a symbolic logic. It does not calculate with
concrete data values or bits, but instead uses logic to deduce and
reason about the type of things around, allowing the compiler to
prove, at compile time, without executing the generated code, that a
program satisfies a set of properties. Namely that, when actually
executed, it will not suffer from basic runtime errors, such as
invalid operations on data, etc.
This is just an example. Each particular logic provides reasoning
rules, allowing us to know how properties expressible in the logic
relate, and how the objects the logic talks about satisfies a property
or not. In some cases, algorithms can be provided to check whether an
object satisfy a given property (automated verification).
These basic concepts are essential for computer science, as no
artifact (processors, algorithms, programming languages, compilers,
etc) can be precisely and fully understood without reference to the
various properties it should satisfy. If you want to check that a
given code piece really returns a sorted vector, you cannot do this
conveniently just with (what I think you call) boolean logic (even if some
simple verification problems can be "compiled" down to boolean
logic). As another example, some of us may teach (separation) logic to
students to empower them with solid techniques for checking the
absence of races in concurrent java programs. As yet another example,
we (with colleagues) have recently discovered how to use (linear)
logic to reason about the safety of session protocols in distributed
systems.
So there is this idea that (symbolic) logic is actually the "calculus
of computer science" (see
e.g. http://www.cs.rice.edu/~vardi/logic/). While symbolic logic has
roots in philosophy, I guess it is fair to recognize that it is being
now developed mostly in conection with (theoretical) computer science
and math, often driven by the deep relation between logic and
computation. So I would really encourage you to look into all this.
Logical concepts may be also perhaps useful to discuss other issues in
this thread such as "specs versus programs", and "declarative versus
imperative". Let me copy some remarks by Hoare (ACM, 2009):
"So I believe there is now a better scope than ever for pure research
in computer science. The research must be motivated by curiosity about
the fundamental principles of computer programming, and the desire to
answer the basic questions common to all branches of science: what
does this program do; how does it work; why does it work; and what is
the evidence for believing the answers to all these questions? We know
in principle how to answer them. It is the specifications that
describes what a program does; it is assertions and other internal
interface contracts between component modules that explain how it
works; it is programming language semantics that explains why it
works; and it is mathematical and logical proof, nowadays constructed
and checked by computer, that ensures mutual consistency of
specifications, interfaces, programs, and their implementations."
So I guess this view of logic actually belongs to the CSD ...
*** From Tadeusz Litak, 4 May 2013
The original point made by Uday re classical logic:
> Coming to classical logic per se, I believe it is ill-fit for describing
> computer programs or "processes"
is certainly worthy of attention. But it does not seem to imply the conclusion
of that mail:
> It is time to leave behind the classical logic. In fact, we should have
> done it a long time ago.
(even if it wasn't intended, it does indeed sound "like a total and
unconditional rejection"... such things happen in the fervor of a
discussion :-)
"Logical pluralism" is a position rather well-established in the philosophy
of logic. I would think that in the context of Computer Science, it is even
more tempting.
[incidentally and perhaps contrary to established views, even Brouwer himself
could be perhaps seen as one of first logical pluralists. While he very
strongly rejected Fregean-Russellian logicism in *foundations of mathematics*,
he has always held the work of Boole and the whole algebraic tradition in
logic in high regard... But this is an aside]
It might even happen to be Uday's own position, if I understand correctly the
remark that "we can perfectly well circumscribe various regimens for various
purposes." Most of my email will elaborate on this.
I would simply say that whenever one wants, needs or has to think of all
propositional formulas (also those possibly involving implication, and also
those involving fusion, "tensor product" or what have you) as *rewritable to a
conjunctive-disjunctive normal form without loss of information*, then the
underlying domain logic is essentially classical.
It is hard to imagine whole areas of Theoretical CS if rewriting formulas to
CNF or proofs by contradiction/contraposition/excluded middle are suddenly
deemed outdated and/or illegal... I mean not only and not even primarily
logic programming, but also finite model theory, complexity theory,
ontologies/description logics or the whole PODS/VLDB community...
[actually, as a curious aside, the logic of database theorists, while
certainly not constructive, is not fully classical either. They dread the
top constant and unrestricted negation, preferring instead relative
complement. This has to do with assumptions such as "closed world", "active
domain" and the demand that queries are "domain independent". In short,
their logic is rather that of Boolean rings without identity, which---funnily
enough---also happen to be the setting of Stone's original work. It is just
contemporary and ahistorical misrepresentation to say that Stone was working
with "Boolean algebras". But this, again, is an aside...]
And even in the context of Curry-Howard correspondence, classical logic is a
legitimate setting to discuss languages with control operators, first-class
continuations, static catch/throw a la Scheme etc. There is so much stunningly
beautiful work in that community that deserves to be better known...
But, equally obviously, not all the programming languages have such constructs.
Furthermore, as linear logicians (already mentioned by Uday) will be happy to
tell you, there are contexts when even intuitionistic notion of implication
(so also the one of topos-theorists or proof-assistants, for example) is way
too coarse-grained. Particularly when one wants, needs or has to be
resource-aware. Also, the recent work of Wadler, Pfenning and other authors
suggests that Curry-Howard correspondence for concurrency will have to do with
linear rather than intuitionistic logic.
[And as substructural logicians will be happy to tell you, there are contexts
where even linear logicians may seem coarse-grained, thick-skinned,
corner-cutting brutes. :-) But this, yet again, is an aside.]
But where I most likely would part ways with Uday is when he claims (if I
understand correctly) that we are approaching or even should approach "a
final answer" of any kind. To me, searching for one logic valid in all
CS-relevant contexts seems a rather misguided enterprise. Especially or at
least when we talk about logic understood as a formal inference system.
What we perhaps need is more introductory logic courses---and also handbooks
and monographs---for budding CS undergraduates and graduates (and perhaps
also some postgraduates) which would make them understand the subtlety and
complexity of the picture. And the benefits and costs of adopting specific
inference rules.
Proof-assistant based courses seem to go in just the right direction. I am
teaching right now one based on that excellent "Software Foundations" material
of Benjamin Pierce et al. I think it changes and sharpens not only the
thinking of students, but also that of the teacher himself (or herself :-).
But even this only goes so far---after all, the underlying logic is
essentially intuitionistic... on the other hand, any weaker one could quickly
become a nightmare for actually discussing things as demanding as semantics
of programming languages (with bangs and exclamation marks in every second
lemma... :-)
To conclude, a few minor points:
> In fact, we cannot accept that we have a final answer until the entire natural language has been formalized
We'll wait for this only a little longer than for the invention of perpetuum
mobile and heat death of the universe... :-)
And which "natural language" are we talking about? Sometimes I think the
only reason why, e.g., Chomsky ever came up with the idea of "universal
grammar" was that he did not speak too many languages in the first place
(although Hebrew seems reasonably distant from English)...
> (The view I take, following Quine, is that logic is a regimentation of
> natural language.
Same objection as above, and this is just to begin with.
[The only redeeming features of Quine were that he wrote well and had a
certain logical culture. As a philosopher, in my opinion, he had a rather
destructive influence on development of logic, particularly in philosophy
departments, even if nowhere near as disastrous as the neopositivists or
the majority of "analytic philosophers". But this is just one more aside...]
> We can perfectly well circumscribe various regimens for various purposes.
As said above, I'm perfectly in agreement with this statement.
> I am entirely happy with the characterization of logical connectives as
> "information composition" operators. But we can only accept it as a good,
> but vague, intuition. We do not know what this "information" is. Neither
> do we know what the information is about. So, in order to claim that
> classical logic is a canonical information composition calculus, somebody
> would need to formalize those notions.
I think I can agree with every word here. Perhaps the difference then is not
so big...
I guess then that "leaving classical logic behind" meant rather "stop
presenting it to students as the only, final and >>real<< formalism for
Computer Scientists, everything else being a marginal pathology, if mentioned
at all"... and if this was indeed intended by this remark, I would have a
hard time disagreeing.
*** From Uday Reddy, 5 May 2013 ***
Tadeusz Litak writes:
>>It is time to leave behind the classical logic. In fact, we should have
>>done it a long time ago.
>
> (even if it wasn't intended, it does indeed sound "like a total and
> unconditional rejection"... such things happen in the fervor of a
> discussion :-)
Having thought about why it sounds like a "total and unconditional
rejection", I believe the difference is in the perspective of what logic is
about.
Logic consists of the principles of "reasoned discourse", as per Aristotle.
Our reasoned discourse happens in natural language, which is a humongous
ocean. We may never be able to understand fully all the principles of logic
that are there. But it is clear that the logic that we do understand (all
the known logics put together) represents only a miniscule proprotion of the
vast ocean of "logic" that is employed in reasoned discourse. So, it seems
to me that a great deal of humility is warranted in talking about "logic" in
general.
In contrast, people that vax about classical logic seem to have the
presumption that classical logic has it all cased. They seem to think that
it represents the sum total of all reasonable principles of reasoned
discourse (even if they are willing to admit modal logics of one kind or
another as being reasonable *extensions* of classical logic). Hence,
anybody that talks about alternative logics is seen to be mounting an attack
on the classical logic, denying the supreme position of classical logic as
the one true logic.
We, the non-believers, of course deny that classical logic is supreme in any
sense. However, that is not an attack on classical logic itself. It is
just an attack on the *presumption* that classical logic is supreme.
All that we can say about classical logic is that it seems to be the
canonical logic for the present-day mathematics. Given that mathematics is
a very conservative discipline, with the bar of entry for new ideas set very
high, it has an abundance of depth but not so much in breadth. Thus, a
canonical logic for mathematics in no way represents a canonical logic for
all of human thought.
In particular, in a young and dynamic discipline like Computer Science,
which has none of the mathematical conservatism, we should be free to
explore all possible logics and invent new ones. In fact, devising logics
is our very main business. We should be very wary of any presumptions about
"the canonical logic" of any kind.
*** From Sergei Soloviev, 5 May 2013 ***
To add a bit in support to Uday's remark about "presumption of
supremacy" of classical logic:
In fact, it is well known that classical logic can be embedded in
intuitionistic logic using, for example, negative interpretation (classical
"exists" becomes \not\forall\not etc.) From the point of view of
provability, the interpretation of a classical theorem is provable
intuitionistically if and only if the original theorem was provable
classically. So, what is bad rationally, if instead of respectable
"exists" we shall say "it is not true that for all x does not exist..."?
It is clear, that it is a "bad publicity", a less "convincing" way to
say - bad for "supremacy", but it has to do with scientific rationality
itself. (Constructive mathematics is just more subtle concerning existence.)
*** From Marc Denecker, 13 May 2013 ***
> Marc Denecker writes:
>
>> "It is time to leave behind the classical logic. In fact, we should
>> have done it a long time ago."
>>
>> To me, that sounds like a total and unconditional rejection.
>
> No, what I meant is that the classical logic represents a stage in the
> development of logic. It cannot be taken as the final answer. In fact, we
> cannot accept that we have a final answer until the entire natural language
> has been formalized, which might take a very very long time indeed! (The
> view I take, following Quine, is that logic is a regimentation of natural
> language. We can perfectly well circumscribe various regimens for various
> purposes.)
>
> I am entirely happy with the characterization of logical connectives as
> "information composition" operators.
We then agree on this point. This view goes back to Frege and
even to Leibniz, right?
But do we agree that the standard logical connectives of FO correctly
implement an important set of basic information composition operators.
That is, conjunction, disjunction, negation, the quantifiers?
If that is the case then FO's connectives
will have to be in other languages as well? We agree that these
operators are not enough but the more universal languages that the
scientific community should strive for in your and my opinion, will
have to contain FO's information composition operators (modulo syntactic
sugar) and hence be extensions of classical logic?
I'm confused about your position in this right now. On the one hand, you
do not (unconditionally) reject FO ; on the other hand, in your reaction
on my first email. you seemed to react against the fact that I propose
FO as a base language. If you agree that FO's connectives correctly
implement a set of basic information composition operators, then you
should agree with me on that claim. But if you think FO's connectives
are not correct or not basic, then please consider the challenge in my
previous email (or see below)
> But we can only accept it as a good,
> but vague, intuition. We do not know what this "information" is. Neither
> do we know what the information is about. So, in order to claim that
> classical logic is a canonical information composition calculus, somebody
> would need to formalize those notions.
I'm sure Tarski would disagree with you. I think this would be his answer.
The notion of "information" that you claim is vague, is actually
formalized in model semantics in a very precise way, albeit slightly
implicitly.
The information content of a logic expression/theory T is formalized by
the class of its models (i.e., the structures in which T is true).
Models are formal representations of possible states of affairs,
non-models are formal representations of impossible states of affairs.
This set is characteristic for the information expressed in T.
This is obvious e.g., when we look at the notion of logical equivalence:
two expressions T, T' are equivalent iff the classes of their models are
identical.
The logical connectives and quantifiers correspond to
operators on these classes of models. Their definition is implicit
in the definition of the satisfaction relation. E.g., the
characteristic function of a conjunction is the intersection of the
classes of models of the conjuncts.
From this formal concept (the class of models), we can derive concepts
like formal concepts of entailment, validity, equivalence. All this is
mathematical and precise.
This formal notion of "information" needs to be connected to the
informal notion of "information" which is the interpretation that we,
human experts, give to a logical expression.
Logic does not know and is not supposed to know what the informal
information (in a logic sentence) is about because this
depends on the meaning that we, human
experts, give to the uninterpreted non-logical symbols.
But once a precise intended interpretation is given,
then I would say that the informal information content of a logic
sentence is precise as well.
For example, if we interpret the nonlogical symbols Semester/1,
Course/1, Registered/3, TakesPlace/2 in the way the names suggest, then
the intuitive information content of
B ! c ! s : Semester(s) & Course(c) & ~ ? st: Registered(st,c,s)
=> ~ TakesPlace(x,s)
is perfectly clear and is given by the equally precise informal sentence
A if in a semester no student registered for a course, then
this course does not take place in that semester.
If anything is vague or inprecise, then I guess it should be easy to
come up with e.g., an example of a database representing a situation
where B and A disagree or at least where it is not clear whether A or B
is true or not. This was the challenge in my previous email.
I think informal language sentence A is precise, at least in any context
where the underlying relations semester, course, are precise,
and I dont think one can find a database where A and B would disagree.
I would love to see a counterexample.
> Even though Vladimir has omitted the word "programming" in titling this
> subthread, the discussion has been about "declarative" and "imperative" as
> paradigms of programming. So, I would rather not divorce myself from
> programming concerns in discussing these issues.
One can only view a logic theory as a program if one associates an
inherent form of inference to the logic. As I argued in my first email,
this is the first step in messing up the difference between declarative
logic and programming languages.
It is the association of a unique inherent form of inference to a logic that
blurs the distinction between declarative theories and procedural programs.