Informal minutes of IFIP Working Group 2.1
54th meeting, Blackheath, London
Monday 2000-04-03 to Friday 2000-04-07

Contents of this page:    Administrative matters  |  Technical presentations
(Last updated 14:00 12th May 2000.)


Organizational and administrative matters

List of attendees

Members: Observers:

Electronic communications (Fri 2000-04-07, 09:20)

In an effort to reduce the cost and time required for secretarial duties, all working group communication from now on will, as far as is reasonably possible, take place electronically, via email and the web. This includes invitations to meetings and distribution of minutes and address lists. Public information will be placed on the web; restricted information (such as discussions of membership decisions) will be distributed by email to the members.

In exceptional cases, the secretary can print out and post copies of electronic documents; but it is to be understood that this facility is for cases of need rather than preference.


Chairs of other TC2 working groups (Fri 2000-04-07, 09:25)

WG2.1 has made the chairs of other closely related TC2 working groups ex officioobservers, and invited them to all meetings. However, this privilege has been neither used nor reciprocated, and besides our information is quite out of date. The decision was made to remove these other chairs from the observers list, and simply let them know informally about upcoming meetings.

The secretary will maintain a list of the secretaries of closely-related groups, in order to keep them informed of the dates of our upcoming meetings and any other information of joint relevance.


Membership (Fri 2000-04-07, 09:30)

This discussion was for members only and is not recorded in these public minutes.


Meeting organization, and costs in particular (Fri 2000-04-07, 10:20)

Several members observed that this was a very expensive meeting to attend, and in future the group will suggest to organizers an upper bound on costs. This will form part of a simple handbook for organizers, to be updated from meeting to meeting and passed from organizer to organizer.


A book for Bob Paige (Fri 2000-04-07, 10:25)

Alberto Pettorossi reported on a discussion he had with Bob Paige at the Le Bischenberg meeting. This concerned the writing of a collection of articles, explaining Bob's collaboration and interaction with group members and other researchers; the intention was to explain to his wife Nieba the fact that Bob's work with the group and others was work among friends. He would like to take up this suggestion after Bob's death. A possibility is to let the collection be the proceedings of a small conference organized around the theme of the impact of Paige's work. Suggestions are welcome as to people from whom to invite contributions; the secretary will examine past minutes to look for past observers to involve.


Next meetings (Fri 2000-04-07, 11:10)

Meeting 55
Cochabamba, Bolivia, organized by Doaitse Swierstra and his colleague Pablo Azero at Cochabamba. The week of 8th to 12th January 2001 was proposed. [It turns out that WG2.3 have also chosen that week for their next meeting, and it is likely that our meeting will now take place in the week of 15th to 19th January. JG]. There is the possibility of running some courses for local students and professionals alongside the meeting.

Meeting 56
Ameland, The Netherlands, organized by Johan Jeuring. The week of 10th to 14th September 2001 has been suggested, but we need to check for clashes with other meetings around that time (especially WG2.3 and ICFP).

Meeting 57
Perhaps Dagstuhl in Germany, organized by Helmut Partsch, in June 2002, in combination with a Working Conference.


Web site (Fri 2000-04-07, 11:55)

Robert Dewar suggested improving and expanding the group's web site, making it more visible to search engines and more attractive to casual readers. The secretary will add some keywords to the current web page, but does not otherwise volunteer to produce any new content; however, contributions from members are welcome, and anything sent to the secretary will be published gladly. Suggestions included: grabbing an easy domain name, like wg21.org; creating a repository of challenge problems; providing some items of historical interest (such as Bauer's history of WG2.1, and Lindsey's and Koster's history of Algol68 and a link to the text of the report).


Two new electronic journals (Fri 2000-04-07, 12:00)

Lambert Meertens proposed a free electronic journal Algorithmic Languages and Calculi, unofficially associated with the group. It would be like Jeuring's The Squiggolist, but formally refereed, which implies the existence of an editorial board (yet to be chosen). Articles will be restricted to a page limit of about ten pages, and the most important criterion will be clarity. They will be published as soon as they are available, so there will be no `issue numbers' (or equivalently, each `number' of a volume will consist of a single article).

Bernhard Möller mentioned JoRMiCS, the Journal on Relational Methods in Computer Science, which has just been launched. Further information can be found at the journal's web page at http://www1.informatik.unibw-muenchen.de/JoRMiCS/.


Formal resolution (Fri 2000-04-07, 12:05)

The members of WG2.1 and the observers present at the 54th meeting in Blackheath, London wish to express their gratitude to Tom Maibaum. Rather than being plagued - except perhaps by railway connections - we found that our longitude problem had been solved by hosting the meeting on the meridian, removing the need for accurate timekeepers, let alone for observing the stars.

In studying the problem of whether the banquet and excursion commute, a sunniness result was obtained -- There's something almost mathematically elegant about that. Although we could ask whether it is morally justifiable to work on applications of our theory, the meeting has shown that Mathematics is sense! That's what sense is! But in the end, in the end, remember, we have to be able to explain it all to Margrethe!



Technical presentations

Note: presentation summaries were prepared by the speakers.


Presentation: Hinze, Making generic progamming more generic (Mon 2000-04-03, 11:05)

Modern functional programming languages such as Haskell 98 typically have a three level structure (ignoring the module system): we have values, types (imposing structure on the value level), and kinds (imposing structure on the type level). In `ordinary' programming we may define values depending on values (called functions) and types depending on types (called type constructors). Generic programming adds to this list the possibility of defining values depending on types (called polytypic values). In this talk we show that it is also useful to have types depending on kinds (called polykinded types). The ability to assign polykinded types to polytypic values greatly enhances genericity: a single polytypic definition, for instance, yields mapping functions for types of arbitrary kinds.

The slides are available online, as is a paper, Polytypic values possess polykinded types (working document 789 BLA-1).


Presentation: Hutton, The generic approximation lemma (Mon 2000-04-03, 14:00)

The approximation lemma is a simplification of the well-known take lemma, and is used to prove properties of programs that produce lists of values. We show how the approximation lemma, unlike the take lemma, can naturally be generalised from lists to a large class of datatypes, and present a generic approximation lemma that is parametric in the datatype to which it applies. As a useful by-product, we find that generalising the approximation lemma in this way also simplifies its proof.
This is joint work with Jeremy Gibbons. A draft paper (working document 790 BLA-2) is available online.


Presentation: de Moor, Compiling image expressions (Mon 2000-04-03, 14:45)


Presentation: Paterson, Homogeneous functions (Mon 2000-04-03, 15:45)

Many circuit designs and parallel algorithms can be expressed using perfectly balanced binary trees. These may be defined as a nested (or non-regular) datatype, so that the balance constraint is enforced by the type system. However, many common manipulations of these trees cannot be expressed in this form. The solution is to define another nested type describing homogeneous (or depth-preserving) functions over these trees. All programming is then done in terms of homogeneous functions, without referring to powertrees. Program-forming operators include counterparts of maps, folds and unfolds, with associated fusion laws. By manipulating whole sequences of functions using these operators, we may concisely define many algorithms. A unique fixed-point property is used in proofs of properties of these algorithms.

Presentation: Wada, My first programs (Tue 2000-04-04, 09:00)

The slides of my talk (working document 791 BLA-3) are available online; errors are corrected.

I wrote a simulator in Scheme. I will include it here. But it is not well refined.

;;The simulator of the two tape controler.
;;
;;"LS" is represented by "L", tape pasted to ring is indicated "R" at the end of
;;the tape. "CR", "LF" and "SP" are represented by "C", "F" and "S".
;;
;;Place the prepared tape to tapea or tapeb by
;;(set! tapea <tape to be read from a>)
;;(set! tapeb <tape to be read from b>)
;;then set the initial device by
;;(set! dev 'a) or (set! dev 'b)
;;And start computer by (copyrobot)
;;
;;The output comes as otape. For the later computation, place the output on the
;;appropriate reader by (set! tapea otape) or (set! tapeb otape), specify the
;;initial reader and start machine.
;;
;;Two examples are attached at the end of this program.
;;

 (define acc 0) (define buf 0) (define mul 1) (define charbuf '())
 (define indexa 0) (define indexb 0) (define sign '()) (define digitf '())
 (define ch '()) (define tapea "") (define tapeb "") (define otape "")
 (define dev 'a)                ;;; default input device
 (define (comment x) '())

(define (copyrobot)
(call-with-current-continuation (lambda (return) 
  (define (comp)
    (getchar) ;(tyo ch)
    (cond ((eq? ch 43) (comment '/+) (cond (digitf (procacc))))
          ((eq? ch 45) (comment '/-)
           (cond (digitf (procacc) (set! sign 't)) (else (set! sign (not sign)))))
          ((eq? ch 120) (comment 'x) (set! mul (getbuf)) (set! digitf '())
           (set! sign '()))
          ((and (<= 48 ch) (< ch 58)) (comment "digits")
           (set! digitf 't)
           (set! buf (+ (* buf 10) ch -48)))
          ((eq? ch 97) (set! dev 'a))
          ((eq? ch 98) (set! dev 'b))
          ((eq? ch 99) (cond ((>= acc 0) (set! dev 'b))))
          ((eq? ch 103) (comment "g") (getchar)
           (set! otape (string-append otape (string ch))))
          ((eq? ch 44) (comment ",")
           (procacc) (set! otape (string-append otape (number->string acc)))
           (copy))
          ((eq? ch 46) (comment ".") (procacc) (copy))
          ((eq? ch 61) (comment "=")
           (procacc) (set! otape (string-append otape (number->string acc)))
           (set! acc 0) (copy))
          ((eq? ch 76) (comment "L")))
    (comp))
 (define (copy)
    (getchar)
    (cond ((eq? ch 76) (comment "L")
           (set! otape (string-append otape (string ch)))
           (comp)))
    (set! otape (string-append otape (string ch)))
    (copy))
 (define (getchar)
  (cond (charbuf (set! ch charbuf) (set! charbuf '()))
       ((eq? dev 'a) ;(display "a") (display indexa) (newline)
        (cond ((= (string-length tapea) indexa) (display otape) (newline) (return 0))
              (else (set! ch (char->integer (string-ref tapea indexa)))
                    (set! indexa (+ indexa 1))
                    (cond ((eq? ch 82) (display otape) (newline)
                        (set! ch (char->integer (string-ref tapea 0)))
                        (set! indexa 1))))))
       ((eq? dev 'b) ;(display "b") (display indexb) (newline)
        (cond ((= (string-length tapeb) indexb) (display otape) (newline) (return 0))
              (else (set! ch (char->integer (string-ref tapeb indexb)))
                    (set! indexb (+ indexb 1))
                    (cond ((eq? ch 82) (display otape) (newline) 
                        (set! ch (char->integer (string-ref tapeb 0)))
                        (set! indexb 1))))))))
 (set! acc 0) (set! buf 0) (set! mul 1) (set! charbuf '())
 (set! indexa 0) (set! indexb 0) (set! sign '()) (set! digitf '())
 (set! otape "")
(comp))))
 

(define (getbuf)
 (let ((temp (if sign (- buf) buf))) (set! buf 0) temp))

(define (procacc)
 (set! acc (+ (* mul (getbuf)) acc)) (set! mul 1) (set! digitf '()) (set! sign '()))

(define (putchar)
 (set! charbuf ch))

;ex0 multiplication of (1 2) and (5 6)
;                      (3 4)     (7 8)

(define tapeb "1a2a3a4a5a6a7a8a")
(define tapea (string-append
 "Lb=........Lg.b=......."
 "Lg.g.b=......Lg.g.g.b=....."
 "Lg.g.g.g.b=....Lg.g.g.g.g.b=..."
 "Lg.g.g.g.g.g.b=..Lg.g.g.g.g.g.g.b=."
 "L.g.g.g.g.g.g.g.,xL.....g.g.g.,+"
 "L..g.g.g.g.g.g.,xL.......g.,=,S"
 "L.g.g.g.g.g.g.g.=xL......g.g.,+"
 "L..g.g.g.g.g.g.=xL........,=CF"
 "L...g.g.g.g.g.,xL.....g.g.g.=+"
 "L....g.g.g.g.,xL.......g.==,S"
 "L...g.g.g.g.g.=xL......g.g.=+"
 "L....g.g.g.g.=xL........==CF"))

(copyrobot)
(set! tapea otape) (copyrobot)
(set! tapea otape) (copyrobot)
(set! tapea otape) (copyrobot)
(set! tapea otape) (copyrobot)
(set! tapea otape) (copyrobot)
(set! tapea otape) (copyrobot)
(set! tapea otape) (copyrobot)
(set! tapea otape) (copyrobot)
(set! tapea otape) (copyrobot)

(newline)
(newline)

;ex1 table of 4x^2-40x+100 for x=0,1,2,...,10

(set! tapeb "0a")
(set! tapea ",abL-10+c11R")
(set! dev 'b)
(copyrobot)

(set! tapea "L4xab-40=xab+100=LR")
(set! tapeb otape)
(set! dev 'a)
(copyrobot)

(set! tapea otape)
(set! dev 'a)
(copyrobot)

Presentation: Jeuring, Type-indexed datatypes (Tue 2000-04-04, 09:40)

A polytypic function is a function that can be instantiated on many datatypes to obtain datatype specific functionality. Examples of polytypic functions are the functions that can be derived in Haskell, such as show, and (==). More advanced examples of polytypic functions are functions for pattern matching, unification, rewriting, and (structured) editing. For each of these problems, we not only have to define polytypic functionality, but also a type-indexed datatype: a datatype that is constructed in a generic way from an argument datatype. For example, in the case of matching we have to extend a datatype with a constructor used for representing variables. This talk discusses how to incorporate a construct for defining type-indexed datatypes in Generic Haskell, an extension of Haskell in which polytypic functions can be defined.

Presentation: de Moor, Pointwise relational programming (Tue 2000-04-04, 11:00)

The point-free relational calculus has been very successful as a language for discussing general programming principles. However, when it comes to specific applications, the calculus can be rather awkward to use: some things are more clearly and simply expressed using variables. The combination of variables and relational combinators such as converse and choice yields a kind of nondeterministic functional programming language. We give a semantics for such a language, and illustrate with an example application.
A paper (working document 792 BLA-4) is available online.


Presentation: Swierstra, Another program for build (Tue 2000-04-04, 12:20)


Presentation: Swierstra, Pretty-printing combinators (How to share a call) (Tue 2000-04-04, 14:10)


Presentation: Pardo, Towards merging recursion and comonads (Tue 2000-04-04, 15:05)

Monads are by now a well established tool for modelling effects in a functional setting. The categorical duals of monads, called comonads, are not so well-known and have been used almost entirely in the area of formal semantics. In recent years, however, there has been a growing interest in investigating meaningful applications of comonads in programming. This work is precisely a contribution in that direction.

In a previous work, we studied the interaction between monadic effects and different recursive operators on datatypes. Following a similar line of research, this work explores the combination between comonads and functions defined by structural recursion. After introducing general aspects of comonads, we focus our study on a particular comonad, namely, the product comonad. We show that using this comonad we can represent two common classes of recursive definitions: (i) functions that admit extra parameters that cannot be altered during the computation, and (ii) functions with accumulating parameters. Associated with each of these classes of functions we introduce the definition of a comonadic fold, a variant of the traditional fold operator that explicitly deals with parameters in terms of the product comonad. We present calculational laws for the new operators.

A draft paper on this subject will be soon available from our web page.


Presentation: Lämmel, A transformational approach to grammar improvement (Tue 2000-04-04, 17:05)

We discuss the correction and completion of syntax definitions based on a transformational style. It turns out that such a style is essential to succeed in certain non-trivial grammar engineering projects. The common figures on the development of COBOL grammars, for example, usually are based on man years, whereas we were able to derive a COBOL grammar from a quite incomplete and incorrect IBM reference in about 2 weeks. The transformational calculus is based on just a few basic operators for renaming nonterminals, replacing EBNF expressions, adding and rejecting rules. The actual transformation scripts are written in terms of higher level operators which are derived from the basic operators using some combinators. The higher level operators are rather restricted, since their definition is accompanied with constraints formalizing applicability conditions. It is an inherent property of the domain (of grammar improvement) that some of the transformation operators might not be semantics-preserving. However, the applicability conditions model certain other valuable requirements which are enforced by the implementation of the calculus. Further requirements might be enforced manually, e.g., using testing technology. The transformational approach facilitates reasoning about grammar improvement, and the underlying stepwise process becomes tracable.

The talk focuses on transformations for grammar improvement. What makes the corresponding calculus interesting is the way how we deal with non-semantics-preserving transformations. It also surprising that a rather simple transformational calculus can provide essential support to address such hard problems as COBOL grammars.

A related draft paper on semi-automatic grammar recovery is available online (HTML). The slides of the talk are available online (Postscript). The referenced reasonably complete and correct VS COBOL II grammar is available online (HTML). It is the first free COBOL grammar. It is being used by people in various projects, e.g., in GNU COBOL compiler projects as a reference.


Presentation: Bird, Maximum weightsum problems (Wed 2000-04-05, 09:00)

A copy of the notes (working document 795 BLA-7) is available online.


Presentation: Möller, Kleene-ing up semantics (Wed 2000-04-05, 10:10)

Kleene algebras provide a convenient and powerful algebraic axiomatisation of a complete lattice that is endowed with a sequential composition operation. The particular kind of Kleene algebras we are considering is equivalent to Boolean quantales [1]. Models include formal languages under concatenation, relations under standard composition, sets of graph paths under path concatenation and sets of streams under concatenation.

The least and greatest fixpoint operators of a complete lattice allow definitions of the finite and infinite iteration operators * and omega, resp.

Elements of Kleene algebras can be used, among others, as abstractions of the input-output semantics of nondeterministic programs or as models for the association of pointers with their target objects. In the first case, one seeks to distinguish the subclass of elements that correspond to deterministic programs. In the second case one is only interested in functional correspondences, since it does not make sense for a pointer to point to two different objects.

In [4] we discuss several candidate notions of determinacy and clarify their relationship. Some characterizations that are equivalent in the case where the underlying Kleene algebra is an (abstract) relation algebra are not equivalent for general Kleene algebras.

In relational semantics, the input-output semantics of a program is a relation on its set of states. We generalize this in considering elements of Kleene algebras as semantical values. In a nondeterministic context, the demonic semantics is calculated by considering the worst behavior of the program. In this paper, we concentrate on while loops. Calculating the semantics of a loop is difficult, but showing the correctness of any candidate abstraction is much easier. For deterministic programs, Mills has described a checking method known as the while statement verification rule. In [5] a corresponding programming theorem for nondeterministic iterative constructs is proposed, proved and applied to an example. This theorem can be considered as a generalization of the while statement verification rule to nondeterministic loops.

In standard Kleene algebra it is assumed that the composition operation is universally disjunctive in both arguments. This entails monotonicity and strictness w.r.t. the least element 0 that plays the role of the bottom element in denotational semantics. However, full strictness does not make sense when one wants to give an algebraic account of systems with lazy evaluation. Therefore in [6] we study a ``one-sided'' variant of KAs in which composition is strict in one argument only. This treatment fits well with systems such as the calculus of finite and infinite streams which is also used in J. Lukkien's operational semantics for the guarded command language [3] or R. Dijkstra's computation calculus [2].

There is some choice in what to postulate for the other argument. Whereas Lukkien and Dijkstra stipulate positive disjunctivity, we investigate how far one gets if only monotonicity is required. The reason is that we want to enable a connection to process algebra. There only one of the distributivity laws for composition over choice is postulated to preserve the temporal succession of choices.

[1]
C. Brown, D. Gurr: A representation theorem for quantales. Journal of Pure and Applied Algebra, 85:27-42 (1993)
[2]
R.M. Dijkstra: Computation calculus - bridging a formalization gap In: J. Jeuring (ed.): Proc. MPC 1998. LNCS 1422, 1998, 151--174
[3]
J. Lukkien: An operational semantics for the guarded command language. In: R.S. Bird, C.C. Morgan, J.C.P. Woodcock (eds.): Mathematics of Program Construction. LNCS 669, 1993, 233--249
[4]
J. Desharnais, B. Möller: Characterizing determinacy in Kleene algebra. Institut für Informatik, Universität Augsburg, Report Nr. 2000-5. Available online.
[5]
J. Desharnais, B. Möller, F. Tchier: Kleene under a demonic star. Institut für Informatik, Universität Augsburg, Report Nr. 2000-3. Available online. Revised version in Proc. AMAST 2000 (to appear)
[6]
B. Möller: Lazy Kleene algebra. In preparation.

Presentation: Frias, A relational formalization of exhaustive search and backtracking (Wed 2000-04-05, 12:10)

In my talk I will present a relational calculus for program construction based on binary relations. One of the main features of this calculus is the possibility to specify algorithm design strategies within the calculus in a highly declarative and compositional way. I will present a formalization of nondeterministic exhaustive search and backtracking within this framework. Finally, the problem of finding a winning strategy in a generic game will be solved first by exhaustive search within the states space, and this solution will be improved to a backtracking algorithm solving the problem.
A paper A relational characterization of exhaustive search and backtracking (working document 796 BLA-8) is available online.


Presentation: Swierstra, Attribute grammar combinators (Fighting T-REX) (Wed 2000-04-05, 15:00)


Presentation: Liu, Automatic accurate time-bound analysis for high-level languages (Wed 2000-04-05, 16:20)

Analysis of program running time is important for reactive systems, real-time systems, interactive environments, compiler optimizations, performance evaluation, and many other computer applications. For many applications, such as real-time systems, the ability to predict accurate time bounds automatically and efficiently is particularly important.

This talk describes a general approach for automatic and accurate time-bound analysis. The approach consists of transformations for building time-bound functions in the presence of partially known input structures, symbolic evaluation of the time-bound function based on input parameters, optimizations to make the overall analysis efficient, and measurements of primitive parameters, all at the source-language level. We have implemented this approach and performed a number of experiments for analyzing Scheme programs. The measured worst-case times are closely bounded by the calculated bounds.

A number of interesting aspects need further study, including in particular, program transformations for improving the efficiency and accuracy of the analysis.

This is joint work with Gustavo Gomez. Papers on what I talked about can be found at my ftp directory; see for example our paper from LCTES98.


Presentation: Boiten, Guards, preconditions and three-valued logic (Wed 2000-04-05, 17:20)

Using binary relations as specifications of operations, several possibilities arise for interpreting the domain of such a relation, in particular guards and preconditions. The informal interpretations of these are non-exclusive, so it could be advantageous to allow both. Traditionally this is done by using pairs of a relation and a set, where the set denotes an explicit guard or precondition. We show that this information can also be embedded in a single three-valued logic predicate. This might lead to a more intuitive notion of refinement, which does not sometimes represent implementation freedom by true, and sometimes by false, but always or mostly by un(der)defined. However, three-valued logic is (by a counting argument) more expressive than pairs of a set and a relation. In order to represent all three-valued logic specifications, we might then use pairs of relations. Two refinement relations are defined for such pairs, one which reduces to the relation+precondition approach, and another which disentangles (required) non-determinism from implementation freedom. The latter is analogous to a treatment for LOTOS proposed by some of us earlier. The Z version of this talk is to appear at ZB 2000.
(This is joint work with Ralph Miarka and John Derrick.)


Presentation: Jeuring, A generic structure editor (Thu 2000-04-06, 09:20)

In this talk I try to construct a generic structure editor in Generic Haskell. An important part of the generic editor is a language of edit commands. In this talk I will mainly discuss the language of commands: how they are created, checked, and processed.

Presentation: Dimitrakos, Practical applications and related theory of a metalogical property: Identifying the role of interpolation in software engineering and formal methods (Thu 2000-04-06, 09:50)

Interpolation has become one of the standard properties (together with completeness and [semi]decidability) that logicians investigate when designing a formalism. In this talk, I will provide strong evidence that the presence of interpolants is not only cogent for scientific reasoning but has also important practical implications in formal approaches to system specification and program construction.

I will illustrate that interpolation in general, and uniform splitting interpolants, in particular, play an important role in applications where formality and modularity are invoked. I will also explain why possessing a uniform presentation of potential splitting interpolants provides a compositional logical mechanism for information hiding and, in the presence of interpolation, facilitates:

  1. the orthogonality of presentation when sharing specification text following an ``one writer, many readers'' architecture object based specification languages such as B, Z and VDM ([11] discussing [2,12] and the contextual proof obligations generated by B-Toolkit [4])
  2. the promotion of data refinement by some schemata in the Z formal method, c.f. [11] discussing [13,20]
  3. the composability of a form of refinement for axiomatic specifications, c.f. [7]
  4. the correct instantiation of (multi)parametric axiomatic specifications of data types [9]
  5. the mixed-variance refinement of parameterisations which unfolds to a parameter relaxation and a body refinement ([9] discussing [18])
  6. Unskolemization [17] and deductive program synthesis.
  7. the correctness of various operations of module algebras ([15] discussing an earlier version of [6])

On going work involves the use of interpolants for detecting (and ensuring the absence of) any Information Flow from a subsystem with high security to a component interface with low security [16] as well as their use in defining a notion of Testing Equivalence [1].

Finally I will highlight pursued further work that involves a close collaboration between software engineers and logicians and aims to enhancing calculi of existing formal methods with meta-logical properties that support modular specification and proof decomposition.

My intention emphasise on examples related to the role of uniform interpolants in the proof obligations that are produced for validating some ``open'' and particularly most ``closed'' compositional structuring mechanisms used in the B-Method. (The terms ``open'' and ``closed'' refer to the open-closed duality principle [14]. ``Open'' means building large systems by extensions as when appending or amalgamating specifications and ``closed'' means building an encapsulated component available for blind use elsewhere, eg. when linking independently constructed code modules.)

The choice of the B-Method as an example of a useful application of interpolation has been made merely for two reasons. First, it has been taken-up by the industry with considerable success (the safety critical components of the first driverless metro in Paris have been developed using the B-Method in all phases) (c.f. the MÉTÉOR project [5]) and there are commercial tools to support it ([3] and [4]). Second, the uniform splitting interpolants appear without much disguise, almost in their common meta-forms.

Bibliography

[1]
M. Abadi and A. Gordon.  A calculus for cryptographic protocols: the spi calculus. Information and Computation, 1999.
[2]
J.R. Abrial. The B-Book: Assigning Programs to Meanings. Camb. Univ. Press, 1996.
[3]
Atelier B. Steria Mediterranee. S.A.V. Steria, BP 16000, 13791 Aix-en-Provence cedex3. France.
[4]
B-Toolkit. B-Core (UK) Limited, Kings Piece Harwell Oxon OX11 0PA, UK. URL: http://www.b-core.com.
[5]
P. Behm, P. Benoit, and J.M. Meynadier. METEOR: A Successful Application of B ina Large Project.  In FM'99- Formal Methods. volume I, pages 369-387,1999.
[6]
J.A. Bergstra,J. Heering, and P. Klint. Module algebra. volume 37, pages 335--372, 1990.
[7]
Th. Dimitrakos and T.S.E. Maibaum. Notes on refinement, interpolation and uniformity. Automated Software Engineering-ASE'97, IEEE&CS, 1997.
[8]
Theodosis Dimitrakos. Formal support for specification design and implementation. PhD thesis, Imperial College, March 1998.
[9]
Theodosis Dimitrakos. Parameterising (algebraic) specifications on diagrams. Automated Software Engineering-ASE'98, IEEE&CS, pages 221--224, 1998.
[10]
Theodosis Dimitrakos and Tom Maibaum. On a generalised modularisation theorem. Information Processing Letters, 1999. To appear.
[11]
Th. Dimitrakos, J.C. Bicarregui, B.M. Matthews, T.S.E Maibaum. Compositional structuring in B: a logical analysis of the static  part. Submitted to ZB'2000.
[12]
Kevin Lano. The B Language and Method. A Guide to Practical Formal Development. Springer-Verlag,1996.
[13]
P.J. Lupton. Promoting Forward Simulation. Z User Workshop, pages 27--49.  Springer-Verlag, Oxford 1990.
[14]
B. Meyer. Object Oriented Construction. Prentice-Hall, 1988.
[15]
P.H. Rodenburg and R.J. van Galbbeek. An Interpolation Theorem in Equational Logic. CS-R8838,Centre for Math. and Comp. Sci.,  The Netherlands,1988.
[16]
P.Y.A. Ryan and S.A. Schneider. Process algebra and non-interference. In Proceedings of The 12th Computer Security Foundations  Workshop. IEEE&CS,1999.
[17]
D.R. Smith. Constructing specification morphisms. Symbolic Computation 15(5-6):57-606,1993
[18]
Y.V. Srinivas. Refinement of parameterized algebraic specifications. In IFIP TC2 Working Conference on Algorithmic Languages and Calculi. Chapman & Hall, 1997.
[19]
W.M. Turski and T.S.E. Maibaum. The Specification of Computer Programs. Addison-Wesley,1987.
[20]
J.C.P. Woodcock. Mathematics as a Management Tool: Proof Rules for Promotion. In CSR Sixth Annual Conference on Large Software Systems. Bristol, 1989.


Presentation: Pettorossi, Perfect model checking via unfold/fold transformations (Thu 2000-04-06, 11:50)

We show how program transformation rules and strategies may be used for proving the satisfiability of first order formulas in some classes of models. In particular, we propose a technique for showing that a closed first order formula phi holds in the perfect model M(P) of a logic program P with locally stratified negation.

For this purpose we consider a new version of the unfold/fold transformation rules and we show that this version preserves the perfect model semantics. Our proof method, called unfold/fold proof method, shows M(P) models phi by: (i) introducing a new predicate symbol f and constructing a conjunction F(f,phi) of clauses such that M(P) models phi iff M(P and F(f,phi)) models f, and then (ii) transforming the program P and F(f,phi) into a new program of the form Q and f, for some conjunction Q of clauses.

We also present a strategy for applying our unfold/fold rules in a semi-automatic way. Our strategy may or may not terminate, depending on the initial program P and formula phi. However, we identify some classes of programs such that, for each closed formula phi, our strategy terminates either for phi or not phi. Thus, our strategy is a decision procedure for programs and formulas in the given classes.

This is joint work with Maurizio Proietti. A paper (working document 793 BLA-5) is available online.


Presentation: Wile, Software architectures and program transformation (Fri 2000-04-07, 12:10)



Working documents

789 BLA-1
Ralf Hinze. Polytypic values possess polykinded types. To appear in Proceedings of the Fifth International Conference on Mathematics of Program Construction (MPC 2000), July 3-5, 2000.

790 BLA-2
Graham Hutton and Jeremy Gibbons. The Generic Approximation Lemma.

791 BLA-3
Eiiti Wada, My first programs. Also available as compressed postscript.

792 BLA-4
Oege de Moor and Jeremy Gibbons, Pointwise Relational Programming. To appear in Proceedings of Algebraic Methodology and Software Technology, Iowa, May 2000.

793 BLA-5
Alberto Pettorossi and Maurizio Proietti, Perfect model checking via unfold/fold transformations. To appear in Proceedings of First International Conference on Computational Logic, London, July 2000. Also available as compressed postscript.

794 BLA-6
Pardo

795 BLA-7
Richard Bird, Maximum weightsum problems. Also available as compressed pdf.

796 BLA-8
Marcelo Frias, Gabriel Baum and Nazareno Aguirre, A relational characterization of exhaustive search and backtracking. Also available as compressed postscript.



Jeremy Gibbons (email: Jeremy.Gibbons@comlab.ox.ac.uk) - May 2000