IFIP Working Group 2.1 -- Meeting #54 Details


Contents of this page:    Local information  |  Getting there  |  Proposed topics for discussion  |  Proposed talks  |  Royal Observatory
Minutes of the meeting are now available.


Local information

The next meeting of WG2.1 will be 3rd to 7th April 2000, in Blackheath, London, UK. The local organizer, Tom Maibaum, is in charge of local information and organizational details.

Getting there

  1. Go to Charing Cross train station in central London as follows:

  2. Take a train to Blackheath. On Saturday, these run at 9 and 39 minutes past each hour. On Sunday, they run at 16 and 46 minutes past each hour. The journey time is about 20 minutes.

  3. When you arrive at Blackheath Station, go out of the station, cross the road directly in front and turn left. Walk up the road (and up the hill) following the slow curve around to your right and about 250m up the road on your right is the Clarendon Hotel at 8-16 Montpelier Row, Blackheath, London SE3 0RW. (Tel: 0208 318 4321, fax: 0208 318 4378). You can look at a local map; the hotel is in the middle of the map that appears and the station is just below it.

(If you are coming by Eurostar, when you arrive at Waterloo Station, walk across to Waterloo East Station. Trains from Charing Cross to Blackheath arrive (usually on platform A) 2 minutes after the times indicated above. Then follow instructions from step 3 above.)


Proposed topics for discussion

Generic programming (proposed by Ralf Hinze and Johan Jeuring)

Description   Generic programming is about making programs more adaptable by making them more general. Generic programs often embody non-traditional kinds of polymorphism; ordinary programs are obtained from them by suitably instantiating their parameters. In contrast with normal programs, the parameters of a generic program are often quite rich in structure. For example they may be other programs, types or kinds.

Problems and challenges   The research area is still in its infancy. The following aspects seem worth investigating:

Background material  

Web Presence of the Working Group (proposed by Robert Dewar)

Description   Effective web presence is the decisive factor in the initial impression an organisation makes today. The former and present secretaries have done a laudable job of setting up and maintaining a web site, but as the Web expands more effort is needed to stay highly visible, and the issues deserve thought and input from the whole Group.

Issues and opportunities  

Pointwise Relational Programming (proposed by Richard Bird, Oege de Moor and Jeremy Gibbons)

Description   The point-free relational calculus has been very successful as a language for discussing general programming principles. However, when it comes to specific applications, the calculus can be rather awkward to use: some things are more clearly and simply expressed using variables. The combination of variables and relational combinators such as converse and choice yields a kind of nondeterministic functional programming language.

In this discussion, we will outline the language, then give a feeling for its use by engaging the group in a collaborative derivation of an algorithm for building trees.

A paper is available.


Proposed talks

The following talks have been proposed and will be ready for presentation at the start of the meeting. The first few talks may be selected from this list.
The Generic Approximation Lemma (Graham Hutton)
The approximation lemma is a simplification of the well-known take lemma, and is used to prove properties of programs that produce lists of values. We show how the approximation lemma, unlike the take lemma, can naturally be generalised from lists to a large class of datatypes, and present a generic approximation lemma that is parametric in the datatype to which it applies. As a useful by-product, we find that generalising the approximation lemma in this way also simplifies its proof.
This is joint work with Jeremy Gibbons. A draft paper is available.

A Language-Theoretic Approach To Algorithms (Deepak Goyal)
An effective algorithm design language should be 1) wide-spectrum in nature, i.e. capable of expressing both abstract specifications and low-level implementations, and 2) `computationally transparent', i.e. facilitate accurate estimation of time and space requirements. The conflict between these requirements is exemplified by SETL which is wide-spectrum, but lacks computational transparency because of its reliance on hash-based data structures.

In the first part of the talk we will describe the design of a set-theoretic algorithm design language that is both wide-spectrum, and computationally transparent. The language is made computationally transparent through the use of a novel type system in which the data structure selection method is guided by type information, and the well-typedness of the program guarantees that all associative access operations in the program (e.g. set membership test) can be implemented in worst-case O(1) time. We give evidence of the effectiveness of our language by showing how it can be used to improve a difficult database query optimization algorithm from expected to worst-case linear time. In addition, a simpler explanation and a shorter proof of correctness are obtained.

In the second part of the talk we show how our data structure selection method can combined with finite differencing and dominated convergence to form an effective transformational program design methodology. Finite differencing is to do with replacing costly repeated computations by cheaper incremental counterparts, and dominated convergence provides a generalized iteration scheme for computing fixed-points. We will describe how our methodology helped simplify a complex linear-time model-checking algorithm for the alternation-free modal mu-calculus, and also led to the discovery of an O(N^3) time algorithm for computing intra-procedural may-alias information, vastly improving the existing O(N^5) time algorithm.

This talk describes my research at New York University as a PhD. student, under the supervision of Bob Paige.

The Role of Interpolants in Formal Software Developments: Theory and Practical Experience (Theodosis Dimitrakos)
Interpolation has become one of the standard properties (together with completeness and [semi]decidability) that logicians investigate when designing a formalism. In this talk, I will provide strong evidence that the presence of interpolants is not only cogent for scientific reasoning but has also important practical implications in formal approaches to system specification and program construction.

I will illustrate that interpolation in general, and uniform splitting interpolants, in particular, play an important role in applications where formality and modularity are invoked. I will also explain why possessing a uniform presentation of potential splitting interpolants provides a compositional logical mechanism for information hiding and, in the presence of interpolation, facilitates:

  1. the orthogonality of presentation when sharing specification text following an ``one writer, many readers'' architecture object based specification languages such as B, Z and VDM ([11] discussing [2,12] and the contextual proof obligations generated by B-Toolkit [4])
  2. the promotion of data refinement by some schemata in the Z formal method, c.f. [11] discussing [13,20]
  3. the composability of a form of refinement for axiomatic specifications, c.f. [7]
  4. the correct instantiation of (multi)parametric axiomatic specifications of data types [9]
  5. the mixed-variance refinement of parameterisations which unfolds to a parameter relaxation and a body refinement ([9] discussing [18])
  6. Unskolemization [17] and deductive program synthesis.
  7. the correctness of various operations of module algebras ([15] discussing an earlier version of [6])

On going work involves the use of interpolants for detecting (and ensuring the absence of) any Information Flow from a subsystem with high security to a component interface with low security [16] as well as their use in defining a notion of Testing Equivalence [1].

Finally I will highlight pursued further work that involves a close collaboration between software engineers and logicians and aims to enhancing calculi of existing formal methods with meta-logical properties that support modular specification and proof decomposition.

My intention emphasise on examples related to the role of uniform interpolants in the proof obligations that are produced for validating some ``open'' and particularly most ``closed'' compositional structuring mechanisms used in the B-Method. (The terms ``open'' and ``closed'' refer to the open-closed duality principle [14]. ``Open'' means building large systems by extensions as when appending or amalgamating specifications and ``closed'' means building an encapsulated component available for blind use elsewhere, eg. when linking independently constructed code modules.)

The choice of the B-Method as an example of a useful application of interpolation has been made merely for two reasons. First, it has been taken-up by the industry with considerable success (the safety critical components of the first driverless metro in Paris have been developed using the B-Method in all phases) (c.f. the MÉTÉOR project [5]) and there are commercial tools to support it ([3] and [4]). Second, the uniform splitting interpolants appear without much disguise, almost in their common meta-forms.

Bibliography

[1]
M. Abadi and A. Gordon.  A calculus for cryptographic protocols: the spi calculus. Information and Computation, 1999.
[2]
J.R. Abrial. The B-Book: Assigning Programs to Meanings. Camb. Univ. Press, 1996.
[3]
Atelier B. Steria Mediterranee. S.A.V. Steria, BP 16000, 13791 Aix-en-Provence cedex3. France.
[4]
B-Toolkit. B-Core (UK) Limited, Kings Piece Harwell Oxon OX11 0PA, UK. URL: http://www.b-core.com.
[5]
P. Behm, P. Benoit, and J.M. Meynadier. METEOR: A Successful Application of B ina Large Project.  In FM'99- Formal Methods. volume I, pages 369-387,1999.
[6]
J.A. Bergstra,J. Heering, and P. Klint. Module algebra. volume 37, pages 335--372, 1990.
[7]
Th. Dimitrakos and T.S.E. Maibaum. Notes on refinement, interpolation and uniformity. Automated Software Engineering-ASE'97, IEEE&CS, 1997.
[8]
Theodosis Dimitrakos. Formal support for specification design and implementation. PhD thesis, Imperial College, March 1998.
[9]
Theodosis Dimitrakos. Parameterising (algebraic) specifications on diagrams. Automated Software Engineering-ASE'98, IEEE&CS, pages 221--224, 1998.
[10]
Theodosis Dimitrakos and Tom Maibaum. On a generalised modularisation theorem. Information Processing Letters, 1999. To appear.
[11]
Th. Dimitrakos, J.C. Bicarregui, B.M. Matthews, T.S.E Maibaum. Compositional structuring in B: a logical analysis of the static  part. Submitted to ZB'2000.
[12]
Kevin Lano. The B Language and Method. A Guide to Practical Formal Development. Springer-Verlag,1996.
[13]
P.J. Lupton. Promoting Forward Simulation. Z User Workshop, pages 27--49.  Springer-Verlag, Oxford 1990.
[14]
B. Meyer. Object Oriented Construction. Prentice-Hall, 1988.
[15]
P.H. Rodenburg and R.J. van Galbbeek. An Interpolation Theorem in Equational Logic. CS-R8838,Centre for Math. and Comp. Sci.,  The Netherlands,1988.
[16]
P.Y.A. Ryan and S.A. Schneider. Process algebra and non-interference. In Proceedings of The 12th Computer Security Foundations  Workshop. IEEE&CS,1999.
[17]
D.R. Smith. Constructing specification morphisms. Symbolic Computation 15(5-6):57-606,1993
[18]
Y.V. Srinivas. Refinement of parameterized algebraic specifications. In IFIP TC2 Working Conference on Algorithmic Languages and Calculi. Chapman & Hall, 1997.
[19]
W.M. Turski and T.S.E. Maibaum. The Specification of Computer Programs. Addison-Wesley,1987.
[20]
J.C.P. Woodcock. Mathematics as a Management Tool: Proof Rules for Promotion. In CSR Sixth Annual Conference on Large Software Systems. Bristol, 1989.

An Aspect Oriented Approach to Attribute Grammars (Doaitse Swierstra and Oege de Moor)

(This talk could perhaps be part of the discussion on generic programming, as it concerns a general technique for modularising catamorphic programs.)

A Transformational Approach to Grammar Improvement (Ralf Lämmel)
The proposed talk is concerned with the correction, completion and disambiguation of syntax definitions based on a transformational style. It turns out that such a style is essential to succeed in certain non-trivial grammar engineering projects. The common figures on the development of COBOL grammars, for example, usually are based on man years, whereas we were able to derive a COBOL grammar from a quite incomplete and incorrect IBM reference in about 2 weeks. The transformational calculus is based on just a few basic operators for renaming nonterminals, replacing EBNF expressions, adding and rejecting rules. The actual transformation scripts are written in terms of higher level operators which are derived from the basic operators using some combinators. The higher level operators are rather restricted, since their definition is accompanied with yielders formalizing applicability conditions. It is an inherent property of the domain (of grammar improvement) that some of the transformation operators might not be semantics-preserving. However, the applicability conditions model certain other valuable requirements which are enforced by the implementation of the calculus. Further requirements might be enforced manually, e.g., using testing technology. The transformational approach facilitates reasoning about grammar improvement, and the underlying stepwise process becomes tracable.

The talk focuses on transformations for grammar improvement. What makes the corresponding calculus interesting is the way how we deal with non-semantics-preserving transformations. It also surprising that a rather simple transformational calculus can provide essential support to address such hard problems as COBOL grammars. To provide some context for this work, the parsing technology involved in parsing syntax diagrams and prototype parsing for the derived grammar, disambiguation of grammars, and testing technology for grammars is reviewed.

Two draft papers on grammar engineering and parsing syntax diagrams are available (joint work with Chris Verhoef). Two other draft papers on the transformational calculus and some testing technology used for grammar improvement will be available soon. The referenced reasonably complete and correct VS COBOL II grammar is available online. It is the first free COBOL grammar. It is being used by people in various projects, e.g., in GNU COBOL compiler projects as a reference.

A Relational Formalization of Exhaustive Search and Backtracking (Marcelo Frias)
In my talk I will present a relational calculus for program construction based on binary relations. One of the main features of this calculus is the possibility to specify algorithm design strategies within the calculus in a highly declarative and compositional way. I will present a formalization of nondeterministic exhaustive search and backtracking within this framework. Finally, the problem of finding a winning strategy in a generic game will be solved first by exhaustive search within the states space, and this solution will be improved to a backtracking algorithm solving the problem.

Making Generic Programming More Generic (Ralf Hinze)
Modern functional programming languages such as Haskell 98 typically have a three level structure (ignoring the module system): we have values, types (imposing structure on the value level), and kinds (imposing structure on the type level). In `ordinary' programming we may define values depending on values (called functions) and types depending on types (called type constructors). Generic programming adds to this list the possibility of defining values depending on types (called polytypic values). In this talk we show that it is also useful to have types depending on kinds (called polykinded types). The ability to assign polykinded types to polytypic values greatly enhances genericity: a single polytypic definition, for instance, yields mapping functions for types of arbitrary kinds.

(The slides are available online.)

Automatic accurate time-bound analysis for high-level languages (Annie Liu)
Analysis of program running time is important for reactive systems, real-time systems, interactive environments, compiler optimizations, performance evaluation, and many other computer applications. For many applications, such as real-time systems, the ability to predict accurate time bounds automatically and efficiently is particularly important.

This talk describes a general approach for automatic and accurate time-bound analysis. The approach consists of transformations for building time-bound functions in the presence of partially known input structures, symbolic evaluation of the time-bound function based on input parameters, optimizations to make the overall analysis efficient, and measurements of primitive parameters, all at the source-language level. We have implemented this approach and performed a number of experiments for analyzing Scheme programs. The measured worst-case times are closely bounded by the calculated bounds.

A number of interesting aspects need further study, including in particular, program transformations for improving the efficiency and accuracy of the analysis.

This is joint work with Gustavo Gomez.

Alberto Pettorossi
During the meeting I am willing to talk about Model Checking, and in particular, in a transformational method for proving first order properties of systems described by logic programs with negation. The method is quite strong, and indeed, it offers a decision procedure for a significant fragment of the weak monadic second order theory with n successor functions.


Royal Observatory

The Royal Observatory Greenwich is the home of the Prime Meridian of the world. Was the Observatory built on top of the Meridian, or was the Meridian drawn underneath the Observatory?

The Observatory houses Britain's largest refracting telescope (28 inch).

The Observatory also has a collection of John Harrison's `marine timekeepers', which won him the coveted Longitude Prize, but only after incredible hassles inflicted by the Board of Longitude.


Jeremy Gibbons (email: Jeremy.Gibbons@comlab.ox.ac.uk) - March 2000