OXFORD UNIVERSITY COMPUTING LABORATORY

Minutes of Meetings of the Algebra of Programming Research Group in 1996

The schedule for Hilary Term 1996 was as follows:
Week 5: 16 Feb, Richard Miller, University of Oxford
Multi-dimensional arrays and the empty array.

Week 6: 23 Feb, Grant Malcolm, University of Oxford
How can programs be made less efficient?

Week 7: 1 March, Quentin Miller, University of Oxford
Graph Manipulation on BSP Computers.

Week 8: 8 March, Paul Rudin, University of Oxford
Graphical Calculus Completness.

The schedule for Michaelmas Term 1996 was as follows:
Week 0: Oct 11, Ian Hayes, University of Queensland
Coercing Real-Time Refinement.

Week 1: Oct 18, Wim Feijen, Technical University of Eindhoven
The Derivation of Multi-Programs. (Lecture theatre).

Week 2: Oct 25, Richard McPhee, University of Oxford
Compositional Logic Programming.

Week 3: Nov 1, Bernard Sufrin, University of Oxford
Unification and Jape.

Week 4: Nov 8, Burghard von Karger, University of Kiel
Temporal and Sequential Algebra.

Week 5: Nov 15, Jifeng He, University of Oxford
Linking Theories in Probabilistic Programming

Week 6: Nov 22, Oege de Moor, University of Oxford
A Fix for Combinator Parsers.

Week 7: Nov 29, Annabelle McIver, University of Oxford
Probability and Data Refinement

Week 8: Dec 6, Carroll Morgan, University of Oxford
How Probability, Nondeterminism, and Information Hiding (might) Interact.

Week 9: Dec 13, Graham Hutton, University of Nottingham
Concurrency Theory for Functional Programmers.

Week 6: 23 Feb, Grant Malcolm, University of Oxford
How can programs be made less efficient?

Various programming calculi and methodologies offer some help to the software engineer in moving from complex, inefficient specifications or programs to complex, efficient ones. Algebraic properties of programs or data types are often very useful in this process; for example, properties of homomorphisms on lists can help simplify an inefficient program like ``maximum sum of segments'' to an efficient program like ``maximum segment sum.''

It is unclear how (or whether) such methodologies can be adapted to help in the development of distributed systems, especially if there are constraints on the end product - for example, it might be required that the system (re)uses existing components. For distributed systems, it seems that the development process goes the other way around, going from a potentially efficient description to a description that may look inefficient but which can be given a distributed implementation.

I'll discuss, and give some examples of, recent work we (Goguen, Diaconescu, Cirstea, Veglioni, Mori, and Malcolm) have been doing on algebraic specification of distributed systems, suggest that it provides useful support for designing and prototyping such systems, with support also for reusing components, and then...make some woolly comments about algebraic properties and ask how you might go about making specifications less efficient (with a view to a distributed implementation). For example, could the relational calculus be used backwards?

Week 7: 1 Mar, Quentin Miller, University of Oxford
Graph Manipulation on BSP Computers

The Bulk-Synchronous Parallel (BSP) model of computing takes a global view of communication -- the cost of a single message transfer cannot be determined in isolation, but the cost of all transfers in the network in a given time interval can be predicted. Thus the greatest control of communication costs comes from patterns of communication which can be simply defined as actions on the whole network, such as broadcasts, reductions, and permutations of the sequence of processors.

This is a difficult goal in applications such as shared graph manipulation where the distribution of data -- and hence patterns of communication -- change in an unpredictable manner. I would like to explore the extent to which fixed network-wide communications can be used to manipulate such amorphous data.

Week 8: 8 Mar, Paul Rudin, University of Oxford
Graphical Calculus Completness

We introduce a graph based calculus similar to the graphical calculus of Sharon Curtis and Gavin Lowe [1] and show that this is complete for the relational calculus. This is done by deriving an arbitary member of an infinite axiomatisation of the relational calculus due to Roger Lyndon [2]. Possible generalisations to other calculi are discussed.

[1] A Graphical Calculus, Sharon Curtis and Gavin Lowe, Proceedings, Mathematics of Program Construction 1995.

[2] The representation of relation algebras II, Roger Lyndon. Annals of mathematics 63(2), 1956.

Week 0: Oct 11, Ian Hayes, University of Queensland
Coercing Real-Time Refinement.

Our overall goal is to support the development of real-time programs from specifications via a process of stepwise refinement. One problem in developing a real-time program in a high-level programming language is that it is not possible to determine the detailed timing characteristics of the program until the compiler and target machine are taken into account. To overcome this problem the programming language can be augmented with directives specifying real-time constraints; it is a compile-time error if the compiler cannot guarantee that the generated code will meet them.

During the refinement process the timing directives are inserted into the program in order to ensure it meets the specification. In this way timing constraints can be separated out at the appropriate points in the refinement process. This allows the remainder of the refinement process to be similar to normal sequential refinement.

Week 1: Oct 18, Wim Feijen, Technical University of Eindhoven
The Derivation of Multi-Programs.

There is quite some evidence that quite a number of multiprograms can be formally derived from their specifications by means of just the theory of Owicki and Gries and the predicate calculus. The potential merit of this is that the technicalities involved are as simple as the handling of Hoare-triples. During the Seminar we will present one or two examples showing the mode of derivation.

Week 2: Oct 25, Richard McPhee, Oxford
Compositional logic programming.

Functional programmers have long shown the benefits of higher-order programming. It enables greater code reuse, encourages abstraction and formal results about programs. Over the years, some researchers have attempted to introduce higher-order programming into the logic programming community, but to say that it has not caught on would be quite an understatement. In this talk, we informally describe a compositional logic programming language that captures many of the important aspects of higher-order programming whilst retaining the simplicity of Prolog. We illustrate the expressive power of the language by programming combinator parsers, familiar from functional programming. We then identify those language features necessary to implement a resolution-based compositional logic programming language.

Week 3: Nov 1, Bernard Sufrin, Oxford
Unification and Jape.

As a generic proof tool Jape needs to be able to formalise logical rules whose description use formal substitution: for example Leibniz's rule. Most generic tools support this by implementing some form of Higher Order Unification. But Higher Order Unification has a number of limitations: in particular, the fact that there is no ``best'' unifier. In this talk I will explain the approach the Jape designers took when implementing unification, how we overcame some of the limitations of H.O.U., and how -- at the moment we were congratulating ourselves on being innovators -- we discovered that we had re-discovered Huet's higher order unification algorithm after all. The talk will be illustrated with several examples of partial proofs.

Week 4: Nov 8, Burghard von Karger, Kiel
Temporal and Sequential Algebra.

Depending on your background, there are two ways of describing what temporal algebra is. Some might say that it is nothing but an algebraic clone of Manna-Pnueli type temporal logic. Others will recognize it as the theory of a Galois connection on a complete Boolean algebra. Temporal algebra introduces no new concepts; it is only interesting because it links two important areas of theoretical computer science. It will appeal to those in love with beautifully crafted calculations, but it does not help at all with model checking, decision procedures, completeness questions and the like. Temporal algebra is built around the "next" and "previous" operators, whereas sequential algebra may be described as a theory of sequential composition. As such, sequential algebra is a natural candidate for modelling (imperative) programs. In fact, it is only a slight generalization of relation algebra in that it omits the converse operator. Programs are also closely related to temporal logic. We shall give an algebraic account of this relationship by showing how temporal algebras may be constructed from sequential ones. If time permits, we shall consider both discrete and continuous time.

Week 5: Nov 15, Jifeng He, Oxford
Linking Theories in Probabilistic Programming.

A theory of programming is intended to help in the construction of programs that provably meet their specifications. It starts with a complete lattice of specifications, used as a domain for the semantics of the programming language. We have developed a theory of probabilistic programming based on relational calculus through a series of stages; each stage concentrates on a different and smaller class of program, defined by the healthiness conditions of increasing strength. At each stage we show that the notations of the programming language conserve the healthiness properties of their operands, and that every theory conserves the definition of recursions.

Week 6: Nov 22, Oege de Moor, Oxford
A Fix for Combinator Parsers.

Combinator parsing is the parsing method of choice for functional programmers. The class of grammars handled by combinator parsers is large, but until now, it did not include grammars that exhibit left-recursion. Here we show how that deficiency may be eliminated by introducing a new combinator that takes least fixed points of parsers. The implementation of that combinator is reminiscent of Earley's algorithm. The original definition of the combinator is due to Phil Wadler; the proof that it works was found by myself and Paul Lickman; and last week I finally succeeded in finding an implementation of reasonable time complexity. The technique thus also eliminates the difficulty that combinator parsers often have exponential time complexity.

Week 7: Nov 29, Annabelle McIver, Oxford
Probability and data refinement: not as simple as it seems.

Relating an abstract data type to a concrete data type for probabilistic programs requires establishing a correspondence between distributions over the state space rather than between single states. Nevertheless the standard technique of using a representation relation still applies but there are some surprises. If the relation must encode a single abstract state as a distribution over the concrete variables for example, then simple intuition alone fails even to identify corresponding data types. Even when data types do correspond the distribution described by an abstract module may seem to have little to do with the distributions of the concrete module and to understand the behaviour of such cases we must think in terms of conditional probabilities. As a by product to this approach we can interpret a representation relation as the most that an external observer can deduce about the local variables, and hence make a connection with security issues.

Week 8: Dec 6, Carroll Morgan, Oxford
How probability, nondeterminism and information hiding (might) interact.

One of the trickiest issues in probabilistic semantics seems to be the interaction of probabilistic and demonic choice: are the demons ignorant of all probabilistic outcomes (oblivious); are they aware of all past outcomes but none from the future (contemporary); or can the demons make choices now which exploit probabilistic choices still to come (prescient)? Orthogonal to those "temporal" distinctions is a "spatial" one --- is a contemporary demon aware of probabilistic choices made secretly, even if in the past? That spatial question is crucial to the effectiveness of data refinement and modularity, at least for contemporary demons (the most common kind). I will outline a relational model of probability and nondeterminism in which demonic choice is contemporary but "local" --- that is, past probabilistic choices can be hidden from the demon if located within modules. The model is "quantum-like" in that visible state-components are "definite"; hidden components are distributions, and "observations" can collapse those distributions to a point. That solves the problem of data refinement, and opens up the interesting possibility of quantifying the likelihood of information flow out of a module. At least, it -would- solve the problem if our model were simpler. If there's time, I'll sketch the details of the programming logic (wp-level) that goes with the relational model: it really should be simpler too.

Week 9: Dec 13, Graham Hutton, Nottingham
Concurrency Theory for Functional Programmers.

In functional programming, there is considerable interest in the use of special-purpose operators such as fold and unfold to structure recursive programs, and in the use of associated algebraic properties to reason about programs. In this talk I will explain how recent categorical work shows how the same techniques can be applied to structure and reason about the semantics of concurrent languages using a functional language. This approach gives functional programmers new insights into the semantics of concurrency, and suggests a number of interesting avenues for research.


Richard McPhee

Oege de Moor (oege@comlab.ox.ac.uk)

[Oxford Spires]



Oxford University Computing Laboratory Courses Research People About us News