The Geocal06 session consists in an intensive five week series of lectures and workshops in logic, models and semantics of programming languages, theory of concurrency... organized by the GEOCAL group at the CIRM. One of the main goal of this session is to gather researchers working in different areas of theoretical science and let them interact.
The session is divided into two parts: the first two weeks are devoted to a winter school, the last three weeks to specialised workshops.
Click on the workshop links for the programme
|9h - 10h30||11h - 12h30||14h - 15h30||16h - 17h30|
|9h - 10h30||11h - 12h30||14h - 15h30||16h - 17h30|
|9h - 10h30||11h - 12h30||14h - 15h30||16h - 17h30|
John Baez conference
John Baez (Riverside, US) is a mathematical physicist specialised in quantum gravity and n-categories. We have given him carte blanche for a lecture on a topic of his choice.
Since the introduction of category theory, the old subject of universal algebra has diversified into a large collection of frameworks for describing algebraic structures. These include monads (formerly known as triples), the algebraic theories of Lawvere, the PROPs of Boardman and Vogt, and the operads of May, which are now widely used in topology and mathematical physics. We give an overview of these different frameworks, many of which are closely related, and explain how one can reason diagrammatically about algebraic structures defined using them.
Organizers: Éric Goubault, Yves Lafont
The objective of the course is to give a general overview of the recent developments of ``geometrical'' approaches to the theory of computation, in particular concurrent and distributed computations, and rewriting.
The course will start by a general introduction to homotopy and homology, classical on topological spaces, and generalize through abstract homotopy theory and generalized homology theories. Then two different applications will be developped.
The first one concerns concurrency theory. In this application, a refined form of homotopy (dihomotopy) has to be designed, that takes into account the direction of time. Similar methods as the ones on ordinary topological spaces can be defined, in order to find ``invariants'' under dihomotopy, i.e. a way to classify directed shapes under (directed) deformation. These invariants can in turn be used for purposes of proofs or static analysis of algorithms or programs. We will show such applications in the course. Other applications, of more classical algebraic topological invariants will also be derived, in the field of fault-tolerant distributed systems.
The second one concerns the rewriting theory. Since the pioneering work by Squier, we know how to use homology to characterize the algebraic objects presented by rewriting systems. Basically, words are seen as objects of dimension one, rewriting rules are seen as of dimension two, confluence diagrams as of dimension three etc. providing resolutions to the algebraic objects they present. Extensions to term rewriting has been recently solved, along similar lines, using more general homology theories. Also higher-dimensional rewriting has gained wide interest and its theory will be fully developped, in conjunction with the necessary n-categorical (or polygraphic) theory.
Organizer: Patrick Baillot
This lecture will present logical approaches to ensure complexity bounds on lambda-calculus or functional programs, with a particular emphasis on polynomial time complexity.
This line of research can be traced back to work by Leivant on restrictions of system F, and by Girard who showed that controling the logical rules responsible for duplication in Linear logic proofs made it possible to tame the complexity of the associated programs: he obtained in this way a logical characterization of elementary complexity (ELL system) and of polynomial time complexity (LLL system). Later Lafont proposed another variant of Linear logic also characterizing polynomial time. This course will also present the typed lambda-calculi that have been designed from these systems.
Another approach is based on restrictions of the primitive recursion scheme that lead to characterizations of polynomial time functions as well as of other complexity classes. This line was developed by Bellantoni and Cook and by Leivant and Marion. Later Marion adapted these ideas to the setting of first-order rewriting systems and showed that new criteria, based on termination orderings and quasi-interpretations provided flexible techniques to control the complexity of a large class of programs.
Finally Hofmann designed a type system for a functional language based on a fragment of Linear logic and ensuring that the typed programs are non-size-increasing and can be run in polynomial time.
This course will try to stress and discuss the issue of algorithmic expressivity: even if various systems characterize polynomial time functions, it is desirable that the languages obtained are expressive enough so as to be able to write some standard algorithms.
Organizers: Thomas Ehrhard, Laurent Regnier
Two lectures giving two points of view on the proof/programs correspondance by Jean-Louis Krivine (PPS, Université Denis Diderot (Paris 7), CNRS, France) and Thierry Coquand (Chalmers, Göteborg University, Sweden).
This course will be an introduction to some recent development in constructive algebra. This approach can be seen as an application of some basic results in proof theory, the main one being the completeness of cut-free provability. It can be seen also as a partial realisation of Hilbert's program in abstract algebra, and uses the idea of "replacing" an infinite ideal object by a syntactical one: a logical theory which describes this object. It has also close connection with formal topology: cut-elimination can be expressed as forcing/topological semantics over a formal space. We show how this approach, besides revealing the computational content implicit in some abstract arguments in algebra, can lead to new concepts, new simple proofs of classical theorems, but also to new results in algebra.
Organizers: Patrick Baillot, Jean-Yves Marion
This workshop will follow the topic of the course "Complexity and logic" and will enlarge the discussion to other approaches and recent developments. The main issue is how to use logic and formal methods like types, rewriting systems... to provide languages for complexity-bounded computation, in particular for polynomial time computing.
Implicit Computational Complexity aims at controlling complexity without refering to explicit bounds on time or memory, but instead by relying on some logical or computational principles that entail complexity properties. Various approaches have been explored for that purpose, like Linear logic, restrictions on primitive recursion, rewriting systems, types ... They mainly rely on the functional programming paradigm.
Two objectives are on the one hand to find natural implicit logical characterizations of functions of various complexity classes, on the other hand to design systems suitable for static verification of programs complexity. In particular the latter goal requires characterizations which are general enough to include commonly used algorithms.
This workshop will be open to contributions on various aspects of Implicit Computational Complexity including (but not exclusively): logical systems, linear logic, semantics, rewriting and termination ordering, types, extensions from functional to imperative or concurrent setting, applications to automated verification...
Organizer: Paul Ruet
Biological networks are found at the core of all biological functions, from biochemical pathways to cell communication processes and gene regulation mechanisms. Their complexity calls for proper mathematical models and results in order to relate their structure to their dynamical properties.
In the last few years, progresses have been made in particular in the mathematical study of genetic networks, and in the modelisation of the structure of protein level interactions via concurrent programming languages.
The workshop is intended to present and confront these lines of research, and propose possible cross-fertilisations and extensions (dynamics of metabolism, intercellular communication, stochastic aspects...). It will be an opportunity to bring together mathematicians, logicians and concurrency theorists, biologists and physicists interested in a structured approach of biological interactions.
Organizers: Marie-Renée Fleury, Myriam Quatrini, Jacqueline Vauzeilles
Linguists and logicians tried to specify grammars as logical theories. As compared to the traditional linguistics, the advantage of the logical approach is that we can prove important properties of the grammars such as soundness, completeness and consistency.
J-Y. Girard has introduced linear logic (1987) as a resource sensitive logic. The basic feature of linear logic that is essential for linguistic applications, is the absence of contraction and weakening rules. But the unlimited commutativity rule is not acceptable from the linguistic point of view either. If we do not allow the commutativity rule at all, we strongly restrict the expressive power to generate natural languages.
One of the promising approaches among various extensions and generalizations of Lambek grammars is to use proof nets technics inspired by linear logic.
This "Linguistics and logic" workshop aims at bringing together logicians, linguists, and computational scientists to discuss the problems of logical applications to linguistics.
Organizers: Olivier Laurent, Pierre-Louis Curien
Denotational semantics explores the meaning of proofs and programs as morphisms acting on abstract mathematical spaces. The first models were based on domains (partially ordered sets) such as Scott domains, dI-domains, coherence spaces etc. and continuous or stable morphisms; recent developments in this area have led to sophisticated accounts of sequentiality (strong stability, bi-stable morphisms).
Game models interpret proofs/programs as strategies in games, providing an operationally more informative interpretation of proofs, with remarkable mathematical achievements (full completeness theorems, interpretation of statefull programming languages, a new approach to the type isomorphisms problem, etc., and also the related theory of ludics, an interactive viewpoint on logic).
New models interpret proofs as continuous linear functions on vector spaces, with possible connections to quantum programming, and operational interpretations of the Taylor formula. This workshop will be an opportunity of confrontation between these various lines of reseach, in this very active field.
Organizer: Philip J. Scott
Linear logic and proof nets, introduced by Girard in 1986, gave rise to new ways of thinking about computation and the geometry of proofs. For example normalization (cut elimination) can be modelled in terms of travelling along paths in the nets (graphs) representing proofs. These travels are governed by algebraic laws on operators associated to edges of proof nets. This led to Girard's dynamical semantics of proofs in terms of functional analysis, to a new semantics of algorithms and to dynamical invariants for cut-elimination. This is the Geometry of Interaction.
GoI has inspired several areas of research. On the one hand, on a concrete level, it has connections with optimal reduction, abstract machines, and continues to have ongoing connections to complexity theory and feasible logics. On the other hand it has many connections with recent work on traced monoidal categories and categorical semantics. Other recent developments include new approaches based on Von Neuman algebras, with possible connections to quantum mechanics. The goal of the workshop is to bring together students and experts from the different directions inspired by GoI, for new insights on the original works and to view the newest research developments.
Organizers: Yves Lafont, Éric Goubault
This workshop will follow the topics of the lectures of the first week (ACTR), focusing on more recent developpements and work in progress.
One of this new directions is the Homotopical Theory of Computation, as expressed in the "Minneapolis Program" of Métayer and Lafont, based on the work of Burroni. This theory is a unified framework for homology and homotopy of computation, possibly including the foundations of directed homotopy.
Organizers: Daniel Hirschkoff, Étienne Lozes
The aim of this workshop is to bring together people interested in the interplay between research on algebraic models for mobile and distributed computing, where one can describe and reason about the spatial structure and interaction possibilities of processes, and recent developments in proof theory, notably in ludics, where similar mechanisms show up.
In models like the pi-calculus or mobile ambients, a process configuration is described as a collection of agents that can interact via appropriate mechanisms (named channels, locations, interaction capabilities). Structure-sensitive relations between processes have been proposed to reason about mobility and locality properties of a system. Among these, the so-called spatial logics extend existing modal logics to specify the actual structure of a process and its evolution along computation.
In logic, the central mechanism of cut-elimination has been presented recently in a way that bears striking analogies with a concurrent scenario. In ludics, in particular, loci, which correspond to formula occurrences, are the places where cut happens, in an interactive fashion. Similarly, denotational models based on games exploit the notion of synchronization trees generated by interacting players.
Actual connections between denotational semantics or logic and concurrency exist, notably on the study of the proof theory of spatial logics, and on process-algebra based models of linear logic. The goal of this meeting is to investigate further the possible interactions between these areas, and to find out whether such analogies can lead to profound connections, or, at least, if methods and results on one side can provide inspiration for the other side.
Organizers: Vincent Danos, Josée Desharnais, Prakash Panangaden
Probabilistic transition systems are used to model non-deterministic processes when one has a measure of the likelihood of the various events involved in the evolution of the process. Following work by several people in the last 15 years, it is possible to set up a neat algebraic definition of these systems in the wider theory of behaviour functors, and to develop a theory of when such systems are equivalent or approximately so, which in turn gives means of evaluating some properties of interest, e.g. properties relating to long term average behaviour, on "compressed" forms of a given system.
The workshop will develop various aspects of this theory including the necessary measure theoretic and algebraic foundations, as well as applications to modeling and model-checking.
Organizers: Dale Miller, Lutz Strassburger
Logic programming, in the sense of proof search in sequent calculus, has been used to describe the operational semantics of concurrent computational systems in at least two distinct fashions.
In the process-as-term approach, process combinators and relations between them (one-step, bisimulation, etc), are encoded as non-logical primitives, whose axiomatization is provided by structured operational semantics (SOS). Here, classical and intuitionistic logics are generally used in such axiomatizations. Recent challenges in this setting have been capturing both may and must properties of processes, uncovering declarative approaches to link mobility, and exploiting evident connections to model checking and game semantics.In the process-as-formula approach, process combinators are encoded directly as logical connective and the operational semantics of processes is provided directly by an underlying logic, such as linear logic. Such encodings have been most successfully employed to date with asynchronous process calculi. Recent challenges here have been finding ways to exploit the meta-theory of linear logic to provide results about processes and declarative approaches to sequentiality and security protocols. This workshop will focus on examining various connections between concurrent computation and the proof search paradigm, whether or not that connection falls neatly into the above mentioned approaches.
General organisers: Thomas Ehrhard and Laurent Regnier
CIRM (Centre International de Rencontres Mathématiques)
FRUMAM (Fédération de Recherche des Unités de Mathématiques de Marseille)
CNRS (Centre National de la Recherche Scientifique)
Luminy (Faculté des sciences de Luminy)
ED-MIM (École doctorale de Mathématiques et Informatique de Marseille)
ParisNord (Université Paris Nord)
Génopole (La cité du gène et des biotechs)
GEOCAL (Geométrie du Calcul (projet ACI Nouvelles Interfaces des Mathématiques))
INVAL (Invariants algébriques des systèmes informatiques (Projet de l'Agence Nationale pour la Recherche))
APPSEM (Applied Semantics II (Thematic European Network))
IML (Institut de Mathématiques de Luminy)
LIPN (Laboratoire d'Informatique de Paris-Nord)
PPS (Laboratoire Preuves, Programmes, Systèmes)
Preregistration and registration to Geocal06 are definitely closed.
Last modified: Wed, 01 Mar 2006 17:45:50 +0100