¡Esta es una revisión vieja del documento!


Theory of Computing

Prerequisites: Basic knowledge of set theory, logic discrete mathematics, programming languages, and techniques such as proof by contradiction and mathematical induction.

Motivation The theory of computation has been strongly influenced by great logicians, in particular Gödel, Turing, Church and Von Neumann. Even before the existence of digital computers they wondered what it meant to be computable; what can or cannot be computed. Thanks to their work we now know that we cannot write a C++ program that proves any theorem from arithmetic, or a Python program that receives a PASCAL program as input and decides if the input program eventually terminates. Nowadays we say that the problems of whether a given arithmetic statement is true, and whether a given program terminates are not computable. A fundamental aspect of the theory of computation is to determine what problems are computable.

The theory of computation is traditionally conducted in various models of computation that have been proposed over the years (since the beginning of the last century); among other Turing Machines, counter machines, finite-state automata, and lambda-calculus. These models are abstractions of computers or computational systems and can be classified in terms of their computational expressive power; i.e., their computing limitations. For example, Turing Machines (and the lambda-calculus) can express any computable function that you can write in C ++ or your favorite programming language. In fact, the Church-Turing thesis states that that a function on the natural numbers is computable in an informal sense (i.e., computable using an everlasting pencil and unlimited paper by some human that lives sufficiently enough) if and only if it is computable by a Turing machine.

Another fundamental aspect of the theory of computation is the study of the complexity of computable problems in terms of computer resources: Time and space (memory) needed to solve a given class of problem. For example, graph coloring is a problem that can be solved in nondeterministic polynomial time. We do not know yet if it can also be solved in polynomial deterministic time. In fact, if you can prove that graph-coloring can or cannot be solved in deterministic polynomial time you will receive a million dollars. This is because such a proof would give you an answer to a major unsolved question in computer science and one of the the Millennium Prize problems: the P = NP question. The question is whether the class of problems that can be solved in non-deterministic polynomial time (NP) is equivalent to the class of problems that can be decided in deterministic polynomial time (P). This is but one example of challenging problems in the theory of computation.

The theory of computation also deals with the semantics of computation: i.e., methods by which programs are endowed with precise meaning. They involve least two traditional approaches: operational and denotational semantics. The operational semantics method was pioneered by Plotkin in his Structural Operational Semantics (SOS) work. An operational semantics interprets a given program by using transitions specifying its computational steps. The denotational semantics method, pioneered by Scott and Strachey, interprets programs by mapping them into mathematical objects (called denotations) such as set or categories. These two approaches are typically related by so-called full-abstraction that states the two program are operationally equivalent (i.e., they behave the same way) if and only if they have the same denotations.

Several modern computational systems consist of multiple processes computing concurrently, possibly interacting among each other. This covers a vast variety of systems which nowadays, due to technological advances such as the Internet and mobile computing, most people can easily relate to. Traditional mathematical models of (sequential) computation based on functions from inputs to outputs no longer apply. The crux is that concurrent computation, e.g., in a reactive system, is seldom expected to terminate and it involves constant interaction with the environment. Concurrency Theory, pioneered by Hoare, Milner and Petri, among others, is a relatively new branch of the theory of computation dealing with concurrent computation.

In this course we will study the above-mentioned fundamental developments which roughly speaking address the capabilities, meaning, and limitations of computation.

Description

The sixteen-week course is an introduction to fundamental concepts of the theory of computation. The course will cover basic concepts from major branches of the theory of computation: Automata and language theory, computability, computational complexity and semantics of computation. It will also cover advanced topics from the theory of concurrent computation.

Goals

After successful course completion, the attendant will:

  • have a conceptual and theoretical understanding understanding of the foundations of

computation, in particular of the mathematical meaning of computation, the notion of computable problem and complexity class.

  • learn about the state-of-the-art frameworks for modern computational systems such as spatial

and mobile computation.

  • be able to formulate their own abstractions and models of computational systems and reason in

terms of them.

Contents

1. Background: (1 sessions)

  • Proof Principles
  • Set Theory
  • Logic
  • Lattice and Order Theory

2. Automata and Computability (4 sessions)

  • Finite State Automata and Regular Grammars
  • Push Down Automata and Context-Free Grammars
  • Turing Machines and Unrestricted Grammars,
  • Lambda-Calculus

3. Computational Complexity (4 sessions)

  • Complexity Classes
  • The Hierarchy Theorem
  • Reduction
  • NP-Completeness

4. Semantics of Computation (4 sessions)

  • Operational Semantics
  • Denotational Semantics
  • Axiomatic Semantics
  • Full Abstraction

5. Concurrency Theory (3 sessions)

  • Automata for concurrent behavior
  • Process Calculi
  • Computational Expressiveness of concurrency
  • Modern Concurrent Formalisms.

How to study for this course

Attend lectures, read the scientific literature (list will be provided in class), and other relevant literature, solve all the assignments, including the class project individually (unless explicitly stated otherwise).

Evaluation

*Three exams 20% each: The first one for Computability and Complexity, the second for Semantics, and a final exam for Concurrency Theory. *20% class project. *20% Assignments.

Bibliography

*Automata and Computability (Undergraduate Texts in Computer Science) by Dexter C. Kozen. ISBN-13: 978-0387949079 *Computational Complexity by Christos H. Papadimitriou ISBN-13: 978-0201530827 *Formal Semantics of Programming Languages by Glynn Winskel ISBN-13: 978-0262731034 *Communicating and Mobile Systems: The Pi Calculus by Robin Milner ISBN-13: 978-0521658690

 
materias/teoria_de_la_computacion.1443020559.txt.gz · Última modificación: 2015/09/23 10:02 por laura.triana
Recent changes RSS feed Donate Powered by PHP Valid XHTML 1.0 Valid CSS Driven by DokuWiki