STOC 2009

Program for the Valiant 60th Birthday Celebration

The talks will take place on Saturday, May 30, 2009 at the Hyatt Regency Bethesda, in the Embassy/Chesapeake Suite on the Conference Level of the Hotel (one level below the main lobby). Abstracts for some of the talks are listed below.

 9:15 -  9:50 — Reception and Coffee — (Chesapeake Foyer)
 9:50 - 10:00 Opening Remarks
10:00 - 10:30 Stephen Cook
Pebbles and Branching Programs
10:30 - 11:00 Paul Valiant
Leveraging Collusion in Unrestricted Combinatorial Auctions
11:00 - 11:30 Michael Rabin
Completely Anonymous, Secure, Verifiable, and Secrecy Preserving Auctions
11:30 - 12:00 Vijay Vazirani
Combinatorial Algorithms for Convex Programs Capturing Market Equilibria and Nash Bargaining Solutions
12:00 -  1:30 — Lunch — (on your own)
 1:30 -  2:00 Rocco Servedio
A Quarter-Century of Efficient Learnability
 2:00 -  2:30 Michael Kearns
Computational Learning Theory and Beyond
 2:30 -  3:00 Vitaly Feldman
On the Learning Power of Evolution
 3:00 -  3:30 — Coffee break — (Chesapeake Foyer)
 3:30 -  4:00 Mike Paterson
Strassen Symmetries
 4:00 -  4:30 Jin-Yi Cai
Proving Dichotomy Theorems for Counting Problems
 4:30 -  5:00 Mark Jerrum
The Permanent: Algorithms
 5:00 -  5:30 Avi Wigderson
Valiant's Permanent Gift to Complexity Theory
 5:30 -  5:45 Closing Remarks


Titles and Abstracts

Pebbles and Branching Programs
Stephen Cook — University of Toronto


Leveraging Collusion in Unrestricted Combinatorial Auctions
Paul Valiant — Massachusetts Institute of Technology

Collusion in auctions can have devastating effects, and is currently legally prosecuted as a criminal activity.

Contrary to this trend, in order to leverage the knowledge of collusive players, we propose to legalize collusion in auctions, and put forward a new type of mechanism, where collusive players are provided with special "collusive" bids, by which they can best advance their own goals and, in so doing, ours.

In unrestricted combinatorial auctions and in an adversarial collusive model, where classical mechanisms guarantee 0 social welfare and 0 revenue, ours guarantees that the sum of social welfare and revenue is always very high.

Achieving this result requires better modeling and understanding of the interplay between cooperative and non-cooperative perspectives in game theory, and the development of more refined solution concepts (which may be of independent interest).

Joint work with Jing Chen and Silvio Micali


Completely Anonymous, Secure, Verifiable, and Secrecy Preserving Auctions
Michael O. Rabin — Harvard University and Google Research

We present a method for conducting any sealed-bid auction mechanism in a manner conforming to the following requirements. There is no information about bids before auction closing time. Values of bids are non-coercibly and permanently kept secret. Identity of winners and prices they pay remain secret. Still, the correctness of the auction conduct and auction results are publicly verifiable. The availability of complete anonymity of winners and of complete permanent secrecy of bids is an impediment to collusions of bidders against the seller. The work employs a recent practical method for ZKPs.

Joint work with Yishay Mansour.


Combinatorial Algorithms for Convex Programs Capturing Market Equilibria and Nash Bargaining Solutions
Vijay V. Vazirani — Georgia Tech

Over the last 50 years, the primal-dual paradigm has had two highly successful "lives" — in combinatorial optimization and in approximation algorithms. In addition to yielding efficient and practically useful algorithms, it has also provided deep insights into the canonical combinatorial structure underlying the problems solved.

Via a most unexpected route, having to do with the solution of some fundamental problems from game theory and mathematical economics, a third life of this paradigm appears to be emerging: combinatorial algorithms for solving certain classes of convex programs.

Besides providing an in-depth perspective on this new development, I will introduce the fascinating problems that led to it and point out the challenges that lie ahead.


A Quarter-Century of Efficient Learnability
Rocco Servedio — Columbia University

Valiant's seminal 1984 paper "A Theory of the Learnable" provided an elegant framework for studying the computational aspects of learning from data. A fundamental goal within this framework, which is still actively pursued, is to understand which learning problems can be solved by computationally efficient algorithms and which cannot. In a series of works in the 1980s Valiant provided foundational results towards answering this important question. This talk will survey some of Valiant's contributions to these problems and look at where we are today.


Computational Learning Theory and Beyond
Michael Kearns — University of Pennsylvania

It has been 25 years since the publication of Les Valiant's landmark CACM paper "A Theory of the Learnable". Valiant's introduction of an algorithmic and complexity-theoretic model of learning — machine, human, or otherwise — generated great excitement, and led to the development of the field we now call computational learning theory, and the founding of the still-vibrant COLT conference.

It also ended up having tremendous impact on the practice of machine learning, in areas as diverse as boosting, semi-supervised learning, and reinforcement learning. In this talk I will briefly overview some of these broader impacts outside the theoretical computer science community.


On the Learning Power of Evolution
Vitaly Feldman — IBM Almaden Research Center

In this talk I'll describe Valiant's recent work on a computational model of Darwinian evolution and several new results regarding his model. Valiant's model addresses the question of how complex mechanisms such as those found in living organisms can result from the evolutionary process of random mutations guided by natural selection. He suggested that the process can be viewed as a constrained form of learning a circuit implementing the desired behavior in which at each step, the most fit candidate mechanism is chosen from a pool of variants of the current mechanism. The mechanisms are described in terms of the multi argument functions they implement and the fitness of such a mechanism is measured by evaluating the correlation of the mechanism with the function representing the most beneficial behavior. I'll outline recent results demonstrating that evolvability in Valiant's model is closely related to learnability by statistical query algorithms and show that many natural variants of the model lead to the same notion.


Strassen Symmetries
Mike Paterson — University of Warwick


Proving Dichotomy Theorems for Counting Problems
Jin-Yi Cai — University of Wisconsin-Madison

Leslie Valiant not only laid the foundation for the complexity theory of counting problems, but also pioneered some of the most elegant and versatile proof techniques in this study. This includes in particular the method of interpolation. Recently he has added to this arsenal the novel technique of holographic algorithms and reductions. With contributions from many researchers in the field, a number of very strong complexity dichotomy theorems have been proved, this includes counting CSP problems, Holant problems, and graph homomorphisms. For over 30 years, Valiant's vision and influence are an unmistakable guiding force in this area.


The Permanent: Algorithms
Mark Jerrum — Queen Mary, University of London

In a seminal paper, Valiant used the permanent function to demonstrate that a super-polynomial gap can exist between the complexity of the decision and counting versions of a search problem (under a reasonable complexity-theoretic assumption). This gave the permanent a psychologically privileged role in the complexity theory of counting problems, akin to that of SAT in the theory of NP-completeness. In this context, the permanent could have been replaced by any natural polynomial-time decision problem whose counting version is #P-complete. But I shall argue that the particular choice of the permanent function was felicitous for the development of algorithms for counting problems.


Valiant's Permanent Gift to Complexity Theory
Avi Wigderson — Institute for Advanced Study

The Permanent function was brought into the focus of TCS in several major works of Les Valiant 30 years ago, and has since served as a tool, a model, a challenge in numerous others. I'll try to survey some of the diverse results which make the standing of this gift so unique and central to computational complexity theory.


-   STOC 2009 Home   -