|Speaker and Abstract
Affiliation: Tata Institue
Title: Making Classical Zero Knowledge Protocols Secure Against Quantum Attacks
Abstract: A zero knowledge protocol is an interactive proof whereby a prover can
convince the verifier of the truth of a statement with revealing any
additional information about the statement. The zero knowledge
condition should hold even when the verifier cheats, that is, it
deviates from the prescribed protocol. The possibility of making
quantum computers begs the question as to what happens about the
security of a zero knowledge protocol if the verifier cheats
quantumly. Since problems like factoring and discrete logarithm which
are at the heart of many classical cryptosystems and protocols become
tractable in the presence of quantum computation, it is a priori
conceivable that a cheating quantum computer can break the security of
a classical zero knowledge protocol.
Nevertheless, in this talk, we shall see that any problem that has a
classical zero-knowledge protocol also has, under a reasonable
condition, another classical zero-knowledge protocol which is secure
against all classical and quantum polynomial time verifiers, even
cheating ones. This answers an open question of Watrous, and
generalizes classical results on zero knowledge by Goldreich, Sahai
This is joint work with Sean Hallgren, Alexandra Kolla and Shengyu
Zhang, and appeared at the ICALP 2008 conference.
Affiliation: University of Toronto
Title: Model-Based Human Pose Tracking
Abstract: Recovery and analysis of human pose and motion from video is a key
enabling technology for myriad potential applications, such as smart
mobile devices, advanced surveillance systems, and new perceptual
man-machine interfaces. Despite more than decade of focused research,
however, the general problem of detecting and tracking people in natural
environments, from monocular observations, remains challenging. Pose
estimation is especially challenging because image measurements are
often insufficient to fully constrain 3D pose, and as a consequence,
current techniques rely heavily on prior models of human model.
This talk will describe the nature of the problem and some recent
advances in modeling human motion based on kinematic data,
physics-based constraints, and biomechanical considerations.
Biography of Speaker:
David Fleet is professor of computer science at the University of Toronto. He received the PhD in Computer Science from the University of Toronto in 1991. From 1991 to 2000 he was on faculty at Queen's University, Canada, in the Department of Computing and Information Science, with cross-appointments in Psychology and Electrical Engineering. In 1999 he joined the Palo Alto Research Center (PARC) where he managed the Digital Video Analysis Group and the Perceptual Document Analysis Group. He returned to the University of Toronto in October 2003.
In 1996 Dr. Fleet was awarded an Alfred P. Sloan Research Fellowship for his research on biological vision. His 1999 paper with Michael Black on probabilistic detection and tracking of motion boundaries received Honorable Mention for the Marr Prize at the IEEE International Conference on Computer Vision. His 2001 paper with Allan Jepson and Thomas El-Maraghi on robust appearance models for visual tracking was awarded runner-up for the best paper at the IEEE Conference on Computer Vision and Pattern Recognition. In 2003, his paper with Eric Saund, James Mahoney and Dan Larner won the best paper award at ACM UIST '03. He was Associate Editor of IEEE Transactions on Pattern Analysis and Machine Intelligence (2000-2004), Program Co-Chair for the IEEE Conference on Computer Vision and Pattern Recognition in 2003, and Associate Editor-In-Chief for IEEE Transactions on Pattern Analysis and Machine Intelligence (2005-2008). He is Fellow of the Canadian Institute of Advanced Research.
His research interests include computer vision, image processing, visual perception, and visual neuroscience. He has published research articles and one book on various topics including the estimation of optical flow and stereoscopic disparity, probabilistic methods in motion analysis, 3D people tracking, modeling appearance in image sequences, non-Fourier motion and stereo perception, and the neural basis of stereo vision.
Affiliation: National Institute of Informatics, Japan
Algorithms and Data Structures
Title: Locality-sensitive analysis for rank-based similarity search
Abstract: The similarity search problem, also known as nearest-neighbour search problem,
is defined as follows: given a collection of data objects, build a
data structure which, for a given query object, returns a list of data
objects most similar to the query.
Similarity search is an operation of fundamental importance in
many application areas, including data mining, information retrieval,
and multimedia databases.
The effectiveness of known methods for similarity search is
generally limited due a phenomenon known as the "curse of dimensionality",
whereby the query or preprocessing time (or both) depends exponentially
on the number of features used to represent the objects.
When the number of features is large (as is often the case in practical
applications), the query cost approaches that of a sequential search
through the entire data collection.
In recent years, a number of similarity search strategies
have been proposed in an attempt to mitigate the effects of the curse
of dimensionality, either by trading the exactness of the query result
for improvements in query processing time, or by relating query performance
to measures of the so-called "intrinsic" dimensionality of the data set,
In this presentation, we will survey several of these strategies
- locality sensitive hashing (LSH), the cover tree, and the SASH
hierarchical search structure - and introduce a new similarity search
structure, the rank cover tree (RCT), that blends the characteristics
of all three approaches.
Biography of Speaker:
Michael Houle obtained his PhD degree from McGill University in 1989,
in the area of computational geometry. Since then, he developed research
interests in algorithmics, data structures, and relational visualization,
first as a research associate at Kyushu University and the University of Tokyo
in Japan, and from 1992 at the University of Newcastle and the University of
Sydney in Australia. From 2001 to 2004, he was a Visiting Scientist at
IBM Japan's Tokyo Research Laboratory, where he first began working on
approximate similarity search and shared-neighbor clustering methods
for data mining applications. Currently, he is a Visiting Professor
at the National Institute of Informatics (NII), Japan.
Affiliation: Rice University
Title: Accurate Inference of Phylogenetic Relationships from Multi-locus Data
Abstract: Accurate inference of phylogenetic relationships of species, and understanding their relationships with gene trees are two central themes in molecular and evolutionary biology. Traditionally, a species tree is inferred by (1) sequencing a genomic region of interest from the group of species under study, (2) reconstructing its evolutionary history, and (3) declaring it to be the estimate of the species tree. However, recent analyses of increasingly available multi-locus data from various groups of organisms have demonstrated that different genomic regions may have evolutionary histories (called “gene trees”) that may disagree with each other, as well as with that of the species. This observation has called into question the suitability of the traditional approach to species tree inference. Further, when some, or all, of these disagreements are caused by reticulate evolutionary events, such as hybridization, then the phylogenetic relationship of the species is more appropriately modeled by a phylogenetic network than a tree. As a result, a new, post-genomic paradigm has emerged, in which multiple genomic regions are analyzed simultaneously, and their evolutionary histories are reconciled in order to infer the evolutionary history of the species, which may not necessarily be treelike.
In this talk, I will describe our recent work on developing mathematical criteria and algorithmic techniques for analyzing incongruence among gene trees, and inferring phylogenetic relationships among species despite such incongruence. This includes work on lineage sorting, reticulate evolution, as well as simultaneous treatment of both. If time permits, I will describe our recent work on population genomic analysis of bacterial data, and the implications on the evolutionary forces shaping the genomic diversity in these populations.
Biography of Speaker:
Luay Nakhleh is an Assistant Professor of Computer Science and Biochemistry and Cell Biology at Rice University, and an adjunct Assistant Professor of Systems Biology at MD Anderson Cancer Center. He received the B.Sc. degree from the Technion, Israel Institute of Technology, in 1996, the Master’s degree from Texas A&M University in 1998, and the PhD degree from the University of Texas at Austin in 2004⎯all three degrees in Computer Science. His research interests fall into the general areas of computational biology and bioinformatics; in particular, he works on computational phylogenomics and population genomics, and their connections with other fields in biology. Luay has published over 50 manuscripts on his work, supervised the dissertations of two recent PhD graduates, and currently supervises the dissertations of 6 PhD students. Luay has received several awards, including the Texas Excellence Teaching Award from UT Austin in 2001, the Outstanding Dissertation Award from UT Austin in 2005, the Roy E. Campbell Faculty Development Award from Rice University in 2006, the DOE Early Career Award in 2006, the NSF CAREER Award in 2009, the Phi Beta Kappa Teaching Prize in 2009, and an Alfred P. Sloan Foundation Fellowship in 2010.
Title: Towards Computational Hybrid System Semantics for Time-Based Block Diagram Modeling
Abstract: The use of computational models in the design of engineered systems has shown to provide significant advantages over the use of physical models and paper documents by enabling Model-Based Design. This has resulted in an autocatalytic trend where computational models have been key to unlocking the potential of model transformation. These transformations, in turn, can be effectively captured by computational models. In order to precisely capture a model transformation, it is essential that the semantics of the source and target representations be unambiguously and rigorously defined.
In the design of engineered systems, time-based block diagrams are extensively used and formal semantics of the discrete-time aspects have previously been established. For analytic purposes, continuous-time behavior of a hybrid system is often considered from a mathematical perspective. Design, however, typically relies on numerical simulation to obtain insight into the system behavior. As a result, the corresponding computational implementation determines the semantics of the continuous-time behavior. A fixed-step forward Euler numerical integration scheme is frequently exploited for hybrid systems simulation, though its poor stability characteristics limit the applicability. Variable-step numerical solvers have superior performance characteristics but their computational semantics are significantly more difficult to define. Moreover, the variable-step nature is difficult to reconcile with discrete-time formalizations. This paper exploits a stream-based approach to establish a framework for defining the computational semantics of both the discrete-time aspects as well as a variable-step numerical solver that generates continuous-time behavior.
Biography of Speaker:
Dr. Mosterman is a senior research scientist at The MathWorks, Inc. in Natick, MA. Before, he held a research position at the German Aerospace Center (DLR) in Oberpfaffenhofen for four years, funded by the Deutsche Forschungsgemeinschaft (German Research Foundation), to investigate object-oriented modeling of physical systems, specifically the mixed continuous-time/discrete-event behavior that often results in the models being of a hybrid system nature.
Dr. Mosterman his primary research interests are in Computer Automated Multiparadigm Modeling (CAMPaM) with principal applications in training systems and fault detection, isolation, and reconfiguration. He designed several modeling and simulation environments such as the Electronics Laboratory Simulator, nominated for The Computerworld Smithsonian Award by Microsoft Corporation, a first version of Transcend for diagnosis of continuous-time system in transient operating regions, and MAsim for simulating differential and algebraic equation based models from Modelica. He was awarded the IMechE Donald Julius Groen Prize for a paper on HyBrSim, a hybrid bond graph modeling and simulation environment. Specific areas of interest are modeling of physical systems, meta-modeling, and model and formalism transformation in computer aided control system design (CACSD) and in Electronics Design Automation (EDA). An important aspect concerns the behavior generation for heterogeneous models, which requires a hybrid dynamic systems approach.
Dr. Mosterman is currently Editor-in-Chief of Simulation: Transactions of The Society for Modeling and Simulation International for the Methodology section, and Associate Editor of IEEE Transactions on Control Systems Technology and of Applied Intelligence. He was Guest Editor of special issues of ACM Transactions on Modeling and Computer Simulation and IEEE Transactions on Control Systems Technology on the topic of CAMPaM. He co-organized the Model-Based Design for Embedded Systems track at the Design Automation and Test in Europe conference and exhibition in 2007 and 2008, the Computational Modeling and Simulation of Embedded Systems track at the 2007 Summer Computer Simulation Conference, the International Conference on High Level Simulation Languages and Applications in 2007, the annual International Bellairs CAMPaM Workshop since 2004, and the 14th International Workshop on Principles of Diagnosis in 2003.
Title: SciDB: a Science-Oriented Database Management System
Abstract: In this talk I will introduce SciDB, a new open-source and massively parallel platform for array data storage, processing and analysis. I will review a number of motivating use-cases from both the scientific and commercial spheres, describing how these use-cases have determined the features and functionality of the system. I will introduce SciDB data model, and will describe some of the key architectural features of the system including columnar storage, parallel user-defined functions and overlapping array partitioning. Finally, I will introduce SSDB, a new benchmark for scientific data management systems, and will explain why SciDB is two orders of magnitude faster than traditional database systems on common large-scale array processing tasks.
Biography of Speaker:
Philippe Cudre-Mauroux is a postdoctoral associate working in the Database Systems group at MIT. He received his Ph.D. from the Swiss Federal Institute of Technology EPFL, where he won the Doctorate Award and the EPFL Press Mention in 2007. Before joining MIT, he worked on distributed information management systems for HP, IBM T.J. Watson, and Microsoft Research Asia. His research interests are in large-scale distributed infrastructures for non-relational data such as spatiotemporal, scientific, or Semantic Web data.
Affiliation: Harvard University
Title: Incentive Mechanism Engineering in the Internet Age
Abstract: (talk given in MC603) The dawn of mechanism design came in the 1960s with a principled debate about the "in principle" power of economic systems to support socially desirable outcomes. The next couple of decades brought tremendous theoretical advances in understanding the possible, and the impossible, when making decisions under constraints of distributed private information and self-interest. Computational considerations also gained in importance, in adapting methods of mechanism design to combinatorial problems, such as those of wireless spectrum allocation. Fast forward to the Internet age. Today there is a thirst for practical, engineered incentive mechanisms that can coordinate everything from the deployment and use of the very resources that form the Internet, to the myriad of multi-user systems enabled by the Internet; e.g., to advertising on search engines, to crowd sourcing of software development, and to knowledge markets. A common thread is a transition from events to processes, from an isolated auction to a platform that provides ongoing coordination to a dynamic user population. In this talk I will trace some of these developments in mechanism design, highlight some recent advances in the design and analysis of dynamic mechanisms and market platforms, and outline a few of the many open research challenges.
Biography of Speaker:
David Parkes is Gordon McKay Professor of Computer Science in
the School of Engineering and Applied Science at Harvard
University. His research interests are in the are of multi-agent
systems, game theory and mechanism design. This year, he is serving
as the general chair of the ACM Conference on electronic Commerce.