Seminar Home
Fall 2012 Schedule
Winter 2013 Schedule
Summer 2013 Schedule
Archives

|
Date (
Winter 2010
) |
Category |
Seminar Info |
| 2010/04/29 |
Theory |
Place: MC320
Time: 16:00 - 17:30
Speaker:
Jurg
Wullschleger
Affiliation: University of Bristol
Area:
theoretical computer science
Title: Extracting Correlations
Abstract: This week I will give the second part of my presentation of the paper
"Extracting Correlations" by Ishai, Kushilevitz, Ostrovsky and Sahai
from STOC 2009.
I will talk about the following:
- how oblivious transfer (OT) can be implemented from Randomized OT.
- what (decomposable) randomized encodings are, and how they can be used.
- the OT-extractor protocol from the "Extracting Correlations"-paper.
|
| 2010/04/23 |
Bioinformatics |
Place: Leacock 26
Time: 14:30 - 16:00
Speaker:
Guillaume
Bourque
Affiliation:
Title: Functional genomics in stem cells, evolution and cancer
Abstract: In the first part of this talk, I will describe a study where we generated occupancy maps for three key regulatory proteins (OCT4, NANOG and CTCF) together with knockdown expression experiments (OCT4) in human and mouse Embryonic Stem cells. Using this data we found that the binding profiles of OCT4 and NANOG were drastically different between the two species with only ~5% of the regions
homologously occupied. Moreover, I will show that species-specific transposable elements have contributed up to 25% of the regulatory sites in both lineages and have wired new genes into the core regulatory network of human ES cells.
In the second part of this talk, I will look at the forces shaping vertebrate genome architectures and present a new algorithm called EMRAE that can predict a wide-range of rearrangement events in the ancestry of a group of species. Applying this tool to
a dataset with six genomes (human, chimpanzee, rhesus macaque, mouse, rat, and dog), I will show that regions of high sequence identity have been associated with rearrangement events throughout the mammalian phylogeny.
Finally, in the third part of this talk, I will show how new paired-end-tag sequencing strategies can efficiently provide the map of all structural changes in a given genome. Applying this technique to normal samples, cancer tumors and cancer cell lines and performing a comparative analysis can reveal different patterns of mutations and help identify important aberrations.
|
| 2010/04/22 |
Theory |
Place: MC320
Time: 16:00 - 17:30
Speaker:
Jürg
Wullschleger
Affiliation: University of Bristol, UK
Area:
theoretical computer science
Title: Extracting Correlations
Abstract: I will present a paper by Ishai, Kushilevitz, Ostrovsk and Sahai from
STOC 2009, called "Extracting Correlations". I will divide it into two sessions.
This week, i will
- explain what extractors, OT, OT-combiners, and OT-extractors are.
- present an extractor using eps-biased sets.
- show what randomized encodings are, and outline how they can be
constructed and used.
|
| 2010/04/15 |
Theory |
Place: MC320
Time: 16:00 - 17:30
Speaker:
Ladan
Mahabadi
Affiliation: McGill
Area:
theoretical computer science
Title: Pseudorandom generators
Abstract: Prevalence of randomized algorithms has made the design of
pseudorandom generators (PRG) a significant issue. Blum, Micali and
Yao in the early 80s used hardness of a one-way permutation to create
a pseudorandom generator (BMY PRG) that fools any probababilistic
algorithm running in polynomial time. In the early 90s, a seminal work
of Nisan and Wigderson, introduced a new type of PRGs (NW PRG) that
required a weaker complexity assumption. The seminal idea behind this
construction is to allow the PRG to run in exponential time since its
output is to be used for derandomization. This construction led to a
new wave of analysis and construction of PRGs. In this talk, I will
introduce the Blum-Blum-Yao, Blum-Blum-Shoup and the Nisan-Wigderson
PRGs. Time permitting, I'll also mention the application of
extractors - efficient algorithms that given a weak source of
randomness and a truly random seed, output a distribution that is
statistically close to the uniform distribution - to improve such
constructions.
|
| 2010/04/08 |
Theory |
Place: MC320
Time: 16:00 - 17:30
Speaker:
Michael
Cadilhac
Affiliation: University of Montreal
Area:
theoretical computer science
Title: The Homomorphism Preservation Theorem for the finite.
Abstract: Last week, we saw what a preservation theorem is, and why the usual proofs in
classical model theory do not go through in finite model theory. Following
Benjamin Rossman's proof (LICS'05), we saw a new approach on proving the
Homomorphism Preservation Theorem (HPT) in the classical case.
This week, we will see why this proof does not work as-is for the finite HPT,
and look at some new tools (I even foresee some exciting use of EF games!) to
salvage most of the ideas exposed previously. Last week's presentation is
not a requirement in order to understand this one.
|
| 2010/04/07 |
CQIL - Cryptography and Quantum Information |
Place: McConnell 103
Time: 16:00 - 17:00
Speaker:
Todd
Brun
Affiliation: University of Southern California
Title: Quantum Steganography with Noisy Channels
Abstract: Steganography is the process of hiding secret information by embedding it in an ``innocent'' message. We present protocols for hiding either classical or quantum information in a codeword of a quantum error-correcting code passing through a channel. Using either a shared secret key or shared entanglement, the sender (Alice) disguises her information as errors in the channel. The receiver (Bob) can retrieve the hidden information, but an eavesdropper (Eve) with the power to monitor the channel, but without the secret key, cannot distinguish the message from channel noise. We analyze how difficult it is for Eve to detect the presence of secret messages, and estimate rates of steganographic communication and secret key consumption for certain protocols.
|
| 2010/04/01 |
Theory |
Place: MC320
Time: 16:00 - 17:30
Speaker:
Michael
Cadilhac
Affiliation: University of Montreal
Area:
theoretical computer science
Title: The Homomorphism Preservation Theorem for the finite.
Abstract: Łoś-Tarski theorem is one of the most natural result in classical model
theory. It states that a first-order formula f is preserved by extensions
iff f is existential, that is, f can be written using only positive
existential quantifiers. A formula f is said to be preserved by extensions
if whenever a structure A (i.e., a universe and a set of relations) is a
model of f and a structure B is an extension of A, B is a model of f. This
theorem is one of the so-called Preservation Theorems, which correlate
semantic and syntactic properties of formulas. Another example is the
Homomorphism Preservation Theorem (HPT), which states that a first-order
formula f is preserved by homomorphisms iff f is existential and positive.
When one is interested in finite model theory, that is, satisfaction for
finite structures, most of the preservation theorems fail due to the fact
that the Compactness Theorem does not hold. In this talk, we will see that
HPT has the incredible property to stay true in the finite. For that
purpose, we will walk through Benjamin Rossman's proof (LICS'2005) of the
finite HPT. If there is interest, I will focus on particular spots of the
proof in a second talk.
|
| 2010/03/26 |
DMO - Discrete Mathematics and Optimization |
Place: McConnell 320
Time: 14:00 - 15:00
Speaker:
Lisa
Fleischer
Affiliation: Dartmouth University
Title: Unrelated Machine Selection and Scheduling
Abstract: We look at problems of scheduling jobs to machines when the processing time of a job
is machine dependent. Common objectives in this framework are to minimize the
maximum load on a machine, or to minimize the average completion time of jobs. These
are well-studied problems. We consider the related problem of trying to select a
subset of machines to use to minimize machine costs subject to bounds on the maximum
load or average completion time of the corresponding schedule. These problems are
NP-hard and include set-cover as a special case. Thus we focus on approximation
algorithms and get tight, or almost tight approximation guarantees.
|
| 2010/03/25 |
Theory |
Place: MC320
Time: 16:00 - 17:30
Speaker:
Pierre
McKenzie
Affiliation: University of Montreal
Area:
Theoretical computer science
Title: Branching programs for the tree evaluation problem
Abstract: I'll recall the relationship between branching programs and Turing machines, then talk about branching program lower
bounds of various types for the problem of evaluating a tree (I'll explain the problem). Joint work with Mark Braverman,
Steve Cook, Rahul Santhanam and Dustin Wehr, presented at MFCS and FSTTCS last year.
|
| 2010/03/22 |
DMO - Discrete Mathematics and Optimization |
Place: Burnside 1205
Time: 16:00 - 17:00
Speaker:
Mohit
Singh
Affiliation: McGill University
Title: Incentives in Secretary Problems and Online Auctions via Linear Programming
Abstract: In the secretary problem, an employer would like to choose the best
candidate in an online manner among n competing candidates. The
candidates
are assumed to arrive in a random order. The optimal mechanism is to
interview the first n/e candidates for the purpose of evaluation, and
then
hire the first candidate that is better than all previous candidates.
This
simple mechanism suffers from a crucial drawback. Candidates arriving
early in the process have an incentive to delay their interview. Using
this mechanism, when candidates can influence their position, can
lead to
sub-optimal result and challenges the basic assumption that the
candidates
arrive in a random order.
In this talk, I will describe a linear programming framework which
characterizes all mechanisms for the secretary problem and its
extensions.
This characterization helps us obtain optimal incentive compatible
mechanisms for the secretary problem. We will also show an
application of
this result to an auction scenario where a single item has to be sold
to n
competing bidders arriving in an online manner.
This is joint work with Niv Buchbinder and Kamal Jain.
|
| 2010/03/18 |
Theory |
Place: MC320
Time: 16:00 - 17:30
Speaker:
Marc
Kaplan
Affiliation: University of Montreal
Area:
theoretical compter science
Title: Some applications of Grothendieck's inequality to computer science (part II)
Abstract: Grothendieck's inequality is a fundamental result in functional analysis. In
the last ten years it has appeared to be an extremely useful tool in computer
science. Perhaps the most surprising fact about it is that has been used in
very different areas. On one side, it is a cornerstone of the approximation
algorithm theory and unique game conjecture. On the other side, it is a very
natural tool to approach various problems of quantum information processing. In
a series of two talks, I will present these two aspects.
During the first talk, I will present the inequality and the algorithm of Alon
and Naor for approximating the cut norm. The core of the algorithm will give us
a complete proof of Grothendieck's inequality. During the second talk, we will
focus on quantum information processing. The connection between Grothendieck's
inequality and the quantum world is made clear by Tsirelson's theorem on
quantum probability distributions. The combination of these two result will
allow us to give a new insight on Linial and Shraibman's work on algebraic
lower bounds on quantum communication complexity.
Biography of Speaker:
-
Marc Kaplan is currently a postdoc at University of Montreal, hosted by Prof. Gilles Brassard. He obtained his PhD at LRI, Orsay, under the supervision of Prof. Sophie Laplante.
|
| 2010/03/15 |
DMO - Discrete Mathematics and Optimization |
Place: Burnside 1205
Time: 16:30 - 17:30
Speaker:
Xavier
Dahan
Affiliation: Kyushu University
Title: Ramanujan graphs of larger girth
Abstract: Ramanujan graphs are optimal (spectral) expanders. We will first
introduce the concept of expansion in a regular graph, then define
expanders, that have dozens of applications in Computer and
Information science [2]. We will review the construction of [1], i.e.
Cayley graphs on discrete quaternionic groups, then show what is
changing when using octonions instead. We obtain families of regular
graphs with a larger girth, and hence with the largest girth known.
(joint work with J.-P. Tillich)
[1] Lubotzsky, Philips and Sarnak. Ramanujan grpahs. Coombinatorica,
1988.
[2] Hoory, Linial and Wigderson. Expander graphs and their
applications, Bull of the AMS, 2006.
|
| 2010/03/12 |
Algorithms |
Place: Trottier 2100
Time: 10:30 - 11:30
Speaker:
Bill
Cook
Affiliation: Georgia Tech
Title: In Pursuit of the Salesman: Mathematics at the Limits of Computation
Abstract: The traveling salesman problem, or TSP for short, is easy to state: given a number of cities along with the cost of travel between each pair of them, find the cheapest way to visit them all and return to your starting point. Easy to state, but difficult to solve! Despite decades of research by top applied mathematicians around the world, in general it is not known how to significantly improve upon simple brute-force checking. It is a real possibility that there may never exist an efficient method that is guaranteed to solve every instance of the problem. This is a deep mathematical question: Is there an efficient solution method or not? The topic goes to the core of complexity theory concerning the limits of feasible computation. For the stout-hearted who would like to tackle the general version of the TSP, the Clay Mathematics Institute will hand over a $1,000,000 prize to anyone who can either produce an efficient general method or prove an impossibility result.
The complexity question that is the subject of the Clay Prize is the Holy Grail of traveling-salesman-problem research and we may be far from seeing its resolution. This is not to say that mathematicians have thus far come away empty-handed. Within the theoretical community the problem has led to a large number of results and conjectures that are both beautiful and deep. In the arena of exact computation, an 85,900-city challenge problem was solved in 2006, in a computation that took the equivalent of 136 years on top-of-the-line computer workstations. On the practical side, solution methods are used to compute optimal or near-optimal tours for a host of applied problems on a daily basis, from genome sequencing to arranging music on iPods.
In this talk we discuss the history, applications, and computation of the TSP, together with an open research area connecting planar graphs and the linear-programming approach to the problem.
|
| 2010/03/11 |
Theory |
Place: MC320
Time: 16:00 - 17:30
Speaker:
Marc
Kaplan
Affiliation: University of Montréal
Area:
Theoretical computer science
Title: Some applications of Grothendieck's inequality to computer science
Abstract: Grothendieck's inequality is a fundamental result in functional analysis. In the last ten years it has appeared to be an
extremely useful tool in computer science. Perhaps the most surprising fact about it is that has been used in very
different areas. On one side, it is a cornerstone of the approximation algorithm theory and unique game conjecture. On the
other side, it is a very natural tool to approach various problems of quantum information processing. In a series of two
talks, I will present these two aspects.
During the first talk, I will present the inequality and the algorithm of Alon and Naor for approximating the cut norm.
The core of the algorithm will give us a complete proof of Grothendieck's inequality. During the second talk, we will
focus on quantum information processing. The connection between Grothendieck's inequality and the quantum world is made
clear by Tsirelson's theorem on quantum probability distributions. The combination of these two result will allow us to
give a new insight on Linial and Shraibman's work on algebraic lower bounds on quantum communication complexity.
|
| 2010/03/08 |
DMO - Discrete Mathematics and Optimization |
Place: Burnside 1205
Time: 16:00 - 17:00
Speaker:
Guyslain
Naves
Affiliation: McGill University
Title: Integer Multicommodity Flows in Planar Graphs
Abstract: A commodity in a capacitated graph G is a pair of vertices (source, sink) and a demand value associated to this pair. Given a set of commodities H, a multicommodity flow is a set of flows, one for each commodity, such that the sum of the flows on each edge e is at most the capacity of e. We want to decide if there is a multicommodity flow such that the flow for each commodity is equal to its demand. When there is only one commodity, this is the decision version of the classic flow problem. If we force the flows to be integral, it defines the integer multicommodity flow problem. When the capacities are uniformly equal to one, this is the edge-disjoint paths problem. I will survey the main results in the field of integer multicommodity flows, and then introduce two new results when G is a planar graph:
- the problem is NP-complete with only two commodities (in undirected and directed acyclic cases),
- the uncapacitated problem is polynomial when G is a planar acyclic digraph, G+H is Eulerian, and the number of commodities is fixed.
|
| 2010/03/01 |
DMO - Discrete Mathematics and Optimization |
Place: Burnside 920
Time: 16:30 - 17:30
Speaker:
Andrei
Krokhin
Affiliation: Durham, UK
Title: On the hardness of losing weight
Abstract:
|
| 2010/02/22 |
DMO - Discrete Mathematics and Optimization |
Place: Burnside 1205
Time: 16:00 - 17:00
Speaker:
Alex
Slivkins
Affiliation: Microsoft Research
Title: Multi-Armed Bandits in Metric Spaces
Abstract: In a multi-armed bandit problem, an online algorithm chooses from a fixed set of
alternatuves (a.k.a. "strategies" or "arms") in a sequence of trials so as to
maximize the total payoff of the chosen strategies. While the performance of bandit
algorithms with a small finite strategy set is quite well understood, bandit
problems with large strategy sets are still a topic of very active investigation,
motivated by practical applications such as online auctions and web advertisement.
The goal of such research is to identify broad and natural classes of strategy sets
and payoff functions which enable the design of efficient solutions. In this work we
study a very general setting for the multi-armed bandit problem in which the
strategies form a metric space, and the payoff function satisfies a Lipschitz
condition with respect to the metric.
Joint work with Bobby Kleinberg and Eli Upfal.
Based on papers in STOC'08 and SODA'10 and some recent work.
|
| 2010/02/19 |
CQIL - Cryptography and Quantum Information |
Place: McConnell 103
Time: 11:30 - 12:30
Speaker:
Dave
Touchette
Affiliation: McGill University
Title: Trade-off capacities of quantum Hadamard channels
Abstract: An important goal of Quantum Shannon Theory is to determine the maximum rates at
which information can be transmitted reliably over noisy quantum channels.
However, the computation of these optimal rates is generally intractable,
requiring an optimization over an arbitrary number of parallel uses of the channel.
In those rare cases where this computation is tractable,
the channel can be used to its full potential to perform the task at hand.
We show that this is the case for a general class of quantum channels, Hadamard
channels,
considering the triple resource capacity region of entanglement-assisted quantum and
classical communication. Three natural subclasses are
generalized dephasing channels, cloning channels and Unruh channels.
Parametrizing the trade-off region for them,
we find that this optimal coding strategy beats the naive time-sharing one,
so we introduce a measure to quantify this improvement.
|
| 2010/02/18 |
Theory |
Place: MC320
Time: 16:00 - 17:30
Speaker:
Phuong
Nguyen
Affiliation: McGill
Area:
theory
Title: Gower uniformity norm
Abstract: Szemeredi's Theorem states that for any delta between 0 and 1, and k
greater than 2, there is a number N so that for all n > N, for all subset
A of {1,2,...,N} of density delta, A contains an arithmetic progression
of length k.
Gower uniformity norm is introduced in his proof of Szemeredi's Theorem
and independently by Alon et. al. Essentially the norm $U^d(f)$ measures
the correlation of the given function $f$ with polynomials of degree at
most $d$. Gower's proof is inspired by Roth's proof of the theorem for the
case of length-3 arithmetic progression, and this notion is needed for the
generalization. It has shown up in circuit lower bounds and communication
complexity as well.
In this talk I will define it in the context of an outline of Gower's
proof. If time allows I will mention some applications.
Biography of Speaker:
-
Postdoc
|
| 2010/02/17 |
CQIL - Cryptography and Quantum Information |
Place: McConnell 321
Time: 11:30 - 12:30
Speaker:
Robert
Spekkens
Affiliation: Perimeter Institute for Theoretical Physics
Title: Why the quantum? Insights from classical theories with a statistical restriction
Abstract: A significant part of quantum theory can be obtained by
postulating a single conceptual innovation relative to classical
theories, namely, that agents face a fundamental restriction on what
they can know about the physical state of any system. This talk will
consider a particular sort of statistical restriction wherein only
classical variables with vanishing Poisson bracket can be known
simultaneously. When this principle is applied to a classical
statistical theory of three-level systems (trits), the resulting theory
is found to be operationally equivalent to the stabilizer formalism for
qutrits. Applied to a classical theory of harmonic oscillators, it
yields quantum mechanics restricted to quadrature eigenstates and
observables. Finally, applied to a classical statistical theory of
bits, it yields a theory that is almost equivalent to (but interestingly
different from) the stabilizer formalism for qubits. I will discuss the
significance of these results for the project of deriving the formalism
of quantum theory from physical principles. Joint work with Olaf
Schreiber.
|
| 2010/02/11 |
Theory |
Place: MC320
Time: 16:00 - 17:30
Speaker:
Anil
Ada
Affiliation: McGill, SOCS
Area:
Theoretical computer science, Ramsey theory
Title: Roth's Theorem
Abstract: In extremal combinatorics, one is interested in how large/small can an
object be that satisfies certain properties. The objects are usually
graphs/hypergraphs or set systems, and the properties of interest may be
cuts, sizes of intersections etc. In additive combinatorics, one studies
integers or more generally abelian groups, and the properties of interest
can be phrased in terms of linear equations.
In the recent years, additive combinatorics has been a very productive
area of mathematics, bringing together techniques from graph theory,
analysis and ergodic theory. A major success story in this area of
research has been the Green-Tao theorem that shows that prime numbers
contain arbitrarily long arithmetic progressions. Techniques used in
additive combinatorics have also found applications in theoretical
computer science such as property testing, PCP constructions, lower bounds
and extractor constructions.
The famous Szemeredi Theorem is one of the fundamental results of this
area. It states that:
Given any delta and k, we can find an N such that every subset A of
{1,2,...,N} of size at least delta*N contains an arithmetic progression of
length k.
In our talk, we are going to present Roth's proof for the k=3 case. This
proof is considered as one of the gems of analytic number theory. The
proof uses basic Fourier analysis but we will not assume any prior
knowledge and provide all relevant definitions.
Biography of Speaker:
-
PhD student.
|
| 2010/02/10 |
CQIL - Cryptography and Quantum Information |
Place: McConnell 437
Time: 11:30 - 12:30
Speaker:
David
Poulin
Affiliation: University of Sherbrooke
Title: Quantum Metropolis algorithm
Abstract: Quantum computers have emerged as the natural architecture to study the physics of strongly correlated quantum systems, thus providing a major new impetus to the field of many-body quantum physics. While the method of choice for simulating classical many-body systems has long since been the ubiquitous Monte Carlo method, the formulation of a generalization of this method to the quantum regime has been impeded by the fundamental peculiarities of quantum mechanics, including interference effects and the no-cloning theorem. In this paper, we overcome those difficulties by constructing an efficient quantum algorithm to sample from the Gibbs distribution of an arbitrary quantum Hamiltonian at arbitrary temperatures, both for bosonic and fermionic systems. This validates the quantum computer as a full quantum simulator, with a wealth of possible applications to quantum chemistry, condensed matter physics and high energy physics.
|
| 2010/02/08 |
DMO - Discrete Mathematics and Optimization |
Place: Burnside 920
Time: 16:30 - 17:30
Speaker:
Amina
Lamghari
Affiliation: McGill University
Title: Models and Methods for the Judge Assignment Problem
Abstract: The judge assignment problem is a practical combinatorial optimization problem. It consists of determining which judges to assign to each individual competition of a tournament such as to satisfy to the best extent possible the objectives fixed by the organizers. We illustrate the judge assignment problem in the particular case of the John Molson International case competition that takes place every year at Concordia University. We consider first the problem for a round because the solution methods will be used afterward to solve the whole problem for several rounds. The constraints taken into account, their nature, their complexity, as well as the horizon of the assignment are as many elements which distinguish these two problems. We propose mathematical models to formulate them, and methods based on metaheuristics to solve them. Computational experiments indicate the efficiency of the proposed approach. One of the developed methods has been embedded into user-friendly software, which has enabled substantial savings in organizers time and given more evenly better assignments.
|
| 2010/02/02 |
CIM - Centre for Intelligent Machines |
Place: McDonald-Harrington Bldg., Room G-10
Time: 11:30 - 12:30
Speaker:
Robert
D. Howe
Affiliation: Harvard University
Title: Fixing the Beating Heart: Ultrasound Guidance for Robotic Intracardiac Surgery
Abstract: To treat defects within the heart, surgeons currently use stopped-
heart techniques. These procedures are
highly invasive and incur a significant risk of neurological
impairment. We are developing methods for
performing surgery within the heart while it is beating. New real-
time 3-D ultrasound imaging allows
visualization through the opaque blood pool, but this imaging
modality poses difficult image processing
challenges, including poor resolution, acoustic artifacts, and data
rates of 30 to 40 million voxels per
second. To track instruments within the heart we have developed a
Radon transform-based algorithm,
which is readily implemented in real-time on graphics processor
units. For manipulation of rapidly moving
cardiac tissue we have created a fast robotic device that can track
the tissue based on ultrasound image
features. This allows the surgeon to interact with the heart as if
it was stationary. To integrate ultrasound
imaging with the robotic device we have developed a predictive
controller that compensates for the 50-100 ms
imaging and image processing delays to ensure good tracking
performance. In vivo studies show that this
approach enhances dexterity and lowers applied forces. Clinical
applications of this technology include atrial
septal defect closure and mitral valve annuloplasty.
|
| 2010/01/11 |
Algorithms |
Place: Burnside 920
Time: 16:30 - 17:30
Speaker:
Vasek
Chvatal
Affiliation: Concordia University
Area:
Discrete Mathematics
Title: Betweenness
Abstract: A point in the plane is said to lie between points A and C if it is the interior point of the line segment joining A and C. In his development of geometry, Euclid neglected to give the notion of betweenness the same axiomatic treatment that he gave, for instance, to the notion of equality. This omission was rectified twenty-two centuries later by Moritz Pasch:
http://www-groups.dcs.st-and.ac.uk/~history/Biographies/Pasch.html
During the twentieth century, geometric betweenness was generalized in diverse branches of mathematics to ternary relations of metric betweennes, lattice betweenness, and algebraic betweenness. I will talk about three settings where such abstract betweennesses show up.
The first of these settings is ordered geometry; there, primitive notions of points and lines are linked by the relation of incidence and by axioms of betweenness; two classic theorems here are the Sylvester-Gallai theorem
http://mathworld.wolfram.com/SylvestersLineProblem.html
and the de Bruijn-Erdos theorem. I conjectured in 1998
http://users.encs.concordia.ca/~chvatal/newsg.pdf
and Xiaomin Chen proved in 2003
http://dimacs.rutgers.edu/TechnicalReports/abstracts/2003/2003-32.html
that the Sylvester-Gallai theorem generalizes to metric spaces when lines in these spaces are defined right; together, we conjectured
http://arxiv.org/abs/math.CO/0610036
that the de Bruijn-Erdos theorem also generalizes to metric spaces when lines in these spaces are defined right (with "right" having a different sense in each of the two instances); the two of us and Ehsan Chiniforooshan have partial results on this conjecture.
The second of the three settings is abstract convexity; there, families of sets called "convex" obey certain axioms. Such finite structures are called convex geometries when they have the Minkowski-Krein-Milman property: every set is the convex hull of its extreme points. Two classical examples of convex geometries come from shelling of partially ordered sets and simplicial shelling of triangulated graphs. Last June I characterized, by a five-point condition, a class of betweennesses generating a class of convex geometries that subsumes the two examples.
http://users.encs.concordia.ca/~chvatal/abc.pdf
Laurent Beaudou, Ehsan Chiniforooshan, and I have additional results on such betweennesses.
The last setting lies between physics and philosophy: in his effort to develop a causal theory of time, Hans Reichenbach
http://en.wikipedia.org/wiki/Hans_Reichenbach
introduced the notion of causal betweenness, which is a ternary relation defined on events in probability spaces. This January, Baoyindureng Wu and I characterized, by easily verifiable properties, abstract ternary relations isomorphic to Reichenbach's causal betweenness.
http://arxiv.org/abs/0902.1763
A nice connection with a 1979 theorem of Jarda Opatrny
http://dx.doi.org/10.1137/0208008
appears here.
The joint work with Laurent Beaudou, Ehsan Chiniforooshan, and Baoyindureng Wu was done in our research group ConCoCO
http://users.encs.concordia.ca/~concoco/
(Concordia Computational Combinatorial Optimization).
|
|