efficient and playable game implementations / Modelling and analyzing different facts of game structure
Designing and modelling an agent comparable in size and complexity to a commercial AI using to formalism of layered statecharts. As a reference point, looked at the behaviour tree of Halo and reconstructed it using the new approach.
Aims to model computer narratives and develop analysis techniques for detecting narrative flaws and other narrative properties, currently built around an investigation of Interactive Fiction as a relatively pure source of game narratives.
Using a World of Warcraft client-side plugin created to record data about players' progress through a cooperative scenario and analyzing the data, the level of difficulty can be quantified in order to design scalable and adaptable scenarios to challenge players.
Organizations, governments, and companies are interested in understanding the people in groups they’re interacting with. We're developing new techniques for inferring the demographics of physical populations using data from online networks like Twitter and Facebook.
Grow, decay, and dissipate. We're interested in observing and measuring these processes using computational and mathematical models. We hope that such knowledge aids in building more vibrant, constructive societies.
Despite the complex biochemical processes at work in cells, you can mess with a cell's DNA, douse it in toxins, and expose it to extreme temperatures and it will survive. We're working to uncover aspects of cellular design that evolution has used to make living systems so resilient.
Developing a parallel environment for gate level simulation which incorporates our research on synchronization and load-balancing algorithms, with the intention to focus on the use of multi-core architectures, GPUs, and embedded circuit simulation.
Studying the motion of particles under the influence of their mutual gravitational attraction, making use of direct methods for the solution of the equations of motion and parallel discrete-event simulation instead of continuous simulation.
Integrating continuous and discrete event paradigms to provide a new and powerful approach to the simulation problems of continuous simulations such as those for astrophysics and weather, using reverse computation as a key component.
Large-scale distributed data management / Cloud computing / Data consistency
Managing consistency for transactional applications when data is distributed across many components is challenging. ConsAD detects and quantifies anomalies as they occur, and we build algorithms that maintain the desired level of consistency across all components.
The goal of our research is to understand the challenges of distributing a game engine across a distributed cluster or a peer-to-peer network and to find solutions for update dissemination, load-balancing, cheating, movement prediction, and much more.
CumuloNimbo is a novel platform as a service (PaaS) that will provide high scalability (100+ nodes), high availability, and adaptability for traditional transactional workloads without sacrificing data consistency and ease of programming as is the norm in today's PaaS.
Algorithmic and machine learning approaches to biological sequence analysis / Comparative genomics / Genome evolution
Can we tell what the genome of the first mammal looked like? Yes, with a clever analysis of genomes of extant species. We use inferred ancestral genomes and evolutionary scenarios to map the functional regions of the human genome and predict the impact of mutations.
These interactions occur when two or more proteins bind together to carry out their biological function. In collaboration with the Coulombe Lab, we are developing the experimental and computational techniques to map this network and the graph algorithms required to analyze it.
Two metres of DNA fit into the nucleus of each cell. The precise way DNA is packaged is highly organized, dynamic, and has major implications on how genes are expressed. In collaboration with the Dostie Lab, we investigate experimental and computational approaches.
Mobile and pervasive computing / Networked embedded systems / Privacy-preserving computing / Cloud systems
Design challenge for pervasive computing researchers: continuous sensing is useful, but how can we design such systems so that they use sensing when and where it is needed while respecting the privacy and comfort of users and others who may be monitored?
By encrypting your wireless communications and requiring users to authenticate before connecting, you can ensure unauthorized users do not intrude on your WLAN and that your wireless data can not be intercepted.
Location-based services (LBS) is a computer program-level services used to include controls for location and time data as control features. LBS has many uses in social networking today, through the mobile devices' network accessing its geographical position.
Probabilistic systems / Markov processes / Decision making under uncertainty / Reinforcement learning
Senses obstacles, uses programmed maps to localize and self-navigate through its environment, and helps individuals with mobility impairments achieve greater autonomy, who can communicate with the robot through a speech interface or touchscreen.
Developing theoretical tools for analyzing continuous-state systems, using duality theory, metrics and logic. Algorithms for automatically computing approximations whose behaviour can be guaranteed to be “close” to that of the original system.
Algorithms for learning good representations of time series data and how to control a stochastic, complex environment to maximize a long-term objective. Applications in medical decision making, energy management, e-commerce, music, etc.
Computer-aided multi-paradigm modelling and simulation / Model compilers & languages / Rapid application development / Reactive systems
A $16.6-million national research network created to tackle the technological challenges related to the growing complexity of automotive software systems. GM of Canada Ltd and IBM Canada are mobilizing leading software engineers at 7 universities and in Montreal.
Meta-modelling refers to the description, or modelling, of different kinds of formalisms used to model systems. Model-transforming refers to the process of converting, translating or modifying a model in a given formalism, into another model that may be in the same formalism.
Designing secure e-ID card apps to protect citizens’ privacy with research in digital credentials, secure integration of biometry and cryptology, reliable dispute handling, trusted modules for securing applications, and services and legal aspects of trust in an open network.
Design of secure and trusted networked computing systems / Authentication / Access control / Cloud computing
A framework for combining resources from public clouds and/or resources donated by participating resources with trusted resources from private clouds to deliver higher levels of services at lower cost for resource intensive applications such as image and video processing.