R o h a n S h i l o h S h a h EMAIL: rohan dot shiloh dot shah at gmail dot com |
|
WINTER 2007: MSc Thesis Support Vector Machines for Classification, Regression and Density Estimation: [PDF]
Abstract In the last decade Support Vector Machines (SVMs) have emerged as an important learning technique for solving classification and regression problems in various fields, most notably in computational biology, finance and text categorization. This is due in part to built-in mechanisms to ensure good generalization which leads to accurate prediction, the use of kernel functions to model non-linear distributions, the ability to train relatively quickly on large data sets using novel mathematical optimization techniques and most significantly the possibility of theoretical analysis using computational learning theory. In this thesis, we discuss the theoretical basis and computational approaches to Support Vector Machines. | |
FALL 2006: Pattern Recognition Kernel Probability Density Estimation: [PDF] [Applet] [HTML] |
|
WINTER 2006: Mining Biological Sequences Presentation on Statistical Methods For Finding Transcription Factor Binding Sites: [PDF Slides] |
|
FALL 2005: Discrete Optimisation Presentation on Quadratic Programming in Support Vector Machines: [PDF Slides] |
|
WINTER 2005: Numerical Estimation Least Squares Support Vector Machines: [PDF] |
|
WINTER 2005: Advanced Probability Theory 2 Approximating Martingales in Continuous and Discrete Time Markov Processes: [PDF] |
|
SUMMER 2003: Comparing Sarsa and MCESP for Partially Observable Markov Decision Processes Preliminary Results: [Postscript] Preliminary Results: [PDF] |
|
WINTER 2003: Function Approximation in Partially Observable Markov Decision Processes Report: [Postscript]Report: [PDF] |
|
SUMMER 2002: Honours Project - Bayesian Network I implemented Exact Inference using Variable Elimination and some Approximate Inference Methods including Rejection Sampling, Likelihood Weighting (form of Importance Sampling) and Gibbs Sampling. Please see the provided documentation for details.
Documentation:
Code:
Archive of everything (TAR): |
|