Prof. Dino Sejdinovic
Assoc. Prof. Dino Sejdinovic is broadly interested in statistical foundations underpinning large-scale machine learning algorithms, with a particular emphasis on nonparametric and kernel methods, and the resulting expressive models. Recent research focused on methods for discovery of higher-order interactions in datasets (when weak individual causes combine in a nonlinear way to form a strong effect), as well as on adaptive simulation suited for target distributions with nonlinear dependencies. Further interests include tradeoffs between statistical, computational and communication efficiency of learning algorithms as well as applications in communications engineering and signal processing.
Selected Publications
Mitrovic, J., Sejdinovic, D. and Teh, Y.W. (2016) ‘DR-ABC: Approximate Bayesian computation with kernel-based distribution regression’, International Conference on Machine Learning (ICML).
S. Flaxman, D. Sejdinovic, J. P. Cunningham, and S. Filippi, Bayesian Learning of Kernel Embeddings, in Uncertainty in Artificial Intelligence (UAI), 2016.
H. Strathmann, D. Sejdinovic, S. Livingstone, Z. Szabo, and A. Gretton, Gradient-free Hamiltonian Monte Carlo with Efficient Kernel Exponential Families, in Advances in Neural Information Processing Systems (NIPS), vol. 28, 2015, pp. 955–963.
O. Johnson, D. Sejdinovic, J. Cruise, R. Piechocki, and A. Ganesh, Non-Parametric Change-Point Estimation using String Matching Algorithms, Methodology and Computing in Applied Probability, vol. 16, no. 4, pp. 987–1008, 2014.
D. Sejdinovic, B. Sriperumbudur, A. Gretton, and K. Fukumizu, Equivalence of distance-based and RKHS-based statistics in hypothesis testing, The Annals of Statistics, vol. 41, no. 5, pp. 2263–2291, Oct. 2013.