Neural Information Processing Systems (NIPS) conference is a place where computational neuroscience meets machine learning. Due to the rise of deep learning (DL) in the…

This is a cover story for the research paper by Anna Leontjeva and Ilya KuzovkinCombining Static and Dynamic Features for Multivariate Sequence Classification2016 IEEE International…

So you’ve decided to throw a Deep Neural Network at your data. Great! You try a basic setup and it kinda works, but you are…

Day 1 Tutorial: Causal Inference for Observational Studies The topic of causal inference was strongly presented this year. The reason for that was, however, unclear…

Mastering the game of Go with deep neural networks and tree search (article overview) from Ilya Kuzovkin A talk given at the Reinforcement Learning Seminar…

Paper overview: "Deep Residual Learning for Image Recognition" from Ilya Kuzovkin A talk given at the Computational Neuroscience Seminar @ University of Tartu. We discuss…

Deep Learning: Theory, History, State of the Art & Practical Tools from Ilya Kuzovkin An introductory talk about deep learning given at Machine Learning Estonia…

Article overview: Unsupervised Learning of Visual Structure Using Predictive Generative Networks from Ilya Kuzovkin This set of slides goes over the recent article that tries…

Article overview: Deep Neural Networks Reveal a Gradient in the Complexity of Neural Representations across the Ventral Stream from Ilya Kuzovkin The article presents the…

Caffe is a framework for deep learning. In a deep learning net it is quite hard to find good parameters (learning rate, dropout, size of…

While reading “An Introduction to the Conjugate Gradient Method Without the Agonizing Pain” I decided to boost understand by repeating the story told there in…

Year 2014 passed under the huge Deep Learning sign. At the last seminar of computational neuroscience I’ve presented a very recent article, which looks into…