Update

I’m an applied researcher at RStudio, where I contribute to the r-tensorflow family of packages (e.g., tfprobability).

I write about doing deep learning from R on the TensorFlow for R blog. If you’re interested in DL/ML/probability and R, perhaps you’d like to check out some of the posts, e.g.

Thanks for stopping by!

Deep learning, concepts and frameworks: Find your way through the jungle (talk)

Today at OOP in Munich, I had an in-depth talk on deep learning, including applications, basic concepts as well as practical demos with Tensorflow, Keras and PyTorch.

As usual, the slides are on RPubs, split up into 2 parts because of the plenty of images included – lossy png compression did work wonders but there’s only so much you can expect 😉 – so there’s a part 1 and a part 2.

There’s also the github repository with the demo notebooks.

Thanks to everyone who attended, and thank you for the interesting questions!

Practical Deep Learning (talk)

Yesterday at IT Tage 2017, I had an introductory-level talk on deep learning.

After giving an overview of concepts and frameworks, I zoomed in on the task of image classification using Keras, Tensorflow and PyTorch, not aiming for high classification accuracy but wanting to convey the different “look and feel” of these frameworks.

(By sheer chance, the use case chosen happened to be about telling apart different types of endurance sports ;-))

Here are the slides, and here are the Jupyter notebooks.

Thanks to everyone who attended & thanks for reading!

I’m a developer, why should I care about matrices or calculus? (talk at MLConference 2017)

Yesterday at ML Conference, which took place this year for the first time, I had a talk on cool bits of calculus and linear algebra that are useful and fun to know if you’re writing code for deep learning and/or machine learning.

Originally, the title was something like “What every interested ML/DL developer should know about matrices and calculus”, but then really I didn’t like the schoolmasterly tone that had, as really what I’ve wanted to convey was the fun and the fascination of it …

So, without further ado, here are the slides and the raw presentation on github.

Thanks for reading!

Plus/minus what? Let’s talk about uncertainty (talk)

Last week at DOAG 2017, I had two talks, one about deep learning with DL4J (slides here) and one about how to communicate uncertainty (or rather: how to construct prediction intervals for various methods / in various frameworks ranging from simple linear regression over Bayesian statistics to neural networks).

TLDR: The most important thing about communicating uncertainty is that you’re doing it.

Want all the formulae? presentation, github

🙂
 

Dynamic forecasts – with Bayesian linear models and neural networks (talk at Predictive Analytics World Berlin)

I really wish I had the time to write an article about the conference, instead of just posting the slides!

Predictive Analytics World was super inspiring, not just in a technical way but also as to the broader picture of today’s data science / AI explosion, including its political, sociological and personal implications.

As I really don’t have the time, I’m not even gonna try, so let me just point you to my talk, which was about time series forecasting using two under-employed (as yet) methods: Dynamic Linear Models (think: Kalman filter) and Recurrent Neural Networks (LSTMs, to be precise).

So, here are the slides, and as usual, here’s the link to the github repo, containing some more example code.

For me, experimentation with time series forecasting seems to form a time series in itself – I’m sure there’s pretty much still to be explored 🙂
Thanks for reading!

Deep Learning with Keras – using R (talk)

This week in Kassel, [R]Kenntnistage 2017 took place, organised by EODA. It was all about Data Science (with R, mostly, as you could guess): Speakers presented interesting applications in industry, manufacturing, ecology, journalism and other fields, including use cases such as predictive maintenance, forecasting and risk analysis.

I had the honour to have a talk too (thanks guys!), combining two of my favorite topics – deep learning and R. The slides are on RPubs as usual, and the source code (including complete examples) can be found on github.

Last not least, it’s great to see data science, and R, gaining momentum like that (this is Europe, so I can still write such a sentence ;-))
If you allow me to include an advertisement here – if you’re wondering what insight might come out of your data: At Trivadis, we’re a (yet) smallish but super motivated team of data scientists and machine learning practitioners happy to help!

Time series shootout: ARIMA vs. LSTM (talk)

Yesterday, the Munich datageeks Data Day took place. It was a totally fun event – great to see how much is going on, data-science-wise, in and around Munich, and how many people are interested in the topic! (By the way, I think that more than half the talks were about deep learning!)

I also had a talk, “Time series shootout: ARIMA vs. LSTM” (slides on RPubs, github).

Whatever the title, it was really about showing a systematic comparison of forecasting using ARIMA and LSTM, on synthetic as well as real datasets. I find it amazing how little is needed to get a very decent result with LSTM – how little data, how little hyperparameter tuning, how few training epochs.

Of course, it gets most interesting when we look at datasets where ARIMA has problems, as with multiple seasonality. I have such an example in the talk (in fact, it’s the main climax ;-)), but it’s definitely also an interesting direction for further experiments.

Thanks for reading!

Automatic Crack Detection – with Deep Learning

On Friday at DOAG Big Data Days, I presented one possible application of deep learning: using deep learning for automatic crack detection – with some background theory, a Keras model trained from scratch, and the use of VGG16 pretrained on Imagenet. The amount of input data really was minimal, and the resulting accuracy, under these circumstances, not bad at all! Here are the slides.

If you’re interested, I’ll have a webcast on this as part of the Trivadis tricast series (registration). The talk will be in German though, so I guess some working knowledge of German would be helpful 🙂

Thanks for reading!

Deep Learning, deeplearning4j and Outlier Detection: Talks at Trivadis Tech Event

Last weekend, another edition of Trivadis Tech Event took place. As usual, it was great fun and a great source of inspiration.
I had the occasion to talk about deep learning twice: One talk was an intro to DL4J (deeplearning4j), zooming in on a few aspects I’ve found especially nice and useful while trying to provide a general introduction to deep learning at the same time. The audience was great, and the framework really is fun to work with, so this was a totally pleasant experience! Here are the slides, and here’s the example code.

The second talk was a joint session with my colleague Olaf on outlier / anomaly detection. We covered both ML and DL algorithms. For DL, I focused on variational autoencoders, the special challenge being to successfully apply the algorithm to datasets other than MNIST… and especially, datasets with a mix of categorical and continuous variables of different scale. As I say in the habitual “conclusion” slide, I don’t think I’ve arrived at a conclusion yet… any comments / suggestions are very welcome! Here’s the VAE presentation on RPubs, and here on github.
Thanks for reading!