Plus/minus what? Let’s talk about uncertainty (talk)

Last week at DOAG 2017, I had two talks, one about deep learning with DL4J (slides here) and one about how to communicate uncertainty (or rather: how to construct prediction intervals for various methods / in various frameworks ranging from simple linear regression over Bayesian statistics to neural networks).

TLDR: The most important thing about communicating uncertainty is that you’re doing it.

Want all the formulae? presentation, github

🙂
 

Advertisements

Dynamic forecasts – with Bayesian linear models and neural networks (talk at Predictive Analytics World Berlin)

I really wish I had the time to write an article about the conference, instead of just posting the slides!

Predictive Analytics World was super inspiring, not just in a technical way but also as to the broader picture of today’s data science / AI explosion, including its political, sociological and personal implications.

As I really don’t have the time, I’m not even gonna try, so let me just point you to my talk, which was about time series forecasting using two under-employed (as yet) methods: Dynamic Linear Models (think: Kalman filter) and Recurrent Neural Networks (LSTMs, to be precise).

So, here are the slides, and as usual, here’s the link to the github repo, containing some more example code.

For me, experimentation with time series forecasting seems to form a time series in itself – I’m sure there’s pretty much still to be explored 🙂
Thanks for reading!