Time series shootout: ARIMA vs. LSTM (talk)

Yesterday, the Munich datageeks Data Day took place. It was a totally fun event – great to see how much is going on, data-science-wise, in and around Munich, and how many people are interested in the topic! (By the way, I think that more than half the talks were about deep learning!)

I also had a talk, “Time series shootout: ARIMA vs. LSTM” (slides on RPubs, github).

Whatever the title, it was really about showing a systematic comparison of forecasting using ARIMA and LSTM, on synthetic as well as real datasets. I find it amazing how little is needed to get a very decent result with LSTM – how little data, how little hyperparameter tuning, how few training epochs.

Of course, it gets most interesting when we look at datasets where ARIMA has problems, as with multiple seasonality. I have such an example in the talk (in fact, it’s the main climax ;-)), but it’s definitely also an interesting direction for further experiments.

Thanks for reading!

Advertisements

Automatic Crack Detection – with Deep Learning

On Friday at DOAG Big Data Days, I presented one possible application of deep learning: using deep learning for automatic crack detection – with some background theory, a Keras model trained from scratch, and the use of VGG16 pretrained on Imagenet. The amount of input data really was minimal, and the resulting accuracy, under these circumstances, not bad at all! Here are the slides.

If you’re interested, I’ll have a webcast on this as part of the Trivadis tricast series (registration). The talk will be in German though, so I guess some working knowledge of German would be helpful 🙂

Thanks for reading!

Deep Learning, deeplearning4j and Outlier Detection: Talks at Trivadis Tech Event

Last weekend, another edition of Trivadis Tech Event took place. As usual, it was great fun and a great source of inspiration.
I had the occasion to talk about deep learning twice: One talk was an intro to DL4J (deeplearning4j), zooming in on a few aspects I’ve found especially nice and useful while trying to provide a general introduction to deep learning at the same time. The audience was great, and the framework really is fun to work with, so this was a totally pleasant experience! Here are the slides, and here’s the example code.

The second talk was a joint session with my colleague Olaf on outlier / anomaly detection. We covered both ML and DL algorithms. For DL, I focused on variational autoencoders, the special challenge being to successfully apply the algorithm to datasets other than MNIST… and especially, datasets with a mix of categorical and continuous variables of different scale. As I say in the habitual “conclusion” slide, I don’t think I’ve arrived at a conclusion yet… any comments / suggestions are very welcome! Here’s the VAE presentation on RPubs, and here on github.
Thanks for reading!

Haskell, R, and HaskellR: Combining the best of two worlds (talk at UseR! 2017)

Earlier today, I presented at UseR! 2017 about HaskellR: a great piece of software, developed by Tweag I/O, that allows to seemlessly use R from Haskell.

It was my first UseR!, it was a great experience, and if I had the time I’d like to write a separate blog post about it, as there were things that did not quite align with my prior expectations… Stuff for thought, but not the topic of this post. (Mainly this would be about how the academic talks compared to the non-academic ones.)

So, why HaskellR? If you allow me one personal note… For the ex-psychologist, ex-software-developer, ex-database administrator, now “in over my head” data scientist and machine learning/deep learning person that I am (see this post for that story), there has always been some fixed point of interest (ideal, you could say), and that is the elegance of functional programming. It all started with SICP, which I first read as a (Java) programmer and recently read again (partly) when preparing R 4 hackers, a talk focused to a great part on the functional programming features of R.

For a database administrator, unless you’re very lucky, it’s hard to integrate use of a functional programming language into your work. How about deep learning and/or data science?
For deep learning, there’s Chris Olah’s wonderful blog post linking deep networks to functional programs, but the reality (of widely used frameworks) looks different: TensorFlow, Keras, PyTorch… it’s mostly Python around there, and whatever the niceties of Python (readability, list comprehensions…) writing Python certainly does not feel like writing FP code at all (much less than writing R!).

So in practice, the connections between data science/machine learning/deep learning and functional programming are scarce. If you look for connections, you will quickly stumble upon the Tweag I/O guys’ work: They’ve not just created HaskellR, they’ve also made Haskell run on Spark, thus enabling Haskell applications to use Spark’s MLLib for large-scale machine learning.

What, then, is HaskellR? It’s a way to seemlessly mix R code and Haskell code, with full interoperability in both directions. You can do that in source files, of course, but you can also quickly play around in the interpreter, appropriately called H (no, I was not thinking of its addictive potential here ;-)), and even use Jupyter notebook with HaskellR! In fact, that’s what I did in the demos.

If you’re interested in the technicalities of the implementation, you’ll find that documented in great detail on the HaskellR website (and even more, in their IFL 2014 paper), but otherwise I suggest you take a look at the demos from my talk: First, there’s a notebook showing how to use HaskellR, how to get values from Haskell to R and vice versa, and then, there’s the trading app scenario notebook: Suppose you have a trading app written in Haskell – it’s gotta be lightning fast and as bug-free as possible, right?
But, how about nice visualizations, time series diagnostics, all kinds of sophisticated statistical and machine learning algorithms… Chances are, someone’s implemented that algorithm in R, already! Let’s take ARIMA – one line of code with R.J. Hyndman’s auto.arima package! Visualization? ggplot2, of course! And last not least, an easy way to do deep learning with R’s keras package (interfacing to Python Keras).

Besides the notebooks, you might also want to check out the slides, especially if you’re an R user who hasn’t had much contact with Haskell. Ever wondered why the pipe looks the way it looks, or what the partial and compose functions are doing?

Last not least, a thousand thanks to the guys over at Tweag I/O, who’ve been incredibly helpful in getting the whole setup to run (the best way to get it up and running on Fedora is using nix, which I didn’t have any prior experience with… just at a second level of parentheses, I think I’d like to know more about nix, the package manager and the OS, now too ;-)). This is really the great thing about open source, the cool stuff people build and how helpful they are! So thanks again, guys – I hope to be doing things “at the interface” of ML/DL and FP more often in the future!

Update:
The talk was recorded, and can be viewed here.

Time series prediction – with deep learning

More and more often, and in more and more different areas, deep learning is making its appearance in the world around us.
Many small and medium businesses, however, will probably still think – Deep Learning, that’s for Google, Facebook & co., for the guys with big data and even bigger computing power (barely resisting the temptation to write “yuge power” here).

Partly this may be true. Certainly when it comes to running through immense permutations of hyperparameter settings. The question however is if we can’t obtain good results in more usual dimensions, too – in areas where traditional methods of data science / machine learning prevail. Prevail, as of today, that is.

One such area is time series prediction, with ARIMA & co. top on the leader board. Can deep learning be a serious competitor here? In what cases? Why? Exploring this is like starting out on an unknown road, fascinated by the magical things that may await us 😉
In any case, I’ve started walking down the road (not running!), in a rather take-your-time-and-explore-the-surroundings way. That means there’s much still to come, and it’s really just a beginning.

Here, anyway, is the travel report – the presentation slides, I mean: best viewed on RPubs, as RMarkdown on github, or downloadable as pdf).
Enjoy!

R 4 hackers

Yesterday at Trivadis Tech Event, I talked about R for Hackers. It was the first session slot on Sunday morning, it was a crazy, nerdy topic, and yet there were, like, 30 people attending! An emphatic thank you to everyone who came!

R a crazy, nerdy topic, – why that, you’ll be asking? What’s so nerdy about using R?
Well, it was about R. But it was neither an introduction (“how to get things done quickly with R”), nor was it even about data science. True, you do get things done super efficiently with R, and true, R is great for data science – but this time, it really was about R as a language!

Because as a language, too, R is cool. In contrast to most object oriented languages, it (at least in it’s most widely used version, S3) uses generic OO, not message-passing OO (ok, I don’t know if this is cool, but it’s really instructive to see how little you need to implement an OO system!).

What definitely is cool though is how R is, quite a bit, a functional programming language! Even using base R, you can write in a functional style, and then there’s Hadley Wickham’s purrr that implements things like function composition or partial application.

Finally, the talk goes into base object internals – closures, builtins, specials… and it ends with a promise … 😉
So, here’s the talk: rpubs, pdf, github. Enjoy!

R for SQListas (3): Classifying Digits with TensorFlow

Yesterday at PASS Meetup Munich, I talked about R for SQListas – thanks again for your interest and attention guys, it was a very nice evening!
Actually, in addition to the content from that original presentation, which I’ve also covered in two recent blog posts (R for SQListas(1): Welcome to the tidyverse and R for SQListas(2): Forecasting the future), there was a new, third part this time: an introduction to machine learning with R, by example of the most classical of examples: MNIST, with a special focus on using rstudio’s tensorflow package for R.
While I hope I’ll find the time to write a post on this part too, I’m not too sure when this will be, so I’ve uploaded the slides already and added links to the pdf, github repo and publication on rpubs to the Presentations/Papers section. Enjoy!