# Helmholtz AI FFT seminar series #3: </br> Benedikt Heidrich ###### tags: `HelmholtzAI`,`FFT` [ToC] ## :memo: Seminar details **10 June 2021, 11:00 - 12:00** - Speaker: **Benedikt Heidrich**, Helmholtz AI research group - PhD student @ Karlsruhe Institute of Technology (KIT) - Title: **Profile Neural Networks (PNNs)** - Chair: **Peter Steinbach**, Head of Helmholtz AI consultant team @ Helmholtz-Zentrum Dresden-Rossendorf (HZDR) - Abstract: The topic of the talk is challenges in (energy) time series forecasting. In particular, it is about periodicities and how to deal with periodicities in neural networks. Therefore, the following three solutions will be presented: The usage of a periodic bias, the Exponential Smoothing Recurrent Neural Network, and the Profile Neural Network. Whereby, the talk will focus more on the last solution (Profile Neural Networks) since only this solution focuses on the prediction of energy time series. ### VC details Access to online venue: https://global.gotomeeting.com/join/809969773 Access Code: 809-969-773 Join from a video-conferencing room or system. Dial in or type: or inroomlink.goto.com Meeting ID: 809 969 773 Or dial directly: 809969773@ or ## :memo: Notes :::info :bulb: Write down notes and/or interesting information of the seminar. For example, observations auxiliar to the content which is not contained in the slidedeck. ::: - an introduction to SARIMA, I found: https://machinelearningmastery.com/sarima-for-time-series-forecasting-in-python/ - M4 competition: https://en.wikipedia.org/wiki/Makridakis_Competitions - "Named after the lead organizer, Spyros Makridakis, the M Competition has been one of the most important events in the forecasting community since 1982." - challenge blogpost by uber: https://eng.uber.com/m4-forecasting-competition/ - PNN repo: https://github.com/benHeid/Profile-Neural-Network - pyWATTS repo: https://github.com/KIT-IAI/pyWATTS - Jenia: a systematic approach to deal with irregular time series with missing data etc: Neural Ordinary Differential Equations (Neural ODEs) and follow up work - Latent ODEs, ODE-RNNs - e.g https://proceedings.neurips.cc/paper/2019/hash/42a6845a557bef704ad8ac9cb4461d43-Abstract.html - can handle any temporal signal form and infer latent components (eg periodic trends etc) - if you are working on time series and would like to continue discussing, please put your name below: - ## :question: Questions for the speaker :::info :bulb: Write down any questions or topics you wish to discuss during the seminar ::: :arrow_right: Question by someone not familiar with time series - how is global deseasonalisation performed? "only" by stratifying by the annual quarters (and subsequent training only on these quarterly time segments)? - plenty methods - paul? winters method - calculate average periodicity per seasons and take this :arrow_right: Question on periodic biases - how is this implemented? like so `x + sin^2(x)` e.g. only for `x >= 0`? - unclear, something to research :arrow_right: Question on PNN - how is aggregation performend? is this a `concatenate` call? Plus 1! - weighted addition - weight importance of each module (weights are trained) - Follow up: Can the module handle trends in the periodicity - e.g. a linear increase in the periodic part. Answer: Yes that should be captured in the (classical) ... module. :arrow_right: Why is periodic signal so special in the sense of being difficult to deal with? As it is an unknown function f(x), and deep neural nets are good in fitting any arbitrary complex functions, what makes it hard in this case? (e.g in images. as spatial signals, there are periodic components that seem to be handled well by vision conv networks) - smyl argued about missing temporal notation in ANNs -> this may be the reason for it - missing perdiodic inductive bias -> might periodic inductive bias - why not use a CNN on time series (with several layers to capture periods of several scales)? - or perhaps consider using FFTs as input transformation :arrow_right: How does it perform against standard signal decomposition, e.g. ICA, FFT, ..., and by extension have you tried (deep-)learning parameters of "templates" gained by the signal decompositions - typically people use ICA/FFT to decompose signal into data on different time scales - not tried yet :arrow_right: on the prediction, how much training data was used and what period was used for eval; - trained on 3 years, 1 measurement per hour (much higher resolution available) - one - week - ahead forecast ## :question: Your Feedback :::info :bulb: Write down your feedback about the seminar ::: - when citing, better to provide bit more info on work (names, journal) that it becomes possible to find it. Only first name and year may be too ambigious - Opposing opinion (SK) I find name and year usually enough, a full citation can clutter slides. - (BH) At the end of the the slides, there is a list of the full citations. The slides will be shared. ### Share something that you learned or liked :+1: - PNNs are a cool idea, solid results - I like that it was relatively easy to follow the content of the presentation also without deeper background knowledge ### Share something that you didn’t like or would like us to improve :-1: - in general, the use of "this is just..." or "we simply do ..." can be discouraging to people not familiar with sequence modelling or time series (which eventually might offer feedback or thoughts oo your work that you didn't consider before), it helps to not use these formulations - I think it would be better if the questions above were written down with indicating the name of the person who wrote it, maybe even ecourage the person to ask for him/herself - PS: true, I should have emphasized that everyone is free to either state their name or put questions anonymously (the latter being the gold standard to encourage questions from a most inclusive subset of the audience) :::info :pushpin: Want to learn more? ➜ [HackMD Tutorials](https://hackmd.io/c/tutorials) :::