Arvid Kingl: Temporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting

  Переглядів 17,530

nPlan

nPlan

День тому

Machine Learning Paper Club (Dec 3, 2020)

КОМЕНТАРІ: 15
@SuperHddf
@SuperHddf Рік тому
Thank you! 😄
@mateuszsmendowski2677
@mateuszsmendowski2677 9 місяців тому
That explanation is quite useful after having some basic understanding of TFT architecture. So reading the original manuscript beforehand is somehow recommended to get the essence from this video.
@nPlan
@nPlan 7 місяців тому
Arvid was kind enough to share his slides with us. Here is a link to the slides storage.googleapis.com/dockertest-191011/jc_temporal_fusion.html#1
@user-nq4uq6ue3w
@user-nq4uq6ue3w 11 місяців тому
This was very helpful after reading the paper! Is there any chance that the slides used in this video are available somewhere?
@divugoel
@divugoel 9 місяців тому
Hello, Were you able to find the slides from this video?
@user-nq4uq6ue3w
@user-nq4uq6ue3w 9 місяців тому
@@divugoel no I haven't. Still hoping someone with them will see this.
@user-dx5gq6el8k
@user-dx5gq6el8k 8 місяців тому
Hi @nplan Can you share the link where we can find the ppt.
@nPlan
@nPlan 7 місяців тому
Arvid was kind enough to share his slides with us. Here is a link to the slides storage.googleapis.com/dockertest-191011/jc_temporal_fusion.html#1
@axe863
@axe863 4 місяці тому
Is the Variable Selection technique is robust to high concurvity; non-cointegrating relationships; varying degrees of persistence amongst predictors ; pattern destroying non-stationarity (EMH) etc etc
@AK-wn1rm
@AK-wn1rm 4 місяці тому
I would think generally not. I mean that is your most general data science problem, how do you predict stuff that is wildly different from the past data that you have seen. Now, that doesn't mean that it isn't somewhat tracking, since the latest data does go in and most forecasts will be somewhat guided by the latest "is" state. The fact that you typically predict over many entities, you would also hope that some generalisation would naturally occur as some similarities have been already observed in other entities. Now these models are not truly causal, at least not out of the box. And they can't, because causality cannot be inferred from data (caveats caveats caveats). So it does fall back to the modeller to provide sensible covariates. As long as causal pathways don't break down (and sometimes that can happen, at least temporarily) the model generalises. If one only throws features at the model and hopes it finds all it needs by itself, one might be in for a bad surprise.
@damianwysokinski3285
@damianwysokinski3285 3 роки тому
As Arvid showed us, TFT can be applied to forecasting tasks i.e. forecasting 30days in the future when we have 90days of past data. Can TFT be used to forecast 90days in the future when we have only 30 days past data? Looks like it’s impossible because of TFT architeture, but I’d really like to know your answer to that question.
@CalogeroZarbo
@CalogeroZarbo 2 роки тому
I'm using it to forcast 4 quarters ahead, having the present data only. It's working pretty well, I'm building my startup around this.
@damianwysokinski3285
@damianwysokinski3285 2 роки тому
@@CalogeroZarbo how do you deal with data processing? Is it possible for you to share example code?
@carlosbahia5895
@carlosbahia5895 Рік тому
@@CalogeroZarbo hi, can you share some code (Python) ? Where is your startup web site? Thank you
@G.94
@G.94 Рік тому
@@carlosbahia5895 I think he is joking around
Nixtla: Deep Learning for Time Series Forecasting
35:12
Databricks
Переглядів 19 тис.
Mamba - a replacement for Transformers?
16:01
Samuel Albanie
Переглядів 239 тис.
What are Transformer Models and how do they work?
44:26
Serrano.Academy
Переглядів 88 тис.
Why Some Designs Are Impossible to Improve: Quintessence
33:03
Design Theory
Переглядів 48 тис.
Let's build GPT: from scratch, in code, spelled out.
1:56:20
Andrej Karpathy
Переглядів 4,2 млн
Transformers, explained: Understand the model behind GPT, BERT, and T5
9:11
Google Cloud Tech
Переглядів 852 тис.
Transformer Neural Networks Derived from Scratch #SoME3
18:08
Algorithmic Simplicity
Переглядів 109 тис.
Attention Is All You Need
27:07
Yannic Kilcher
Переглядів 605 тис.
Paper Review: Temporal Fusion Forecasting (Christopher Ewanik)
19:32
Embedded Systems and Deep Learning
Переглядів 2,5 тис.