# DIN/DIEN ###### tags: `Accelerator` ### DIN Config   ``` 消費者瀏覽過的商品們,並非都是她要買的,所以透過 Activation 精選出最有可能的品項 ```  ### DIEN Config  ``` GRU makes recurrent units adaptively capture dependencies of different time scale. See Fig.1(b) for graphical illustration. ```  ``` 傳統的Seq2Seq模型對輸入序列X缺乏區分度,使得當X長度很長時,模型性能會急遽下降。 Attention Mechanism 的出現解決了上述問題。 左下圖為 Before,右下圖為 After adding Attention。 ```   ``` AUGRU(GRU with attentional update gate)衍生自 AGRU(Attention based GRU) ``` ### Reference 1. 2018, [Deep Interest Network for Click-Through Rate Prediction](https://arxiv.org/pdf/1706.06978.pdf) 2. 2018, [Deep Interest Evolution Network for Click-Through Rate Prediction](https://arxiv.org/pdf/1809.03672.pdf) 3. 2014, [Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling](https://arxiv.org/pdf/1412.3555.pdf)
×
Sign in
Email
Password
Forgot password
or
By clicking below, you agree to our
terms of service
.
Sign in via Facebook
Sign in via Twitter
Sign in via GitHub
Sign in via Dropbox
Sign in with Wallet
Wallet (
)
Connect another wallet
New to HackMD?
Sign up