# DIN/DIEN ###### tags: `Accelerator` ### DIN Config ![](https://i.imgur.com/M7qH38E.png) ![](https://i.imgur.com/Mdxi9pd.png =400x) ``` 消費者瀏覽過的商品們,並非都是她要買的,所以透過 Activation 精選出最有可能的品項 ``` ![](https://i.imgur.com/UBkRu2O.png =400x) ### DIEN Config ![](https://i.imgur.com/dDHRwG5.png =700x) ``` GRU makes recurrent units adaptively capture dependencies of different time scale. See Fig.1(b) for graphical illustration. ``` ![](https://i.imgur.com/BAv2MxA.png =500x) ``` 傳統的Seq2Seq模型對輸入序列X缺乏區分度,使得當X長度很長時,模型性能會急遽下降。 Attention Mechanism 的出現解決了上述問題。 左下圖為 Before,右下圖為 After adding Attention。 ``` ![](https://i.imgur.com/bUkbv1g.png =300x) ![](https://i.imgur.com/11CU9M0.png =300x) ``` AUGRU(GRU with attentional update gate)衍生自 AGRU(Attention based GRU) ``` ### Reference 1. 2018, [Deep Interest Network for Click-Through Rate Prediction](https://arxiv.org/pdf/1706.06978.pdf) 2. 2018, [Deep Interest Evolution Network for Click-Through Rate Prediction](https://arxiv.org/pdf/1809.03672.pdf) 3. 2014, [Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling](https://arxiv.org/pdf/1412.3555.pdf)