---
tags: 生物辨識
---
# Feature Decomposition and Reconstruction Learning for Effective Facial Expression Recognition
[B站資源](https://www.bilibili.com/video/BV1S5411w7Yo?p=3)
## Contribution
- View the expression information as the combination of the shared information
- Propose a novel Feature Decomposition and Reconstruction Learning
- Feature Decomposition Network(FDN): face的特徵分解成latent feature的集合
- Feature Reconstruction Network(FRN): latent feature的重組成face的representation
- Intra-feature Relation Modeling module (Intra-RM):
- Assign intra-feature relation weight to each latent feature according to the importance of the feature
- 將每一個latent feature給一個important weight, 並將important weight與latent feature相乘得到每一個latent feature的intra-aware feature
- Inter-feature Relation Modeling module (Inter-RM):
- Learn an Inter-feature Relation Weight (Inter-W) between intra-aware features based on GNN
- 根據latent feature之間的距離, 得到每一個latent feature的inter-aware feature

## Pipeline of Feature Decomposition and Reconstruction Learning
### FDN
- Decomposes the basic feature into a set of facial action-aware latent features
- Basic feature of i-th facial image : $x_i$, j-th latent featur of i-th facial image
$$
l_{i,j}=\sigma_{1}(W_{d_j}x_i), \mbox{for} j=1...M
$$
- Compact Loss
$$
L_{C}=\frac{1}{N}\sum_{i=1}^{N}\sum_{j=1}^{M}||l_{i,j}-c_{j}||^2_2
$$

### FRN
- Intra-RM
- Intra-feature relation Weight
$$
\alpha_{i,j}=||\sigma_2(W_{s_j}l_{i,j})||_1
$$
- intra-aware feature for the i-th facial image
$$
f_{i,j}=\alpha_{i,j}l_{i,j}, \mbox{for} j=1...M
$$
- distribution loss
$$
L_{C}=\frac{1}{N}\sum_{i=1}^{N}||w_{i}-w_{k_i}||^2_2
$$
where $w_{i}=[\alpha_{i1}, \alpha_{i2}, ..., \alpha_{iM}]$, $w_{k_i}$denotes the class center corresponding to the ki-th expression category.
- balance Loss
$$
L_{B}=||\bar{w}-w_u||_1
$$
where $\bar{w}=[\bar{\alpha}_{1}, \bar{\alpha}_{2}, ..., \bar{\alpha}_{M}]$ and $w_u=[\frac{1}{M}, \frac{1}{M}, ...., \frac{1}{M}]$
- Inter-RM
- First fed into a message network for feature encoding
$$
g_{i,j}=\sigma_{1}(W_{e_j}f_{i,j}), \mbox{for} j=1...M
$$
- The relation importance between the node $g_{i,j}$ and the node $g_{i,m}$
$$
w_{i}(j,m) =
\begin{cases}
\sigma_3(S(g_{i,j}, g_{i,m})), & j\neq m \\
0, & \mbox{otherwise}
\end{cases}
$$
- j-th inter-aware feature of i-th image
$$
\hat{f}_{i,j}=\sum_{m=1}^{M}w_{i}(j,m)g_{i,m}, \mbox{for} j=1...M
$$
- final expression feature
$$
y_{i}=\sum_{j=1}^{M}\delta f_{i,j}+(1-\delta) \hat{f}_{i,j}
$$
### Total Loss
$$
L=L_{cls}+\lambda_{1}L_{C}+\lambda_{2}L_{B}+\lambda_{3}L_{D}
$$