{%hackmd @themes/dracula %} ## NLLLoss $l_n = -w_{y_n}x_{n, y_n}$ ## CrossEntropyLoss = logSoftmax + NLLLoss loss = $\sum p(x)logq(x)$ ## KL loss = $\sum p_klog\frac{p_k}{q_k}$ = $\sum p_k(logp_k-logq_k)$