{%hackmd @themes/dracula %} ## NLLLoss $l_n = -w_{y_n}x_{n, y_n}$ ## CrossEntropyLoss = logSoftmax + NLLLoss loss = $\sum p(x)logq(x)$ ## KL loss = $\sum p_klog\frac{p_k}{q_k}$ = $\sum p_k(logp_k-logq_k)$
×
Sign in
Email
Password
Forgot password
or
By clicking below, you agree to our
terms of service
.
Sign in via Facebook
Sign in via Twitter
Sign in via GitHub
Sign in via Dropbox
Sign in with Wallet
Wallet (
)
Connect another wallet
New to HackMD?
Sign up