# 深度學習
# a)Method descriptions:
架設cnn網路並預測手勢
# b)Source code explanations:
在程式碼內
# c)實驗成果:
Epoch {1}
-------------------------------
Warning (from warnings module):
File "C:\Users\TW-Recruit\Desktop\深度\reg_benjamin.py", line 65
return self.logsoftmax(out)
UserWarning: Implicit dimension choice for log_softmax has been deprecated. Change the call to include dim=X as an argument.
loss: 0.065235 [ 0/ 540]
loss: 3.464860 [ 30/ 540]
loss: 5.801835 [ 60/ 540]
loss: 6.929997 [ 90/ 540]
loss: 8.269307 [ 120/ 540]
loss: 0.029482 [ 150/ 540]
loss: 0.027458 [ 180/ 540]
loss: 2.933991 [ 210/ 540]
loss: 0.101763 [ 240/ 540]
loss: 2.714937 [ 270/ 540]
loss: 0.061495 [ 300/ 540]
loss: 5.872705 [ 330/ 540]
loss: 0.012855 [ 360/ 540]
loss: 2.500899 [ 390/ 540]
loss: 6.737005 [ 420/ 540]
loss: 5.976598 [ 450/ 540]
loss: 3.504619 [ 480/ 540]
loss: 7.985864 [ 510/ 540]
Accuracy: 33.3%, Avg loss: 3.201234
Epoch {2}
-------------------------------
loss: 4.753898 [ 0/ 540]
loss: 0.018671 [ 30/ 540]
loss: 3.410061 [ 60/ 540]
loss: 0.035747 [ 90/ 540]
loss: 0.047177 [ 120/ 540]
loss: 0.025933 [ 150/ 540]
loss: 0.111312 [ 180/ 540]
loss: 3.929190 [ 210/ 540]
loss: 5.559365 [ 240/ 540]
loss: 2.515308 [ 270/ 540]
loss: 3.223219 [ 300/ 540]
loss: 3.876003 [ 330/ 540]
loss: 3.214012 [ 360/ 540]
loss: 4.303778 [ 390/ 540]
loss: 3.509511 [ 420/ 540]
loss: 0.029173 [ 450/ 540]
loss: 7.044427 [ 480/ 540]
loss: 3.585757 [ 510/ 540]
Accuracy: 33.3%, Avg loss: 3.201234
Epoch {3}
-------------------------------
loss: 0.024747 [ 0/ 540]
loss: 5.028611 [ 30/ 540]
loss: 0.021400 [ 60/ 540]
loss: 5.498536 [ 90/ 540]
loss: 3.509511 [ 120/ 540]
loss: 0.007420 [ 150/ 540]
loss: 5.240959 [ 180/ 540]
loss: 2.243244 [ 210/ 540]
loss: 2.651394 [ 240/ 540]
loss: 5.364343 [ 270/ 540]
loss: 4.068261 [ 300/ 540]
loss: 6.352154 [ 330/ 540]
loss: 4.598450 [ 360/ 540]
loss: 0.008729 [ 390/ 540]
loss: 4.538314 [ 420/ 540]
loss: 0.049507 [ 450/ 540]
loss: 4.117374 [ 480/ 540]
loss: 5.559365 [ 510/ 540]
Accuracy: 33.3%, Avg loss: 3.201234
Epoch {4}
-------------------------------
loss: 3.225171 [ 0/ 540]
loss: 4.171563 [ 30/ 540]
loss: 0.027249 [ 60/ 540]
loss: 2.741087 [ 90/ 540]
loss: 5.107327 [ 120/ 540]
loss: 4.053771 [ 150/ 540]
loss: 0.085870 [ 180/ 540]
loss: 0.019637 [ 210/ 540]
loss: 3.394980 [ 240/ 540]
loss: 3.182096 [ 270/ 540]
loss: 0.030831 [ 300/ 540]
loss: 5.398232 [ 330/ 540]
loss: 0.048474 [ 360/ 540]
loss: 0.025428 [ 390/ 540]
loss: 4.476646 [ 420/ 540]
loss: 5.801835 [ 450/ 540]
loss: 0.030398 [ 480/ 540]
loss: 0.024315 [ 510/ 540]
Accuracy: 33.3%, Avg loss: 3.201234
Epoch {5}
-------------------------------
loss: 5.471781 [ 0/ 540]
loss: 6.473519 [ 30/ 540]
loss: 0.038137 [ 60/ 540]
loss: 2.714937 [ 90/ 540]
loss: 2.441057 [ 120/ 540]
loss: 2.658318 [ 150/ 540]
loss: 3.598749 [ 180/ 540]
loss: 5.231395 [ 210/ 540]
loss: 0.076402 [ 240/ 540]
loss: 4.265241 [ 270/ 540]
loss: 2.676218 [ 300/ 540]
loss: 3.642813 [ 330/ 540]
loss: 6.154784 [ 360/ 540]
loss: 0.028266 [ 390/ 540]
loss: 0.026811 [ 420/ 540]
loss: 6.068089 [ 450/ 540]
loss: 0.042659 [ 480/ 540]
loss: 0.059371 [ 510/ 540]
Accuracy: 33.3%, Avg loss: 3.201234
Epoch {6}
-------------------------------
loss: 0.036411 [ 0/ 540]
loss: 3.888846 [ 30/ 540]
loss: 3.585757 [ 60/ 540]
loss: 5.112121 [ 90/ 540]
loss: 3.980577 [ 120/ 540]
loss: 5.711401 [ 150/ 540]
loss: 0.089743 [ 180/ 540]
loss: 4.249712 [ 210/ 540]
loss: 0.024166 [ 240/ 540]
loss: 0.028800 [ 270/ 540]
loss: 0.020797 [ 300/ 540]
loss: 0.054998 [ 330/ 540]
loss: 4.393620 [ 360/ 540]
loss: 4.587889 [ 390/ 540]
loss: 0.072655 [ 420/ 540]
loss: 8.086723 [ 450/ 540]
loss: 0.030410 [ 480/ 540]
loss: 3.054413 [ 510/ 540]
Accuracy: 33.3%, Avg loss: 3.201234
Epoch {7}
-------------------------------
loss: 0.054998 [ 0/ 540]
loss: 2.606077 [ 30/ 540]
loss: 5.646023 [ 60/ 540]
loss: 4.988642 [ 90/ 540]
loss: 0.128827 [ 120/ 540]
loss: 0.039510 [ 150/ 540]
loss: 0.072655 [ 180/ 540]
loss: 5.276203 [ 210/ 540]
loss: 2.894662 [ 240/ 540]
loss: 4.393620 [ 270/ 540]
loss: 3.238527 [ 300/ 540]
loss: 4.781704 [ 330/ 540]
loss: 4.825458 [ 360/ 540]
loss: 0.010187 [ 390/ 540]
loss: 2.737911 [ 420/ 540]
loss: 5.169029 [ 450/ 540]
loss: 0.030363 [ 480/ 540]
loss: 4.576992 [ 510/ 540]
Accuracy: 33.3%, Avg loss: 3.201234
Epoch {8}
-------------------------------
loss: 2.916328 [ 0/ 540]
loss: 3.882244 [ 30/ 540]
loss: 0.057494 [ 60/ 540]
loss: 3.049276 [ 90/ 540]
loss: 0.004427 [ 120/ 540]
loss: 0.059371 [ 150/ 540]
loss: 2.646603 [ 180/ 540]
loss: 2.645606 [ 210/ 540]
loss: 4.145267 [ 240/ 540]
loss: 6.411925 [ 270/ 540]
loss: 0.094388 [ 300/ 540]
loss: 0.013396 [ 330/ 540]
loss: 6.015685 [ 360/ 540]
loss: 3.841204 [ 390/ 540]
loss: 2.720193 [ 420/ 540]
loss: 5.908076 [ 450/ 540]
loss: 3.238726 [ 480/ 540]
loss: 3.039895 [ 510/ 540]
Accuracy: 33.3%, Avg loss: 3.201234
Epoch {9}
-------------------------------
loss: 0.021725 [ 0/ 540]
loss: 3.692283 [ 30/ 540]
loss: 4.298891 [ 60/ 540]
loss: 4.587889 [ 90/ 540]
loss: 2.676075 [ 120/ 540]
loss: 3.858973 [ 150/ 540]
loss: 5.422315 [ 180/ 540]
loss: 4.377876 [ 210/ 540]
loss: 5.509836 [ 240/ 540]
loss: 3.464860 [ 270/ 540]
loss: 0.037129 [ 300/ 540]
loss: 3.509511 [ 330/ 540]
loss: 0.012486 [ 360/ 540]
loss: 0.019883 [ 390/ 540]
loss: 0.060805 [ 420/ 540]
loss: 0.131757 [ 450/ 540]
loss: 5.028611 [ 480/ 540]
loss: 2.916328 [ 510/ 540]
Accuracy: 33.3%, Avg loss: 3.201234
Epoch {10}
-------------------------------
loss: 0.023889 [ 0/ 540]
loss: 3.642813 [ 30/ 540]
loss: 4.305746 [ 60/ 540]
loss: 2.803155 [ 90/ 540]
loss: 6.737005 [ 120/ 540]
loss: 0.019637 [ 150/ 540]
loss: 0.056732 [ 180/ 540]
loss: 5.102562 [ 210/ 540]
loss: 0.010071 [ 240/ 540]
loss: 0.038178 [ 270/ 540]
loss: 2.954634 [ 300/ 540]
loss: 0.034217 [ 330/ 540]
loss: 5.945993 [ 360/ 540]
loss: 0.042593 [ 390/ 540]
loss: 4.317887 [ 420/ 540]
loss: 6.785609 [ 450/ 540]
loss: 2.676075 [ 480/ 540]
loss: 5.811920 [ 510/ 540]
Accuracy: 33.3%, Avg loss: 3.201234
Done!
# d)Discussions on the results:
這次作業做的很失敗,很多部分都需要學習與debug,但是沒有把握好時間努力製作,最後也沒能做出來accuracy超過35%以上的成果
# e)Concluding remarks:
這次最主要遇到的問題在於資料讀取方式,解決方式是在跟同學討論與看老師的網站其他兩位先交作業的同學,才發現到的解決方法。也多看了好幾次老師的影片,讓CNN網路的結構更清晰的了解,希望下一個作業能夠成功完成。