Backpropagation

Gradient Descent

  • Gradient Descent需要計算LostFunction對各項權重的偏微分。我們用Backpropagation來計算。
    Image Not Showing Possible Reasons
    • The image file may be corrupted
    • The server hosting the image is unavailable
    • The image path is incorrect
    • The image format is not supported
    Learn More →

計算

  • 結論 :
    • 將計算拆分為ForwardPass和BackwardPass分別計算,前者即為Neural的Input。後者需要遞迴計算。
      Image Not Showing Possible Reasons
      • The image file may be corrupted
      • The server hosting the image is unavailable
      • The image path is incorrect
      • The image format is not supported
      Learn More →
    • 誤差函數(C )對權重(Z)的偏微分,可以視為Network反向傳遞的過程,將末項的微分值傳入,乘上參數相加,再乘上sigmoid function的微分值,就是要求的微分值。
      Image Not Showing Possible Reasons
      • The image file may be corrupted
      • The server hosting the image is unavailable
      • The image path is incorrect
      • The image format is not supported
      Learn More →
  • 推導 :
    • Lost對weight的偏微,可以視為誤差函數對weight偏微的相加。
    • 偏微分拆分
    • 後項拆解為sigmoid的微分,乘上下一組neural的偏微分,遞迴計算。
    • 直到最後一組 計算誤差函數與輸出的微分值
  • 優勢
    正向計算微分值需要樹狀的擴散計算各項微分(Top-Down),而反向則可以節省重複的計算(類似演算法的Bottom-Up結構)
    Image Not Showing Possible Reasons
    • The image file may be corrupted
    • The server hosting the image is unavailable
    • The image path is incorrect
    • The image format is not supported
    Learn More →
tags: ML2020