# 4.2.5 2層神經網路的python實現 * 使用TwoLayerNN表示2層神經網路 ```python= #呼叫train函數訓練模型 nn = TwoLayerNN(2,100,3) W1,b1,W2,b2 = nn.train(X,y) iteration 0: loss 1.098627 iteration 1000: loss 0.115216 iteration 2000: loss 0.053218 iteration 3000: loss 0.038299 iteration 4000: loss 0.031767 iteration 5000: loss 0.028016 iteration 6000: loss 0.025411 iteration 7000: loss 0.023476 iteration 8000: loss 0.022009 iteration 9000: loss 0.020872 training accuracy: 0.99 #顯示決策邊界 h = 0.02 x_min, x_max = X[:, 0].min()-1, X[:, 0].max()+1 y_min, y_max = X[:, 1].min()-1, X[:, 1].max()+1 xx, yy = np.meshgrid(np.arrange(x_min, x_max, h), np.arrange(y_min, y_max, h)) XX = np.c_[xx.ravel(), yy.ravel()] Z = nn.predict(XX) Z = np.argmax(Z, axis=1) Z = Z.reshape(xx.shape) fig = plt.figure() plt.contourf(xx, yy, Z, cmap=plt.cm.Spectral, alpha=0.8) plt.scatter(X[:, 0], X[:, 1], c=y, s=20, cmap=plt.cm.spring) plt.xlim(xx.min(), xx.max()) plt.ylim(yy.min(), yy.max()) ``` # 4.2.6任意層神經網路的反向求導 * 第l層神經網路的加權和為$z^{[l]}=W^{[l]}a^{[l-1]}+b^{l}$ * 通過向量、加權推導損失函數關於各項的偏導數,參考書本的4-61~4-66 * 反向求導的重要公式 ![image](https://hackmd.io/_uploads/rJAmyilpp.png) * 二分類或多分類的啟動函數:sigmoid或softmax 問題:上述程式碼的功能是甚麼?