owned this note
owned this note
Published
Linked with GitHub
---
title: '深度學習學習心得-6'
disqus: hackmd
---
深度學習學習心得
第六篇
===


---
## CNN CODE
---
1. 還是一樣拿MNIST當作例子
2. 編輯器:Juypter notebook
3. Choose a comfortable chair:kissing_heart:
4. Start learning!
----
CNN流程
---
1. 輸入照片
2. 進行捲積層
3. 最大池化
4. 3&4步驟循環
5. flatten
6. DNN
---
CNN流程
---

1. Convolution2D(25,3,3):25個filter(3x3大小)
2. Input_shape(1,28,28): 1 個顏色
若是*RGB*的話就是3 // 28x28是影像大小
3. MaxPooling2D(2,2): 每2x2的特徵圖做處理
----
CNN流程影像大小變化
---

1. Input = 28x28 / Conv1 = 26x26
2. Max1 = 13x13 / Conv2 = 11x11
3. Max2 = 5x5
----
CNN流程參數數量變化
---

1. Input = 1
2. Conv1 = 25 / Max1 = 25
3. Conv2 = 50 / Max1 = 50
----
Flatten
---
:::success
Convolution -Maxpooling循環結束後
就要把向量拉直,放到全連階層
:::
1. Flatten()使2-D向量變成1維
2. model.add(Flatten())
3. Flatten之後就是做DNN
---
分析filter
---

1. filter幫助我們找特徵出來
2. 設第k個filter大小11x11
3. 可以使用gradient ascent的方法
找出這個filter最大貢獻的output
----
分析filter(gradient ascent)
---

1. 做出來發現每個filter都負責分辨一些小紋路
2. **完全看不出來像是數字** :octopus:
----
分析filter(gradient ascent)
---

1. 但是若將這些filter放大DNN裡面做一樣的事
會發現不再是出現紋路,而是更有圖案的感覺
2. 因為fully connected是看全部的圖形,不是部分
----
分析filter(gradient ascent)
---
**output 出來也不像是數字**
**只像是電視的雜訊**

----
分析filter(gradient ascent)
---

1. 如何更像數字呢? **把x加上一些限制**
2. 數字image裡面只有一部分是有筆畫的
其餘都是白色的部分=>將白色去掉
---
Deep Dream(CNN應用)
---
1. Give a photo,machine adds what it sees
2. 拿一張圖放到網路中,取出某些參數
然後將他們兩極化(正的更正,負的更負)
:::info
兩極化範例
Original : w1 = +3 , w2 = -1
After : w1 = +6 , w2 = -4
:::
[Deep Dream 網站](http://deepdreamgenerator.com./)
----
Deep Dream
---
**原圖**

**Deep Dream**

----
Deep Style
---
**原圖**

**Deep Style**

----
Deep Style
---
1. 就是找一張圖(output)來符合
*最像是我輸入的原圖的條件*
2. 但是filter值要最像是選擇那張藝術圖的style
3. style = 參數

---
CNN與圍棋
---
1. 除了影像,在圍棋上也是也有點關係
2. 第一點 : 不需要看整張image就可以知道某些pattern
Ex. 叫吃(只剩一個氣)
3. 第二點 : 某些pattern出現在不同位置代表相同意義
----
CNN與圍棋
---
第一點

第二點

----
CNN與圍棋
---
1. Maxpooling 跟圍棋有關係嗎?
2. 應該沒有,Alpha Go 沒有用maxpooling
3. CNN也可應用在NLP上,下圖為說話頻譜

---
MNIST CNN CODE1
```python=
# 設定所需 library
import numpy as np
from keras.models import Sequential
from keras.layers.core import Dense, Dropout, Activation
from keras.layers import Conv2D, MaxPooling2D, Flatten
from keras.optimizers import SGD, Adam
from keras.utils import np_utils
from keras.datasets import mnist
import matplotlib.pyplot as plt
```
----
MNIST CNN CODE2
```python=
# categorical_crossentropy
def load_data():
(x_train, y_train), (x_test, y_test) = mnist.load_data()
number = 10000
x_train = x_train[0:number]
y_train = y_train[0:number]
x_train = x_train.reshape(number, 28 , 28,1)
x_test = x_test.reshape(number, 28 , 28,1)
x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
# convert class vectors to binary class matrices
y_train = np_utils.to_categorical(y_train, 10)
y_test = np_utils.to_categorical(y_test, 10)
x_train = x_train
x_test = x_test
# x_test=np.random.normal(x_test)
x_train = x_train/255
x_test = x_test/255
return (x_train, y_train), (x_test, y_test)
def show_train_history(train_history, train, validation):
plt.plot(train_history.history[train])
plt.plot(train_history.history[validation])
plt.title('Train History')
plt.ylabel(train)
plt.xlabel('Epoch')
plt.legend(['train', 'validation'], loc='upper left')
plt.show()
```
----
MNIST CNN CODE3
```python=
(x_train, y_train), (x_test, y_test) = load_data()
# define network structure
model = Sequential()
model.add(Conv2D(25,(3,3), activation='relu', padding='same',input_shape=(28,28,1)))
model.add(MaxPooling2D(pool_size = (2, 2),dim_ordering="th"))
#model.add(Dropout(0.5))
model.add(Conv2D(50,(3,3), activation='relu', padding='same'))
model.add(MaxPooling2D(pool_size=(2,2),dim_ordering="th"))
model.add(Flatten())
model.add(Dense(units=10, activation='softmax'))
model.compile(loss='categorical_crossentropy',optimizer='adam', metrics=['acc'])
train_history = model.fit(x_train, y_train, batch_size=100, epochs=20,validation_split=0.2)
result = model.evaluate(x_train, y_train)
print('Train Accuracy:', result[1])
result = model.evaluate(x_test, y_test)
print('Test Accuracy:', result[1])
show_train_history(train_history, 'acc', 'val_acc')
show_train_history(train_history, 'loss', 'val_loss')
```
----
MNIST CNN CODE4
```python=
from keras.models import load_model
# creates a HDF5 file
model.save('CNN-1.h5')
if load_model('CNN-1.h5'):
print('Yeee~')
```
----
More resource:
---
:::success
Use your finger :100:
:::

----
## Next Lesson ...
1. 深度學習的技巧
###### tags: `Deep learning` `beginner` `python` `keras` `tutorial`