# 6/30 PM
## Current Situation
### paper reading
[Neural Architecture Search with Reinforcement Learning (NAS)](https://arxiv.org/abs/1611.01578)
* 用一個RNN來生成一個卷積網路的超參數,再藉由訓練與驗證此卷積網路得到的準確率,來返回去更新RNN的權重。(卷積網路的資料集為CIFAR-10)

* RNN所生成的超參數種類包括:

* 為了增加RNN的複雜度,在原本架構中添加類似於GoogleNet和Residual Net的錨點,來決定前面N-1層有哪些層要作為當前第N層的輸入。

* 每一個RNN單元的詳細步驟(這邊是比較看不懂的一塊)

* 卷積網路:

### Programming
[英翻中實作](https://leemeng.tw/neural-machine-translation-with-transformer-and-tensorflow2.html)
* 目前已經將函式建好了,包括Matmul、Scale、Mask等等,以及Multi-Head


* 但是在建構Encoder和Decoder時出現Bug,目前還未找到解決方法。(有嘗試更動tensorflow版本但無效)
```
InvalidArgumentError Traceback (most recent call last)
<ipython-input-35-5a7154c0fb1c> in <module>
5 enc_layer = EncoderLayer(d_model, num_heads, dff)
6 padding_mask = create_padding_mask(inp)
----> 7 enc_out = enc_layer(emb_inp, training=False, mask=padding_mask)
8
9 print("inp:", inp)
~\anaconda3\lib\site-packages\tensorflow\python\keras\engine\base_layer.py in __call__(self, *args, **kwargs)
966 with base_layer_utils.autocast_context_manager(
967 self._compute_dtype):
--> 968 outputs = self.call(cast_inputs, *args, **kwargs)
969 self._handle_activity_regularization(inputs, outputs)
970 self._set_mask_metadata(inputs, outputs, input_masks)
<ipython-input-28-bacdd11d1ac9> in call(self, x, training, mask)
13
14 def call(self, x, training, mask):
---> 15 attn_output, attn = self.mha(x, x, x, mask)
16 attn_output = self.dropout1(attn_output, training=training)
17 out1 = self.layernorm1(x + attn_output)
~\anaconda3\lib\site-packages\tensorflow\python\keras\engine\base_layer.py in __call__(self, *args, **kwargs)
966 with base_layer_utils.autocast_context_manager(
967 self._compute_dtype):
--> 968 outputs = self.call(cast_inputs, *args, **kwargs)
969 self._handle_activity_regularization(inputs, outputs)
970 self._set_mask_metadata(inputs, outputs, input_masks)
<ipython-input-25-f272a694a943> in call(self, v, k, q, mask)
28 v = self.split_heads(v, batch_size)
29
---> 30 scaled_attention, attention_weights = scaled_dot_product_attention(q, k, v, mask)
31 scaled_attention = tf.transpose(scaled_attention, perm=[0, 2, 1, 3])
32 concat_attention = tf.reshape(scaled_attention, (batch_size, -1, self.d_model))
<ipython-input-20-ffac29a2bec7> in scaled_dot_product_attention(q, k, v, mask)
6
7 if mask is not None:
----> 8 scaled_attention_logits += (mask * -1e9)
9
10 attention_weights = tf.nn.softmax(scaled_attention_logits, axis=-1)
~\anaconda3\lib\site-packages\tensorflow\python\ops\math_ops.py in binary_op_wrapper(x, y)
982 with ops.name_scope(None, op_name, [x, y]) as name:
983 if isinstance(x, ops.Tensor) and isinstance(y, ops.Tensor):
--> 984 return func(x, y, name=name)
985 elif not isinstance(y, sparse_tensor.SparseTensor):
986 try:
~\anaconda3\lib\site-packages\tensorflow\python\ops\math_ops.py in _add_dispatch(x, y, name)
1274 return gen_math_ops.add(x, y, name=name)
1275 else:
-> 1276 return gen_math_ops.add_v2(x, y, name=name)
1277
1278
~\anaconda3\lib\site-packages\tensorflow\python\ops\gen_math_ops.py in add_v2(x, y, name)
478 pass # Add nodes to the TensorFlow graph.
479 except _core._NotOkStatusException as e:
--> 480 _ops.raise_from_not_ok_status(e, name)
481 # Add nodes to the TensorFlow graph.
482 _, _, _op, _outputs = _op_def_library._apply_op_helper(
~\anaconda3\lib\site-packages\tensorflow\python\framework\ops.py in raise_from_not_ok_status(e, name)
6651 message = e.message + (" name: " + name if name is not None else "")
6652 # pylint: disable=protected-access
-> 6653 six.raise_from(core._status_to_exception(e.code, message), None)
6654 # pylint: enable=protected-access
6655
~\anaconda3\lib\site-packages\six.py in raise_from(value, from_value)
InvalidArgumentError: Incompatible shapes: [2,2,4,4] vs. [2,1,1,8] [Op:AddV2]
```
### Others (e.g.,projects)
* 多益題目練習
---
## Next Step
### Paper reading
### Programming
英翻中實作
### Others
多益