# Intel Movidius NCS 2
## Setting raspberry pi
**Note: NCS2 currently doesn't support raspberry pi**
### Configure locale
Edit the file `/etc/locale.gen`, uncomment the lines for the locales you want and run
```
sudo locale-gen
sudo dpkg-reconfigure locales
```
### Install some needed package (optional)
#### vim
```
sudo apt-get update
sudo apt-get install vim
```
Enter all the line below in `~/.vimrc`
```
# vim ~/.vimrc
syntax on
set t_Co=256
set encoding=utf-8
set nu
set ts=4
set shiftwidth=4
set smartindent
```
## Setting host in Virtualbox
Download ubuntu 16.04 LTS from [here](https://www.ubuntu-tw.org/modules/tinyd0/)
Download Oracle VM Extension pack from [here](https://www.virtualbox.org/wiki/Downloads)
**Note: The verion of extension pack must match the version of VirtualBox you are using!**
Create an ubuntu 16.04 VM (below are all personal choices)
* Choose 4096 RAM for now
* Select 2 virtual CPU
* Set up to 40GB dynamically allocated VDI
### VM USB passthrough setting
Go to the *"Settings"* for your VM and edit *"Ports > USB"* to reflect a *"USB 3.0 (xHCI) Controller"*
You need to set **USB2** and **USB3** Device Filters for the Movidius to seamlessly stay connected.
To to this, click *"Add new USB filter"* as is marked in the below image

Create two USB Device Filters. Most of the fields can be left blank except Name and Vendor ID.
* Name: **Movidius1**, Vendor ID: **03e7**
* Name: **Movidius2**, Vendor ID: **040e**
Time to boot the Ubuntu 16.04 VM, the installation of OS might take a while...
### After ubuntu 16.04 has been installed
#### Update the system
```
sudo apt-get install update
```
#### Install guest additions
Go to the Devices menu of Virtual box and clicking *"Insert Guest Additions CD Imageā¦"*
#### Install openVINO toolkit
Download the latest toolkit from [here](https://software.seek.intel.com/openvino-toolkit)
After downloading, open the terminal and run these commands
```
cd ~/Downloads
tar xvf l_openvino_toolkit_<VERSION>.tgz
cd l_openvino_toolkit_<VERSION>
sudo ./install_cv_sdk_dependencies.sh
./install_GUI.sh
```
#### Setup environment variable
1. Edit `.bashrc` in <user_directory>
```
vi ~/.bashrc
```
2. Add this line at the end of the file
```
source ~/intel/computer_vision_sdk/bin/setupvars.sh
```
3. To test your change, open a new terminal. You will see
```
[setupvars.sh] OpenVINO environment initialized
```
#### Configure model optimizer
```
cd ~/intel/computer_vision_sdk/deployment_tools/model_optimizer/install_prerequisites
sudo ./install_prerequisites.sh
```
**Notes:** If this error message `locale.Error: unsupported locale setting` pop up, enter these commands and re-run the commands above
```
export LC_ALL="en_US.UTF-8"
export LC_CTYPE="en_US.UTF-8"
sudo dpkg-reconfigure locales
```
### Additional step (required)
```
sudo usermod -a -G users "$(whoami)"
```
To perform inference on movidius NCS, install the USB rules as follows
```
cat <<EOF > 97-usbboot.rules
SUBSYSTEM=="usb", ATTRS{idProduct}=="2150", ATTRS{idVendor}=="03e7", GROUP="users", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1"
SUBSYSTEM=="usb", ATTRS{idProduct}=="2485", ATTRS{idVendor}=="03e7", GROUP="users", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1"
SUBSYSTEM=="usb", ATTRS{idProduct}=="f63b", ATTRS{idVendor}=="03e7", GROUP="users", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1"
EOF
```
Then use the following commands
```
sudo cp 97-usbboot.rules /etc/udev/rules.d/
sudo udevadm control --reload-rules
sudo udevadm trigger
sudo ldconfig
rm 97-usbboot.rules
```
## Compiling Tensorflow model with openVINO
### Training
When you are training your model, add the line with `+` before it in your training code
This will save your training model
```
+ saver = tf.train.Saver()
...
with tf.Session() as sess:
+ save_path = saver.save(sess, './model')
```
After training complete, there will be four files in your current directory
```
model.data-00000-of-00001
model.index
model.meta
checkpoint
```
### Create inference model
Copy your training code to inference only code by following command
```
cp <your_trianing_code> <any_name_you_want_for_inference_only_code>
```
Remove all training specific code in the copied inference only code such as
```
Dropout layer
loading training data
cross entropy
accuracy
placeholder except input
... etc
```
Now remove all the code below `with tf.Session() as sess:` and add those lines with `+` before it
```
with tf.Session() as sess:
+ sess.run(init)
+ saver.restore(sess, './model')
+ save_path = saver.save(sess, './model_inference')
```
If you run the inference code without error, there will be 4 files in current directory too
```
model_inference.data-00000-of-00001
model_inference.index
model_inference.meta
checkpoint
```
These 4 files are **very important**, it will be converted to **IR** (a graph which is understanable by movidius NCS2 ) by `openVINO` toolkit.
### Compile inference model with model optimizer
Go to the `~/intel/computer_vision_sdk/deployment_tools/model_optimizer` directory
```
python3 mo_tf.py --input_meta_graph <path/to/your/meta/graph> --input_shape <shape_of_input> --output_dir <where/you/want/to/store/the/output> --data_type=FP16
```
For example
```
python3 mo_tf.py --input_meta_graph ~/Desktop/dl/lab2/host/inference/inference.ckpt.meta --input_shape [1,48,48,1] --output_dir ~/Desktop/dl/lab2/movidius/model/ --data_type=FP16
```
Note that `--data_type=FP16` is needed if you are going to do inference on NCS2,
use `--data_type=FP32` instead if you are going to do infernece with your CPU.
If you run successfully, you might see the output similar with
```
Model Optimizer arguments:
Common parameters:
- Path to the Input Model: None
- Path for generated IR: /home/hwy/Desktop/
- IR output name: model_inference.ckpt
- Log level: ERROR
- Batch: Not specified, inherited from the model
- Input layers: Not specified, inherited from the model
- Output layers: Not specified, inherited from the model
- Input shapes: [1,28,28,3]
- Mean values: Not specified
- Scale values: Not specified
- Scale factor: Not specified
- Precision of IR: FP16
- Enable fusing: True
- Enable grouped convolutions fusing: True
- Move mean values to preprocess section: False
- Reverse input channels: False
TensorFlow specific parameters:
- Input model in text protobuf format: False
- Offload unsupported operations: False
- Path to model dump for TensorBoard: None
- List of shared libraries with TensorFlow custom layers implementation: None
- Update the configuration file with input/output node names: None
- Use configuration file used to generate the model with Object Detection API: None
- Operations to offload: None
- Patterns to offload: None
- Use the config file: None
Model Optimizer version: 1.4.292.6ef7232d
[ SUCCESS ] Generated IR model.
[ SUCCESS ] XML file: /home/hwy/Desktop/model_inference.ckpt.xml
[ SUCCESS ] BIN file: /home/hwy/Desktop/model_inference.ckpt.bin
[ SUCCESS ] Total execution time: 1.36 seconds.
```
### Inference with intel neural compute stick
```
python classification.py --model ./inference.ckpt.xml --input images/ --device MYRIAD --labels mapping
```
## Some Error
`locale.Error: unsupported locale setting`
[](https://stackoverflow.com/questions/14547631/python-locale-error-unsupported-locale-setting)