# Use POT to Quantize yolo-v3-tf Public Model
###### tags: `POT`
## Use OpenVINO dockerhub image
```
docker run -it -v ~/Downloads:/mnt -u root --rm openvino/ubuntu20_data_dev:latest
```
## Run Accuracy Checker and POT
In ubuntu20_data_dev docker image,
#### 0. Download COCO 2017 trainval dataset and annotation
```=bash
cd /home/openvino
apt update
apt install unzip
mkdir coco_dataset
cd coco_dataset
curl http://images.cocodataset.org/zips/val2017.zip -o val2017.zip
unzip val2017.zip
curl http://images.cocodataset.org/annotations/annotations_trainval2017.zip -o trainval_2017.zip
unzip trainval_2017.zip
```
##### coco_dataset content
```
coco_dataset/
|--annotations/
|-- captions_train2017.json
|-- captions_val2017.json
|-- instances_train2017.json
|-- instances_val2017.json
|-- person_keypoints_train2017.json
`-- person_keypoints_val2017.json
|-- val2017/
|-- 000000042102.jpg
|-- 000000060102.jpg
|-- 000000245102.jpg
...
`-- 000000364102.jpg
```
#### 1. Download yolo-v3-tf
```=bash
python3 /opt/intel/openvino_2021.3.394/deployment_tools/tools/model_downloader/downloader.py --name yolo-v3-tf -o /home/openvino/openvino_models
```
#### 2. Convert yolo-v3-tf to IR
```=bash
python3 /opt/intel/openvino_2021.3.394/deployment_tools/tools/model_downloader/converter.py --name yolo-v3-tf -d /home/openvino/openvino_models -o /home/openvino/openvino_models
```
#### 3. Run Accuracy Checker on yolo-v3-tf
```=bash
accuracy_check -c yolo-v3-tf-int8.yml -m /home/openvino/openvino_models/public/yolo-v3-tf/FP32 -d /opt/intel/openvino_2021.3.394/deployment_tools/open_model_zoo/tools/accuracy_checker/dataset_definitions.yml -ss 300
```
#### 4. Run POT on yolo-v3-tf
```=bash
pot -c yolo-v3-tf-int8.json -e
```
#### 5. Copy yolo-v3-tf FP16-INT8 IR
```=bash
mkdir /home/openvino/openvino_models/public/yolo-v3-tf/FP16-INT8/
cp -ar results/yolo-v3-tf_DefaultQuantization/2021-04-13_04-01-17/optimized/* /home/openvino/openvino_models/public/yolo-v3-tf/FP16-INT8/
#### Note : Copy openvino_models foler to /mnt folder. They will be accessable in ~/Downloads folder in the host and /mnt in the container.
```
## Reference
```
drwxr-xr-x 4 openvino openvino 4096 Apr 13 03:27 coco_dataset
drwxr-xr-x 3 root root 4096 Apr 13 03:36 openvino_models
drwxr-xr-x 3 root root 4096 Apr 13 04:01 results
./results/yolo-v3-tf_DefaultQuantization/2021-04-13_04-01-17/log.txt
./results/yolo-v3-tf_DefaultQuantization/2021-04-13_04-01-17/optimized
./results/yolo-v3-tf_DefaultQuantization/2021-04-13_04-01-17/optimized/yolo-v3-tf.mapping
./results/yolo-v3-tf_DefaultQuantization/2021-04-13_04-01-17/optimized/yolo-v3-tf.bin
./results/yolo-v3-tf_DefaultQuantization/2021-04-13_04-01-17/optimized/yolo-v3-tf.xml
-rw-r--r-- 1 root root 591 Apr 13 03:54 yolo-v3-tf-int8.json
-rw-r--r-- 1 root root 1386 Apr 13 03:53 yolo-v3-tf-int8.yml
openvino_models/
`-- public
`-- yolo-v3-tf
|-- FP16
| |-- yolo-v3-tf.bin
| |-- yolo-v3-tf.mapping
| `-- yolo-v3-tf.xml
|-- FP16-INT8
| |-- yolo-v3-tf.bin
| |-- yolo-v3-tf.mapping
| `-- yolo-v3-tf.xml
|-- FP32
| |-- yolo-v3-tf.bin
| |-- yolo-v3-tf.mapping
| `-- yolo-v3-tf.xml
|-- yolo-v3.json
`-- yolo-v3.pb
```
### yolo-v3-tf-int8.yml
```
models:
- name: yolo_v3_tf
launchers:
- framework: dlsdk
device: CPU
adapter:
type: yolo_v3
anchors: "10,13, 16,30, 33,23, 30,61, 62,45, 59,119, 116,90, 156,198, 373,326"
num: 9
coords: 4
classes: 80
anchor_masks: [[6, 7, 8], [3, 4, 5], [0, 1, 2]]
outputs:
- conv2d_58/Conv2D/YoloRegion
- conv2d_66/Conv2D/YoloRegion
- conv2d_74/Conv2D/YoloRegion
datasets:
- name: ms_coco_detection_80_class_without_background
annotation_conversion:
converter: mscoco_detection
annotation_file: /home/openvino/coco_dataset/annotations/instances_val2017.json
data_source: /home/openvino/coco_dataset/val2017
preprocessing:
- type: resize
size: 416
postprocessing:
- type: resize_prediction_boxes
- type: filter
apply_to: prediction
min_confidence: 0.001
remove_filtered: True
- type: nms
overlap: 0.5
- type: clip_boxes
apply_to: prediction
metrics:
- type: map
integral: 11point
ignore_difficult: true
presenter: print_scalar
- type: coco_precision
max_detections: 100
threshold: 0.5
```
### yolo-v3-tf-int8.json
```
{
"model": {
"model_name": "yolo-v3-tf",
"model": "/home/openvino/openvino_models/public/yolo-v3-tf/FP16/yolo-v3-tf.xml",
"weights": "/home/openvino/openvino_models/public/yolo-v3-tf/FP16/yolo-v3-tf.bin"
},
"engine": {
"config": "/home/openvino/yolo-v3-tf-int8.yml"
},
"compression": {
"algorithms": [
{
"name": "DefaultQuantization",
"params": {
"preset": "performance",
"stat_subset_size": 300
}
}
]
}
}
```
### accuracy_checker log
```
Processing info:
model: yolo_v3_tf
launcher: dlsdk
device: CPU
dataset: ms_coco_detection_80_class_without_background
OpenCV version: 4.5.2-openvino
Annotation conversion for ms_coco_detection_80_class_without_background dataset has been started
Parameters to be used for conversion:
converter: mscoco_detection
annotation_file: /home/openvino/coco_dataset/annotations/instances_val2017.json
Total annotations size: 5000
100 / 5000 processed in 0.507s
200 / 5000 processed in 0.466s
300 / 5000 processed in 0.452s
400 / 5000 processed in 0.443s
500 / 5000 processed in 0.440s
600 / 5000 processed in 0.434s
700 / 5000 processed in 0.436s
800 / 5000 processed in 0.432s
900 / 5000 processed in 0.433s
1000 / 5000 processed in 0.432s
1100 / 5000 processed in 0.431s
1200 / 5000 processed in 0.433s
1300 / 5000 processed in 0.432s
1400 / 5000 processed in 0.518s
1500 / 5000 processed in 0.503s
1600 / 5000 processed in 0.477s
1700 / 5000 processed in 0.463s
1800 / 5000 processed in 0.458s
1900 / 5000 processed in 0.453s
2000 / 5000 processed in 0.451s
2100 / 5000 processed in 0.449s
2200 / 5000 processed in 0.445s
2300 / 5000 processed in 0.454s
2400 / 5000 processed in 0.446s
2500 / 5000 processed in 0.445s
2600 / 5000 processed in 0.441s
2700 / 5000 processed in 0.443s
2800 / 5000 processed in 0.440s
2900 / 5000 processed in 0.441s
3000 / 5000 processed in 0.437s
3100 / 5000 processed in 0.436s
3200 / 5000 processed in 0.438s
3300 / 5000 processed in 0.432s
3400 / 5000 processed in 0.434s
3500 / 5000 processed in 0.434s
3600 / 5000 processed in 0.438s
3700 / 5000 processed in 0.440s
3800 / 5000 processed in 0.432s
3900 / 5000 processed in 0.430s
4000 / 5000 processed in 0.424s
4100 / 5000 processed in 0.471s
4200 / 5000 processed in 0.461s
4300 / 5000 processed in 0.445s
4400 / 5000 processed in 0.435s
4500 / 5000 processed in 0.429s
4600 / 5000 processed in 0.425s
4700 / 5000 processed in 0.421s
4800 / 5000 processed in 0.418s
4900 / 5000 processed in 0.414s
5000 / 5000 processed in 0.414s
5000 objects processed in 22.209 seconds
Annotation conversion for ms_coco_detection_80_class_without_background dataset has been finished
ms_coco_detection_80_class_without_background dataset metadata will be saved to mscoco_det_80.json
Converted annotation for ms_coco_detection_80_class_without_background dataset will be saved to mscoco_det_80.pickle
IE version: 2.1.2021.3.0-2787-60059f2c755-releases/2021/3
Loaded CPU plugin version:
CPU - MKLDNNPlugin: 2.1.2021.3.0-2787-60059f2c755-releases/2021/3
Found model /home/openvino/openvino_models/public/yolo-v3-tf/FP32/yolo-v3-tf.xml
Found weights /home/openvino/openvino_models/public/yolo-v3-tf/FP32/yolo-v3-tf.bin
Input info:
Layer name: input_1
precision: FP32
shape [1, 3, 416, 416]
Output info
Layer name: conv2d_58/Conv2D/YoloRegion
precision: FP32
shape: [1, 255, 13, 13]
Layer name: conv2d_66/Conv2D/YoloRegion
precision: FP32
shape: [1, 255, 26, 26]
Layer name: conv2d_74/Conv2D/YoloRegion
precision: FP32
shape: [1, 255, 52, 52]
300 objects processed in 133.030 seconds
map: 64.34%
coco_precision: 70.45%
```
### pot log
```
pot -c yolo-v3-tf-int8.json -e
04:01:17 accuracy_checker WARNING: /usr/local/lib/python3.8/dist-packages/networkx/classes/graph.py:23: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3, and in 3.9 it will stop working
from collections import Mapping
04:01:17 accuracy_checker WARNING: /usr/local/lib/python3.8/dist-packages/networkx/classes/reportviews.py:95: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3, and in 3.9 it will stop working
from collections import Mapping, Set, Iterable
04:01:17 accuracy_checker WARNING: /opt/intel/openvino/deployment_tools/tools/post_training_optimization_toolkit/compression/algorithms/quantization/optimization/algorithm.py:41: UserWarning: Nevergrad package could not be imported. If you are planning to useany hyperparameter optimization algo, consider installing itusing pip. This implies advanced usage of the tool.Note that nevergrad is compatible only with Python 3.6+
warnings.warn(
04:01:17 accuracy_checker WARNING: /usr/local/lib/python3.8/dist-packages/past/builtins/misc.py:45: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
from imp import reload
INFO:app.run:Output log dir: ./results/yolo-v3-tf_DefaultQuantization/2021-04-13_04-01-17
INFO:app.run:Creating pipeline:
Algorithm: DefaultQuantization
Parameters:
preset : performance
stat_subset_size : 300
target_device : ANY
model_type : None
dump_intermediate_model : False
exec_log_dir : ./results/yolo-v3-tf_DefaultQuantization/2021-04-13_04-01-17
===========================================================================
IE version: 2.1.2021.3.0-2787-60059f2c755-releases/2021/3
Loaded CPU plugin version:
CPU - MKLDNNPlugin: 2.1.2021.3.0-2787-60059f2c755-releases/2021/3
Annotation conversion for ms_coco_detection_80_class_without_background dataset has been started
Parameters to be used for conversion:
converter: mscoco_detection
annotation_file: /home/openvino/coco_dataset/annotations/instances_val2017.json
Total annotations size: 5000
100 / 5000 processed in 0.524s
200 / 5000 processed in 0.489s
300 / 5000 processed in 0.465s
400 / 5000 processed in 0.453s
500 / 5000 processed in 0.446s
600 / 5000 processed in 0.443s
700 / 5000 processed in 0.441s
800 / 5000 processed in 0.439s
900 / 5000 processed in 0.438s
1000 / 5000 processed in 0.449s
1100 / 5000 processed in 0.442s
1200 / 5000 processed in 0.440s
1300 / 5000 processed in 0.444s
1400 / 5000 processed in 0.442s
1500 / 5000 processed in 0.441s
1600 / 5000 processed in 0.441s
1700 / 5000 processed in 0.441s
1800 / 5000 processed in 0.442s
1900 / 5000 processed in 0.442s
2000 / 5000 processed in 0.442s
2100 / 5000 processed in 0.442s
2200 / 5000 processed in 0.442s
2300 / 5000 processed in 0.445s
2400 / 5000 processed in 0.443s
2500 / 5000 processed in 0.446s
2600 / 5000 processed in 0.453s
2700 / 5000 processed in 0.458s
2800 / 5000 processed in 0.504s
2900 / 5000 processed in 0.481s
3000 / 5000 processed in 0.473s
3100 / 5000 processed in 0.459s
3200 / 5000 processed in 0.455s
3300 / 5000 processed in 0.451s
3400 / 5000 processed in 0.446s
3500 / 5000 processed in 0.444s
3600 / 5000 processed in 0.443s
3700 / 5000 processed in 0.438s
3800 / 5000 processed in 0.437s
3900 / 5000 processed in 0.435s
4000 / 5000 processed in 0.432s
4100 / 5000 processed in 0.431s
4200 / 5000 processed in 0.429s
4300 / 5000 processed in 0.426s
4400 / 5000 processed in 0.426s
4500 / 5000 processed in 0.430s
4600 / 5000 processed in 0.425s
4700 / 5000 processed in 0.421s
4800 / 5000 processed in 0.424s
4900 / 5000 processed in 0.420s
5000 / 5000 processed in 0.416s
5000 objects processed in 22.279 seconds
Annotation conversion for ms_coco_detection_80_class_without_background dataset has been finished
INFO:compression.statistics.collector:Start computing statistics for algorithms : DefaultQuantization
INFO:compression.statistics.collector:Computing statistics finished
INFO:compression.pipeline.pipeline:Start algorithm: DefaultQuantization
INFO:compression.algorithms.quantization.default.algorithm:Start computing statistics for algorithm : ActivationChannelAlignment
INFO:compression.algorithms.quantization.default.algorithm:Computing statistics finished
INFO:compression.algorithms.quantization.default.algorithm:Start computing statistics for algorithms : MinMaxQuantization,FastBiasCorrection
04:01:59 accuracy_checker WARNING: /opt/intel/openvino/deployment_tools/model_optimizer/mo/back/ie_ir_ver_2/emitter.py:243: DeprecationWarning: This method will be removed in future versions. Use 'list(elem)' or iteration over elem instead.
if len(element.attrib) == 0 and len(element.getchildren()) == 0:
INFO:compression.algorithms.quantization.default.algorithm:Computing statistics finished
INFO:compression.pipeline.pipeline:Finished: DefaultQuantization
===========================================================================
INFO:compression.pipeline.pipeline:Evaluation of generated model
INFO:compression.engines.ac_engine:Start inference on the whole dataset
Total dataset size: 5000
1000 / 5000 processed in 430.922s
2000 / 5000 processed in 438.432s
3000 / 5000 processed in 434.308s
4000 / 5000 processed in 433.565s
5000 / 5000 processed in 430.274s
5000 objects processed in 2167.502 seconds
INFO:compression.engines.ac_engine:Inference finished
INFO:app.run:map : 0.6228774878840078
INFO:app.run:coco_precision : 0.6797002936757407
```