python /home/admin/mtr/script_for_cron.py -j coverage -m 9 -a '' -s coverage -M 0 -S 0 -U 100,100,120 import MySQLdb succeeded root_folder /data_4/data_log/job/2026/March/08032026/coverage/ git_velours : /home/admin/workarea/git/Velours/ out_folder_name htmlcov output_folder /data_4/data_log/job/2026/March/08032026/coverage/htmlcov new path : /data_4/data_log/job/2026/March/08032026/coverage/ command : coverage3 run /home/admin/workarea/git/Velours/python/tests/python_tests.py --short_python3 `cat ~/.fotonower_pass/bdd.py.pass` cat: /home/admin/.fotonower_pass/bdd.py.pass: Aucun fichier ou dossier de ce type import MySQLdb succeeded Import error (python version) python version = 3 warning , we can't find thcl infos in json_data warning , we can't find pdt infos in json_data python version used : 3 #&_# BEGIN OF TEST : tests/mask_test #&_# /home/admin/workarea/git/Velours/python/tests/mask_test.py Test mask-detection python version used : 3 ############################### TEST memory used ################################ free memory at begining : begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 10998 run mask_detect Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : mask_detect list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.19269299507141113 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:mask_detect Sun Mar 8 05:20:28 2026 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 10998 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 /home/admin/workarea/git/Velours/python/tests/python_tests.py:11: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses import imp 2026-03-08 05:20:32.756113: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2026-03-08 05:20:32.786632: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493010000 Hz 2026-03-08 05:20:32.789077: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f7458000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2026-03-08 05:20:32.789123: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2026-03-08 05:20:32.793859: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2026-03-08 05:20:33.126901: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x155d4060 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2026-03-08 05:20:33.126952: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2026-03-08 05:20:33.128754: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2026-03-08 05:20:33.130990: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2026-03-08 05:20:33.164447: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2026-03-08 05:20:33.183325: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2026-03-08 05:20:33.187163: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2026-03-08 05:20:33.217312: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2026-03-08 05:20:33.222542: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2026-03-08 05:20:33.278506: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2026-03-08 05:20:33.280282: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2026-03-08 05:20:33.280691: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2026-03-08 05:20:33.282262: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2026-03-08 05:20:33.282312: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2026-03-08 05:20:33.282323: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2026-03-08 05:20:33.284352: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10193 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl454 thcls : [{'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}] thcl {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 3473 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (3473, 'mask_coco_origin', 16384, 25088, 'mask_coco_origin', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2018, 3, 19, 10, 42, 21), datetime.datetime(2018, 3, 19, 10, 42, 21)) {'thcl': {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'], 'list_hashtags_csv': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'svm_hashtag_type_desc': 3473, 'photo_desc_type': 3473, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME mask_coco_origin NUM_CLASSES 81 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 2026-03-08 05:20:34.351266: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2026-03-08 05:20:34.351344: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2026-03-08 05:20:34.351366: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2026-03-08 05:20:34.351385: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2026-03-08 05:20:34.351404: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2026-03-08 05:20:34.351422: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2026-03-08 05:20:34.351440: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2026-03-08 05:20:34.351459: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2026-03-08 05:20:34.353057: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2026-03-08 05:20:34.354364: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2026-03-08 05:20:34.354401: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2026-03-08 05:20:34.354420: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2026-03-08 05:20:34.354438: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2026-03-08 05:20:34.354456: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2026-03-08 05:20:34.354473: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2026-03-08 05:20:34.354491: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2026-03-08 05:20:34.354508: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2026-03-08 05:20:34.356087: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2026-03-08 05:20:34.356117: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2026-03-08 05:20:34.356128: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2026-03-08 05:20:34.356138: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2026-03-08 05:20:34.357761: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10193 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : mask_coco_origin model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2026-03-08 05:20:44.566253: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2026-03-08 05:20:44.875359: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/mask_coco_origin /data/models_weight/mask_coco_origin/mask_model.h5 size_local : 257557808 size in s3 : 257557808 create time local : 2026-03-05 05:20:41 create time in s3 : 2026-03-04 16:35:32 mask_model.h5 already exist and didn't need to update list_images length : 1 NEW PHOTO Processing 1 images image shape: (480, 640, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 89) min: 0.00000 max: 640.00000 nb d'objets trouves : 5 Detection mask done ! Trying to reset tf kernel 2275801 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 5706 tf kernel not reseted sub process len(results) : 1 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 1 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 10998 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl454 Catched exception ! Connect or reconnect ! thcls : [{'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}] thcl {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 3473 ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] DEBUG bbox = [22, 0, 282, 186] DEBUG masks shape = (480, 640) time for calcul the mask position with numpy : 0.0005438327789306641 nb_pixel_total : 15553 time to create 1 rle with old method : 0.03418874740600586 length of segment : 256 DEBUG bbox = [24, 29, 419, 591] DEBUG masks shape = (480, 640) time for calcul the mask position with numpy : 0.0024623870849609375 nb_pixel_total : 145327 time to create 1 rle with old method : 0.30223703384399414 length of segment : 371 DEBUG bbox = [23, 485, 174, 636] DEBUG masks shape = (480, 640) time for calcul the mask position with numpy : 0.00023984909057617188 nb_pixel_total : 14254 time to create 1 rle with old method : 0.030112028121948242 length of segment : 151 DEBUG bbox = [2, 280, 55, 481] DEBUG masks shape = (480, 640) time for calcul the mask position with numpy : 0.00012111663818359375 nb_pixel_total : 5613 time to create 1 rle with old method : 0.012255430221557617 length of segment : 48 DEBUG bbox = [6, 456, 45, 547] DEBUG masks shape = (480, 640) time for calcul the mask position with numpy : 6.031990051269531e-05 nb_pixel_total : 1824 time to create 1 rle with old method : 0.004097461700439453 length of segment : 39 time spent for convertir_results : 1.2209010124206543 time spend for datou_step_exec : 23.96342134475708 time spend to save output : 3.910064697265625e-05 total time spend for step 1 : 23.963460445404053 caffe_path_current : About to save ! 1 Inside saveOutput : final : True verbose : False eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 3424 chid ids of type : 445 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++Number RLEs to save : 0 begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1 time used for this insertion : 0.0214691162109375 save missing photos in datou_result : After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {'957285035': [[(957285035, 492601069, 445, 0, 186, 22, 282, 0.9954869, [(140, 26, 6), (135, 27, 15), (133, 28, 18), (131, 29, 22), (127, 30, 27), (10, 31, 1), (120, 31, 35), (8, 32, 13), (27, 32, 3), (115, 32, 41), (7, 33, 52), (109, 33, 48), (6, 34, 70), (103, 34, 55), (5, 35, 154), (4, 36, 155), (3, 37, 156), (3, 38, 156), (3, 39, 156), (2, 40, 157), (2, 41, 157), (2, 42, 157), (2, 43, 157), (2, 44, 157), (2, 45, 157), (1, 46, 158), (1, 47, 158), (1, 48, 158), (1, 49, 157), (1, 50, 157), (1, 51, 156), (1, 52, 156), (1, 53, 155), (1, 54, 154), (1, 55, 152), (1, 56, 149), (1, 57, 145), (1, 58, 141), (1, 59, 136), (1, 60, 133), (1, 61, 130), (1, 62, 127), (1, 63, 126), (1, 64, 124), (1, 65, 123), (1, 66, 121), (1, 67, 120), (1, 68, 118), (1, 69, 117), (1, 70, 116), (1, 71, 115), (1, 72, 114), (1, 73, 113), (1, 74, 112), (1, 75, 111), (1, 76, 110), (1, 77, 108), (1, 78, 108), (1, 79, 107), (1, 80, 106), (1, 81, 105), (2, 82, 104), (2, 83, 103), (2, 84, 103), (2, 85, 102), (2, 86, 102), (2, 87, 101), (2, 88, 100), (2, 89, 99), (2, 90, 99), (2, 91, 98), (2, 92, 97), (2, 93, 96), (2, 94, 95), (2, 95, 93), (2, 96, 91), (2, 97, 90), (2, 98, 89), (2, 99, 87), (2, 100, 86), (2, 101, 86), (2, 102, 85), (2, 103, 84), (2, 104, 83), (2, 105, 83), (2, 106, 82), (2, 107, 81), (2, 108, 80), (2, 109, 80), (2, 110, 79), (2, 111, 78), (2, 112, 77), (2, 113, 76), (1, 114, 76), (1, 115, 75), (1, 116, 74), (1, 117, 73), (1, 118, 72), (1, 119, 71), (1, 120, 71), (1, 121, 70), (1, 122, 69), (1, 123, 69), (1, 124, 68), (1, 125, 68), (1, 126, 67), (1, 127, 67), (1, 128, 66), (1, 129, 66), (1, 130, 66), (1, 131, 65), (1, 132, 65), (1, 133, 64), (1, 134, 63), (1, 135, 63), (1, 136, 62), (1, 137, 61), (1, 138, 60), (1, 139, 60), (1, 140, 59), (1, 141, 58), (1, 142, 58), (1, 143, 57), (1, 144, 56), (1, 145, 56), (1, 146, 55), (1, 147, 54), (1, 148, 54), (1, 149, 53), (1, 150, 52), (1, 151, 52), (1, 152, 51), (1, 153, 50), (1, 154, 49), (1, 155, 48), (1, 156, 47), (1, 157, 46), (1, 158, 46), (1, 159, 45), (1, 160, 44), (1, 161, 43), (1, 162, 42), (1, 163, 42), (1, 164, 41), (1, 165, 40), (1, 166, 40), (1, 167, 39), (1, 168, 38), (1, 169, 37), (1, 170, 36), (1, 171, 35), (1, 172, 34), (1, 173, 34), (1, 174, 33), (1, 175, 33), (1, 176, 32), (1, 177, 32), (1, 178, 32), (1, 179, 32), (1, 180, 31), (1, 181, 31), (1, 182, 31), (1, 183, 30), (1, 184, 30), (1, 185, 30), (1, 186, 29), (1, 187, 29), (1, 188, 29), (1, 189, 28), (1, 190, 28), (1, 191, 27), (1, 192, 27), (1, 193, 26), (1, 194, 26), (1, 195, 26), (1, 196, 26), (1, 197, 26), (1, 198, 26), (1, 199, 26), (1, 200, 25), (1, 201, 25), (1, 202, 25), (1, 203, 25), (1, 204, 25), (1, 205, 25), (1, 206, 25), (1, 207, 25), (1, 208, 25), (1, 209, 25), (1, 210, 25), (1, 211, 25), (1, 212, 25), (1, 213, 25), (1, 214, 25), (1, 215, 25), (1, 216, 25), (1, 217, 25), (1, 218, 25), (1, 219, 25), (1, 220, 24), (1, 221, 24), (1, 222, 24), (1, 223, 24), (1, 224, 24), (1, 225, 24), (1, 226, 25), (1, 227, 25), (1, 228, 25), (2, 229, 24), (2, 230, 24), (2, 231, 24), (2, 232, 23), (2, 233, 23), (2, 234, 23), (2, 235, 23), (2, 236, 23), (2, 237, 23), (2, 238, 23), (2, 239, 23), (2, 240, 23), (2, 241, 23), (2, 242, 23), (2, 243, 23), (2, 244, 23), (2, 245, 23), (2, 246, 23), (2, 247, 23), (2, 248, 23), (2, 249, 24), (2, 250, 24), (2, 251, 23), (2, 252, 23), (2, 253, 23), (2, 254, 23), (2, 255, 23), (2, 256, 23), (2, 257, 23), (2, 258, 23), (2, 259, 23), (2, 260, 23), (2, 261, 23), (3, 262, 22), (3, 263, 22), (3, 264, 22), (3, 265, 22), (4, 266, 21), (4, 267, 21), (5, 268, 20), (5, 269, 20), (6, 270, 19), (7, 271, 17), (8, 272, 16), (8, 273, 16), (9, 274, 13), (11, 275, 9), (15, 276, 2)], ['16,276,8,273,2,261,2,229,1,228,1,114,2,113,2,82,1,81,1,46,3,37,8,32,20,32,21,33,58,33,59,34,75,34,76,35,102,35,114,33,120,31,130,30,135,27,145,26,152,29,158,35,158,48,154,54,141,58,128,61,119,67,105,81,103,86,96,94,89,98,81,109,71,119,65,132,60,138,52,151,42,162,40,166,34,172,29,188,26,193,25,200,25,219,24,232,24,270,23,273']), (957285035, 492601069, 445, 29, 591, 24, 419, 0.9923813, [(315, 37, 24), (272, 38, 86), (253, 39, 130), (238, 40, 151), (199, 41, 196), (189, 42, 213), (180, 43, 238), (175, 44, 250), (172, 45, 257), (169, 46, 265), (166, 47, 274), (162, 48, 284), (159, 49, 294), (157, 50, 304), (155, 51, 310), (153, 52, 317), (151, 53, 323), (149, 54, 330), (148, 55, 334), (146, 56, 337), (144, 57, 341), (142, 58, 344), (140, 59, 347), (138, 60, 350), (136, 61, 353), (134, 62, 356), (132, 63, 358), (130, 64, 361), (128, 65, 364), (126, 66, 367), (124, 67, 370), (122, 68, 373), (120, 69, 376), (118, 70, 379), (117, 71, 381), (115, 72, 385), (114, 73, 387), (113, 74, 389), (112, 75, 391), (112, 76, 393), (111, 77, 395), (110, 78, 397), (109, 79, 399), (109, 80, 400), (108, 81, 402), (107, 82, 404), (107, 83, 404), (106, 84, 406), (105, 85, 408), (105, 86, 409), (104, 87, 410), (104, 88, 411), (103, 89, 413), (102, 90, 415), (101, 91, 417), (100, 92, 420), (98, 93, 423), (97, 94, 426), (96, 95, 428), (94, 96, 431), (93, 97, 433), (92, 98, 435), (91, 99, 437), (90, 100, 439), (89, 101, 441), (89, 102, 441), (89, 103, 442), (89, 104, 443), (89, 105, 444), (89, 106, 444), (89, 107, 445), (89, 108, 446), (89, 109, 447), (89, 110, 448), (89, 111, 449), (89, 112, 450), (89, 113, 451), (89, 114, 453), (89, 115, 454), (89, 116, 455), (88, 117, 456), (88, 118, 457), (87, 119, 459), (87, 120, 459), (86, 121, 461), (86, 122, 461), (85, 123, 463), (84, 124, 464), (84, 125, 465), (83, 126, 466), (82, 127, 468), (82, 128, 468), (81, 129, 470), (80, 130, 471), (78, 131, 473), (77, 132, 475), (75, 133, 477), (73, 134, 480), (71, 135, 482), (70, 136, 484), (68, 137, 486), (67, 138, 488), (65, 139, 490), (64, 140, 492), (63, 141, 493), (61, 142, 496), (60, 143, 497), (59, 144, 499), (58, 145, 501), (58, 146, 501), (57, 147, 503), (57, 148, 504), (57, 149, 505), (56, 150, 507), (56, 151, 507), (55, 152, 509), (55, 153, 510), (54, 154, 511), (54, 155, 512), (54, 156, 513), (53, 157, 514), (53, 158, 514), (52, 159, 515), (52, 160, 516), (52, 161, 516), (51, 162, 517), (51, 163, 517), (50, 164, 518), (50, 165, 518), (49, 166, 519), (49, 167, 520), (48, 168, 521), (48, 169, 521), (47, 170, 522), (47, 171, 522), (46, 172, 523), (46, 173, 523), (46, 174, 523), (45, 175, 524), (45, 176, 523), (44, 177, 524), (44, 178, 524), (44, 179, 524), (43, 180, 525), (43, 181, 525), (42, 182, 525), (42, 183, 525), (42, 184, 525), (41, 185, 526), (41, 186, 526), (40, 187, 526), (39, 188, 526), (39, 189, 525), (38, 190, 526), (38, 191, 525), (37, 192, 525), (37, 193, 523), (36, 194, 523), (36, 195, 522), (36, 196, 522), (35, 197, 522), (35, 198, 521), (34, 199, 521), (34, 200, 521), (34, 201, 520), (34, 202, 520), (34, 203, 520), (34, 204, 519), (34, 205, 519), (33, 206, 520), (33, 207, 519), (33, 208, 519), (33, 209, 519), (33, 210, 518), (33, 211, 518), (33, 212, 518), (33, 213, 517), (32, 214, 518), (32, 215, 517), (32, 216, 517), (32, 217, 516), (32, 218, 515), (32, 219, 514), (32, 220, 513), (32, 221, 512), (32, 222, 511), (32, 223, 510), (32, 224, 508), (32, 225, 507), (32, 226, 505), (32, 227, 504), (32, 228, 503), (32, 229, 502), (32, 230, 502), (32, 231, 501), (32, 232, 500), (32, 233, 499), (32, 234, 498), (32, 235, 497), (31, 236, 496), (31, 237, 495), (31, 238, 494), (31, 239, 493), (31, 240, 491), (31, 241, 490), (31, 242, 488), (31, 243, 487), (31, 244, 486), (31, 245, 485), (31, 246, 483), (31, 247, 482), (31, 248, 480), (31, 249, 479), (31, 250, 477), (31, 251, 475), (31, 252, 474), (31, 253, 472), (31, 254, 470), (31, 255, 468), (31, 256, 467), (31, 257, 465), (31, 258, 464), (31, 259, 463), (31, 260, 462), (31, 261, 461), (31, 262, 459), (31, 263, 458), (31, 264, 456), (31, 265, 455), (31, 266, 453), (31, 267, 451), (31, 268, 449), (31, 269, 448), (31, 270, 447), (31, 271, 445), (31, 272, 444), (31, 273, 443), (32, 274, 441), (32, 275, 440), (32, 276, 438), (32, 277, 437), (32, 278, 435), (32, 279, 434), (32, 280, 432), (33, 281, 429), (33, 282, 427), (33, 283, 426), (33, 284, 424), (33, 285, 423), (34, 286, 421), (34, 287, 420), (34, 288, 419), (35, 289, 416), (35, 290, 415), (35, 291, 414), (36, 292, 411), (36, 293, 410), (37, 294, 407), (37, 295, 406), (38, 296, 403), (38, 297, 401), (39, 298, 399), (39, 299, 397), (41, 300, 394), (42, 301, 392), (43, 302, 389), (44, 303, 387), (45, 304, 385), (46, 305, 382), (47, 306, 380), (47, 307, 378), (48, 308, 376), (49, 309, 373), (50, 310, 370), (51, 311, 368), (51, 312, 367), (52, 313, 365), (54, 314, 362), (55, 315, 360), (56, 316, 359), (58, 317, 356), (61, 318, 352), (64, 319, 349), (67, 320, 345), (70, 321, 341), (73, 322, 338), (75, 323, 335), (78, 324, 332), (80, 325, 329), (82, 326, 327), (84, 327, 324), (86, 328, 322), (88, 329, 320), (90, 330, 317), (93, 331, 314), (96, 332, 311), (99, 333, 307), (102, 334, 304), (105, 335, 300), (108, 336, 297), (111, 337, 294), (113, 338, 291), (115, 339, 289), (117, 340, 286), (119, 341, 283), (121, 342, 281), (123, 343, 278), (125, 344, 275), (127, 345, 272), (129, 346, 269), (132, 347, 266), (135, 348, 262), (137, 349, 259), (141, 350, 255), (143, 351, 252), (145, 352, 250), (147, 353, 247), (149, 354, 245), (151, 355, 242), (152, 356, 241), (154, 357, 239), (156, 358, 237), (159, 359, 233), (161, 360, 231), (163, 361, 229), (165, 362, 227), (167, 363, 224), (169, 364, 222), (170, 365, 221), (172, 366, 219), (173, 367, 218), (174, 368, 216), (175, 369, 215), (177, 370, 213), (178, 371, 212), (180, 372, 209), (183, 373, 206), (185, 374, 204), (188, 375, 200), (191, 376, 197), (194, 377, 193), (196, 378, 191), (199, 379, 188), (201, 380, 185), (203, 381, 183), (205, 382, 180), (207, 383, 178), (208, 384, 176), (210, 385, 174), (212, 386, 171), (213, 387, 169), (215, 388, 166), (218, 389, 162), (221, 390, 158), (225, 391, 153), (228, 392, 149), (232, 393, 144), (235, 394, 140), (238, 395, 136), (241, 396, 133), (245, 397, 128), (248, 398, 124), (252, 399, 119), (257, 400, 113), (263, 401, 105), (272, 402, 94), (283, 403, 82), (297, 404, 65), (306, 405, 53), (313, 406, 38), (321, 407, 23)], ['321,407,296,403,263,401,215,388,193,376,178,371,168,363,140,349,110,336,90,330,77,323,56,316,39,299,31,273,31,236,34,199,42,184,58,145,82,128,89,116,89,101,104,88,115,72,159,49,180,43,199,41,237,41,272,38,338,37,382,39,402,43,460,50,481,55,543,116,556,143,566,156,568,167,566,186,554,199,548,216,528,235,496,256,471,275,460,281,414,315,403,339,392,355,383,385,369,400,358,405']), (957285035, 492601069, 445, 485, 636, 23, 174, 0.97114336, [(540, 24, 21), (626, 24, 3), (531, 25, 49), (594, 25, 40), (527, 26, 107), (523, 27, 111), (520, 28, 114), (518, 29, 117), (516, 30, 119), (515, 31, 120), (513, 32, 122), (512, 33, 123), (510, 34, 125), (509, 35, 126), (507, 36, 128), (506, 37, 129), (504, 38, 131), (503, 39, 132), (501, 40, 134), (500, 41, 135), (499, 42, 136), (498, 43, 137), (497, 44, 138), (496, 45, 139), (496, 46, 139), (495, 47, 140), (495, 48, 140), (494, 49, 141), (493, 50, 142), (492, 51, 143), (491, 52, 144), (491, 53, 144), (490, 54, 145), (490, 55, 145), (490, 56, 145), (490, 57, 146), (490, 58, 146), (490, 59, 146), (491, 60, 145), (491, 61, 145), (491, 62, 145), (492, 63, 144), (493, 64, 143), (494, 65, 142), (495, 66, 141), (496, 67, 140), (497, 68, 138), (498, 69, 138), (499, 70, 137), (500, 71, 136), (501, 72, 135), (503, 73, 133), (503, 74, 133), (505, 75, 131), (506, 76, 130), (507, 77, 129), (508, 78, 128), (509, 79, 127), (510, 80, 126), (511, 81, 125), (512, 82, 124), (513, 83, 123), (514, 84, 122), (515, 85, 121), (516, 86, 120), (517, 87, 119), (518, 88, 118), (519, 89, 117), (521, 90, 115), (521, 91, 115), (522, 92, 114), (523, 93, 113), (524, 94, 112), (525, 95, 111), (526, 96, 110), (527, 97, 109), (529, 98, 107), (530, 99, 106), (532, 100, 104), (533, 101, 103), (534, 102, 102), (535, 103, 101), (536, 104, 100), (538, 105, 98), (540, 106, 96), (541, 107, 95), (543, 108, 93), (546, 109, 90), (548, 110, 88), (549, 111, 87), (551, 112, 84), (552, 113, 83), (553, 114, 82), (555, 115, 80), (556, 116, 79), (556, 117, 79), (557, 118, 78), (558, 119, 77), (559, 120, 76), (560, 121, 75), (560, 122, 75), (561, 123, 74), (561, 124, 74), (561, 125, 74), (562, 126, 73), (562, 127, 73), (563, 128, 72), (563, 129, 72), (564, 130, 70), (564, 131, 70), (565, 132, 69), (565, 133, 68), (565, 134, 68), (565, 135, 67), (566, 136, 65), (566, 137, 64), (566, 138, 64), (566, 139, 62), (566, 140, 61), (566, 141, 59), (566, 142, 57), (566, 143, 56), (566, 144, 55), (566, 145, 54), (567, 146, 53), (567, 147, 52), (567, 148, 51), (568, 149, 50), (568, 150, 49), (568, 151, 48), (568, 152, 47), (569, 153, 45), (569, 154, 44), (570, 155, 42), (570, 156, 42), (570, 157, 41), (571, 158, 39), (571, 159, 39), (572, 160, 37), (572, 161, 37), (573, 162, 35), (573, 163, 34), (573, 164, 34), (574, 165, 32), (575, 166, 30), (577, 167, 28), (578, 168, 26), (581, 169, 22), (584, 170, 19), (587, 171, 15), (591, 172, 8)], ['598,172,591,172,590,171,578,168,573,164,573,162,568,152,568,149,566,145,566,136,565,132,561,125,560,121,556,116,547,109,543,108,536,104,531,99,527,97,491,62,490,54,495,48,496,45,502,40,516,30,523,27,531,25,539,25,540,24,560,24,561,25,579,25,580,26,593,26,594,25,633,25,634,29,634,56,635,57,635,111,634,112,634,129,632,134,629,138,623,141,619,145,617,149,611,155,608,161,604,166']), (957285035, 492601069, 445, 280, 481, 2, 55, 0.8298871, [(292, 3, 128), (284, 4, 146), (282, 5, 151), (281, 6, 154), (281, 7, 156), (281, 8, 157), (281, 9, 158), (281, 10, 160), (281, 11, 162), (281, 12, 165), (281, 13, 167), (281, 14, 169), (281, 15, 171), (281, 16, 173), (281, 17, 174), (281, 18, 175), (281, 19, 177), (281, 20, 178), (281, 21, 179), (281, 22, 180), (281, 23, 181), (281, 24, 182), (281, 25, 183), (281, 26, 184), (281, 27, 185), (281, 28, 185), (281, 29, 185), (282, 30, 185), (283, 31, 27), (337, 31, 131), (371, 32, 97), (401, 33, 68), (409, 34, 61), (419, 35, 52), (424, 36, 48), (429, 37, 44), (432, 38, 41), (434, 39, 40), (436, 40, 39), (438, 41, 37), (441, 42, 35), (444, 43, 32), (448, 44, 29), (452, 45, 25), (454, 46, 23), (459, 47, 17), (463, 48, 12), (468, 49, 5)], ['472,49,468,49,467,48,459,47,458,46,454,46,451,44,448,44,447,43,444,43,440,41,438,41,428,36,424,36,423,35,419,35,418,34,409,34,408,33,401,33,400,32,371,32,370,31,337,31,336,30,283,31,281,29,281,6,284,4,291,4,292,3,419,3,420,4,429,4,430,5,432,5,436,7,441,11,445,12,453,16,456,19,457,19,465,27,465,29,472,37,476,44,476,46']), (957285035, 492601069, 445, 456, 547, 6, 45, 0.7401094, [(482, 8, 19), (464, 9, 3), (481, 9, 44), (457, 10, 12), (479, 10, 50), (457, 11, 13), (476, 11, 56), (457, 12, 15), (475, 12, 65), (457, 13, 84), (457, 14, 85), (457, 15, 89), (457, 16, 89), (458, 17, 88), (459, 18, 87), (460, 19, 86), (461, 20, 80), (464, 21, 71), (466, 22, 63), (467, 23, 59), (468, 24, 55), (469, 25, 52), (469, 26, 51), (470, 27, 48), (471, 28, 46), (471, 29, 44), (472, 30, 42), (473, 31, 39), (473, 32, 38), (474, 33, 36), (475, 34, 33), (475, 35, 32), (476, 36, 30), (476, 37, 29), (477, 38, 26), (478, 39, 23), (479, 40, 20), (480, 41, 17), (488, 42, 5)], ['492,42,488,42,487,41,480,41,476,37,475,34,473,32,469,25,465,21,461,20,457,16,457,10,463,10,464,9,466,9,470,12,474,13,476,11,480,10,482,8,500,8,501,9,524,9,525,10,528,10,532,12,539,12,542,15,545,15,545,19,535,20,534,21,529,21,525,23,523,23,513,30,512,30,504,37,496,41,493,41'])], 'temp/1772943628_2275741_957285035_a42482e51c93c8025d243dd179aee85b.jpg']} free memory after detection : begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 10998 ############################### TEST detect object ################################ run mask_detect Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : mask_detect list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.20987820625305176 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:mask_detect Sun Mar 8 05:20:57 2026 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 10998 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2026-03-08 05:21:00.339409: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2026-03-08 05:21:00.370471: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493010000 Hz 2026-03-08 05:21:00.372613: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f7458000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2026-03-08 05:21:00.372680: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2026-03-08 05:21:00.376749: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2026-03-08 05:21:00.685067: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x12debe60 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2026-03-08 05:21:00.685121: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2026-03-08 05:21:00.686583: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2026-03-08 05:21:00.686988: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2026-03-08 05:21:00.689978: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2026-03-08 05:21:00.692625: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2026-03-08 05:21:00.693105: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2026-03-08 05:21:00.696129: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2026-03-08 05:21:00.697105: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2026-03-08 05:21:00.701241: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2026-03-08 05:21:00.702675: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2026-03-08 05:21:00.702747: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2026-03-08 05:21:00.703525: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2026-03-08 05:21:00.703540: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2026-03-08 05:21:00.703549: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2026-03-08 05:21:00.704887: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10193 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2026-03-08 05:21:00.812038: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2026-03-08 05:21:00.812125: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2026-03-08 05:21:00.812149: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2026-03-08 05:21:00.812171: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2026-03-08 05:21:00.812192: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2026-03-08 05:21:00.812213: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2026-03-08 05:21:00.812247: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2026-03-08 05:21:00.812268: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2026-03-08 05:21:00.813864: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2026-03-08 05:21:00.815197: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2026-03-08 05:21:00.815236: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2026-03-08 05:21:00.815258: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2026-03-08 05:21:00.815278: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2026-03-08 05:21:00.815298: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2026-03-08 05:21:00.815318: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2026-03-08 05:21:00.815337: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2026-03-08 05:21:00.815357: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2026-03-08 05:21:00.816953: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2026-03-08 05:21:00.816983: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2026-03-08 05:21:00.816993: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2026-03-08 05:21:00.817003: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2026-03-08 05:21:00.818670: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10193 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (3473, 'mask_coco_origin', 16384, 25088, 'mask_coco_origin', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2018, 3, 19, 10, 42, 21), datetime.datetime(2018, 3, 19, 10, 42, 21)) {'thcl': {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'], 'list_hashtags_csv': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'svm_hashtag_type_desc': 3473, 'photo_desc_type': 3473, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME mask_coco_origin NUM_CLASSES 81 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : mask_coco_origin model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2026-03-08 05:21:09.856212: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2026-03-08 05:21:10.026698: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/mask_coco_origin /data/models_weight/mask_coco_origin/mask_model.h5 size_local : 257557808 size in s3 : 257557808 create time local : 2026-03-05 05:20:41 create time in s3 : 2026-03-04 16:35:32 mask_model.h5 already exist and didn't need to update list_images length : 1 NEW PHOTO Processing 1 images image shape: (720, 1280, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 89) min: 0.00000 max: 1280.00000 nb d'objets trouves : 4 Detection mask done ! Trying to reset tf kernel 2276196 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 5706 tf kernel not reseted sub process len(results) : 1 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 1 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 10998 list_Values should be empty [] ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] DEBUG bbox = [0, 1092, 108, 1280] DEBUG masks shape = (720, 1280) time for calcul the mask position with numpy : 0.0007686614990234375 nb_pixel_total : 16902 time to create 1 rle with old method : 0.038160085678100586 length of segment : 107 DEBUG bbox = [16, 52, 668, 1128] DEBUG masks shape = (720, 1280) time for calcul the mask position with numpy : 0.019735097885131836 nb_pixel_total : 480730 time to create 1 rle with new method : 0.03329825401306152 length of segment : 632 DEBUG bbox = [0, 0, 116, 438] DEBUG masks shape = (720, 1280) time for calcul the mask position with numpy : 0.0008428096771240234 nb_pixel_total : 36585 time to create 1 rle with old method : 0.07491564750671387 length of segment : 132 DEBUG bbox = [0, 390, 54, 550] DEBUG masks shape = (720, 1280) time for calcul the mask position with numpy : 0.00010085105895996094 nb_pixel_total : 4793 time to create 1 rle with old method : 0.010236024856567383 length of segment : 51 time spent for convertir_results : 0.42479681968688965 time spend for datou_step_exec : 18.85175848007202 time spend to save output : 7.796287536621094e-05 total time spend for step 1 : 18.851836442947388 caffe_path_current : About to save ! 1 Inside saveOutput : final : True verbose : False eke 12-6-18 : saveMask need to be cleaned for new output ! Catched exception ! Connect or reconnect ! Number saved : None batch 1 Loaded 447 chid ids of type : 445 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++Number RLEs to save : 0 begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1 time used for this insertion : 0.013677597045898438 save missing photos in datou_result : After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {'917855882': [[(917855882, 492601069, 445, 1092, 1280, 0, 108, 0.99883765, [(1205, 1, 58), (1165, 2, 105), (1159, 3, 113), (1149, 4, 124), (1113, 5, 161), (1100, 6, 174), (1097, 7, 177), (1095, 8, 179), (1095, 9, 179), (1095, 10, 179), (1095, 11, 179), (1095, 12, 179), (1095, 13, 179), (1095, 14, 178), (1095, 15, 178), (1095, 16, 178), (1095, 17, 178), (1095, 18, 177), (1095, 19, 177), (1095, 20, 177), (1095, 21, 177), (1095, 22, 177), (1095, 23, 178), (1095, 24, 178), (1095, 25, 178), (1095, 26, 179), (1095, 27, 179), (1095, 28, 180), (1095, 29, 181), (1095, 30, 182), (1095, 31, 183), (1095, 32, 183), (1095, 33, 184), (1095, 34, 184), (1096, 35, 183), (1096, 36, 183), (1096, 37, 184), (1097, 38, 183), (1097, 39, 183), (1097, 40, 183), (1098, 41, 182), (1098, 42, 182), (1098, 43, 182), (1099, 44, 181), (1099, 45, 181), (1099, 46, 181), (1100, 47, 180), (1100, 48, 180), (1101, 49, 179), (1101, 50, 179), (1102, 51, 178), (1102, 52, 178), (1103, 53, 177), (1103, 54, 177), (1104, 55, 176), (1104, 56, 176), (1104, 57, 176), (1104, 58, 176), (1105, 59, 175), (1105, 60, 175), (1105, 61, 175), (1105, 62, 175), (1105, 63, 175), (1106, 64, 174), (1106, 65, 174), (1106, 66, 174), (1106, 67, 174), (1106, 68, 174), (1106, 69, 174), (1106, 70, 174), (1106, 71, 174), (1106, 72, 174), (1106, 73, 174), (1107, 74, 173), (1107, 75, 173), (1107, 76, 173), (1107, 77, 173), (1107, 78, 173), (1107, 79, 173), (1108, 80, 172), (1108, 81, 172), (1109, 82, 171), (1110, 83, 170), (1110, 84, 170), (1111, 85, 169), (1112, 86, 168), (1113, 87, 166), (1114, 88, 165), (1115, 89, 164), (1117, 90, 162), (1120, 91, 159), (1138, 92, 141), (1146, 93, 133), (1154, 94, 125), (1167, 95, 112), (1177, 96, 102), (1183, 97, 95), (1185, 98, 93), (1187, 99, 90), (1188, 100, 55), (1264, 100, 12), (1190, 101, 50), (1191, 102, 46), (1194, 103, 40), (1197, 104, 34), (1202, 105, 25), (1207, 106, 16)], ['1222,106,1207,106,1206,105,1197,104,1191,102,1182,96,1176,95,1167,95,1166,94,1154,94,1153,93,1146,93,1145,92,1137,91,1120,91,1115,89,1110,84,1107,79,1106,73,1106,64,1104,55,1099,46,1095,34,1095,8,1100,6,1112,6,1113,5,1148,5,1149,4,1158,4,1165,2,1204,2,1205,1,1262,1,1269,2,1273,5,1273,13,1271,18,1271,22,1273,27,1277,31,1279,37,1279,86,1278,87,1278,96,1275,100,1264,100,1263,99,1243,99,1230,104']), (917855882, 492601069, 445, 52, 1128, 16, 668, 0.99774545, [(711, 22, 21), (926, 22, 46), (608, 23, 146), (894, 23, 103), (598, 24, 233), (851, 24, 157), (590, 25, 427), (582, 26, 444), (575, 27, 458), (569, 28, 466), (565, 29, 472), (561, 30, 479), (556, 31, 486), (550, 32, 495), (545, 33, 502), (538, 34, 511), (532, 35, 520), (527, 36, 527), (523, 37, 534), (518, 38, 541), (514, 39, 548), (510, 40, 554), (506, 41, 561), (503, 42, 566), (499, 43, 572), (496, 44, 577), (493, 45, 582), (491, 46, 585), (489, 47, 589), (487, 48, 592), (485, 49, 595), (483, 50, 598), (482, 51, 600), (481, 52, 602), (480, 53, 603), (479, 54, 605), (478, 55, 606), (476, 56, 608), (475, 57, 610), (474, 58, 611), (473, 59, 613), (472, 60, 614), (470, 61, 616), (469, 62, 618), (468, 63, 619), (466, 64, 621), (465, 65, 623), (464, 66, 624), (462, 67, 626), (461, 68, 628), (459, 69, 630), (458, 70, 631), (456, 71, 633), (455, 72, 635), (453, 73, 637), (452, 74, 638), (451, 75, 639), (450, 76, 640), (448, 77, 642), (447, 78, 643), (446, 79, 644), (445, 80, 645), (444, 81, 646), (442, 82, 648), (441, 83, 649), (440, 84, 650), (439, 85, 651), (438, 86, 652), (437, 87, 653), (436, 88, 654), (435, 89, 655), (434, 90, 656), (433, 91, 657), (432, 92, 658), (431, 93, 659), (430, 94, 660), (429, 95, 661), (428, 96, 662), (427, 97, 663), (425, 98, 665), (423, 99, 667), (422, 100, 668), (419, 101, 671), (417, 102, 673), (414, 103, 676), (410, 104, 680), (406, 105, 684), (401, 106, 689), (397, 107, 693), (392, 108, 698), (387, 109, 703), (382, 110, 708), (377, 111, 713), (373, 112, 717), (369, 113, 721), (365, 114, 725), (362, 115, 728), (358, 116, 732), (356, 117, 734), (353, 118, 737), (351, 119, 739), (348, 120, 742), (346, 121, 744), (344, 122, 746), (341, 123, 749), (338, 124, 752), (335, 125, 755), (331, 126, 759), (327, 127, 763), (323, 128, 766), (319, 129, 770), (314, 130, 775), (308, 131, 781), (303, 132, 786), (294, 133, 795), (286, 134, 803), (279, 135, 810), (273, 136, 816), (267, 137, 822), (262, 138, 827), (258, 139, 831), (255, 140, 834), (252, 141, 837), (250, 142, 839), (247, 143, 842), (245, 144, 844), (242, 145, 847), (240, 146, 849), (237, 147, 852), (233, 148, 856), (230, 149, 859), (226, 150, 863), (220, 151, 869), (213, 152, 876), (207, 153, 882), (200, 154, 889), (193, 155, 896), (187, 156, 902), (184, 157, 905), (181, 158, 908), (178, 159, 911), (176, 160, 913), (174, 161, 915), (172, 162, 917), (170, 163, 919), (168, 164, 921), (167, 165, 922), (165, 166, 924), (164, 167, 925), (162, 168, 927), (161, 169, 928), (159, 170, 930), (157, 171, 932), (155, 172, 934), (153, 173, 935), (151, 174, 937), (148, 175, 940), (146, 176, 942), (144, 177, 944), (142, 178, 946), (140, 179, 948), (139, 180, 949), (137, 181, 951), (136, 182, 952), (134, 183, 954), (133, 184, 955), (132, 185, 956), (131, 186, 957), (130, 187, 958), (129, 188, 959), (128, 189, 960), (127, 190, 960), (126, 191, 961), (126, 192, 961), (125, 193, 962), (124, 194, 963), (123, 195, 964), (122, 196, 965), (122, 197, 965), (121, 198, 966), (120, 199, 967), (119, 200, 968), (118, 201, 969), (117, 202, 970), (116, 203, 971), (114, 204, 973), (113, 205, 973), (112, 206, 974), (111, 207, 975), (109, 208, 977), (108, 209, 978), (107, 210, 979), (106, 211, 980), (105, 212, 981), (104, 213, 982), (103, 214, 983), (102, 215, 984), (101, 216, 985), (101, 217, 984), (100, 218, 985), (99, 219, 986), (99, 220, 986), (98, 221, 987), (98, 222, 987), (97, 223, 988), (97, 224, 987), (96, 225, 988), (96, 226, 988), (95, 227, 989), (95, 228, 989), (94, 229, 990), (94, 230, 990), (94, 231, 990), (93, 232, 990), (93, 233, 990), (92, 234, 991), (92, 235, 991), (92, 236, 991), (91, 237, 992), (91, 238, 991), (91, 239, 991), (91, 240, 990), (91, 241, 990), (90, 242, 991), (90, 243, 990), (90, 244, 990), (90, 245, 989), (90, 246, 989), (89, 247, 990), (89, 248, 989), (89, 249, 989), (89, 250, 988), (89, 251, 988), (88, 252, 988), (88, 253, 988), (88, 254, 987), (88, 255, 986), (88, 256, 986), (87, 257, 986), (87, 258, 985), (87, 259, 985), (87, 260, 984), (87, 261, 983), (86, 262, 983), (86, 263, 982), (86, 264, 982), (86, 265, 981), (85, 266, 981), (85, 267, 980), (85, 268, 980), (84, 269, 980), (84, 270, 979), (84, 271, 979), (84, 272, 978), (83, 273, 979), (83, 274, 978), (83, 275, 977), (82, 276, 978), (82, 277, 977), (82, 278, 977), (81, 279, 977), (81, 280, 977), (81, 281, 977), (80, 282, 977), (80, 283, 977), (80, 284, 976), (79, 285, 977), (79, 286, 976), (79, 287, 976), (78, 288, 976), (78, 289, 976), (78, 290, 975), (77, 291, 976), (77, 292, 975), (77, 293, 975), (76, 294, 975), (76, 295, 975), (76, 296, 974), (75, 297, 975), (75, 298, 974), (74, 299, 975), (74, 300, 974), (74, 301, 974), (73, 302, 974), (73, 303, 974), (72, 304, 974), (72, 305, 974), (71, 306, 974), (71, 307, 973), (71, 308, 972), (70, 309, 972), (70, 310, 971), (70, 311, 970), (70, 312, 968), (69, 313, 968), (69, 314, 966), (69, 315, 964), (69, 316, 962), (68, 317, 961), (68, 318, 959), (68, 319, 958), (68, 320, 956), (67, 321, 955), (67, 322, 954), (67, 323, 952), (67, 324, 951), (66, 325, 951), (66, 326, 950), (66, 327, 948), (66, 328, 947), (65, 329, 947), (65, 330, 946), (65, 331, 946), (65, 332, 945), (65, 333, 944), (65, 334, 942), (65, 335, 941), (65, 336, 940), (65, 337, 939), (65, 338, 938), (64, 339, 937), (64, 340, 936), (64, 341, 934), (64, 342, 932), (64, 343, 930), (64, 344, 928), (64, 345, 926), (64, 346, 925), (64, 347, 923), (64, 348, 922), (64, 349, 920), (64, 350, 919), (63, 351, 919), (63, 352, 918), (63, 353, 917), (63, 354, 916), (63, 355, 915), (63, 356, 914), (63, 357, 912), (63, 358, 911), (63, 359, 910), (63, 360, 909), (63, 361, 908), (63, 362, 906), (63, 363, 905), (63, 364, 904), (63, 365, 902), (63, 366, 901), (63, 367, 899), (63, 368, 898), (63, 369, 896), (62, 370, 895), (62, 371, 893), (62, 372, 891), (62, 373, 890), (62, 374, 888), (62, 375, 887), (62, 376, 886), (62, 377, 885), (62, 378, 884), (62, 379, 882), (63, 380, 880), (63, 381, 879), (63, 382, 878), (63, 383, 877), (63, 384, 876), (63, 385, 875), (63, 386, 874), (63, 387, 873), (64, 388, 871), (64, 389, 870), (64, 390, 869), (64, 391, 868), (64, 392, 867), (64, 393, 865), (64, 394, 864), (64, 395, 863), (65, 396, 861), (65, 397, 860), (65, 398, 859), (65, 399, 858), (65, 400, 857), (65, 401, 856), (65, 402, 854), (65, 403, 853), (65, 404, 851), (65, 405, 850), (65, 406, 848), (66, 407, 846), (66, 408, 844), (66, 409, 843), (66, 410, 842), (66, 411, 841), (66, 412, 840), (66, 413, 838), (66, 414, 837), (66, 415, 836), (66, 416, 835), (66, 417, 835), (66, 418, 834), (66, 419, 833), (67, 420, 831), (67, 421, 830), (67, 422, 829), (67, 423, 829), (67, 424, 828), (67, 425, 827), (67, 426, 826), (67, 427, 825), (67, 428, 824), (68, 429, 822), (68, 430, 820), (68, 431, 819), (68, 432, 818), (68, 433, 816), (68, 434, 815), (68, 435, 813), (68, 436, 811), (69, 437, 809), (69, 438, 807), (69, 439, 806), (69, 440, 804), (69, 441, 803), (69, 442, 802), (69, 443, 800), (70, 444, 798), (70, 445, 797), (70, 446, 796), (70, 447, 796), (71, 448, 794), (71, 449, 794), (72, 450, 792), (72, 451, 792), (73, 452, 790), (73, 453, 789), (74, 454, 788), (74, 455, 787), (75, 456, 786), (75, 457, 785), (76, 458, 784), (76, 459, 783), (77, 460, 782), (77, 461, 781), (77, 462, 781), (78, 463, 779), (78, 464, 779), (79, 465, 777), (79, 466, 777), (79, 467, 776), (80, 468, 775), (80, 469, 774), (80, 470, 774), (81, 471, 772), (81, 472, 771), (82, 473, 770), (82, 474, 769), (83, 475, 767), (83, 476, 766), (83, 477, 766), (84, 478, 764), (84, 479, 763), (85, 480, 761), (85, 481, 760), (85, 482, 759), (86, 483, 757), (86, 484, 755), (87, 485, 753), (87, 486, 752), (87, 487, 751), (88, 488, 748), (88, 489, 747), (88, 490, 746), (89, 491, 744), (89, 492, 743), (90, 493, 741), (90, 494, 741), (91, 495, 739), (91, 496, 738), (92, 497, 737), (93, 498, 735), (94, 499, 733), (94, 500, 733), (95, 501, 731), (96, 502, 729), (97, 503, 728), (98, 504, 726), (99, 505, 724), (99, 506, 724), (100, 507, 722), (101, 508, 721), (102, 509, 719), (104, 510, 717), (105, 511, 715), (106, 512, 714), (107, 513, 712), (108, 514, 711), (110, 515, 708), (111, 516, 707), (113, 517, 704), (114, 518, 703), (115, 519, 701), (117, 520, 698), (118, 521, 697), (119, 522, 695), (121, 523, 693), (122, 524, 691), (124, 525, 689), (125, 526, 687), (126, 527, 685), (128, 528, 683), (129, 529, 681), (131, 530, 678), (132, 531, 676), (134, 532, 674), (135, 533, 672), (137, 534, 669), (138, 535, 667), (140, 536, 664), (141, 537, 662), (143, 538, 659), (144, 539, 657), (146, 540, 654), (148, 541, 651), (149, 542, 649), (151, 543, 645), (153, 544, 642), (154, 545, 641), (156, 546, 638), (158, 547, 635), (159, 548, 633), (161, 549, 630), (162, 550, 628), (164, 551, 626), (166, 552, 623), (167, 553, 621), (169, 554, 618), (170, 555, 617), (171, 556, 615), (173, 557, 613), (174, 558, 611), (176, 559, 608), (177, 560, 607), (179, 561, 604), (180, 562, 603), (181, 563, 601), (183, 564, 599), (185, 565, 597), (186, 566, 595), (189, 567, 592), (192, 568, 589), (195, 569, 585), (198, 570, 582), (201, 571, 579), (204, 572, 575), (206, 573, 573), (209, 574, 569), (212, 575, 566), (215, 576, 563), (218, 577, 559), (221, 578, 556), (224, 579, 552), (226, 580, 550), (228, 581, 547), (230, 582, 545), (232, 583, 542), (234, 584, 540), (235, 585, 539), (237, 586, 536), (238, 587, 534), (240, 588, 531), (242, 589, 528), (244, 590, 525), (245, 591, 523), (247, 592, 520), (249, 593, 516), (251, 594, 513), (253, 595, 510), (256, 596, 505), (258, 597, 501), (261, 598, 497), (263, 599, 493), (267, 600, 488), (271, 601, 482), (274, 602, 478), (278, 603, 473), (281, 604, 468), (284, 605, 464), (287, 606, 460), (290, 607, 456), (292, 608, 453), (295, 609, 449), (298, 610, 445), (300, 611, 442), (303, 612, 438), (305, 613, 434), (308, 614, 430), (310, 615, 427), (312, 616, 423), (315, 617, 418), (317, 618, 415), (320, 619, 410), (322, 620, 406), (325, 621, 401), (328, 622, 395), (330, 623, 390), (333, 624, 384), (335, 625, 379), (338, 626, 374), (341, 627, 369), (345, 628, 362), (349, 629, 356), (353, 630, 350), (357, 631, 344), (360, 632, 340), (364, 633, 334), (368, 634, 328), (373, 635, 320), (378, 636, 313), (384, 637, 304), (389, 638, 295), (395, 639, 282), (401, 640, 270), (408, 641, 256), (416, 642, 240), (432, 643, 216), (448, 644, 193), (465, 645, 169), (480, 646, 148), (495, 647, 126), (511, 648, 104), (526, 649, 82), (565, 650, 9)], ['526,649,416,642,368,634,341,627,297,609,253,595,223,578,186,566,144,539,102,509,91,496,70,447,63,380,65,329,86,265,91,237,101,216,134,183,187,156,225,151,262,138,318,130,365,114,416,103,493,45,527,36,608,23,754,24,893,24,971,22,1032,27,1066,41,1082,52,1089,72,1088,172,1082,237,1064,267,1045,305,1019,322,1002,338,950,373,865,446,851,473,822,505,810,528,786,554,773,585,725,621,683,638,607,649']), (917855882, 492601069, 445, 0, 438, 0, 116, 0.99194473, [(127, 1, 140), (94, 2, 205), (59, 3, 273), (338, 3, 59), (22, 4, 380), (19, 5, 386), (16, 6, 391), (15, 7, 393), (14, 8, 395), (14, 9, 396), (13, 10, 398), (12, 11, 399), (12, 12, 399), (11, 13, 401), (10, 14, 402), (11, 15, 402), (11, 16, 403), (12, 17, 403), (12, 18, 404), (12, 19, 405), (12, 20, 405), (12, 21, 405), (12, 22, 406), (12, 23, 406), (12, 24, 406), (12, 25, 407), (12, 26, 407), (12, 27, 407), (12, 28, 408), (12, 29, 408), (12, 30, 408), (12, 31, 408), (12, 32, 408), (12, 33, 408), (12, 34, 409), (12, 35, 409), (12, 36, 409), (12, 37, 409), (12, 38, 409), (12, 39, 409), (12, 40, 409), (12, 41, 409), (12, 42, 409), (12, 43, 410), (12, 44, 410), (12, 45, 409), (12, 46, 409), (12, 47, 409), (12, 48, 409), (12, 49, 409), (12, 50, 409), (12, 51, 409), (12, 52, 408), (12, 53, 408), (12, 54, 407), (12, 55, 406), (12, 56, 405), (12, 57, 404), (12, 58, 402), (11, 59, 402), (11, 60, 401), (11, 61, 401), (11, 62, 400), (11, 63, 399), (11, 64, 399), (11, 65, 398), (11, 66, 397), (11, 67, 396), (11, 68, 395), (11, 69, 395), (11, 70, 394), (11, 71, 394), (11, 72, 394), (11, 73, 393), (11, 74, 393), (11, 75, 393), (11, 76, 393), (11, 77, 393), (11, 78, 393), (11, 79, 392), (11, 80, 392), (10, 81, 393), (10, 82, 394), (10, 83, 394), (9, 84, 396), (9, 85, 263), (284, 85, 121), (9, 86, 75), (96, 86, 30), (141, 86, 120), (292, 86, 112), (9, 87, 71), (151, 87, 105), (294, 87, 110), (8, 88, 67), (160, 88, 92), (295, 88, 108), (8, 89, 63), (175, 89, 74), (296, 89, 107), (7, 90, 61), (204, 90, 41), (297, 90, 105), (7, 91, 57), (298, 91, 104), (7, 92, 53), (299, 92, 103), (6, 93, 50), (300, 93, 101), (7, 94, 46), (303, 94, 96), (7, 95, 44), (305, 95, 93), (7, 96, 42), (308, 96, 88), (7, 97, 40), (310, 97, 85), (7, 98, 38), (312, 98, 82), (8, 99, 34), (314, 99, 79), (8, 100, 32), (316, 100, 75), (8, 101, 29), (319, 101, 71), (13, 102, 18), (324, 102, 63), (20, 103, 6), (331, 103, 51), (337, 104, 37), (344, 105, 22), (351, 106, 2)], ['344,105,330,102,319,101,307,95,300,93,283,84,261,85,244,90,204,90,203,89,175,89,160,88,140,85,125,86,96,86,84,85,67,90,56,92,36,101,25,103,8,101,6,93,11,80,11,59,12,58,12,17,10,14,16,6,22,4,58,4,59,3,93,3,94,2,126,2,127,1,266,1,267,2,298,2,299,3,396,3,406,6,416,19,420,34,420,51,405,68,402,81,404,85,401,92,386,102,365,105']), (917855882, 492601069, 445, 390, 550, 0, 54, 0.9392728, [(414, 0, 7), (441, 0, 60), (508, 0, 28), (402, 1, 142), (401, 2, 146), (402, 3, 145), (404, 4, 143), (406, 5, 140), (408, 6, 137), (410, 7, 134), (411, 8, 132), (412, 9, 130), (413, 10, 127), (414, 11, 125), (415, 12, 123), (415, 13, 122), (416, 14, 120), (417, 15, 117), (417, 16, 116), (418, 17, 114), (418, 18, 113), (418, 19, 111), (418, 20, 109), (419, 21, 107), (419, 22, 105), (419, 23, 103), (419, 24, 102), (420, 25, 99), (420, 26, 97), (420, 27, 95), (420, 28, 94), (421, 29, 91), (421, 30, 90), (422, 31, 88), (422, 32, 88), (422, 33, 87), (423, 34, 84), (423, 35, 82), (423, 36, 81), (424, 37, 79), (424, 38, 77), (424, 39, 75), (424, 40, 73), (424, 41, 71), (425, 42, 67), (425, 43, 66), (426, 44, 62), (426, 45, 6), (433, 45, 52), (443, 46, 30), (450, 47, 1)], ['449,46,443,46,442,45,426,45,424,41,424,37,423,36,422,31,420,28,420,25,419,24,419,21,418,20,418,17,417,15,409,6,402,3,402,1,413,1,414,0,420,0,421,1,440,1,441,0,500,0,501,1,507,1,508,0,535,0,536,1,543,1,546,2,546,4,542,8,530,18,527,19,525,21,522,22,520,24,512,28,508,33,505,34,502,37,494,41,492,41,490,43,488,43,484,45,473,45,472,46'])], 'temp/1772943657_2275741_917855882_da0fa7b7e6b5b551fe26c0ba8713276d.jpg']} ############################### TEST POLYGON ################################ Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : mask_detect list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.16629910469055176 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:mask_detect Sun Mar 8 05:21:17 2026 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 10998 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2026-03-08 05:21:20.997005: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2026-03-08 05:21:21.022484: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493010000 Hz 2026-03-08 05:21:21.024455: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f7458000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2026-03-08 05:21:21.024504: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2026-03-08 05:21:21.027981: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2026-03-08 05:21:21.313116: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x1333a8c0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2026-03-08 05:21:21.313158: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2026-03-08 05:21:21.314559: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2026-03-08 05:21:21.314961: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2026-03-08 05:21:21.317891: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2026-03-08 05:21:21.320460: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2026-03-08 05:21:21.320941: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2026-03-08 05:21:21.323782: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2026-03-08 05:21:21.324750: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2026-03-08 05:21:21.328630: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2026-03-08 05:21:21.329990: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2026-03-08 05:21:21.330053: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2026-03-08 05:21:21.330838: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2026-03-08 05:21:21.330855: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2026-03-08 05:21:21.330864: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2026-03-08 05:21:21.332267: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10193 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2026-03-08 05:21:21.448320: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2026-03-08 05:21:21.448406: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2026-03-08 05:21:21.448426: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2026-03-08 05:21:21.448444: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2026-03-08 05:21:21.448462: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2026-03-08 05:21:21.448479: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2026-03-08 05:21:21.448496: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2026-03-08 05:21:21.448525: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2026-03-08 05:21:21.449733: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2026-03-08 05:21:21.450910: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2026-03-08 05:21:21.450943: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2026-03-08 05:21:21.450961: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2026-03-08 05:21:21.450977: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2026-03-08 05:21:21.450993: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2026-03-08 05:21:21.451010: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2026-03-08 05:21:21.451026: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2026-03-08 05:21:21.451042: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2026-03-08 05:21:21.452314: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2026-03-08 05:21:21.452345: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2026-03-08 05:21:21.452354: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2026-03-08 05:21:21.452362: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2026-03-08 05:21:21.453690: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10193 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (3473, 'mask_coco_origin', 16384, 25088, 'mask_coco_origin', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2018, 3, 19, 10, 42, 21), datetime.datetime(2018, 3, 19, 10, 42, 21)) {'thcl': {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'], 'list_hashtags_csv': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'svm_hashtag_type_desc': 3473, 'photo_desc_type': 3473, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME mask_coco_origin NUM_CLASSES 81 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : mask_coco_origin model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2026-03-08 05:21:30.849367: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2026-03-08 05:21:31.015206: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/mask_coco_origin /data/models_weight/mask_coco_origin/mask_model.h5 size_local : 257557808 size in s3 : 257557808 create time local : 2026-03-05 05:20:41 create time in s3 : 2026-03-04 16:35:32 mask_model.h5 already exist and didn't need to update list_images length : 1 NEW PHOTO Processing 1 images image shape: (2448, 2448, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 89) min: 0.00000 max: 2448.00000 nb d'objets trouves : 1 Detection mask done ! Trying to reset tf kernel 2276587 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 5706 tf kernel not reseted sub process len(results) : 1 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 1 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 10998 list_Values should be empty [] ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] DEBUG bbox = [118, 7, 2245, 2268] DEBUG masks shape = (2448, 2448) time for calcul the mask position with numpy : 0.4514944553375244 nb_pixel_total : 3693818 time to create 1 rle with new method : 0.4099764823913574 length of segment : 2042 time spent for convertir_results : 2.1900551319122314 time spend for datou_step_exec : 21.142312049865723 time spend to save output : 2.765655517578125e-05 total time spend for step 1 : 21.1423397064209 caffe_path_current : About to save ! 1 Inside saveOutput : final : True verbose : False eke 12-6-18 : saveMask need to be cleaned for new output ! Catched exception ! Connect or reconnect ! Number saved : None batch 1 Loaded 726 chid ids of type : 445 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++Number RLEs to save : 0 begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1 time used for this insertion : 0.022861003875732422 save missing photos in datou_result : After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {'917877156': [[(917877156, 492601069, 445, 7, 2268, 118, 2245, 0.9848001, [(678, 120, 107), (520, 121, 482), (1050, 121, 381), (503, 122, 946), (487, 123, 980), (471, 124, 1014), (456, 125, 1045), (443, 126, 1090), (430, 127, 1134), (418, 128, 1167), (406, 129, 1186), (395, 130, 1204), (384, 131, 1221), (374, 132, 1238), (369, 133, 1249), (366, 134, 1258), (364, 135, 1265), (361, 136, 1274), (359, 137, 1281), (357, 138, 1288), (355, 139, 1295), (353, 140, 1302), (351, 141, 1308), (349, 142, 1315), (347, 143, 1320), (345, 144, 1325), (344, 145, 1330), (342, 146, 1335), (340, 147, 1340), (339, 148, 1344), (337, 149, 1349), (336, 150, 1353), (334, 151, 1358), (333, 152, 1361), (331, 153, 1366), (330, 154, 1370), (329, 155, 1373), (327, 156, 1378), (326, 157, 1381), (325, 158, 1385), (324, 159, 1388), (322, 160, 1393), (321, 161, 1397), (320, 162, 1400), (318, 163, 1405), (317, 164, 1409), (315, 165, 1414), (314, 166, 1418), (312, 167, 1424), (311, 168, 1428), (309, 169, 1433), (307, 170, 1438), (306, 171, 1442), (304, 172, 1447), (302, 173, 1452), (300, 174, 1458), (298, 175, 1463), (296, 176, 1468), (294, 177, 1474), (292, 178, 1479), (290, 179, 1485), (287, 180, 1492), (284, 181, 1499), (281, 182, 1506), (278, 183, 1513), (275, 184, 1521), (272, 185, 1528), (269, 186, 1536), (266, 187, 1544), (263, 188, 1552), (260, 189, 1560), (258, 190, 1568), (255, 191, 1577), (252, 192, 1586), (249, 193, 1595), (246, 194, 1604), (243, 195, 1613), (240, 196, 1622), (237, 197, 1631), (234, 198, 1639), (231, 199, 1648), (228, 200, 1656), (225, 201, 1664), (222, 202, 1673), (219, 203, 1681), (216, 204, 1689), (214, 205, 1693), (211, 206, 1697), (208, 207, 1702), (206, 208, 1706), (204, 209, 1709), (203, 210, 1712), (201, 211, 1715), (200, 212, 1718), (198, 213, 1721), (196, 214, 1725), (195, 215, 1727), (194, 216, 1729), (192, 217, 1732), (191, 218, 1735), (190, 219, 1737), (188, 220, 1740), (187, 221, 1742), (186, 222, 1744), (185, 223, 1746), (184, 224, 1748), (183, 225, 1750), (181, 226, 1753), (180, 227, 1755), (179, 228, 1757), (178, 229, 1759), (177, 230, 1761), (177, 231, 1761), (176, 232, 1763), (175, 233, 1765), (174, 234, 1767), (173, 235, 1768), (172, 236, 1770), (171, 237, 1772), (170, 238, 1774), (169, 239, 1775), (168, 240, 1777), (167, 241, 1779), (166, 242, 1781), (165, 243, 1783), (164, 244, 1785), (163, 245, 1787), (162, 246, 1788), (161, 247, 1790), (160, 248, 1792), (158, 249, 1795), (157, 250, 1797), (156, 251, 1799), (155, 252, 1802), (153, 253, 1805), (152, 254, 1807), (151, 255, 1809), (149, 256, 1812), (148, 257, 1814), (147, 258, 1817), (145, 259, 1820), (144, 260, 1822), (142, 261, 1826), (141, 262, 1828), (139, 263, 1831), (137, 264, 1835), (136, 265, 1837), (134, 266, 1841), (132, 267, 1845), (130, 268, 1848), (128, 269, 1852), (127, 270, 1855), (125, 271, 1859), (124, 272, 1861), (123, 273, 1864), (121, 274, 1868), (120, 275, 1870), (119, 276, 1873), (118, 277, 1875), (117, 278, 1878), (116, 279, 1880), (115, 280, 1882), (113, 281, 1886), (112, 282, 1888), (111, 283, 1890), (111, 284, 1891), (110, 285, 1893), (109, 286, 1895), (108, 287, 1898), (107, 288, 1900), (106, 289, 1902), (105, 290, 1904), (104, 291, 1906), (104, 292, 1907), (103, 293, 1908), (102, 294, 1910), (101, 295, 1912), (101, 296, 1913), (100, 297, 1915), (99, 298, 1917), (99, 299, 1917), (98, 300, 1919), (97, 301, 1921), (97, 302, 1922), (96, 303, 1923), (96, 304, 1924), (95, 305, 1926), (94, 306, 1927), (94, 307, 1928), (94, 308, 1929), (93, 309, 1930), (93, 310, 1931), (93, 311, 1931), (92, 312, 1933), (92, 313, 1933), (92, 314, 1934), (92, 315, 1934), (91, 316, 1936), (91, 317, 1936), (91, 318, 1937), (91, 319, 1937), (90, 320, 1939), (90, 321, 1939), (90, 322, 1940), (90, 323, 1940), (89, 324, 1942), (89, 325, 1942), (89, 326, 1943), (88, 327, 1945), (88, 328, 1945), (88, 329, 1946), (88, 330, 1946), (87, 331, 1948), (87, 332, 1949), (87, 333, 1949), (86, 334, 1951), (86, 335, 1952), (86, 336, 1952), (86, 337, 1953), (85, 338, 1955), (85, 339, 1955), (85, 340, 1956), (84, 341, 1958), (84, 342, 1959), (84, 343, 1959), (84, 344, 1960), (83, 345, 1962), (83, 346, 1963), (83, 347, 1963), (82, 348, 1965), (82, 349, 1966), (82, 350, 1967), (82, 351, 1968), (81, 352, 1969), (81, 353, 1970), (81, 354, 1971), (80, 355, 1973), (80, 356, 1974), (80, 357, 1975), (79, 358, 1977), (79, 359, 1978), (79, 360, 1979), (79, 361, 1980), (78, 362, 1982), (78, 363, 1983), (78, 364, 1984), (77, 365, 1986), (77, 366, 1987), (77, 367, 1988), (76, 368, 1990), (76, 369, 1991), (76, 370, 1992), (75, 371, 1994), (75, 372, 1995), (75, 373, 1996), (74, 374, 1998), (74, 375, 1999), (74, 376, 2000), (74, 377, 2001), (73, 378, 2003), (73, 379, 2004), (73, 380, 2004), (72, 381, 2006), (72, 382, 2007), (72, 383, 2008), (71, 384, 2009), (71, 385, 2010), (71, 386, 2011), (70, 387, 2012), (70, 388, 2013), (70, 389, 2013), (69, 390, 2015), (69, 391, 2015), (69, 392, 2016), (68, 393, 2017), (68, 394, 2018), (68, 395, 2019), (67, 396, 2020), (67, 397, 2021), (67, 398, 2021), (66, 399, 2023), (66, 400, 2023), (66, 401, 2024), (65, 402, 2025), (65, 403, 2026), (65, 404, 2026), (64, 405, 2028), (64, 406, 2028), (64, 407, 2029), (63, 408, 2030), (63, 409, 2031), (62, 410, 2032), (62, 411, 2032), (62, 412, 2033), (61, 413, 2034), (61, 414, 2035), (60, 415, 2036), (60, 416, 2037), (60, 417, 2037), (59, 418, 2039), (59, 419, 2039), (58, 420, 2041), (58, 421, 2041), (57, 422, 2042), (57, 423, 2043), (57, 424, 2043), (56, 425, 2045), (56, 426, 2045), (55, 427, 2046), (55, 428, 2047), (54, 429, 2048), (54, 430, 2049), (53, 431, 2050), (53, 432, 2050), (52, 433, 2052), (52, 434, 2052), (51, 435, 2054), (51, 436, 2054), (50, 437, 2055), (50, 438, 2056), (49, 439, 2057), (49, 440, 2058), (48, 441, 2059), (48, 442, 2059), (47, 443, 2061), (46, 444, 2062), (46, 445, 2062), (45, 446, 2064), (45, 447, 2064), (44, 448, 2065), (44, 449, 2066), (43, 450, 2067), (43, 451, 2067), (42, 452, 2069), (42, 453, 2069), (42, 454, 2069), (41, 455, 2071), (41, 456, 2071), (40, 457, 2072), (40, 458, 2073), (39, 459, 2074), (39, 460, 2074), (39, 461, 2075), (39, 462, 2075), (39, 463, 2075), (38, 464, 2077), (38, 465, 2077), (38, 466, 2077), (38, 467, 2078), (38, 468, 2078), (38, 469, 2078), (37, 470, 2080), (37, 471, 2080), (37, 472, 2080), (37, 473, 2081), (37, 474, 2081), (37, 475, 2081), (36, 476, 2083), (36, 477, 2083), (36, 478, 2083), (36, 479, 2084), (36, 480, 2084), (36, 481, 2084), (35, 482, 2086), (35, 483, 2086), (35, 484, 2086), (35, 485, 2087), (35, 486, 2087), (35, 487, 2088), (35, 488, 2088), (34, 489, 2089), (34, 490, 2090), (34, 491, 2090), (34, 492, 2091), (34, 493, 2091), (34, 494, 2091), (33, 495, 2093), (33, 496, 2093), (33, 497, 2094), (33, 498, 2094), (33, 499, 2095), (33, 500, 2095), (33, 501, 2096), (32, 502, 2097), (32, 503, 2098), (32, 504, 2098), (32, 505, 2098), (32, 506, 2099), (32, 507, 2099), (32, 508, 2100), (31, 509, 2102), (31, 510, 2102), (31, 511, 2103), (31, 512, 2103), (31, 513, 2104), (31, 514, 2104), (31, 515, 2105), (31, 516, 2105), (30, 517, 2107), (30, 518, 2107), (30, 519, 2108), (30, 520, 2109), (30, 521, 2109), (30, 522, 2110), (30, 523, 2111), (29, 524, 2112), (29, 525, 2113), (29, 526, 2113), (29, 527, 2114), (29, 528, 2115), (29, 529, 2115), (29, 530, 2116), (29, 531, 2117), (28, 532, 2119), (28, 533, 2119), (28, 534, 2120), (28, 535, 2121), (28, 536, 2121), (28, 537, 2122), (28, 538, 2122), (28, 539, 2122), (28, 540, 2123), (28, 541, 2123), (28, 542, 2123), (28, 543, 2123), (28, 544, 2124), (27, 545, 2125), (27, 546, 2125), (27, 547, 2126), (27, 548, 2126), (27, 549, 2126), (27, 550, 2127), (27, 551, 2127), (27, 552, 2127), (27, 553, 2128), (27, 554, 2128), (27, 555, 2128), (27, 556, 2128), (27, 557, 2129), (27, 558, 2129), (27, 559, 2129), (27, 560, 2130), (27, 561, 2130), (27, 562, 2130), (26, 563, 2131), (26, 564, 2132), (26, 565, 2132), (26, 566, 2132), (26, 567, 2132), (26, 568, 2133), (26, 569, 2133), (26, 570, 2133), (26, 571, 2133), (26, 572, 2134), (26, 573, 2134), (26, 574, 2134), (26, 575, 2134), (26, 576, 2135), (26, 577, 2135), (26, 578, 2135), (26, 579, 2135), (26, 580, 2136), (25, 581, 2137), (25, 582, 2137), (25, 583, 2137), (25, 584, 2138), (25, 585, 2138), (25, 586, 2138), (25, 587, 2138), (25, 588, 2139), (25, 589, 2139), (25, 590, 2139), (25, 591, 2139), (25, 592, 2139), (25, 593, 2140), (25, 594, 2140), (25, 595, 2140), (25, 596, 2140), (25, 597, 2141), (25, 598, 2141), (25, 599, 2141), (24, 600, 2142), (24, 601, 2142), (24, 602, 2143), (24, 603, 2143), (24, 604, 2143), (24, 605, 2143), (24, 606, 2143), (24, 607, 2144), (24, 608, 2144), (24, 609, 2144), (24, 610, 2144), (24, 611, 2144), (24, 612, 2144), (24, 613, 2145), (24, 614, 2145), (24, 615, 2145), (24, 616, 2145), (24, 617, 2145), (24, 618, 2145), (24, 619, 2145), (24, 620, 2145), (23, 621, 2146), (23, 622, 2146), (23, 623, 2146), (23, 624, 2146), (23, 625, 2146), (23, 626, 2146), (23, 627, 2146), (23, 628, 2146), (23, 629, 2146), (23, 630, 2147), (23, 631, 2147), (23, 632, 2147), (23, 633, 2147), (23, 634, 2147), (23, 635, 2147), (23, 636, 2147), (23, 637, 2147), (23, 638, 2147), (23, 639, 2147), (23, 640, 2147), (23, 641, 2147), (23, 642, 2147), (23, 643, 2147), (22, 644, 2148), (22, 645, 2148), (22, 646, 2149), (22, 647, 2149), (22, 648, 2149), (22, 649, 2149), (22, 650, 2149), (22, 651, 2149), (22, 652, 2149), (22, 653, 2149), (22, 654, 2149), (22, 655, 2149), (22, 656, 2149), (22, 657, 2149), (22, 658, 2149), (22, 659, 2149), (22, 660, 2149), (22, 661, 2149), (22, 662, 2149), (22, 663, 2150), (22, 664, 2150), (22, 665, 2150), (22, 666, 2150), (22, 667, 2150), (21, 668, 2151), (21, 669, 2151), (21, 670, 2151), (21, 671, 2151), (21, 672, 2151), (21, 673, 2151), (21, 674, 2151), (21, 675, 2151), (21, 676, 2151), (21, 677, 2151), (21, 678, 2151), (21, 679, 2151), (21, 680, 2151), (21, 681, 2152), (21, 682, 2152), (21, 683, 2152), (21, 684, 2152), (21, 685, 2152), (21, 686, 2152), (21, 687, 2152), (21, 688, 2152), (21, 689, 2152), (21, 690, 2152), (21, 691, 2152), (21, 692, 2152), (21, 693, 2152), (21, 694, 2152), (21, 695, 2152), (21, 696, 2151), (21, 697, 2151), (22, 698, 2150), (22, 699, 2150), (22, 700, 2150), (22, 701, 2150), (22, 702, 2150), (22, 703, 2150), (22, 704, 2150), (22, 705, 2150), (22, 706, 2150), (22, 707, 2150), (22, 708, 2150), (22, 709, 2150), (22, 710, 2150), (23, 711, 2149), (23, 712, 2149), (23, 713, 2149), (23, 714, 2149), (23, 715, 2149), (23, 716, 2149), (23, 717, 2149), (23, 718, 2148), (23, 719, 2148), (23, 720, 2148), (23, 721, 2148), (23, 722, 2148), (23, 723, 2148), (24, 724, 2147), (24, 725, 2147), (24, 726, 2147), (24, 727, 2147), (24, 728, 2147), (24, 729, 2147), (24, 730, 2147), (24, 731, 2147), (24, 732, 2147), (24, 733, 2147), (24, 734, 2147), (24, 735, 2147), (25, 736, 2146), (25, 737, 2146), (25, 738, 2146), (25, 739, 2146), (25, 740, 2145), (25, 741, 2145), (25, 742, 2145), (25, 743, 2145), (25, 744, 2145), (25, 745, 2145), (25, 746, 2145), (26, 747, 2144), (26, 748, 2144), (26, 749, 2144), (26, 750, 2144), (26, 751, 2144), (26, 752, 2144), (26, 753, 2144), (26, 754, 2144), (26, 755, 2144), (26, 756, 2144), (26, 757, 2144), (26, 758, 2144), (27, 759, 2143), (27, 760, 2142), (27, 761, 2142), (27, 762, 2142), (27, 763, 2142), (27, 764, 2142), (27, 765, 2142), (27, 766, 2142), (27, 767, 2142), (27, 768, 2142), (27, 769, 2142), (27, 770, 2142), (27, 771, 2142), (27, 772, 2142), (27, 773, 2142), (27, 774, 2142), (27, 775, 2142), (27, 776, 2142), (27, 777, 2142), (27, 778, 2142), (27, 779, 2142), (27, 780, 2141), (27, 781, 2141), (27, 782, 2141), (27, 783, 2141), (27, 784, 2141), (27, 785, 2141), (27, 786, 2141), (27, 787, 2141), (27, 788, 2141), (27, 789, 2141), (26, 790, 2142), (26, 791, 2142), (26, 792, 2142), (26, 793, 2142), (26, 794, 2142), (26, 795, 2142), (26, 796, 2142), (26, 797, 2142), (26, 798, 2141), (26, 799, 2141), (26, 800, 2141), (26, 801, 2141), (26, 802, 2141), (26, 803, 2141), (26, 804, 2141), (26, 805, 2141), (26, 806, 2141), (26, 807, 2141), (26, 808, 2141), (26, 809, 2141), (26, 810, 2141), (26, 811, 2141), (26, 812, 2141), (26, 813, 2141), (26, 814, 2141), (26, 815, 2141), (26, 816, 2140), (26, 817, 2140), (26, 818, 2140), (26, 819, 2140), (26, 820, 2140), (26, 821, 2140), (26, 822, 2140), (26, 823, 2140), (26, 824, 2140), (26, 825, 2140), (26, 826, 2140), (26, 827, 2140), (26, 828, 2140), (26, 829, 2140), (26, 830, 2140), (26, 831, 2140), (26, 832, 2140), (26, 833, 2139), (26, 834, 2139), (26, 835, 2139), (26, 836, 2139), (26, 837, 2139), (26, 838, 2139), (26, 839, 2139), (26, 840, 2139), (26, 841, 2138), (26, 842, 2138), (26, 843, 2138), (26, 844, 2137), (26, 845, 2137), (26, 846, 2137), (26, 847, 2136), (26, 848, 2136), (26, 849, 2135), (26, 850, 2135), (26, 851, 2134), (26, 852, 2134), (26, 853, 2134), (26, 854, 2133), (26, 855, 2133), (27, 856, 2131), (27, 857, 2131), (27, 858, 2130), (27, 859, 2130), (27, 860, 2129), (27, 861, 2129), (27, 862, 2129), (27, 863, 2128), (27, 864, 2128), (27, 865, 2127), (27, 866, 2127), (27, 867, 2126), (27, 868, 2125), (27, 869, 2125), (27, 870, 2124), (27, 871, 2124), (27, 872, 2123), (27, 873, 2123), (28, 874, 2121), (28, 875, 2121), (28, 876, 2120), (28, 877, 2119), (28, 878, 2119), (28, 879, 2118), (28, 880, 2118), (28, 881, 2117), (28, 882, 2116), (28, 883, 2116), (28, 884, 2115), (28, 885, 2115), (28, 886, 2114), (28, 887, 2114), (28, 888, 2113), (28, 889, 2113), (28, 890, 2112), (29, 891, 2111), (29, 892, 2110), (29, 893, 2110), (29, 894, 2109), (29, 895, 2109), (29, 896, 2108), (29, 897, 2108), (29, 898, 2107), (29, 899, 2107), (29, 900, 2106), (29, 901, 2106), (29, 902, 2105), (29, 903, 2105), (29, 904, 2105), (29, 905, 2104), (29, 906, 2104), (29, 907, 2103), (30, 908, 2102), (30, 909, 2102), (30, 910, 2101), (30, 911, 2101), (30, 912, 2100), (30, 913, 2100), (30, 914, 2100), (30, 915, 2099), (30, 916, 2099), (30, 917, 2099), (30, 918, 2098), (30, 919, 2098), (30, 920, 2098), (30, 921, 2098), (30, 922, 2097), (30, 923, 2097), (30, 924, 2097), (30, 925, 2097), (30, 926, 2096), (30, 927, 2096), (29, 928, 2097), (29, 929, 2097), (29, 930, 2096), (29, 931, 2096), (29, 932, 2096), (29, 933, 2096), (29, 934, 2096), (29, 935, 2095), (29, 936, 2095), (29, 937, 2095), (29, 938, 2095), (29, 939, 2094), (29, 940, 2094), (29, 941, 2094), (29, 942, 2094), (29, 943, 2094), (29, 944, 2093), (29, 945, 2093), (29, 946, 2093), (29, 947, 2093), (29, 948, 2092), (29, 949, 2092), (29, 950, 2092), (29, 951, 2092), (29, 952, 2092), (29, 953, 2091), (29, 954, 2091), (29, 955, 2091), (29, 956, 2091), (29, 957, 2091), (28, 958, 2091), (28, 959, 2091), (28, 960, 2091), (28, 961, 2091), (28, 962, 2091), (28, 963, 2091), (28, 964, 2090), (28, 965, 2090), (28, 966, 2090), (28, 967, 2090), (28, 968, 2090), (28, 969, 2089), (28, 970, 2089), (28, 971, 2089), (28, 972, 2089), (28, 973, 2089), (28, 974, 2088), (28, 975, 2088), (28, 976, 2088), (28, 977, 2088), (28, 978, 2088), (28, 979, 2088), (28, 980, 2087), (28, 981, 2087), (28, 982, 2087), (28, 983, 2087), (28, 984, 2087), (28, 985, 2087), (28, 986, 2086), (28, 987, 2086), (27, 988, 2087), (27, 989, 2087), (27, 990, 2087), (27, 991, 2087), (27, 992, 2086), (27, 993, 2086), (27, 994, 2086), (28, 995, 2084), (28, 996, 2084), (28, 997, 2084), (28, 998, 2083), (28, 999, 2083), (28, 1000, 2083), (28, 1001, 2082), (28, 1002, 2082), (28, 1003, 2082), (28, 1004, 2081), (28, 1005, 2081), (28, 1006, 2081), (28, 1007, 2080), (28, 1008, 2080), (28, 1009, 2080), (28, 1010, 2079), (28, 1011, 2079), (28, 1012, 2079), (28, 1013, 2078), (28, 1014, 2078), (28, 1015, 2078), (28, 1016, 2077), (28, 1017, 2077), (28, 1018, 2076), (28, 1019, 2076), (28, 1020, 2076), (28, 1021, 2075), (28, 1022, 2075), (28, 1023, 2074), (29, 1024, 2073), (29, 1025, 2073), (29, 1026, 2072), (29, 1027, 2072), (29, 1028, 2072), (29, 1029, 2071), (29, 1030, 2071), (29, 1031, 2070), (29, 1032, 2070), (29, 1033, 2069), (29, 1034, 2069), (29, 1035, 2069), (29, 1036, 2068), (29, 1037, 2068), (29, 1038, 2067), (29, 1039, 2067), (29, 1040, 2066), (29, 1041, 2066), (29, 1042, 2066), (29, 1043, 2065), (29, 1044, 2065), (29, 1045, 2064), (29, 1046, 2064), (29, 1047, 2063), (29, 1048, 2063), (29, 1049, 2062), (29, 1050, 2062), (29, 1051, 2061), (30, 1052, 2060), (30, 1053, 2059), (30, 1054, 2059), (30, 1055, 2058), (30, 1056, 2058), (30, 1057, 2057), (30, 1058, 2057), (30, 1059, 2056), (30, 1060, 2056), (30, 1061, 2055), (30, 1062, 2055), (30, 1063, 2054), (30, 1064, 2054), (30, 1065, 2053), (30, 1066, 2053), (30, 1067, 2052), (30, 1068, 2051), (30, 1069, 2051), (30, 1070, 2050), (30, 1071, 2049), (30, 1072, 2048), (30, 1073, 2048), (30, 1074, 2047), (30, 1075, 2046), (30, 1076, 2045), (30, 1077, 2044), (30, 1078, 2043), (30, 1079, 2042), (30, 1080, 2041), (30, 1081, 2040), (30, 1082, 2039), (30, 1083, 2038), (30, 1084, 2037), (30, 1085, 2036), (30, 1086, 2035), (30, 1087, 2034), (30, 1088, 2033), (30, 1089, 2032), (30, 1090, 2031), (30, 1091, 2030), (30, 1092, 2029), (30, 1093, 2028), (30, 1094, 2027), (30, 1095, 2026), (30, 1096, 2026), (30, 1097, 2025), (29, 1098, 2025), (29, 1099, 2024), (29, 1100, 2023), (29, 1101, 2022), (29, 1102, 2022), (29, 1103, 2021), (29, 1104, 2020), (29, 1105, 2019), (29, 1106, 2019), (29, 1107, 2018), (29, 1108, 2017), (29, 1109, 2017), (29, 1110, 2016), (29, 1111, 2015), (29, 1112, 2015), (29, 1113, 2014), (29, 1114, 2013), (29, 1115, 2013), (29, 1116, 2012), (29, 1117, 2011), (29, 1118, 2011), (29, 1119, 2010), (29, 1120, 2010), (29, 1121, 2009), (29, 1122, 2009), (29, 1123, 2008), (29, 1124, 2008), (29, 1125, 2007), (29, 1126, 2007), (29, 1127, 2006), (29, 1128, 2006), (29, 1129, 2005), (29, 1130, 2005), (29, 1131, 2004), (29, 1132, 2004), (29, 1133, 2003), (29, 1134, 2003), (29, 1135, 2002), (29, 1136, 2002), (29, 1137, 2001), (29, 1138, 2001), (29, 1139, 2001), (29, 1140, 2000), (29, 1141, 2000), (29, 1142, 1999), (29, 1143, 1999), (29, 1144, 1999), (29, 1145, 1998), (29, 1146, 1998), (29, 1147, 1998), (29, 1148, 1997), (29, 1149, 1997), (29, 1150, 1997), (29, 1151, 1996), (29, 1152, 1996), (29, 1153, 1996), (29, 1154, 1995), (29, 1155, 1995), (29, 1156, 1995), (29, 1157, 1994), (29, 1158, 1994), (29, 1159, 1994), (29, 1160, 1993), (29, 1161, 1993), (29, 1162, 1992), (29, 1163, 1992), (29, 1164, 1992), (29, 1165, 1991), (29, 1166, 1991), (29, 1167, 1991), (29, 1168, 1990), (29, 1169, 1990), (29, 1170, 1989), (29, 1171, 1989), (29, 1172, 1989), (29, 1173, 1988), (29, 1174, 1988), (29, 1175, 1987), (29, 1176, 1987), (29, 1177, 1987), (29, 1178, 1986), (29, 1179, 1986), (29, 1180, 1985), (29, 1181, 1985), (29, 1182, 1985), (29, 1183, 1984), (29, 1184, 1984), (29, 1185, 1983), (29, 1186, 1983), (29, 1187, 1982), (29, 1188, 1982), (29, 1189, 1981), (29, 1190, 1981), (29, 1191, 1980), (29, 1192, 1980), (29, 1193, 1980), (29, 1194, 1979), (29, 1195, 1979), (29, 1196, 1978), (29, 1197, 1978), (29, 1198, 1977), (29, 1199, 1977), (29, 1200, 1976), (29, 1201, 1976), (29, 1202, 1975), (29, 1203, 1974), (29, 1204, 1974), (29, 1205, 1973), (29, 1206, 1973), (30, 1207, 1971), (30, 1208, 1971), (30, 1209, 1970), (30, 1210, 1970), (30, 1211, 1969), (30, 1212, 1968), (30, 1213, 1968), (30, 1214, 1967), (30, 1215, 1967), (30, 1216, 1966), (30, 1217, 1966), (30, 1218, 1965), (30, 1219, 1964), (30, 1220, 1964), (30, 1221, 1963), (30, 1222, 1962), (30, 1223, 1961), (30, 1224, 1961), (30, 1225, 1960), (30, 1226, 1959), (30, 1227, 1958), (30, 1228, 1958), (30, 1229, 1957), (30, 1230, 1956), (30, 1231, 1955), (30, 1232, 1954), (30, 1233, 1954), (30, 1234, 1953), (30, 1235, 1952), (30, 1236, 1951), (30, 1237, 1950), (30, 1238, 1950), (30, 1239, 1949), (30, 1240, 1948), (30, 1241, 1947), (30, 1242, 1947), (30, 1243, 1946), (30, 1244, 1945), (30, 1245, 1945), (30, 1246, 1944), (30, 1247, 1943), (30, 1248, 1943), (30, 1249, 1942), (30, 1250, 1942), (30, 1251, 1941), (30, 1252, 1940), (30, 1253, 1940), (30, 1254, 1939), (30, 1255, 1939), (30, 1256, 1938), (30, 1257, 1937), (30, 1258, 1937), (30, 1259, 1936), (30, 1260, 1936), (30, 1261, 1935), (30, 1262, 1935), (30, 1263, 1934), (30, 1264, 1934), (30, 1265, 1933), (30, 1266, 1933), (30, 1267, 1932), (30, 1268, 1932), (30, 1269, 1931), (30, 1270, 1931), (30, 1271, 1931), (30, 1272, 1930), (30, 1273, 1930), (30, 1274, 1929), (30, 1275, 1929), (30, 1276, 1928), (30, 1277, 1928), (30, 1278, 1928), (30, 1279, 1927), (30, 1280, 1927), (30, 1281, 1926), (30, 1282, 1926), (30, 1283, 1926), (30, 1284, 1925), (30, 1285, 1925), (30, 1286, 1925), (30, 1287, 1924), (30, 1288, 1924), (30, 1289, 1923), (30, 1290, 1923), (30, 1291, 1923), (30, 1292, 1922), (30, 1293, 1922), (30, 1294, 1922), (30, 1295, 1921), (30, 1296, 1921), (30, 1297, 1921), (30, 1298, 1921), (30, 1299, 1921), (30, 1300, 1920), (30, 1301, 1920), (30, 1302, 1920), (30, 1303, 1920), (30, 1304, 1919), (30, 1305, 1919), (31, 1306, 1918), (31, 1307, 1918), (31, 1308, 1917), (31, 1309, 1917), (31, 1310, 1917), (31, 1311, 1917), (31, 1312, 1916), (31, 1313, 1916), (31, 1314, 1916), (31, 1315, 1916), (31, 1316, 1915), (31, 1317, 1915), (31, 1318, 1915), (31, 1319, 1915), (31, 1320, 1914), (31, 1321, 1914), (31, 1322, 1914), (31, 1323, 1914), (31, 1324, 1913), (31, 1325, 1913), (31, 1326, 1913), (31, 1327, 1913), (31, 1328, 1912), (31, 1329, 1912), (31, 1330, 1912), (31, 1331, 1911), (32, 1332, 1910), (32, 1333, 1910), (32, 1334, 1910), (32, 1335, 1909), (32, 1336, 1909), (32, 1337, 1909), (32, 1338, 1909), (32, 1339, 1908), (32, 1340, 1908), (32, 1341, 1908), (32, 1342, 1907), (32, 1343, 1907), (32, 1344, 1907), (32, 1345, 1907), (32, 1346, 1906), (32, 1347, 1906), (32, 1348, 1906), (32, 1349, 1905), (32, 1350, 1905), (32, 1351, 1905), (32, 1352, 1905), (32, 1353, 1904), (32, 1354, 1904), (32, 1355, 1904), (32, 1356, 1903), (32, 1357, 1903), (33, 1358, 1902), (33, 1359, 1901), (33, 1360, 1901), (33, 1361, 1901), (33, 1362, 1900), (33, 1363, 1900), (33, 1364, 1900), (33, 1365, 1899), (33, 1366, 1899), (33, 1367, 1899), (33, 1368, 1898), (33, 1369, 1898), (33, 1370, 1898), (33, 1371, 1897), (33, 1372, 1897), (33, 1373, 1897), (33, 1374, 1896), (33, 1375, 1895), (33, 1376, 1895), (33, 1377, 1894), (34, 1378, 1893), (34, 1379, 1892), (34, 1380, 1892), (34, 1381, 1891), (34, 1382, 1891), (34, 1383, 1890), (34, 1384, 1889), (34, 1385, 1889), (34, 1386, 1888), (34, 1387, 1888), (34, 1388, 1887), (34, 1389, 1886), (34, 1390, 1886), (34, 1391, 1885), (35, 1392, 1883), (35, 1393, 1883), (35, 1394, 1882), (35, 1395, 1881), (35, 1396, 1880), (35, 1397, 1880), (35, 1398, 1879), (35, 1399, 1878), (35, 1400, 1877), (35, 1401, 1877), (35, 1402, 1876), (35, 1403, 1875), (35, 1404, 1874), (36, 1405, 1872), (36, 1406, 1871), (36, 1407, 1870), (36, 1408, 1869), (36, 1409, 1868), (36, 1410, 1868), (36, 1411, 1867), (36, 1412, 1866), (36, 1413, 1865), (36, 1414, 1864), (36, 1415, 1863), (36, 1416, 1862), (36, 1417, 1862), (36, 1418, 1861), (37, 1419, 1859), (37, 1420, 1858), (37, 1421, 1858), (37, 1422, 1857), (37, 1423, 1856), (37, 1424, 1855), (37, 1425, 1855), (37, 1426, 1854), (37, 1427, 1853), (37, 1428, 1853), (37, 1429, 1852), (37, 1430, 1851), (38, 1431, 1850), (38, 1432, 1849), (38, 1433, 1849), (38, 1434, 1848), (38, 1435, 1847), (38, 1436, 1847), (38, 1437, 1846), (38, 1438, 1846), (38, 1439, 1845), (38, 1440, 1845), (38, 1441, 1844), (38, 1442, 1844), (38, 1443, 1843), (39, 1444, 1842), (39, 1445, 1841), (39, 1446, 1841), (39, 1447, 1840), (39, 1448, 1840), (39, 1449, 1840), (39, 1450, 1840), (39, 1451, 1839), (39, 1452, 1839), (39, 1453, 1839), (39, 1454, 1839), (39, 1455, 1838), (39, 1456, 1838), (39, 1457, 1838), (39, 1458, 1838), (39, 1459, 1838), (39, 1460, 1837), (39, 1461, 1837), (39, 1462, 1837), (39, 1463, 1837), (39, 1464, 1837), (39, 1465, 1836), (39, 1466, 1836), (39, 1467, 1836), (39, 1468, 1836), (39, 1469, 1836), (39, 1470, 1835), (39, 1471, 1835), (39, 1472, 1835), (39, 1473, 1835), (39, 1474, 1835), (39, 1475, 1834), (39, 1476, 1834), (39, 1477, 1834), (39, 1478, 1834), (40, 1479, 1833), (40, 1480, 1833), (40, 1481, 1832), (40, 1482, 1832), (40, 1483, 1832), (40, 1484, 1832), (40, 1485, 1832), (40, 1486, 1831), (40, 1487, 1831), (40, 1488, 1831), (40, 1489, 1831), (40, 1490, 1831), (40, 1491, 1831), (40, 1492, 1830), (40, 1493, 1830), (40, 1494, 1830), (40, 1495, 1830), (40, 1496, 1830), (40, 1497, 1830), (40, 1498, 1829), (40, 1499, 1829), (40, 1500, 1829), (40, 1501, 1829), (40, 1502, 1829), (40, 1503, 1828), (40, 1504, 1828), (40, 1505, 1828), (40, 1506, 1828), (40, 1507, 1828), (40, 1508, 1828), (40, 1509, 1827), (40, 1510, 1827), (40, 1511, 1827), (40, 1512, 1827), (40, 1513, 1827), (40, 1514, 1827), (40, 1515, 1827), (40, 1516, 1826), (40, 1517, 1826), (40, 1518, 1826), (40, 1519, 1826), (40, 1520, 1826), (40, 1521, 1826), (40, 1522, 1825), (41, 1523, 1824), (41, 1524, 1824), (41, 1525, 1824), (41, 1526, 1824), (41, 1527, 1824), (41, 1528, 1824), (41, 1529, 1824), (41, 1530, 1823), (41, 1531, 1823), (41, 1532, 1823), (41, 1533, 1823), (41, 1534, 1823), (41, 1535, 1823), (41, 1536, 1823), (41, 1537, 1823), (41, 1538, 1822), (41, 1539, 1822), (41, 1540, 1822), (41, 1541, 1822), (41, 1542, 1822), (41, 1543, 1822), (41, 1544, 1822), (41, 1545, 1822), (41, 1546, 1821), (41, 1547, 1821), (41, 1548, 1821), (41, 1549, 1821), (41, 1550, 1821), (41, 1551, 1821), (41, 1552, 1821), (41, 1553, 1821), (41, 1554, 1820), (41, 1555, 1820), (41, 1556, 1820), (41, 1557, 1820), (41, 1558, 1820), (41, 1559, 1820), (41, 1560, 1820), (41, 1561, 1820), (41, 1562, 1819), (41, 1563, 1819), (41, 1564, 1819), (41, 1565, 1819), (41, 1566, 1819), (41, 1567, 1819), (42, 1568, 1818), (42, 1569, 1818), (42, 1570, 1817), (42, 1571, 1817), (42, 1572, 1817), (42, 1573, 1817), (42, 1574, 1817), (42, 1575, 1817), (42, 1576, 1817), (42, 1577, 1817), (42, 1578, 1816), (42, 1579, 1816), (42, 1580, 1816), (42, 1581, 1816), (42, 1582, 1816), (42, 1583, 1816), (42, 1584, 1816), (42, 1585, 1815), (42, 1586, 1815), (42, 1587, 1815), (42, 1588, 1815), (42, 1589, 1815), (42, 1590, 1815), (42, 1591, 1815), (42, 1592, 1815), (42, 1593, 1814), (42, 1594, 1814), (42, 1595, 1814), (42, 1596, 1814), (42, 1597, 1814), (42, 1598, 1814), (42, 1599, 1814), (42, 1600, 1814), (42, 1601, 1813), (42, 1602, 1813), (42, 1603, 1813), (42, 1604, 1813), (42, 1605, 1813), (42, 1606, 1813), (42, 1607, 1813), (42, 1608, 1813), (42, 1609, 1813), (42, 1610, 1812), (42, 1611, 1812), (42, 1612, 1812), (42, 1613, 1812), (42, 1614, 1812), (41, 1615, 1813), (41, 1616, 1813), (41, 1617, 1813), (41, 1618, 1812), (41, 1619, 1812), (41, 1620, 1812), (41, 1621, 1812), (41, 1622, 1812), (41, 1623, 1812), (41, 1624, 1812), (41, 1625, 1812), (41, 1626, 1812), (41, 1627, 1811), (41, 1628, 1811), (41, 1629, 1811), (41, 1630, 1811), (41, 1631, 1811), (41, 1632, 1811), (41, 1633, 1811), (41, 1634, 1811), (41, 1635, 1810), (41, 1636, 1810), (41, 1637, 1810), (40, 1638, 1811), (40, 1639, 1811), (40, 1640, 1811), (40, 1641, 1811), (40, 1642, 1811), (40, 1643, 1810), (40, 1644, 1810), (40, 1645, 1810), (40, 1646, 1810), (40, 1647, 1810), (40, 1648, 1810), (40, 1649, 1810), (40, 1650, 1809), (40, 1651, 1809), (40, 1652, 1809), (40, 1653, 1809), (40, 1654, 1809), (40, 1655, 1809), (40, 1656, 1809), (40, 1657, 1809), (40, 1658, 1808), (40, 1659, 1808), (40, 1660, 1808), (39, 1661, 1809), (39, 1662, 1809), (39, 1663, 1809), (39, 1664, 1809), (39, 1665, 1808), (39, 1666, 1808), (39, 1667, 1808), (39, 1668, 1808), (39, 1669, 1808), (39, 1670, 1808), (39, 1671, 1808), (39, 1672, 1807), (39, 1673, 1807), (39, 1674, 1807), (39, 1675, 1807), (39, 1676, 1807), (39, 1677, 1806), (39, 1678, 1806), (40, 1679, 1805), (40, 1680, 1805), (40, 1681, 1804), (40, 1682, 1804), (40, 1683, 1804), (40, 1684, 1804), (41, 1685, 1802), (41, 1686, 1802), (41, 1687, 1802), (41, 1688, 1802), (41, 1689, 1801), (41, 1690, 1801), (42, 1691, 1800), (42, 1692, 1799), (42, 1693, 1799), (42, 1694, 1799), (42, 1695, 1799), (42, 1696, 1798), (43, 1697, 1797), (43, 1698, 1797), (43, 1699, 1796), (43, 1700, 1796), (43, 1701, 1796), (43, 1702, 1796), (44, 1703, 1794), (44, 1704, 1794), (44, 1705, 1794), (44, 1706, 1793), (44, 1707, 1793), (45, 1708, 1792), (45, 1709, 1791), (45, 1710, 1791), (45, 1711, 1791), (45, 1712, 1790), (46, 1713, 1789), (46, 1714, 1789), (46, 1715, 1788), (46, 1716, 1788), (46, 1717, 1788), (46, 1718, 1787), (47, 1719, 1786), (47, 1720, 1786), (47, 1721, 1785), (47, 1722, 1785), (47, 1723, 1785), (48, 1724, 1783), (48, 1725, 1783), (48, 1726, 1782), (48, 1727, 1782), (49, 1728, 1781), (49, 1729, 1780), (49, 1730, 1780), (49, 1731, 1780), (49, 1732, 1779), (50, 1733, 1778), (50, 1734, 1777), (50, 1735, 1777), (50, 1736, 1777), (51, 1737, 1775), (51, 1738, 1775), (51, 1739, 1774), (51, 1740, 1774), (51, 1741, 1773), (52, 1742, 1772), (52, 1743, 1772), (52, 1744, 1771), (52, 1745, 1771), (52, 1746, 1770), (53, 1747, 1769), (53, 1748, 1768), (53, 1749, 1768), (53, 1750, 1767), (53, 1751, 1767), (53, 1752, 1767), (53, 1753, 1766), (53, 1754, 1766), (53, 1755, 1765), (53, 1756, 1765), (53, 1757, 1765), (53, 1758, 1764), (54, 1759, 1763), (54, 1760, 1762), (54, 1761, 1762), (54, 1762, 1762), (54, 1763, 1761), (54, 1764, 1761), (54, 1765, 1761), (54, 1766, 1760), (54, 1767, 1760), (54, 1768, 1759), (54, 1769, 1759), (54, 1770, 1759), (54, 1771, 1758), (54, 1772, 1758), (54, 1773, 1758), (54, 1774, 1757), (54, 1775, 1757), (54, 1776, 1757), (54, 1777, 1756), (54, 1778, 1756), (54, 1779, 1756), (54, 1780, 1755), (54, 1781, 1755), (54, 1782, 1755), (54, 1783, 1754), (54, 1784, 1754), (54, 1785, 1754), (54, 1786, 1753), (54, 1787, 1753), (54, 1788, 1753), (54, 1789, 1752), (54, 1790, 1752), (54, 1791, 1752), (54, 1792, 1751), (54, 1793, 1751), (54, 1794, 1751), (54, 1795, 1751), (54, 1796, 1750), (54, 1797, 1750), (54, 1798, 1750), (54, 1799, 1749), (54, 1800, 1749), (54, 1801, 1749), (54, 1802, 1748), (54, 1803, 1748), (54, 1804, 1748), (54, 1805, 1748), (54, 1806, 1747), (54, 1807, 1747), (55, 1808, 1746), (55, 1809, 1745), (55, 1810, 1745), (55, 1811, 1745), (55, 1812, 1745), (55, 1813, 1744), (55, 1814, 1744), (55, 1815, 1744), (55, 1816, 1744), (55, 1817, 1743), (55, 1818, 1743), (55, 1819, 1743), (55, 1820, 1743), (55, 1821, 1742), (55, 1822, 1742), (55, 1823, 1742), (55, 1824, 1741), (55, 1825, 1741), (55, 1826, 1741), (55, 1827, 1741), (56, 1828, 1739), (56, 1829, 1739), (57, 1830, 1738), (57, 1831, 1737), (57, 1832, 1737), (58, 1833, 1736), (58, 1834, 1735), (59, 1835, 1734), (59, 1836, 1734), (60, 1837, 1732), (60, 1838, 1732), (61, 1839, 1731), (61, 1840, 1730), (62, 1841, 1729), (62, 1842, 1729), (63, 1843, 1727), (63, 1844, 1727), (63, 1845, 1727), (64, 1846, 1725), (64, 1847, 1725), (65, 1848, 1724), (65, 1849, 1723), (66, 1850, 1722), (66, 1851, 1721), (67, 1852, 1720), (67, 1853, 1720), (68, 1854, 1718), (68, 1855, 1718), (68, 1856, 1718), (69, 1857, 1716), (69, 1858, 1716), (70, 1859, 1715), (70, 1860, 1714), (71, 1861, 1713), (71, 1862, 1712), (72, 1863, 1711), (72, 1864, 1711), (73, 1865, 1709), (73, 1866, 1709), (73, 1867, 1708), (74, 1868, 1707), (74, 1869, 1707), (75, 1870, 1705), (75, 1871, 1705), (76, 1872, 1703), (76, 1873, 1703), (76, 1874, 1703), (77, 1875, 1701), (77, 1876, 1701), (78, 1877, 1699), (78, 1878, 1699), (79, 1879, 1698), (79, 1880, 1697), (79, 1881, 1697), (80, 1882, 1695), (80, 1883, 1695), (81, 1884, 1693), (81, 1885, 1693), (82, 1886, 1692), (82, 1887, 1691), (82, 1888, 1691), (83, 1889, 1689), (83, 1890, 1689), (84, 1891, 1687), (84, 1892, 1687), (85, 1893, 1685), (85, 1894, 1685), (85, 1895, 1685), (86, 1896, 1683), (86, 1897, 1683), (87, 1898, 1681), (87, 1899, 1681), (88, 1900, 1679), (88, 1901, 1679), (88, 1902, 1678), (89, 1903, 1677), (89, 1904, 1676), (89, 1905, 1676), (90, 1906, 1674), (90, 1907, 1674), (91, 1908, 1672), (91, 1909, 1672), (91, 1910, 1671), (92, 1911, 1670), (92, 1912, 1669), (92, 1913, 1669), (93, 1914, 1667), (93, 1915, 1667), (94, 1916, 1665), (94, 1917, 1665), (95, 1918, 1663), (95, 1919, 1662), (95, 1920, 1662), (96, 1921, 1660), (96, 1922, 1660), (97, 1923, 1658), (97, 1924, 1658), (98, 1925, 1656), (98, 1926, 1655), (98, 1927, 1655), (99, 1928, 1653), (99, 1929, 1652), (100, 1930, 1651), (100, 1931, 1650), (101, 1932, 1649), (101, 1933, 1648), (102, 1934, 1646), (102, 1935, 1646), (103, 1936, 1644), (103, 1937, 1643), (104, 1938, 1642), (104, 1939, 1641), (105, 1940, 1639), (106, 1941, 1637), (106, 1942, 1637), (107, 1943, 1635), (107, 1944, 1634), (108, 1945, 1632), (108, 1946, 1631), (109, 1947, 1630), (110, 1948, 1628), (110, 1949, 1627), (111, 1950, 1625), (111, 1951, 1624), (112, 1952, 1623), (113, 1953, 1621), (113, 1954, 1620), (114, 1955, 1618), (115, 1956, 1616), (115, 1957, 1616), (116, 1958, 1614), (117, 1959, 1612), (117, 1960, 1611), (118, 1961, 1610), (119, 1962, 1608), (120, 1963, 1606), (120, 1964, 1605), (121, 1965, 1604), (122, 1966, 75), (216, 1966, 1508), (123, 1967, 62), (223, 1967, 1500), (124, 1968, 50), (230, 1968, 1493), (124, 1969, 41), (238, 1969, 1484), (125, 1970, 30), (245, 1970, 1476), (126, 1971, 21), (253, 1971, 1467), (127, 1972, 12), (261, 1972, 1459), (128, 1973, 3), (269, 1973, 1450), (277, 1974, 1441), (286, 1975, 1432), (292, 1976, 1425), (297, 1977, 1419), (302, 1978, 1413), (307, 1979, 1408), (312, 1980, 1402), (317, 1981, 1396), (323, 1982, 1389), (328, 1983, 1383), (334, 1984, 1375), (340, 1985, 1368), (347, 1986, 1360), (354, 1987, 1352), (361, 1988, 1344), (369, 1989, 1335), (373, 1990, 1330), (377, 1991, 1324), (381, 1992, 1319), (385, 1993, 1314), (389, 1994, 1309), (394, 1995, 1302), (398, 1996, 1297), (402, 1997, 1292), (407, 1998, 1285), (411, 1999, 1280), (416, 2000, 1274), (420, 2001, 1268), (425, 2002, 1262), (430, 2003, 1255), (435, 2004, 1249), (440, 2005, 1242), (445, 2006, 1236), (451, 2007, 1228), (455, 2008, 1222), (460, 2009, 1216), (465, 2010, 1209), (470, 2011, 1202), (475, 2012, 1195), (480, 2013, 1189), (485, 2014, 1182), (490, 2015, 1175), (495, 2016, 1168), (501, 2017, 1080), (506, 2018, 1071), (512, 2019, 1061), (517, 2020, 1051), (523, 2021, 1041), (529, 2022, 1031), (534, 2023, 1022), (539, 2024, 1013), (544, 2025, 1004), (549, 2026, 995), (554, 2027, 986), (559, 2028, 977), (564, 2029, 969), (568, 2030, 961), (573, 2031, 952), (578, 2032, 943), (582, 2033, 936), (586, 2034, 928), (591, 2035, 920), (595, 2036, 912), (599, 2037, 905), (603, 2038, 895), (607, 2039, 880), (612, 2040, 863), (615, 2041, 849), (617, 2042, 835), (620, 2043, 821), (623, 2044, 807), (625, 2045, 795), (628, 2046, 788), (630, 2047, 782), (633, 2048, 776), (635, 2049, 770), (637, 2050, 765), (639, 2051, 760), (642, 2052, 753), (644, 2053, 748), (646, 2054, 743), (648, 2055, 738), (649, 2056, 734), (651, 2057, 729), (653, 2058, 724), (655, 2059, 719), (657, 2060, 714), (658, 2061, 710), (660, 2062, 705), (662, 2063, 699), (664, 2064, 694), (667, 2065, 687), (669, 2066, 682), (671, 2067, 676), (673, 2068, 670), (675, 2069, 665), (678, 2070, 657), (680, 2071, 651), (683, 2072, 643), (685, 2073, 636), (688, 2074, 628), (690, 2075, 621), (693, 2076, 613), (697, 2077, 604), (702, 2078, 594), (707, 2079, 584), (711, 2080, 575), (716, 2081, 564), (720, 2082, 555), (725, 2083, 544), (729, 2084, 534), (733, 2085, 524), (738, 2086, 514), (742, 2087, 504), (747, 2088, 494), (751, 2089, 485), (755, 2090, 476), (760, 2091, 465), (764, 2092, 456), (768, 2093, 448), (773, 2094, 438), (776, 2095, 430), (780, 2096, 421), (783, 2097, 414), (787, 2098, 405), (790, 2099, 398), (794, 2100, 389), (797, 2101, 382), (801, 2102, 375), (804, 2103, 370), (807, 2104, 364), (811, 2105, 358), (814, 2106, 352), (818, 2107, 346), (821, 2108, 341), (825, 2109, 334), (828, 2110, 329), (831, 2111, 324), (835, 2112, 318), (838, 2113, 313), (842, 2114, 307), (845, 2115, 302), (848, 2116, 297), (852, 2117, 291), (855, 2118, 286), (859, 2119, 280), (862, 2120, 275), (865, 2121, 271), (868, 2122, 266), (871, 2123, 261), (874, 2124, 256), (877, 2125, 252), (880, 2126, 247), (883, 2127, 242), (886, 2128, 238), (889, 2129, 233), (891, 2130, 230), (894, 2131, 225), (897, 2132, 220), (900, 2133, 215), (903, 2134, 210), (906, 2135, 205), (910, 2136, 199), (913, 2137, 194), (916, 2138, 189), (920, 2139, 183), (923, 2140, 178), (927, 2141, 172), (930, 2142, 165), (934, 2143, 153), (940, 2144, 140), (949, 2145, 124), (958, 2146, 107), (967, 2147, 91), (976, 2148, 74), (985, 2149, 57), (995, 2150, 38), (1006, 2151, 19), (1016, 2152, 1)], ['940,2144,772,2093,693,2076,611,2039,368,1988,215,1965,128,1973,104,1939,55,1827,39,1678,39,1444,30,1305,27,759,24,600,39,459,94,306,208,207,289,180,374,132,520,121,1430,121,1584,128,1663,142,1771,179,1904,204,2015,298,2076,379,2149,537,2168,613,2171,717,2164,840,2129,912,2113,991,2080,1069,2029,1137,2008,1193,1966,1257,1924,1382,1878,1447,1846,1671,1776,1879,1714,1979,1662,2016,1581,2016,1497,2038,1420,2044,1330,2071,1179,2100,1098,2141,1017,2151'])], 'temp/1772943677_2275741_917877156_a9c2d4b99270c9302def4ed40606e685.jpg']} nb pixel non reg : 3692295 nb pixel common : 3690017 proportion of common points : 0.9993830395458652 #&_# TEST SUCCEEDED #&_# : tests/mask_test #&_# #&_# END OF TEST #&_# : tests/mask_test #&_# #&_# BEGIN OF TEST : tests/datou_test #&_# /home/admin/workarea/git/Velours/python/tests/datou_test.py Datou All Test python version used : 3 ############################### TEST sam ################################ TEST SAM Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : sam list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.21728134155273438 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! WARNING : we have an input that is not a photo, we should get rid of it Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:sam Sun Mar 8 05:21:44 2026 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step sam ! Inside sam : nb paths : 1 (640, 960, 3) time for calcul the mask position with numpy : 0.0018508434295654297 nb_pixel_total : 5628 time to create 1 rle with old method : 0.01236104965209961 time for calcul the mask position with numpy : 0.0014681816101074219 nb_pixel_total : 7617 time to create 1 rle with old method : 0.0161740779876709 time for calcul the mask position with numpy : 0.0013713836669921875 nb_pixel_total : 13936 time to create 1 rle with old method : 0.03341937065124512 time for calcul the mask position with numpy : 0.0013885498046875 nb_pixel_total : 8602 time to create 1 rle with old method : 0.01801276206970215 time for calcul the mask position with numpy : 0.0013806819915771484 nb_pixel_total : 14663 time to create 1 rle with old method : 0.03099346160888672 time for calcul the mask position with numpy : 0.0017952919006347656 nb_pixel_total : 84144 time to create 1 rle with old method : 0.17766451835632324 time for calcul the mask position with numpy : 0.001386404037475586 nb_pixel_total : 5556 time to create 1 rle with old method : 0.011947870254516602 time for calcul the mask position with numpy : 0.0013108253479003906 nb_pixel_total : 2329 time to create 1 rle with old method : 0.00504302978515625 time for calcul the mask position with numpy : 0.0013127326965332031 nb_pixel_total : 2949 time to create 1 rle with old method : 0.006338357925415039 time for calcul the mask position with numpy : 0.0013320446014404297 nb_pixel_total : 3781 time to create 1 rle with old method : 0.007862091064453125 time for calcul the mask position with numpy : 0.0013113021850585938 nb_pixel_total : 2781 time to create 1 rle with old method : 0.0058574676513671875 time for calcul the mask position with numpy : 0.0014421939849853516 nb_pixel_total : 29430 time to create 1 rle with old method : 0.06258606910705566 time for calcul the mask position with numpy : 0.0013899803161621094 nb_pixel_total : 10767 time to create 1 rle with old method : 0.022263050079345703 time for calcul the mask position with numpy : 0.0014395713806152344 nb_pixel_total : 27634 time to create 1 rle with old method : 0.05886030197143555 time for calcul the mask position with numpy : 0.0013082027435302734 nb_pixel_total : 1221 time to create 1 rle with old method : 0.002704620361328125 time for calcul the mask position with numpy : 0.0013225078582763672 nb_pixel_total : 4274 time to create 1 rle with old method : 0.00908803939819336 time for calcul the mask position with numpy : 0.001394510269165039 nb_pixel_total : 6637 time to create 1 rle with old method : 0.013888835906982422 time for calcul the mask position with numpy : 0.0014281272888183594 nb_pixel_total : 1513 time to create 1 rle with old method : 0.0033469200134277344 time for calcul the mask position with numpy : 0.0014355182647705078 nb_pixel_total : 2452 time to create 1 rle with old method : 0.005320310592651367 time for calcul the mask position with numpy : 0.0013244152069091797 nb_pixel_total : 3952 time to create 1 rle with old method : 0.0085601806640625 time for calcul the mask position with numpy : 0.0013141632080078125 nb_pixel_total : 2079 time to create 1 rle with old method : 0.004303693771362305 time for calcul the mask position with numpy : 0.0013763904571533203 nb_pixel_total : 16361 time to create 1 rle with old method : 0.035250186920166016 time for calcul the mask position with numpy : 0.0015568733215332031 nb_pixel_total : 13029 time to create 1 rle with old method : 0.027872562408447266 time for calcul the mask position with numpy : 0.0014581680297851562 nb_pixel_total : 2314 time to create 1 rle with old method : 0.005198240280151367 time for calcul the mask position with numpy : 0.0014119148254394531 nb_pixel_total : 5512 time to create 1 rle with old method : 0.012022733688354492 time for calcul the mask position with numpy : 0.0013301372528076172 nb_pixel_total : 604 time to create 1 rle with old method : 0.0013561248779296875 time for calcul the mask position with numpy : 0.0014214515686035156 nb_pixel_total : 4283 time to create 1 rle with old method : 0.009648561477661133 time for calcul the mask position with numpy : 0.0014383792877197266 nb_pixel_total : 11916 time to create 1 rle with old method : 0.025920391082763672 time for calcul the mask position with numpy : 0.0013325214385986328 nb_pixel_total : 3321 time to create 1 rle with old method : 0.007254838943481445 time for calcul the mask position with numpy : 0.0013570785522460938 nb_pixel_total : 3538 time to create 1 rle with old method : 0.0074694156646728516 time for calcul the mask position with numpy : 0.0040760040283203125 nb_pixel_total : 38784 time to create 1 rle with old method : 0.08155345916748047 time for calcul the mask position with numpy : 0.0013344287872314453 nb_pixel_total : 3910 time to create 1 rle with old method : 0.008622407913208008 time for calcul the mask position with numpy : 0.001468658447265625 nb_pixel_total : 9929 time to create 1 rle with old method : 0.02134108543395996 time for calcul the mask position with numpy : 0.0013964176177978516 nb_pixel_total : 16382 time to create 1 rle with old method : 0.03483009338378906 time for calcul the mask position with numpy : 0.0014491081237792969 nb_pixel_total : 2727 time to create 1 rle with old method : 0.00611114501953125 time for calcul the mask position with numpy : 0.0013227462768554688 nb_pixel_total : 2450 time to create 1 rle with old method : 0.005246639251708984 time for calcul the mask position with numpy : 0.0013353824615478516 nb_pixel_total : 2369 time to create 1 rle with old method : 0.005272388458251953 time for calcul the mask position with numpy : 0.0013704299926757812 nb_pixel_total : 13017 time to create 1 rle with old method : 0.02784872055053711 time for calcul the mask position with numpy : 0.0013036727905273438 nb_pixel_total : 595 time to create 1 rle with old method : 0.001375436782836914 time for calcul the mask position with numpy : 0.0014286041259765625 nb_pixel_total : 1648 time to create 1 rle with old method : 0.0037970542907714844 time for calcul the mask position with numpy : 0.001329183578491211 nb_pixel_total : 1260 time to create 1 rle with old method : 0.002693653106689453 time for calcul the mask position with numpy : 0.0013244152069091797 nb_pixel_total : 342 time to create 1 rle with old method : 0.0008409023284912109 time for calcul the mask position with numpy : 0.0013875961303710938 nb_pixel_total : 1026 time to create 1 rle with old method : 0.002226591110229492 time for calcul the mask position with numpy : 0.0013647079467773438 nb_pixel_total : 10621 time to create 1 rle with old method : 0.022452592849731445 time for calcul the mask position with numpy : 0.0013418197631835938 nb_pixel_total : 4120 time to create 1 rle with old method : 0.009237051010131836 time for calcul the mask position with numpy : 0.00144195556640625 nb_pixel_total : 4182 time to create 1 rle with old method : 0.009033918380737305 time for calcul the mask position with numpy : 0.0013082027435302734 nb_pixel_total : 2031 time to create 1 rle with old method : 0.004552125930786133 time for calcul the mask position with numpy : 0.0013375282287597656 nb_pixel_total : 861 time to create 1 rle with old method : 0.0021080970764160156 time for calcul the mask position with numpy : 0.0013072490692138672 nb_pixel_total : 595 time to create 1 rle with old method : 0.0014071464538574219 time for calcul the mask position with numpy : 0.0014238357543945312 nb_pixel_total : 875 time to create 1 rle with old method : 0.002126455307006836 time for calcul the mask position with numpy : 0.0014095306396484375 nb_pixel_total : 579 time to create 1 rle with old method : 0.001287698745727539 time for calcul the mask position with numpy : 0.0013937950134277344 nb_pixel_total : 693 time to create 1 rle with old method : 0.0016541481018066406 time for calcul the mask position with numpy : 0.001306772232055664 nb_pixel_total : 331 time to create 1 rle with old method : 0.0008385181427001953 time for calcul the mask position with numpy : 0.0014286041259765625 nb_pixel_total : 1709 time to create 1 rle with old method : 0.0038518905639648438 time for calcul the mask position with numpy : 0.0014028549194335938 nb_pixel_total : 8505 time to create 1 rle with old method : 0.018319368362426758 time for calcul the mask position with numpy : 0.0013170242309570312 nb_pixel_total : 2772 time to create 1 rle with old method : 0.0059566497802734375 time for calcul the mask position with numpy : 0.0013628005981445312 nb_pixel_total : 13087 time to create 1 rle with old method : 0.027658939361572266 time for calcul the mask position with numpy : 0.0013079643249511719 nb_pixel_total : 1682 time to create 1 rle with old method : 0.0037338733673095703 time for calcul the mask position with numpy : 0.0013535022735595703 nb_pixel_total : 1206 time to create 1 rle with old method : 0.0026183128356933594 time for calcul the mask position with numpy : 0.0013148784637451172 nb_pixel_total : 1056 time to create 1 rle with old method : 0.0023670196533203125 time for calcul the mask position with numpy : 0.0013661384582519531 nb_pixel_total : 1617 time to create 1 rle with old method : 0.003670930862426758 time for calcul the mask position with numpy : 0.0013089179992675781 nb_pixel_total : 1077 time to create 1 rle with old method : 0.0023756027221679688 time for calcul the mask position with numpy : 0.001352548599243164 nb_pixel_total : 8574 time to create 1 rle with old method : 0.01836371421813965 time for calcul the mask position with numpy : 0.0013227462768554688 nb_pixel_total : 3092 time to create 1 rle with old method : 0.006742954254150391 time for calcul the mask position with numpy : 0.001314401626586914 nb_pixel_total : 1740 time to create 1 rle with old method : 0.0037462711334228516 time for calcul the mask position with numpy : 0.001447439193725586 nb_pixel_total : 16693 time to create 1 rle with old method : 0.03588104248046875 time for calcul the mask position with numpy : 0.0013880729675292969 nb_pixel_total : 7395 time to create 1 rle with old method : 0.016262054443359375 time for calcul the mask position with numpy : 0.0013108253479003906 nb_pixel_total : 1009 time to create 1 rle with old method : 0.0022482872009277344 time for calcul the mask position with numpy : 0.0013768672943115234 nb_pixel_total : 3164 time to create 1 rle with old method : 0.0068051815032958984 time for calcul the mask position with numpy : 0.0013403892517089844 nb_pixel_total : 7547 time to create 1 rle with old method : 0.016416549682617188 time for calcul the mask position with numpy : 0.0013642311096191406 nb_pixel_total : 9077 time to create 1 rle with old method : 0.020914316177368164 time for calcul the mask position with numpy : 0.0014410018920898438 nb_pixel_total : 715 time to create 1 rle with old method : 0.0017066001892089844 time for calcul the mask position with numpy : 0.0014224052429199219 nb_pixel_total : 267 time to create 1 rle with old method : 0.0006577968597412109 time for calcul the mask position with numpy : 0.0013957023620605469 nb_pixel_total : 1334 time to create 1 rle with old method : 0.0032529830932617188 time for calcul the mask position with numpy : 0.0014200210571289062 nb_pixel_total : 9509 time to create 1 rle with old method : 0.020777225494384766 time for calcul the mask position with numpy : 0.0013463497161865234 nb_pixel_total : 4402 time to create 1 rle with old method : 0.010133028030395508 time for calcul the mask position with numpy : 0.0013766288757324219 nb_pixel_total : 4204 time to create 1 rle with old method : 0.009276866912841797 time for calcul the mask position with numpy : 0.0014035701751708984 nb_pixel_total : 971 time to create 1 rle with old method : 0.002148151397705078 time for calcul the mask position with numpy : 0.0013303756713867188 nb_pixel_total : 221 time to create 1 rle with old method : 0.0005369186401367188 time for calcul the mask position with numpy : 0.0014045238494873047 nb_pixel_total : 18474 time to create 1 rle with old method : 0.0390474796295166 time for calcul the mask position with numpy : 0.0013360977172851562 nb_pixel_total : 615 time to create 1 rle with old method : 0.0014276504516601562 time for calcul the mask position with numpy : 0.0013887882232666016 nb_pixel_total : 248 time to create 1 rle with old method : 0.0005958080291748047 time for calcul the mask position with numpy : 0.0013270378112792969 nb_pixel_total : 976 time to create 1 rle with old method : 0.002298116683959961 time for calcul the mask position with numpy : 0.0013251304626464844 nb_pixel_total : 735 time to create 1 rle with old method : 0.0018696784973144531 time for calcul the mask position with numpy : 0.0013930797576904297 nb_pixel_total : 1482 time to create 1 rle with old method : 0.003237009048461914 time for calcul the mask position with numpy : 0.001321554183959961 nb_pixel_total : 1633 time to create 1 rle with old method : 0.0035588741302490234 time for calcul the mask position with numpy : 0.0013194084167480469 nb_pixel_total : 1438 time to create 1 rle with old method : 0.0031561851501464844 time for calcul the mask position with numpy : 0.0013420581817626953 nb_pixel_total : 921 time to create 1 rle with old method : 0.0021800994873046875 time for calcul the mask position with numpy : 0.0013155937194824219 nb_pixel_total : 867 time to create 1 rle with old method : 0.0019142627716064453 time for calcul the mask position with numpy : 0.0013124942779541016 nb_pixel_total : 298 time to create 1 rle with old method : 0.0007352828979492188 time for calcul the mask position with numpy : 0.0013003349304199219 nb_pixel_total : 890 time to create 1 rle with old method : 0.0019795894622802734 time for calcul the mask position with numpy : 0.001302957534790039 nb_pixel_total : 1125 time to create 1 rle with old method : 0.002492189407348633 time for calcul the mask position with numpy : 0.0014314651489257812 nb_pixel_total : 2197 time to create 1 rle with old method : 0.004841804504394531 time for calcul the mask position with numpy : 0.0013353824615478516 nb_pixel_total : 6154 time to create 1 rle with old method : 0.013671398162841797 time for calcul the mask position with numpy : 0.0014052391052246094 nb_pixel_total : 1320 time to create 1 rle with old method : 0.003106832504272461 time for calcul the mask position with numpy : 0.0013077259063720703 nb_pixel_total : 419 time to create 1 rle with old method : 0.000978231430053711 time for calcul the mask position with numpy : 0.0014083385467529297 nb_pixel_total : 884 time to create 1 rle with old method : 0.0019464492797851562 time for calcul the mask position with numpy : 0.0014185905456542969 nb_pixel_total : 1627 time to create 1 rle with old method : 0.0035495758056640625 time for calcul the mask position with numpy : 0.0013158321380615234 nb_pixel_total : 2263 time to create 1 rle with old method : 0.005070209503173828 time for calcul the mask position with numpy : 0.0013136863708496094 nb_pixel_total : 967 time to create 1 rle with old method : 0.0021677017211914062 time for calcul the mask position with numpy : 0.0013167858123779297 nb_pixel_total : 1438 time to create 1 rle with old method : 0.0033273696899414062 time for calcul the mask position with numpy : 0.0013289451599121094 nb_pixel_total : 340 time to create 1 rle with old method : 0.0008304119110107422 batch 1 Loaded 102 chid ids of type : 4677 Number RLEs to save : 9346 TO DO : save crop sub photo not yet done ! Inside saveOutput : final : True verbose : False saveOutput not yet implemented for datou_step.type : sam we use saveGeneral [1189321094] Looping around the photos to save general results len do output : 1 /1189321094Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4573', None, None, None, None, None, None, None, None) ('4573', None, '1189321094', None, None, None, None, None, None) begin to insert list_values into mtr_datou_result : length of list_values in save_final : 3 time used for this insertion : 0.019391298294067383 save_final save missing photos in datou_result : time spend for datou_step_exec : 15.023008823394775 time spend to save output : 0.019664764404296875 total time spend for step 1 : 15.042673587799072 caffe_path_current : About to save ! 2 After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {'1189321094': [[, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ], 'temp/1772943704_2275741_1189321094_9626af7f95d010f2a4fd524688d4ea22_76896585.png']} nb_objects detect : 102 ############################### TEST frcnn ################################ Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : frcnn list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.13539695739746094 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:frcnn Sun Mar 8 05:22:00 2026 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step Faster rcnn ! To loadFromThcl() model_param file didn't exist model_name : detection_plaque_valcor_010622 model_type : caffe_faster_rcnn list file need : ['caffemodel', 'test.prototxt'] file exist in s3 : ['caffemodel', 'test.prototxt'] file manque in s3 : [] [libprotobuf ERROR google/protobuf/text_format.cc:307] Error parsing text-format caffe.NetParameter: 325:21: Message type "caffe.LayerParameter" has no field named "roi_pooling_param". WARNING: Logging before InitGoogleLogging() is written to STDERR F0308 05:22:01.738979 2275741 upgrade_proto.cpp:90] Check failed: ReadProtoFromTextFile(param_file, param) Failed to parse NetParameter file: /data/models_weight/detection_plaque_valcor_010622/test.prototxt *** Check failure stack trace: *** Aborted (core dumped) No data to report. No data to report. ret : 34304 command : coverage3 html -i --omit=/usr/local/lib/python3.8/dist-packages/*,/home/admin/.local/lib/python3.8/site-packages/*,/usr/lib/python3/dist-packages/* -d htmlcov ret : 256 command : coverage3 report -i -m ret : 256 44.87user 24.37system 1:37.48elapsed 71%CPU (0avgtext+0avgdata 2964556maxresident)k 4110272inputs+4760outputs (13650major+2899012minor)pagefaults 0swaps