python /home/admin/mtr/script_for_cron.py -j coverage -m 9 -a '' -s coverage -M 0 -S 0 -U 100,100,120 import MySQLdb succeeded root_folder /data_4/data_log/job/2025/October/05102025/coverage/ git_velours : /home/admin/workarea/git/Velours/ out_folder_name htmlcov output_folder /data_4/data_log/job/2025/October/05102025/coverage/htmlcov new path : /data_4/data_log/job/2025/October/05102025/coverage/ command : coverage3 run /home/admin/workarea/git/Velours/python/tests/python_tests.py --short_python3 `cat ~/.fotonower_pass/bdd.py.pass` cat: /home/admin/.fotonower_pass/bdd.py.pass: Aucun fichier ou dossier de ce type import MySQLdb succeeded Import error (python version) python version = 3 warning , we can't find thcl infos in json_data warning , we can't find pdt infos in json_data python version used : 3 #&_# BEGIN OF TEST : tests/mask_test #&_# /home/admin/workarea/git/Velours/python/tests/mask_test.py Test mask-detection python version used : 3 ############################### TEST memory used ################################ free memory at begining : begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 10578 run mask_detect Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : mask_detect list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.13689470291137695 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:mask_detect Sun Oct 5 05:20:30 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 10578 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 /home/admin/workarea/git/Velours/python/tests/python_tests.py:11: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses import imp 2025-10-05 05:20:33.934731: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-10-05 05:20:33.964570: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3492910000 Hz 2025-10-05 05:20:33.966791: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f74ac000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-10-05 05:20:33.966827: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-10-05 05:20:33.970905: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-10-05 05:20:34.099090: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x8b9e880 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-10-05 05:20:34.099140: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-10-05 05:20:34.100302: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-10-05 05:20:34.100726: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-05 05:20:34.103481: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-05 05:20:34.106327: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-10-05 05:20:34.106819: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-10-05 05:20:34.109912: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-10-05 05:20:34.110934: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-10-05 05:20:34.115176: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-10-05 05:20:34.116658: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-10-05 05:20:34.116738: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-05 05:20:34.117534: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-10-05 05:20:34.117551: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-10-05 05:20:34.117560: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-10-05 05:20:34.122710: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9654 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl454 thcls : [{'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}] thcl {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 3473 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (3473, 'mask_coco_origin', 16384, 25088, 'mask_coco_origin', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2018, 3, 19, 10, 42, 21), datetime.datetime(2018, 3, 19, 10, 42, 21)) {'thcl': {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'], 'list_hashtags_csv': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'svm_hashtag_type_desc': 3473, 'photo_desc_type': 3473, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME mask_coco_origin NUM_CLASSES 81 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 2025-10-05 05:20:36.003158: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-10-05 05:20:36.003239: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-05 05:20:36.003259: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-05 05:20:36.003276: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-10-05 05:20:36.003292: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-10-05 05:20:36.003308: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-10-05 05:20:36.003323: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-10-05 05:20:36.003339: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-10-05 05:20:36.004603: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-10-05 05:20:36.005779: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-10-05 05:20:36.005826: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-05 05:20:36.005842: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-05 05:20:36.005856: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-10-05 05:20:36.005871: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-10-05 05:20:36.005893: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-10-05 05:20:36.005907: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-10-05 05:20:36.005922: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-10-05 05:20:36.007198: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-10-05 05:20:36.007233: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-10-05 05:20:36.007241: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-10-05 05:20:36.007249: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-10-05 05:20:36.008625: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9654 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : mask_coco_origin model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-10-05 05:20:45.676583: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-05 05:20:45.894563: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/mask_coco_origin /data/models_weight/mask_coco_origin/mask_model.h5 size_local : 257557808 size in s3 : 257557808 create time local : 2021-08-09 05:27:17 create time in s3 : 2021-08-06 19:45:17 mask_model.h5 already exist and didn't need to update list_images length : 1 NEW PHOTO Processing 1 images image shape: (480, 640, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 89) min: 0.00000 max: 640.00000 nb d'objets trouves : 5 Detection mask done ! Trying to reset tf kernel 2283145 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 1509 tf kernel not reseted sub process len(results) : 1 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 1 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 6798 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl454 Catched exception ! Connect or reconnect ! thcls : [{'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}] thcl {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 3473 ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] time for calcul the mask position with numpy : 0.0007691383361816406 nb_pixel_total : 15552 time to create 1 rle with old method : 0.03329157829284668 length of segment : 256 time for calcul the mask position with numpy : 0.0028367042541503906 nb_pixel_total : 145329 time to create 1 rle with old method : 0.31108689308166504 length of segment : 371 time for calcul the mask position with numpy : 0.0003018379211425781 nb_pixel_total : 14252 time to create 1 rle with old method : 0.029938220977783203 length of segment : 151 time for calcul the mask position with numpy : 0.00013685226440429688 nb_pixel_total : 5613 time to create 1 rle with old method : 0.012459278106689453 length of segment : 48 time for calcul the mask position with numpy : 6.508827209472656e-05 nb_pixel_total : 1824 time to create 1 rle with old method : 0.0041561126708984375 length of segment : 39 time spent for convertir_results : 2.4897282123565674 time spend for datou_step_exec : 23.587493658065796 time spend to save output : 8.082389831542969e-05 total time spend for step 1 : 23.58757448196411 caffe_path_current : About to save ! 1 Inside saveOutput : final : True verbose : False eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 3397 chid ids of type : 445 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++Number RLEs to save : 0 begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1 time used for this insertion : 0.04159069061279297 save missing photos in datou_result : After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {'957285035': [[(957285035, 492601069, 445, 0, 186, 22, 282, 0.995544, [(140, 26, 6), (135, 27, 15), (133, 28, 18), (131, 29, 22), (127, 30, 27), (10, 31, 1), (121, 31, 34), (8, 32, 13), (26, 32, 4), (115, 32, 41), (7, 33, 52), (109, 33, 48), (6, 34, 70), (103, 34, 55), (5, 35, 154), (4, 36, 155), (3, 37, 156), (3, 38, 156), (3, 39, 156), (2, 40, 157), (2, 41, 157), (2, 42, 157), (2, 43, 157), (2, 44, 157), (2, 45, 157), (1, 46, 158), (1, 47, 158), (1, 48, 158), (1, 49, 157), (1, 50, 157), (1, 51, 156), (1, 52, 156), (1, 53, 155), (1, 54, 154), (1, 55, 152), (1, 56, 149), (1, 57, 145), (1, 58, 141), (1, 59, 137), (1, 60, 133), (1, 61, 130), (1, 62, 127), (1, 63, 126), (1, 64, 124), (1, 65, 123), (1, 66, 121), (1, 67, 120), (1, 68, 118), (1, 69, 117), (1, 70, 116), (1, 71, 115), (1, 72, 114), (1, 73, 113), (1, 74, 112), (1, 75, 111), (1, 76, 110), (1, 77, 108), (1, 78, 108), (1, 79, 107), (1, 80, 106), (1, 81, 105), (2, 82, 104), (2, 83, 103), (2, 84, 103), (2, 85, 102), (2, 86, 102), (2, 87, 101), (2, 88, 100), (2, 89, 99), (2, 90, 99), (2, 91, 98), (2, 92, 97), (2, 93, 96), (2, 94, 95), (2, 95, 93), (2, 96, 91), (2, 97, 90), (2, 98, 89), (2, 99, 87), (2, 100, 86), (2, 101, 86), (2, 102, 85), (2, 103, 84), (2, 104, 83), (2, 105, 83), (2, 106, 82), (2, 107, 81), (2, 108, 80), (2, 109, 80), (2, 110, 79), (2, 111, 78), (2, 112, 77), (2, 113, 76), (1, 114, 76), (1, 115, 75), (1, 116, 74), (1, 117, 73), (1, 118, 72), (1, 119, 71), (1, 120, 71), (1, 121, 70), (1, 122, 69), (1, 123, 69), (1, 124, 68), (1, 125, 68), (1, 126, 67), (1, 127, 67), (1, 128, 66), (1, 129, 66), (1, 130, 66), (1, 131, 65), (1, 132, 65), (1, 133, 64), (1, 134, 63), (1, 135, 63), (1, 136, 62), (1, 137, 61), (1, 138, 60), (1, 139, 60), (1, 140, 59), (1, 141, 58), (1, 142, 58), (1, 143, 57), (1, 144, 56), (1, 145, 56), (1, 146, 55), (1, 147, 54), (1, 148, 54), (1, 149, 53), (1, 150, 52), (1, 151, 52), (1, 152, 51), (1, 153, 50), (1, 154, 49), (1, 155, 48), (1, 156, 47), (1, 157, 46), (1, 158, 45), (1, 159, 45), (1, 160, 44), (1, 161, 43), (1, 162, 42), (1, 163, 41), (1, 164, 41), (1, 165, 40), (1, 166, 40), (1, 167, 39), (1, 168, 38), (1, 169, 37), (1, 170, 36), (1, 171, 35), (1, 172, 34), (1, 173, 34), (1, 174, 33), (1, 175, 33), (1, 176, 32), (1, 177, 32), (1, 178, 32), (1, 179, 32), (1, 180, 31), (1, 181, 31), (1, 182, 31), (1, 183, 30), (1, 184, 30), (1, 185, 30), (1, 186, 29), (1, 187, 29), (1, 188, 29), (1, 189, 28), (1, 190, 28), (1, 191, 27), (1, 192, 27), (1, 193, 26), (1, 194, 26), (1, 195, 26), (1, 196, 26), (1, 197, 26), (1, 198, 26), (1, 199, 26), (1, 200, 25), (1, 201, 25), (1, 202, 25), (1, 203, 25), (1, 204, 25), (1, 205, 25), (1, 206, 25), (1, 207, 25), (1, 208, 25), (1, 209, 25), (1, 210, 25), (1, 211, 25), (1, 212, 25), (1, 213, 25), (1, 214, 25), (1, 215, 25), (1, 216, 25), (1, 217, 25), (1, 218, 25), (1, 219, 25), (1, 220, 24), (1, 221, 24), (1, 222, 24), (1, 223, 24), (1, 224, 24), (1, 225, 24), (1, 226, 25), (1, 227, 25), (1, 228, 25), (2, 229, 24), (2, 230, 24), (2, 231, 24), (2, 232, 23), (2, 233, 23), (2, 234, 23), (2, 235, 23), (2, 236, 23), (2, 237, 23), (2, 238, 23), (2, 239, 23), (2, 240, 23), (2, 241, 23), (2, 242, 23), (2, 243, 23), (2, 244, 23), (2, 245, 23), (2, 246, 23), (2, 247, 23), (2, 248, 23), (2, 249, 24), (2, 250, 24), (2, 251, 23), (2, 252, 23), (2, 253, 23), (2, 254, 23), (2, 255, 23), (2, 256, 23), (2, 257, 23), (2, 258, 23), (2, 259, 23), (2, 260, 23), (2, 261, 23), (3, 262, 22), (3, 263, 22), (3, 264, 22), (3, 265, 22), (4, 266, 21), (4, 267, 21), (5, 268, 20), (5, 269, 20), (6, 270, 19), (7, 271, 17), (8, 272, 16), (8, 273, 16), (9, 274, 13), (11, 275, 9), (15, 276, 2)], ['16,276,8,273,2,261,2,229,1,228,1,114,2,113,2,82,1,81,1,46,3,37,8,32,29,32,30,33,58,33,59,34,75,34,76,35,102,35,115,32,126,31,135,27,145,26,152,29,158,35,158,48,154,54,128,61,119,67,105,81,103,86,96,94,89,98,81,109,71,119,66,128,65,132,60,138,52,150,41,163,40,166,34,172,32,176,29,188,26,193,25,200,25,219,24,232,24,270,23,273']), (957285035, 492601069, 445, 29, 591, 24, 419, 0.99236876, [(312, 37, 29), (271, 38, 88), (252, 39, 131), (236, 40, 153), (199, 41, 196), (189, 42, 213), (180, 43, 239), (175, 44, 250), (172, 45, 258), (169, 46, 266), (166, 47, 274), (162, 48, 284), (159, 49, 294), (157, 50, 304), (155, 51, 311), (153, 52, 317), (151, 53, 323), (150, 54, 329), (148, 55, 334), (146, 56, 337), (144, 57, 341), (142, 58, 344), (140, 59, 347), (138, 60, 350), (136, 61, 353), (134, 62, 356), (132, 63, 358), (130, 64, 361), (128, 65, 364), (126, 66, 367), (124, 67, 370), (122, 68, 373), (120, 69, 376), (118, 70, 379), (117, 71, 381), (115, 72, 385), (114, 73, 387), (113, 74, 389), (112, 75, 391), (112, 76, 393), (111, 77, 395), (110, 78, 397), (109, 79, 399), (109, 80, 400), (108, 81, 402), (107, 82, 404), (107, 83, 404), (106, 84, 406), (105, 85, 408), (105, 86, 409), (104, 87, 410), (104, 88, 411), (103, 89, 413), (102, 90, 415), (101, 91, 417), (100, 92, 420), (98, 93, 423), (97, 94, 426), (96, 95, 428), (94, 96, 431), (93, 97, 433), (92, 98, 435), (91, 99, 437), (90, 100, 439), (89, 101, 441), (89, 102, 441), (89, 103, 442), (89, 104, 443), (89, 105, 444), (89, 106, 444), (89, 107, 445), (89, 108, 446), (89, 109, 447), (89, 110, 448), (89, 111, 449), (89, 112, 450), (89, 113, 451), (89, 114, 453), (89, 115, 454), (89, 116, 455), (88, 117, 456), (88, 118, 457), (87, 119, 459), (87, 120, 459), (86, 121, 461), (86, 122, 461), (85, 123, 463), (84, 124, 464), (84, 125, 465), (83, 126, 466), (82, 127, 468), (82, 128, 468), (81, 129, 470), (80, 130, 471), (78, 131, 473), (77, 132, 475), (75, 133, 477), (73, 134, 480), (72, 135, 481), (70, 136, 484), (68, 137, 486), (67, 138, 488), (66, 139, 489), (64, 140, 492), (63, 141, 493), (61, 142, 496), (60, 143, 497), (59, 144, 499), (59, 145, 500), (58, 146, 501), (58, 147, 502), (57, 148, 504), (57, 149, 505), (56, 150, 507), (56, 151, 507), (55, 152, 509), (55, 153, 510), (55, 154, 510), (54, 155, 512), (54, 156, 513), (53, 157, 514), (53, 158, 514), (53, 159, 514), (52, 160, 516), (52, 161, 516), (51, 162, 517), (51, 163, 517), (50, 164, 518), (50, 165, 518), (49, 166, 519), (49, 167, 520), (48, 168, 521), (48, 169, 521), (47, 170, 522), (47, 171, 522), (46, 172, 523), (46, 173, 523), (46, 174, 523), (45, 175, 524), (45, 176, 523), (45, 177, 523), (44, 178, 524), (44, 179, 524), (43, 180, 525), (43, 181, 525), (42, 182, 525), (42, 183, 525), (42, 184, 525), (41, 185, 526), (41, 186, 526), (40, 187, 526), (39, 188, 526), (39, 189, 525), (38, 190, 526), (38, 191, 525), (37, 192, 525), (37, 193, 523), (36, 194, 523), (36, 195, 522), (36, 196, 522), (35, 197, 522), (35, 198, 521), (35, 199, 520), (34, 200, 521), (34, 201, 520), (34, 202, 520), (34, 203, 520), (34, 204, 519), (34, 205, 519), (33, 206, 520), (33, 207, 519), (33, 208, 519), (33, 209, 519), (33, 210, 518), (33, 211, 518), (33, 212, 518), (33, 213, 517), (32, 214, 518), (32, 215, 517), (32, 216, 517), (32, 217, 516), (32, 218, 515), (32, 219, 514), (32, 220, 513), (32, 221, 512), (32, 222, 511), (32, 223, 510), (32, 224, 508), (32, 225, 507), (32, 226, 505), (32, 227, 504), (32, 228, 503), (32, 229, 502), (32, 230, 502), (32, 231, 501), (32, 232, 500), (32, 233, 499), (32, 234, 498), (32, 235, 497), (31, 236, 496), (31, 237, 495), (31, 238, 494), (31, 239, 493), (31, 240, 491), (31, 241, 490), (31, 242, 488), (31, 243, 487), (31, 244, 486), (31, 245, 485), (31, 246, 483), (31, 247, 482), (31, 248, 480), (31, 249, 479), (31, 250, 477), (31, 251, 475), (31, 252, 473), (31, 253, 472), (31, 254, 470), (31, 255, 468), (31, 256, 467), (31, 257, 465), (31, 258, 464), (31, 259, 463), (31, 260, 462), (31, 261, 461), (31, 262, 459), (31, 263, 458), (31, 264, 456), (31, 265, 455), (31, 266, 453), (31, 267, 451), (31, 268, 449), (31, 269, 448), (31, 270, 446), (31, 271, 445), (31, 272, 444), (31, 273, 443), (32, 274, 441), (32, 275, 440), (32, 276, 438), (32, 277, 437), (32, 278, 435), (32, 279, 434), (32, 280, 432), (33, 281, 429), (33, 282, 427), (33, 283, 426), (33, 284, 424), (33, 285, 423), (34, 286, 421), (34, 287, 420), (34, 288, 419), (35, 289, 416), (35, 290, 415), (35, 291, 414), (36, 292, 411), (36, 293, 410), (37, 294, 407), (37, 295, 406), (38, 296, 403), (38, 297, 401), (39, 298, 399), (39, 299, 397), (41, 300, 394), (42, 301, 392), (43, 302, 389), (44, 303, 387), (45, 304, 385), (46, 305, 382), (47, 306, 380), (47, 307, 378), (48, 308, 376), (49, 309, 373), (50, 310, 370), (51, 311, 368), (51, 312, 367), (52, 313, 365), (54, 314, 362), (55, 315, 360), (56, 316, 359), (58, 317, 356), (61, 318, 352), (64, 319, 349), (67, 320, 345), (70, 321, 341), (73, 322, 338), (75, 323, 335), (78, 324, 332), (80, 325, 329), (82, 326, 327), (84, 327, 324), (86, 328, 322), (88, 329, 320), (90, 330, 317), (93, 331, 314), (96, 332, 311), (99, 333, 307), (102, 334, 304), (105, 335, 300), (108, 336, 297), (111, 337, 294), (113, 338, 291), (115, 339, 289), (117, 340, 286), (119, 341, 283), (121, 342, 281), (123, 343, 278), (125, 344, 275), (127, 345, 272), (129, 346, 269), (132, 347, 266), (135, 348, 262), (138, 349, 258), (141, 350, 255), (143, 351, 252), (145, 352, 250), (147, 353, 247), (149, 354, 245), (151, 355, 242), (152, 356, 241), (154, 357, 239), (156, 358, 237), (159, 359, 233), (161, 360, 231), (163, 361, 229), (165, 362, 227), (167, 363, 224), (169, 364, 222), (170, 365, 221), (172, 366, 219), (173, 367, 218), (174, 368, 216), (175, 369, 215), (177, 370, 213), (178, 371, 212), (180, 372, 209), (183, 373, 206), (185, 374, 204), (188, 375, 200), (191, 376, 197), (194, 377, 193), (196, 378, 191), (199, 379, 188), (201, 380, 185), (203, 381, 183), (205, 382, 180), (207, 383, 178), (208, 384, 176), (210, 385, 174), (212, 386, 171), (213, 387, 169), (215, 388, 166), (218, 389, 162), (221, 390, 158), (225, 391, 153), (228, 392, 149), (232, 393, 144), (235, 394, 140), (238, 395, 136), (241, 396, 133), (245, 397, 128), (248, 398, 124), (252, 399, 119), (257, 400, 113), (263, 401, 105), (272, 402, 94), (283, 403, 82), (297, 404, 65), (306, 405, 53), (313, 406, 38), (321, 407, 23)], ['321,407,296,403,263,401,215,388,206,382,178,371,168,363,110,336,90,330,56,316,39,299,31,273,32,214,36,194,59,144,82,128,89,116,89,101,104,88,115,72,159,49,180,43,199,41,235,41,271,38,340,37,382,39,402,43,418,43,460,50,481,55,504,76,543,116,556,143,566,156,568,167,566,186,554,199,548,216,528,235,509,249,477,269,448,291,420,309,407,327,403,339,392,355,383,385,369,400,358,405']), (957285035, 492601069, 445, 485, 636, 23, 174, 0.9711494, [(540, 24, 21), (626, 24, 3), (531, 25, 49), (594, 25, 40), (527, 26, 107), (523, 27, 111), (520, 28, 114), (518, 29, 117), (516, 30, 119), (515, 31, 120), (513, 32, 122), (512, 33, 123), (510, 34, 125), (509, 35, 126), (507, 36, 128), (506, 37, 129), (504, 38, 131), (503, 39, 132), (501, 40, 134), (500, 41, 135), (499, 42, 136), (498, 43, 137), (497, 44, 138), (496, 45, 139), (496, 46, 139), (495, 47, 140), (495, 48, 140), (494, 49, 141), (493, 50, 142), (492, 51, 143), (491, 52, 144), (491, 53, 144), (490, 54, 145), (490, 55, 145), (490, 56, 145), (490, 57, 146), (490, 58, 146), (490, 59, 146), (491, 60, 145), (491, 61, 145), (491, 62, 145), (492, 63, 144), (493, 64, 143), (494, 65, 142), (495, 66, 141), (496, 67, 139), (497, 68, 138), (498, 69, 138), (499, 70, 137), (500, 71, 136), (501, 72, 135), (503, 73, 133), (503, 74, 133), (505, 75, 131), (506, 76, 130), (507, 77, 129), (508, 78, 128), (509, 79, 127), (510, 80, 126), (511, 81, 125), (512, 82, 124), (513, 83, 123), (514, 84, 122), (515, 85, 121), (516, 86, 120), (517, 87, 119), (518, 88, 118), (519, 89, 117), (521, 90, 115), (521, 91, 115), (522, 92, 114), (523, 93, 113), (524, 94, 112), (525, 95, 111), (526, 96, 110), (527, 97, 109), (529, 98, 107), (530, 99, 106), (532, 100, 104), (533, 101, 103), (534, 102, 102), (535, 103, 101), (536, 104, 100), (538, 105, 98), (540, 106, 96), (541, 107, 95), (543, 108, 93), (546, 109, 90), (548, 110, 88), (549, 111, 87), (551, 112, 84), (552, 113, 83), (553, 114, 82), (555, 115, 80), (556, 116, 79), (556, 117, 79), (557, 118, 78), (558, 119, 77), (559, 120, 76), (560, 121, 75), (560, 122, 75), (561, 123, 74), (561, 124, 74), (561, 125, 74), (562, 126, 73), (562, 127, 73), (563, 128, 72), (563, 129, 72), (564, 130, 70), (564, 131, 70), (565, 132, 69), (565, 133, 68), (565, 134, 68), (565, 135, 67), (566, 136, 65), (566, 137, 64), (566, 138, 64), (566, 139, 62), (566, 140, 61), (566, 141, 59), (566, 142, 57), (566, 143, 56), (566, 144, 55), (566, 145, 54), (567, 146, 53), (567, 147, 52), (567, 148, 51), (568, 149, 50), (568, 150, 49), (568, 151, 48), (568, 152, 47), (569, 153, 45), (569, 154, 44), (570, 155, 42), (570, 156, 42), (570, 157, 41), (571, 158, 39), (571, 159, 39), (572, 160, 37), (572, 161, 37), (573, 162, 35), (573, 163, 34), (573, 164, 34), (574, 165, 32), (575, 166, 30), (577, 167, 28), (578, 168, 26), (581, 169, 22), (585, 170, 18), (587, 171, 15), (591, 172, 8)], ['598,172,591,172,584,169,578,168,573,164,573,162,568,152,568,149,566,145,566,136,565,132,561,125,560,121,556,116,547,109,543,108,536,104,531,99,527,97,491,62,490,54,495,48,496,45,502,40,516,30,523,27,526,27,531,25,539,25,540,24,560,24,561,25,579,25,580,26,593,26,594,25,633,25,634,29,634,56,635,57,635,111,634,112,634,129,632,134,629,138,623,141,619,145,617,149,611,155,608,161']), (957285035, 492601069, 445, 280, 481, 2, 55, 0.82984155, [(292, 3, 128), (284, 4, 146), (282, 5, 151), (281, 6, 154), (281, 7, 156), (281, 8, 157), (281, 9, 158), (281, 10, 160), (281, 11, 162), (281, 12, 165), (281, 13, 167), (281, 14, 169), (281, 15, 171), (281, 16, 173), (281, 17, 174), (281, 18, 175), (281, 19, 177), (281, 20, 178), (281, 21, 179), (281, 22, 180), (281, 23, 181), (281, 24, 182), (281, 25, 183), (281, 26, 184), (281, 27, 185), (281, 28, 185), (281, 29, 185), (282, 30, 185), (283, 31, 27), (337, 31, 131), (371, 32, 97), (401, 33, 68), (409, 34, 61), (419, 35, 52), (424, 36, 48), (429, 37, 44), (432, 38, 41), (434, 39, 40), (436, 40, 39), (438, 41, 37), (441, 42, 35), (444, 43, 32), (448, 44, 29), (452, 45, 25), (454, 46, 23), (459, 47, 17), (463, 48, 12), (468, 49, 5)], ['472,49,468,49,467,48,459,47,458,46,454,46,451,44,448,44,447,43,444,43,440,41,438,41,428,36,424,36,423,35,419,35,418,34,409,34,408,33,401,33,400,32,371,32,370,31,337,31,336,30,283,31,281,29,281,6,284,4,291,4,292,3,419,3,420,4,429,4,430,5,432,5,436,7,441,11,445,12,453,16,456,19,457,19,465,27,465,29,472,37,476,44,476,46']), (957285035, 492601069, 445, 456, 547, 6, 45, 0.74088126, [(482, 8, 19), (464, 9, 3), (481, 9, 44), (457, 10, 12), (479, 10, 50), (457, 11, 13), (476, 11, 56), (457, 12, 15), (475, 12, 65), (457, 13, 84), (457, 14, 85), (457, 15, 89), (457, 16, 89), (458, 17, 88), (459, 18, 87), (460, 19, 86), (461, 20, 80), (464, 21, 71), (466, 22, 63), (467, 23, 59), (468, 24, 55), (469, 25, 52), (469, 26, 51), (470, 27, 48), (471, 28, 46), (471, 29, 44), (472, 30, 42), (473, 31, 39), (473, 32, 38), (474, 33, 36), (475, 34, 33), (475, 35, 32), (476, 36, 30), (476, 37, 29), (477, 38, 26), (478, 39, 23), (479, 40, 20), (480, 41, 17), (488, 42, 5)], ['492,42,488,42,487,41,480,41,476,37,475,34,473,32,469,25,465,21,461,20,457,16,457,10,463,10,464,9,466,9,470,12,474,13,476,11,480,10,482,8,500,8,501,9,524,9,525,10,528,10,532,12,539,12,542,15,545,15,545,19,535,20,534,21,529,21,525,23,523,23,513,30,512,30,504,37,496,41,493,41'])], 'temp/1759634430_2282988_957285035_a42482e51c93c8025d243dd179aee85b.jpg']} free memory after detection : begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 6798 error , can't release the memory or there are other process who occupe the free memory ERROR test release memory FAILED ############################### TEST detect object ################################ run mask_detect Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : mask_detect list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.1611626148223877 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:mask_detect Sun Oct 5 05:20:56 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 6798 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-10-05 05:20:59.985058: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-10-05 05:21:00.012635: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3492910000 Hz 2025-10-05 05:21:00.015060: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f74b0000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-10-05 05:21:00.015115: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-10-05 05:21:00.018990: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-10-05 05:21:00.165885: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x9abadc0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-10-05 05:21:00.165937: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-10-05 05:21:00.167219: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-10-05 05:21:00.167644: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-05 05:21:00.173904: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-05 05:21:00.176572: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-10-05 05:21:00.177219: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-10-05 05:21:00.179758: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-10-05 05:21:00.181162: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-10-05 05:21:00.186620: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-10-05 05:21:00.188707: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-10-05 05:21:00.188837: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-05 05:21:00.189723: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-10-05 05:21:00.189740: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-10-05 05:21:00.189748: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-10-05 05:21:00.190967: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6247 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-10-05 05:21:00.319416: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-10-05 05:21:00.319505: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-05 05:21:00.319525: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-05 05:21:00.319543: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-10-05 05:21:00.319560: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-10-05 05:21:00.319578: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-10-05 05:21:00.319595: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-10-05 05:21:00.319612: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-10-05 05:21:00.320712: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-10-05 05:21:00.321781: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-10-05 05:21:00.321813: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-05 05:21:00.321831: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-05 05:21:00.321847: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-10-05 05:21:00.321863: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-10-05 05:21:00.321880: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-10-05 05:21:00.321896: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-10-05 05:21:00.321912: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-10-05 05:21:00.322969: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-10-05 05:21:00.322996: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-10-05 05:21:00.323005: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-10-05 05:21:00.323012: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-10-05 05:21:00.324118: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6247 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (3473, 'mask_coco_origin', 16384, 25088, 'mask_coco_origin', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2018, 3, 19, 10, 42, 21), datetime.datetime(2018, 3, 19, 10, 42, 21)) {'thcl': {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'], 'list_hashtags_csv': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'svm_hashtag_type_desc': 3473, 'photo_desc_type': 3473, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME mask_coco_origin NUM_CLASSES 81 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : mask_coco_origin model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-10-05 05:21:08.678764: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-05 05:21:08.854192: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/mask_coco_origin /data/models_weight/mask_coco_origin/mask_model.h5 size_local : 257557808 size in s3 : 257557808 create time local : 2021-08-09 05:27:17 create time in s3 : 2021-08-06 19:45:17 mask_model.h5 already exist and didn't need to update list_images length : 1 NEW PHOTO Processing 1 images image shape: (720, 1280, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 89) min: 0.00000 max: 1280.00000 nb d'objets trouves : 4 Detection mask done ! Trying to reset tf kernel 2285231 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 1881 tf kernel not reseted sub process len(results) : 1 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 1 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 10578 list_Values should be empty [] ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] time for calcul the mask position with numpy : 0.0007266998291015625 nb_pixel_total : 16902 time to create 1 rle with old method : 0.046441078186035156 length of segment : 107 time for calcul the mask position with numpy : 0.02373337745666504 nb_pixel_total : 480741 time to create 1 rle with new method : 0.025307416915893555 length of segment : 632 time for calcul the mask position with numpy : 0.0004668235778808594 nb_pixel_total : 36639 time to create 1 rle with old method : 0.07946109771728516 length of segment : 133 time for calcul the mask position with numpy : 0.00012350082397460938 nb_pixel_total : 4794 time to create 1 rle with old method : 0.011052131652832031 length of segment : 51 time spent for convertir_results : 0.43698549270629883 time spend for datou_step_exec : 18.244992971420288 time spend to save output : 2.3603439331054688e-05 total time spend for step 1 : 18.24501657485962 caffe_path_current : About to save ! 1 Inside saveOutput : final : True verbose : False eke 12-6-18 : saveMask need to be cleaned for new output ! Catched exception ! Connect or reconnect ! Number saved : None batch 1 Loaded 443 chid ids of type : 445 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++Number RLEs to save : 0 begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1 time used for this insertion : 0.03862619400024414 save missing photos in datou_result : After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {'917855882': [[(917855882, 492601069, 445, 1092, 1280, 0, 108, 0.9988366, [(1205, 1, 58), (1165, 2, 105), (1159, 3, 113), (1149, 4, 124), (1113, 5, 161), (1100, 6, 174), (1097, 7, 177), (1095, 8, 179), (1095, 9, 179), (1095, 10, 179), (1095, 11, 179), (1095, 12, 179), (1095, 13, 179), (1095, 14, 178), (1095, 15, 178), (1095, 16, 178), (1095, 17, 178), (1095, 18, 177), (1095, 19, 177), (1095, 20, 177), (1095, 21, 177), (1095, 22, 177), (1095, 23, 178), (1095, 24, 178), (1095, 25, 178), (1095, 26, 179), (1095, 27, 179), (1095, 28, 180), (1095, 29, 181), (1095, 30, 182), (1095, 31, 183), (1095, 32, 183), (1095, 33, 184), (1095, 34, 184), (1096, 35, 183), (1096, 36, 183), (1096, 37, 184), (1097, 38, 183), (1097, 39, 183), (1097, 40, 183), (1098, 41, 182), (1098, 42, 182), (1098, 43, 182), (1099, 44, 181), (1099, 45, 181), (1099, 46, 181), (1100, 47, 180), (1100, 48, 180), (1101, 49, 179), (1101, 50, 179), (1102, 51, 178), (1102, 52, 178), (1103, 53, 177), (1103, 54, 177), (1104, 55, 176), (1104, 56, 176), (1104, 57, 176), (1104, 58, 176), (1105, 59, 175), (1105, 60, 175), (1105, 61, 175), (1105, 62, 175), (1105, 63, 175), (1106, 64, 174), (1106, 65, 174), (1106, 66, 174), (1106, 67, 174), (1106, 68, 174), (1106, 69, 174), (1106, 70, 174), (1106, 71, 174), (1106, 72, 174), (1106, 73, 174), (1107, 74, 173), (1107, 75, 173), (1107, 76, 173), (1107, 77, 173), (1107, 78, 173), (1107, 79, 173), (1108, 80, 172), (1108, 81, 172), (1109, 82, 171), (1110, 83, 170), (1110, 84, 170), (1111, 85, 169), (1112, 86, 168), (1113, 87, 166), (1114, 88, 165), (1115, 89, 164), (1117, 90, 162), (1120, 91, 159), (1138, 92, 141), (1146, 93, 133), (1154, 94, 125), (1167, 95, 112), (1177, 96, 102), (1183, 97, 95), (1185, 98, 93), (1187, 99, 90), (1188, 100, 55), (1264, 100, 12), (1190, 101, 50), (1191, 102, 46), (1194, 103, 40), (1197, 104, 34), (1202, 105, 25), (1207, 106, 16)], ['1222,106,1207,106,1206,105,1197,104,1191,102,1182,96,1176,95,1167,95,1166,94,1154,94,1153,93,1146,93,1145,92,1137,91,1120,91,1115,89,1110,84,1107,79,1106,73,1106,64,1104,55,1099,46,1095,34,1095,8,1100,6,1112,6,1113,5,1148,5,1149,4,1158,4,1165,2,1204,2,1205,1,1262,1,1269,2,1273,5,1273,13,1271,18,1271,22,1273,27,1277,31,1279,37,1279,86,1278,87,1278,96,1275,100,1264,100,1263,99,1243,99,1230,104']), (917855882, 492601069, 445, 52, 1128, 16, 668, 0.9977458, [(711, 22, 22), (925, 22, 47), (608, 23, 146), (894, 23, 103), (598, 24, 234), (850, 24, 158), (590, 25, 427), (582, 26, 444), (575, 27, 458), (569, 28, 466), (565, 29, 472), (560, 30, 480), (556, 31, 486), (550, 32, 495), (544, 33, 503), (538, 34, 512), (532, 35, 520), (527, 36, 527), (523, 37, 534), (518, 38, 541), (514, 39, 548), (510, 40, 554), (506, 41, 561), (503, 42, 566), (499, 43, 572), (496, 44, 577), (493, 45, 582), (491, 46, 585), (489, 47, 589), (487, 48, 592), (485, 49, 595), (483, 50, 598), (482, 51, 600), (481, 52, 602), (480, 53, 603), (479, 54, 605), (478, 55, 606), (476, 56, 608), (475, 57, 610), (474, 58, 611), (473, 59, 613), (472, 60, 614), (470, 61, 616), (469, 62, 618), (468, 63, 619), (466, 64, 621), (465, 65, 623), (464, 66, 624), (462, 67, 626), (461, 68, 628), (459, 69, 630), (458, 70, 631), (456, 71, 633), (455, 72, 635), (453, 73, 637), (452, 74, 638), (451, 75, 639), (450, 76, 640), (448, 77, 642), (447, 78, 643), (446, 79, 644), (445, 80, 645), (444, 81, 646), (442, 82, 648), (441, 83, 649), (440, 84, 650), (439, 85, 651), (438, 86, 652), (437, 87, 653), (436, 88, 654), (435, 89, 655), (434, 90, 656), (433, 91, 657), (432, 92, 658), (431, 93, 659), (430, 94, 660), (429, 95, 661), (428, 96, 662), (427, 97, 663), (425, 98, 665), (423, 99, 667), (421, 100, 669), (419, 101, 671), (417, 102, 673), (413, 103, 677), (410, 104, 680), (406, 105, 684), (401, 106, 689), (397, 107, 693), (392, 108, 698), (387, 109, 703), (382, 110, 708), (377, 111, 713), (373, 112, 717), (369, 113, 721), (365, 114, 725), (362, 115, 728), (359, 116, 731), (356, 117, 734), (353, 118, 737), (351, 119, 739), (349, 120, 741), (346, 121, 744), (344, 122, 746), (342, 123, 748), (339, 124, 751), (335, 125, 755), (331, 126, 759), (327, 127, 763), (323, 128, 767), (319, 129, 770), (314, 130, 775), (308, 131, 781), (303, 132, 786), (294, 133, 795), (287, 134, 802), (279, 135, 810), (273, 136, 816), (267, 137, 822), (262, 138, 827), (258, 139, 831), (255, 140, 834), (252, 141, 837), (250, 142, 839), (247, 143, 842), (245, 144, 844), (242, 145, 847), (240, 146, 849), (237, 147, 852), (234, 148, 855), (230, 149, 859), (226, 150, 863), (220, 151, 869), (213, 152, 876), (207, 153, 882), (200, 154, 889), (193, 155, 896), (187, 156, 902), (184, 157, 905), (181, 158, 908), (178, 159, 911), (176, 160, 913), (174, 161, 915), (172, 162, 917), (170, 163, 919), (168, 164, 921), (167, 165, 922), (165, 166, 924), (164, 167, 925), (162, 168, 927), (161, 169, 928), (159, 170, 930), (157, 171, 932), (155, 172, 934), (153, 173, 935), (151, 174, 937), (148, 175, 940), (146, 176, 942), (144, 177, 944), (142, 178, 946), (140, 179, 948), (139, 180, 949), (137, 181, 951), (136, 182, 952), (134, 183, 954), (133, 184, 955), (132, 185, 956), (131, 186, 957), (130, 187, 958), (129, 188, 959), (128, 189, 960), (127, 190, 960), (126, 191, 961), (126, 192, 961), (125, 193, 962), (124, 194, 963), (123, 195, 964), (122, 196, 965), (122, 197, 965), (121, 198, 966), (120, 199, 967), (119, 200, 968), (118, 201, 969), (117, 202, 970), (116, 203, 971), (114, 204, 973), (113, 205, 973), (112, 206, 974), (111, 207, 975), (109, 208, 977), (108, 209, 978), (107, 210, 979), (106, 211, 980), (105, 212, 981), (104, 213, 982), (103, 214, 983), (102, 215, 984), (101, 216, 985), (101, 217, 984), (100, 218, 985), (100, 219, 985), (99, 220, 986), (98, 221, 987), (98, 222, 987), (97, 223, 988), (97, 224, 987), (96, 225, 988), (96, 226, 988), (95, 227, 989), (95, 228, 989), (94, 229, 990), (94, 230, 990), (94, 231, 990), (93, 232, 990), (93, 233, 990), (92, 234, 991), (92, 235, 991), (92, 236, 991), (91, 237, 992), (91, 238, 991), (91, 239, 991), (91, 240, 990), (91, 241, 990), (90, 242, 991), (90, 243, 990), (90, 244, 990), (90, 245, 989), (90, 246, 989), (89, 247, 990), (89, 248, 989), (89, 249, 989), (89, 250, 988), (89, 251, 988), (88, 252, 988), (88, 253, 988), (88, 254, 987), (88, 255, 986), (88, 256, 986), (87, 257, 986), (87, 258, 985), (87, 259, 985), (87, 260, 984), (87, 261, 983), (86, 262, 983), (86, 263, 982), (86, 264, 982), (86, 265, 981), (85, 266, 981), (85, 267, 980), (85, 268, 980), (84, 269, 980), (84, 270, 979), (84, 271, 979), (84, 272, 978), (83, 273, 979), (83, 274, 978), (83, 275, 977), (82, 276, 978), (82, 277, 977), (82, 278, 977), (81, 279, 977), (81, 280, 977), (81, 281, 977), (80, 282, 977), (80, 283, 977), (80, 284, 976), (79, 285, 977), (79, 286, 976), (79, 287, 976), (78, 288, 976), (78, 289, 976), (78, 290, 975), (77, 291, 976), (77, 292, 975), (77, 293, 975), (76, 294, 975), (76, 295, 975), (76, 296, 974), (75, 297, 975), (75, 298, 974), (74, 299, 975), (74, 300, 974), (74, 301, 974), (73, 302, 974), (73, 303, 974), (72, 304, 974), (72, 305, 974), (71, 306, 974), (71, 307, 973), (71, 308, 972), (70, 309, 972), (70, 310, 971), (70, 311, 970), (70, 312, 968), (69, 313, 968), (69, 314, 966), (69, 315, 964), (69, 316, 962), (68, 317, 961), (68, 318, 959), (68, 319, 958), (68, 320, 956), (67, 321, 955), (67, 322, 954), (67, 323, 952), (67, 324, 951), (66, 325, 951), (66, 326, 950), (66, 327, 948), (66, 328, 947), (65, 329, 947), (65, 330, 946), (65, 331, 946), (65, 332, 945), (65, 333, 944), (65, 334, 942), (65, 335, 941), (65, 336, 940), (65, 337, 939), (65, 338, 938), (64, 339, 937), (64, 340, 936), (64, 341, 934), (64, 342, 932), (64, 343, 930), (64, 344, 928), (64, 345, 926), (64, 346, 925), (64, 347, 923), (64, 348, 922), (64, 349, 920), (64, 350, 919), (63, 351, 919), (63, 352, 918), (63, 353, 917), (63, 354, 916), (63, 355, 915), (63, 356, 914), (63, 357, 912), (63, 358, 911), (63, 359, 910), (63, 360, 909), (63, 361, 908), (63, 362, 906), (63, 363, 905), (63, 364, 904), (63, 365, 902), (63, 366, 901), (63, 367, 899), (63, 368, 898), (63, 369, 896), (62, 370, 895), (62, 371, 893), (62, 372, 891), (62, 373, 890), (62, 374, 888), (62, 375, 887), (62, 376, 886), (62, 377, 885), (62, 378, 884), (62, 379, 883), (63, 380, 880), (63, 381, 879), (63, 382, 878), (63, 383, 877), (63, 384, 876), (63, 385, 875), (63, 386, 874), (63, 387, 873), (63, 388, 872), (64, 389, 870), (64, 390, 869), (64, 391, 868), (64, 392, 867), (64, 393, 866), (64, 394, 864), (64, 395, 863), (65, 396, 861), (65, 397, 860), (65, 398, 859), (65, 399, 858), (65, 400, 857), (65, 401, 856), (65, 402, 854), (65, 403, 853), (65, 404, 851), (65, 405, 850), (65, 406, 848), (66, 407, 846), (66, 408, 844), (66, 409, 843), (66, 410, 842), (66, 411, 841), (66, 412, 840), (66, 413, 838), (66, 414, 837), (66, 415, 836), (66, 416, 835), (66, 417, 835), (66, 418, 834), (66, 419, 833), (67, 420, 831), (67, 421, 830), (67, 422, 829), (67, 423, 829), (67, 424, 828), (67, 425, 827), (67, 426, 826), (67, 427, 825), (67, 428, 824), (68, 429, 822), (68, 430, 820), (68, 431, 819), (68, 432, 818), (68, 433, 816), (68, 434, 815), (68, 435, 813), (68, 436, 811), (69, 437, 809), (69, 438, 807), (69, 439, 805), (69, 440, 804), (69, 441, 803), (69, 442, 802), (69, 443, 800), (70, 444, 798), (70, 445, 797), (70, 446, 796), (70, 447, 796), (71, 448, 794), (71, 449, 794), (72, 450, 792), (72, 451, 791), (73, 452, 790), (73, 453, 789), (74, 454, 788), (74, 455, 787), (75, 456, 786), (75, 457, 785), (76, 458, 784), (76, 459, 783), (77, 460, 782), (77, 461, 781), (77, 462, 781), (78, 463, 779), (78, 464, 779), (79, 465, 777), (79, 466, 777), (79, 467, 776), (80, 468, 775), (80, 469, 774), (80, 470, 774), (81, 471, 772), (81, 472, 771), (82, 473, 770), (82, 474, 769), (83, 475, 767), (83, 476, 766), (83, 477, 766), (84, 478, 764), (84, 479, 763), (85, 480, 761), (85, 481, 760), (85, 482, 759), (86, 483, 757), (86, 484, 755), (87, 485, 753), (87, 486, 752), (87, 487, 751), (88, 488, 748), (88, 489, 747), (88, 490, 746), (89, 491, 744), (89, 492, 743), (90, 493, 741), (90, 494, 741), (91, 495, 739), (91, 496, 738), (92, 497, 737), (93, 498, 735), (94, 499, 733), (94, 500, 733), (95, 501, 731), (96, 502, 729), (97, 503, 728), (98, 504, 726), (99, 505, 724), (99, 506, 724), (100, 507, 722), (101, 508, 721), (102, 509, 719), (104, 510, 717), (105, 511, 715), (106, 512, 714), (107, 513, 712), (108, 514, 711), (110, 515, 708), (111, 516, 707), (113, 517, 704), (114, 518, 703), (115, 519, 701), (117, 520, 698), (118, 521, 697), (119, 522, 695), (121, 523, 693), (122, 524, 691), (124, 525, 689), (125, 526, 687), (126, 527, 685), (128, 528, 683), (129, 529, 681), (131, 530, 678), (132, 531, 676), (134, 532, 674), (135, 533, 672), (137, 534, 669), (138, 535, 667), (140, 536, 664), (141, 537, 662), (143, 538, 659), (144, 539, 657), (146, 540, 654), (148, 541, 651), (149, 542, 649), (151, 543, 645), (153, 544, 642), (154, 545, 641), (156, 546, 638), (158, 547, 635), (159, 548, 633), (161, 549, 630), (162, 550, 628), (164, 551, 626), (166, 552, 623), (167, 553, 621), (169, 554, 618), (170, 555, 617), (171, 556, 615), (173, 557, 613), (174, 558, 611), (176, 559, 608), (177, 560, 607), (178, 561, 605), (180, 562, 603), (181, 563, 601), (183, 564, 599), (185, 565, 597), (186, 566, 595), (189, 567, 592), (192, 568, 589), (195, 569, 585), (198, 570, 582), (201, 571, 579), (204, 572, 575), (206, 573, 573), (209, 574, 569), (212, 575, 566), (215, 576, 563), (218, 577, 559), (221, 578, 556), (223, 579, 553), (226, 580, 550), (228, 581, 547), (230, 582, 545), (232, 583, 542), (234, 584, 540), (235, 585, 539), (237, 586, 536), (238, 587, 534), (240, 588, 531), (242, 589, 528), (243, 590, 526), (245, 591, 523), (247, 592, 520), (249, 593, 516), (251, 594, 513), (253, 595, 510), (256, 596, 505), (258, 597, 501), (261, 598, 497), (263, 599, 493), (267, 600, 488), (271, 601, 482), (274, 602, 478), (278, 603, 473), (281, 604, 468), (284, 605, 464), (287, 606, 460), (290, 607, 456), (292, 608, 453), (295, 609, 449), (297, 610, 446), (300, 611, 442), (303, 612, 438), (305, 613, 434), (307, 614, 431), (310, 615, 427), (312, 616, 423), (315, 617, 418), (317, 618, 415), (320, 619, 410), (322, 620, 406), (325, 621, 401), (327, 622, 396), (330, 623, 390), (333, 624, 384), (335, 625, 379), (338, 626, 374), (341, 627, 369), (345, 628, 362), (349, 629, 356), (353, 630, 350), (357, 631, 344), (360, 632, 340), (364, 633, 334), (368, 634, 328), (373, 635, 320), (378, 636, 313), (383, 637, 305), (389, 638, 295), (395, 639, 282), (401, 640, 270), (408, 641, 256), (416, 642, 240), (432, 643, 216), (448, 644, 193), (465, 645, 169), (480, 646, 148), (495, 647, 126), (511, 648, 104), (526, 649, 82), (565, 650, 9)], ['526,649,416,642,341,627,289,606,263,599,220,577,186,566,102,509,91,496,70,447,63,388,65,329,86,265,91,237,101,216,134,183,187,156,225,151,279,135,302,133,341,124,377,111,426,98,442,82,493,45,527,36,608,23,754,24,893,24,925,22,996,23,1032,27,1066,41,1082,52,1089,72,1088,172,1082,237,1045,305,1002,338,950,373,889,429,865,446,851,473,822,505,810,528,786,554,773,585,714,624,683,638,607,649']), (917855882, 492601069, 445, 0, 440, 0, 116, 0.9919559, [(127, 1, 141), (95, 2, 205), (384, 2, 2), (59, 3, 273), (340, 3, 57), (22, 4, 381), (19, 5, 387), (16, 6, 392), (15, 7, 394), (14, 8, 396), (14, 9, 397), (13, 10, 399), (12, 11, 400), (12, 12, 400), (11, 13, 402), (10, 14, 403), (11, 15, 403), (11, 16, 404), (12, 17, 403), (12, 18, 404), (12, 19, 405), (12, 20, 405), (12, 21, 406), (12, 22, 406), (12, 23, 407), (12, 24, 407), (12, 25, 408), (12, 26, 408), (12, 27, 408), (12, 28, 408), (12, 29, 409), (12, 30, 409), (12, 31, 409), (12, 32, 409), (12, 33, 409), (12, 34, 410), (12, 35, 410), (12, 36, 410), (12, 37, 410), (12, 38, 410), (12, 39, 410), (12, 40, 410), (12, 41, 411), (12, 42, 411), (12, 43, 411), (12, 44, 411), (12, 45, 411), (12, 46, 410), (12, 47, 410), (12, 48, 410), (12, 49, 410), (12, 50, 410), (12, 51, 410), (12, 52, 409), (12, 53, 408), (12, 54, 408), (12, 55, 407), (12, 56, 406), (12, 57, 404), (12, 58, 403), (11, 59, 403), (11, 60, 402), (11, 61, 401), (11, 62, 400), (11, 63, 400), (11, 64, 399), (11, 65, 398), (11, 66, 397), (11, 67, 397), (11, 68, 396), (11, 69, 395), (11, 70, 395), (11, 71, 394), (11, 72, 394), (11, 73, 394), (11, 74, 393), (11, 75, 393), (11, 76, 393), (11, 77, 393), (11, 78, 393), (11, 79, 393), (11, 80, 392), (10, 81, 394), (10, 82, 394), (10, 83, 395), (9, 84, 396), (9, 85, 261), (279, 85, 126), (9, 86, 75), (98, 86, 28), (142, 86, 117), (292, 86, 112), (9, 87, 71), (152, 87, 103), (294, 87, 110), (8, 88, 68), (161, 88, 91), (296, 88, 107), (8, 89, 63), (177, 89, 72), (297, 89, 106), (7, 90, 61), (205, 90, 40), (298, 90, 104), (7, 91, 57), (299, 91, 103), (6, 92, 54), (300, 92, 102), (6, 93, 50), (301, 93, 100), (7, 94, 46), (303, 94, 97), (7, 95, 44), (306, 95, 92), (7, 96, 42), (308, 96, 89), (7, 97, 40), (310, 97, 86), (7, 98, 38), (312, 98, 83), (8, 99, 34), (314, 99, 79), (8, 100, 32), (317, 100, 75), (8, 101, 29), (319, 101, 71), (13, 102, 19), (324, 102, 63), (20, 103, 6), (330, 103, 51), (337, 104, 37), (344, 105, 22), (352, 106, 3)], ['344,105,319,101,301,93,291,85,259,85,244,90,205,90,204,89,177,89,176,88,161,88,160,87,142,86,141,85,126,85,125,86,98,86,84,85,56,92,36,101,26,102,8,101,6,92,11,80,11,59,12,58,12,17,10,14,16,6,22,4,94,3,95,2,126,2,127,1,267,1,268,2,396,3,407,6,419,25,421,34,421,51,410,62,404,71,402,80,404,85,401,92,394,98,386,102,374,103,365,105']), (917855882, 492601069, 445, 390, 550, 0, 54, 0.9391806, [(414, 0, 7), (441, 0, 60), (508, 0, 28), (402, 1, 142), (401, 2, 146), (402, 3, 145), (404, 4, 143), (406, 5, 140), (408, 6, 137), (410, 7, 134), (411, 8, 132), (412, 9, 130), (413, 10, 127), (414, 11, 125), (415, 12, 123), (415, 13, 122), (416, 14, 120), (417, 15, 117), (417, 16, 116), (418, 17, 114), (418, 18, 113), (418, 19, 111), (418, 20, 109), (419, 21, 107), (419, 22, 105), (419, 23, 103), (419, 24, 102), (419, 25, 100), (420, 26, 97), (420, 27, 95), (420, 28, 94), (421, 29, 91), (421, 30, 90), (422, 31, 88), (422, 32, 88), (422, 33, 87), (423, 34, 84), (423, 35, 82), (423, 36, 81), (424, 37, 79), (424, 38, 77), (424, 39, 75), (424, 40, 73), (424, 41, 71), (425, 42, 67), (425, 43, 66), (426, 44, 62), (426, 45, 6), (433, 45, 52), (443, 46, 30), (450, 47, 1)], ['450,47,449,46,443,46,442,45,426,45,424,41,424,37,423,36,422,31,419,25,419,21,418,20,418,17,417,15,409,6,402,3,402,1,413,1,414,0,420,0,421,1,440,1,441,0,500,0,501,1,507,1,508,0,535,0,536,1,543,1,546,2,546,4,542,8,530,18,527,19,525,21,522,22,520,24,512,28,508,33,505,34,502,37,494,41,492,41,490,43,488,43,484,45,473,45,472,46,451,46'])], 'temp/1759634456_2282988_917855882_da0fa7b7e6b5b551fe26c0ba8713276d.jpg']} ############################### TEST POLYGON ################################ Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : mask_detect list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.16519880294799805 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:mask_detect Sun Oct 5 05:21:17 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 10578 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-10-05 05:21:20.286716: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-10-05 05:21:20.316558: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3492910000 Hz 2025-10-05 05:21:20.318869: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f74b8000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-10-05 05:21:20.318930: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-10-05 05:21:20.322928: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-10-05 05:21:20.507552: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0xa04b020 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-10-05 05:21:20.507620: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-10-05 05:21:20.509167: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-10-05 05:21:20.509631: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-05 05:21:20.513119: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-05 05:21:20.516278: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-10-05 05:21:20.516839: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-10-05 05:21:20.519392: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-10-05 05:21:20.520519: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-10-05 05:21:20.525401: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-10-05 05:21:20.526893: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-10-05 05:21:20.526976: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-05 05:21:20.527784: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-10-05 05:21:20.527800: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-10-05 05:21:20.527810: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-10-05 05:21:20.529205: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9798 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-10-05 05:21:20.656063: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-10-05 05:21:20.656189: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-05 05:21:20.656209: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-05 05:21:20.656227: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-10-05 05:21:20.656243: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-10-05 05:21:20.656259: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-10-05 05:21:20.656274: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-10-05 05:21:20.656290: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-10-05 05:21:20.657579: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-10-05 05:21:20.658732: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-10-05 05:21:20.658779: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-05 05:21:20.658796: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-05 05:21:20.658811: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-10-05 05:21:20.658826: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-10-05 05:21:20.658841: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-10-05 05:21:20.658855: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-10-05 05:21:20.658870: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-10-05 05:21:20.660042: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-10-05 05:21:20.660076: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-10-05 05:21:20.660084: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-10-05 05:21:20.660091: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-10-05 05:21:20.661338: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9798 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (3473, 'mask_coco_origin', 16384, 25088, 'mask_coco_origin', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2018, 3, 19, 10, 42, 21), datetime.datetime(2018, 3, 19, 10, 42, 21)) {'thcl': {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'], 'list_hashtags_csv': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'svm_hashtag_type_desc': 3473, 'photo_desc_type': 3473, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME mask_coco_origin NUM_CLASSES 81 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : mask_coco_origin model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-10-05 05:21:31.333988: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-05 05:21:31.505809: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/mask_coco_origin /data/models_weight/mask_coco_origin/mask_model.h5 size_local : 257557808 size in s3 : 257557808 create time local : 2021-08-09 05:27:17 create time in s3 : 2021-08-06 19:45:17 mask_model.h5 already exist and didn't need to update list_images length : 1 NEW PHOTO Processing 1 images image shape: (2448, 2448, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 89) min: 0.00000 max: 2448.00000 nb d'objets trouves : 1 Detection mask done ! Trying to reset tf kernel 2286308 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 5286 tf kernel not reseted sub process len(results) : 1 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 1 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 10578 list_Values should be empty [] ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] time for calcul the mask position with numpy : 0.13872528076171875 nb_pixel_total : 3693242 time to create 1 rle with new method : 0.4461181163787842 length of segment : 2042 time spent for convertir_results : 1.752823829650879 time spend for datou_step_exec : 22.55493474006653 time spend to save output : 3.528594970703125e-05 total time spend for step 1 : 22.554970026016235 caffe_path_current : About to save ! 1 Inside saveOutput : final : True verbose : False eke 12-6-18 : saveMask need to be cleaned for new output ! Catched exception ! Connect or reconnect ! Number saved : None batch 1 Loaded 724 chid ids of type : 445 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++Number RLEs to save : 0 begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1 time used for this insertion : 0.04112553596496582 save missing photos in datou_result : After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {'917877156': [[(917877156, 492601069, 445, 7, 2268, 118, 2241, 0.9849961, [(674, 120, 114), (520, 121, 482), (1050, 121, 381), (503, 122, 947), (486, 123, 982), (470, 124, 1015), (455, 125, 1046), (442, 126, 1092), (429, 127, 1136), (417, 128, 1168), (405, 129, 1187), (394, 130, 1205), (383, 131, 1223), (373, 132, 1239), (368, 133, 1250), (366, 134, 1258), (364, 135, 1266), (361, 136, 1274), (359, 137, 1281), (357, 138, 1288), (355, 139, 1295), (353, 140, 1302), (351, 141, 1309), (349, 142, 1315), (347, 143, 1320), (345, 144, 1326), (343, 145, 1331), (342, 146, 1335), (340, 147, 1340), (338, 148, 1345), (337, 149, 1349), (335, 150, 1354), (334, 151, 1358), (332, 152, 1363), (331, 153, 1366), (330, 154, 1370), (328, 155, 1374), (327, 156, 1378), (326, 157, 1381), (325, 158, 1385), (324, 159, 1389), (322, 160, 1393), (321, 161, 1397), (320, 162, 1401), (318, 163, 1406), (317, 164, 1410), (315, 165, 1415), (314, 166, 1419), (312, 167, 1424), (310, 168, 1429), (309, 169, 1434), (307, 170, 1439), (305, 171, 1444), (304, 172, 1448), (302, 173, 1453), (300, 174, 1458), (298, 175, 1463), (296, 176, 1469), (294, 177, 1474), (292, 178, 1480), (289, 179, 1487), (286, 180, 1493), (283, 181, 1500), (281, 182, 1507), (278, 183, 1514), (275, 184, 1521), (272, 185, 1529), (269, 186, 1536), (266, 187, 1544), (263, 188, 1552), (260, 189, 1561), (257, 190, 1569), (254, 191, 1579), (251, 192, 1588), (248, 193, 1597), (245, 194, 1606), (242, 195, 1615), (240, 196, 1623), (237, 197, 1631), (234, 198, 1640), (231, 199, 1648), (228, 200, 1657), (225, 201, 1665), (222, 202, 1673), (219, 203, 1682), (216, 204, 1689), (213, 205, 1694), (210, 206, 1699), (208, 207, 1702), (206, 208, 1706), (204, 209, 1710), (203, 210, 1712), (201, 211, 1716), (199, 212, 1719), (198, 213, 1721), (196, 214, 1725), (195, 215, 1727), (193, 216, 1730), (192, 217, 1733), (191, 218, 1735), (189, 219, 1738), (188, 220, 1740), (187, 221, 1742), (186, 222, 1744), (185, 223, 1746), (183, 224, 1749), (182, 225, 1751), (181, 226, 1753), (180, 227, 1755), (179, 228, 1757), (178, 229, 1759), (177, 230, 1761), (176, 231, 1762), (176, 232, 1763), (175, 233, 1765), (174, 234, 1767), (173, 235, 1768), (172, 236, 1770), (171, 237, 1772), (170, 238, 1774), (169, 239, 1775), (168, 240, 1777), (167, 241, 1779), (166, 242, 1781), (165, 243, 1783), (164, 244, 1785), (163, 245, 1787), (162, 246, 1789), (161, 247, 1791), (159, 248, 1794), (158, 249, 1796), (157, 250, 1798), (156, 251, 1800), (154, 252, 1803), (153, 253, 1805), (152, 254, 1807), (150, 255, 1810), (149, 256, 1812), (148, 257, 1815), (146, 258, 1818), (145, 259, 1820), (143, 260, 1823), (142, 261, 1826), (140, 262, 1829), (138, 263, 1833), (137, 264, 1835), (135, 265, 1839), (133, 266, 1842), (132, 267, 1845), (130, 268, 1849), (128, 269, 1852), (126, 270, 1856), (125, 271, 1859), (124, 272, 1862), (122, 273, 1865), (121, 274, 1868), (120, 275, 1871), (119, 276, 1873), (118, 277, 1876), (116, 278, 1879), (115, 279, 1881), (114, 280, 1884), (113, 281, 1886), (112, 282, 1888), (111, 283, 1890), (110, 284, 1892), (109, 285, 1895), (108, 286, 1897), (108, 287, 1898), (107, 288, 1900), (106, 289, 1902), (105, 290, 1904), (104, 291, 1906), (103, 292, 1908), (103, 293, 1909), (102, 294, 1911), (101, 295, 1912), (101, 296, 1913), (100, 297, 1915), (99, 298, 1917), (99, 299, 1918), (98, 300, 1919), (97, 301, 1921), (97, 302, 1922), (96, 303, 1924), (95, 304, 1925), (95, 305, 1926), (94, 306, 1928), (94, 307, 1928), (93, 308, 1930), (93, 309, 1930), (93, 310, 1931), (93, 311, 1931), (92, 312, 1933), (92, 313, 1933), (92, 314, 1934), (92, 315, 1934), (91, 316, 1936), (91, 317, 1936), (91, 318, 1937), (91, 319, 1937), (90, 320, 1939), (90, 321, 1939), (90, 322, 1940), (89, 323, 1941), (89, 324, 1942), (89, 325, 1943), (89, 326, 1943), (88, 327, 1945), (88, 328, 1945), (88, 329, 1946), (87, 330, 1948), (87, 331, 1948), (87, 332, 1949), (87, 333, 1949), (86, 334, 1951), (86, 335, 1952), (86, 336, 1952), (85, 337, 1954), (85, 338, 1955), (85, 339, 1956), (85, 340, 1956), (84, 341, 1958), (84, 342, 1959), (84, 343, 1959), (83, 344, 1961), (83, 345, 1962), (83, 346, 1963), (83, 347, 1963), (82, 348, 1965), (82, 349, 1966), (82, 350, 1967), (81, 351, 1969), (81, 352, 1970), (81, 353, 1970), (80, 354, 1972), (80, 355, 1973), (80, 356, 1974), (80, 357, 1975), (79, 358, 1977), (79, 359, 1978), (79, 360, 1979), (78, 361, 1981), (78, 362, 1982), (78, 363, 1983), (77, 364, 1985), (77, 365, 1986), (77, 366, 1987), (76, 367, 1989), (76, 368, 1990), (76, 369, 1991), (76, 370, 1992), (75, 371, 1994), (75, 372, 1995), (75, 373, 1996), (74, 374, 1998), (74, 375, 1999), (74, 376, 2000), (73, 377, 2002), (73, 378, 2003), (73, 379, 2004), (72, 380, 2005), (72, 381, 2006), (72, 382, 2007), (71, 383, 2009), (71, 384, 2009), (71, 385, 2010), (70, 386, 2012), (70, 387, 2012), (70, 388, 2013), (70, 389, 2013), (69, 390, 2015), (69, 391, 2015), (69, 392, 2016), (68, 393, 2018), (68, 394, 2018), (68, 395, 2019), (67, 396, 2020), (67, 397, 2021), (67, 398, 2021), (66, 399, 2023), (66, 400, 2023), (65, 401, 2025), (65, 402, 2025), (65, 403, 2026), (64, 404, 2027), (64, 405, 2028), (64, 406, 2028), (63, 407, 2030), (63, 408, 2030), (63, 409, 2031), (62, 410, 2032), (62, 411, 2033), (61, 412, 2034), (61, 413, 2034), (61, 414, 2035), (60, 415, 2036), (60, 416, 2037), (59, 417, 2038), (59, 418, 2039), (58, 419, 2040), (58, 420, 2041), (58, 421, 2041), (57, 422, 2042), (57, 423, 2043), (56, 424, 2044), (56, 425, 2045), (55, 426, 2046), (55, 427, 2047), (54, 428, 2048), (54, 429, 2048), (53, 430, 2050), (53, 431, 2050), (52, 432, 2052), (52, 433, 2052), (51, 434, 2053), (51, 435, 2054), (50, 436, 2055), (50, 437, 2055), (49, 438, 2057), (49, 439, 2057), (48, 440, 2059), (48, 441, 2059), (47, 442, 2060), (47, 443, 2061), (46, 444, 2062), (46, 445, 2062), (45, 446, 2064), (45, 447, 2064), (44, 448, 2065), (44, 449, 2066), (43, 450, 2067), (43, 451, 2068), (42, 452, 2069), (42, 453, 2069), (41, 454, 2071), (41, 455, 2071), (40, 456, 2072), (40, 457, 2073), (39, 458, 2074), (39, 459, 2074), (39, 460, 2074), (39, 461, 2075), (38, 462, 2076), (38, 463, 2076), (38, 464, 2077), (38, 465, 2077), (38, 466, 2077), (38, 467, 2078), (37, 468, 2079), (37, 469, 2079), (37, 470, 2080), (37, 471, 2080), (37, 472, 2080), (37, 473, 2081), (37, 474, 2081), (36, 475, 2082), (36, 476, 2083), (36, 477, 2083), (36, 478, 2083), (36, 479, 2084), (36, 480, 2084), (35, 481, 2085), (35, 482, 2086), (35, 483, 2086), (35, 484, 2087), (35, 485, 2087), (35, 486, 2087), (34, 487, 2089), (34, 488, 2089), (34, 489, 2089), (34, 490, 2090), (34, 491, 2090), (34, 492, 2091), (34, 493, 2091), (33, 494, 2093), (33, 495, 2093), (33, 496, 2093), (33, 497, 2094), (33, 498, 2094), (33, 499, 2095), (33, 500, 2095), (32, 501, 2097), (32, 502, 2097), (32, 503, 2098), (32, 504, 2098), (32, 505, 2099), (32, 506, 2099), (32, 507, 2100), (31, 508, 2101), (31, 509, 2102), (31, 510, 2102), (31, 511, 2103), (31, 512, 2103), (31, 513, 2104), (31, 514, 2104), (30, 515, 2106), (30, 516, 2106), (30, 517, 2107), (30, 518, 2108), (30, 519, 2108), (30, 520, 2109), (30, 521, 2109), (30, 522, 2110), (29, 523, 2112), (29, 524, 2112), (29, 525, 2113), (29, 526, 2114), (29, 527, 2114), (29, 528, 2115), (29, 529, 2116), (29, 530, 2116), (28, 531, 2118), (28, 532, 2119), (28, 533, 2120), (28, 534, 2120), (28, 535, 2121), (28, 536, 2121), (28, 537, 2121), (28, 538, 2122), (28, 539, 2122), (28, 540, 2122), (28, 541, 2123), (28, 542, 2123), (27, 543, 2124), (27, 544, 2125), (27, 545, 2125), (27, 546, 2125), (27, 547, 2126), (27, 548, 2126), (27, 549, 2126), (27, 550, 2127), (27, 551, 2127), (27, 552, 2127), (27, 553, 2128), (27, 554, 2128), (27, 555, 2128), (27, 556, 2128), (27, 557, 2129), (27, 558, 2129), (27, 559, 2129), (27, 560, 2130), (26, 561, 2131), (26, 562, 2131), (26, 563, 2131), (26, 564, 2132), (26, 565, 2132), (26, 566, 2132), (26, 567, 2132), (26, 568, 2133), (26, 569, 2133), (26, 570, 2133), (26, 571, 2133), (26, 572, 2134), (26, 573, 2134), (26, 574, 2134), (26, 575, 2135), (26, 576, 2135), (26, 577, 2135), (26, 578, 2135), (25, 579, 2136), (25, 580, 2137), (25, 581, 2137), (25, 582, 2137), (25, 583, 2137), (25, 584, 2138), (25, 585, 2138), (25, 586, 2138), (25, 587, 2138), (25, 588, 2139), (25, 589, 2139), (25, 590, 2139), (25, 591, 2139), (25, 592, 2140), (25, 593, 2140), (25, 594, 2140), (25, 595, 2140), (25, 596, 2140), (25, 597, 2141), (24, 598, 2142), (24, 599, 2142), (24, 600, 2142), (24, 601, 2142), (24, 602, 2143), (24, 603, 2143), (24, 604, 2143), (24, 605, 2143), (24, 606, 2143), (24, 607, 2144), (24, 608, 2144), (24, 609, 2144), (24, 610, 2144), (24, 611, 2144), (24, 612, 2144), (24, 613, 2145), (24, 614, 2145), (24, 615, 2145), (24, 616, 2145), (24, 617, 2145), (24, 618, 2145), (23, 619, 2146), (23, 620, 2146), (23, 621, 2146), (23, 622, 2146), (23, 623, 2146), (23, 624, 2146), (23, 625, 2146), (23, 626, 2146), (23, 627, 2146), (23, 628, 2146), (23, 629, 2147), (23, 630, 2147), (23, 631, 2147), (23, 632, 2147), (23, 633, 2147), (23, 634, 2147), (23, 635, 2147), (23, 636, 2147), (23, 637, 2147), (23, 638, 2147), (23, 639, 2147), (23, 640, 2147), (23, 641, 2147), (23, 642, 2147), (22, 643, 2148), (22, 644, 2148), (22, 645, 2149), (22, 646, 2149), (22, 647, 2149), (22, 648, 2149), (22, 649, 2149), (22, 650, 2149), (22, 651, 2149), (22, 652, 2149), (22, 653, 2149), (22, 654, 2149), (22, 655, 2149), (22, 656, 2149), (22, 657, 2149), (22, 658, 2149), (22, 659, 2149), (22, 660, 2149), (22, 661, 2149), (22, 662, 2150), (22, 663, 2150), (22, 664, 2150), (22, 665, 2150), (22, 666, 2150), (22, 667, 2150), (21, 668, 2151), (21, 669, 2151), (21, 670, 2151), (21, 671, 2151), (21, 672, 2151), (21, 673, 2151), (21, 674, 2151), (21, 675, 2151), (21, 676, 2151), (21, 677, 2151), (21, 678, 2151), (21, 679, 2151), (21, 680, 2152), (21, 681, 2152), (21, 682, 2152), (21, 683, 2152), (21, 684, 2152), (21, 685, 2152), (21, 686, 2152), (21, 687, 2152), (21, 688, 2152), (21, 689, 2152), (21, 690, 2152), (21, 691, 2152), (21, 692, 2152), (21, 693, 2152), (21, 694, 2152), (21, 695, 2152), (21, 696, 2151), (22, 697, 2150), (22, 698, 2150), (22, 699, 2150), (22, 700, 2150), (22, 701, 2150), (22, 702, 2150), (22, 703, 2150), (22, 704, 2150), (22, 705, 2150), (22, 706, 2150), (22, 707, 2150), (22, 708, 2150), (22, 709, 2150), (23, 710, 2149), (23, 711, 2149), (23, 712, 2149), (23, 713, 2149), (23, 714, 2149), (23, 715, 2149), (23, 716, 2149), (23, 717, 2149), (23, 718, 2148), (23, 719, 2148), (23, 720, 2148), (23, 721, 2148), (24, 722, 2147), (24, 723, 2147), (24, 724, 2147), (24, 725, 2147), (24, 726, 2147), (24, 727, 2147), (24, 728, 2147), (24, 729, 2147), (24, 730, 2147), (24, 731, 2147), (24, 732, 2147), (24, 733, 2147), (25, 734, 2146), (25, 735, 2146), (25, 736, 2146), (25, 737, 2146), (25, 738, 2146), (25, 739, 2146), (25, 740, 2145), (25, 741, 2145), (25, 742, 2145), (25, 743, 2145), (25, 744, 2145), (25, 745, 2145), (26, 746, 2144), (26, 747, 2144), (26, 748, 2144), (26, 749, 2144), (26, 750, 2144), (26, 751, 2144), (26, 752, 2144), (26, 753, 2144), (26, 754, 2144), (26, 755, 2144), (26, 756, 2144), (27, 757, 2143), (27, 758, 2143), (27, 759, 2143), (27, 760, 2143), (27, 761, 2142), (27, 762, 2142), (27, 763, 2142), (27, 764, 2142), (27, 765, 2142), (27, 766, 2142), (27, 767, 2142), (27, 768, 2142), (27, 769, 2142), (27, 770, 2142), (27, 771, 2142), (27, 772, 2142), (27, 773, 2142), (27, 774, 2142), (27, 775, 2142), (27, 776, 2142), (27, 777, 2142), (27, 778, 2142), (27, 779, 2142), (27, 780, 2141), (27, 781, 2141), (27, 782, 2141), (27, 783, 2141), (27, 784, 2141), (27, 785, 2141), (27, 786, 2141), (27, 787, 2141), (27, 788, 2141), (26, 789, 2142), (26, 790, 2142), (26, 791, 2142), (26, 792, 2142), (26, 793, 2142), (26, 794, 2142), (26, 795, 2142), (26, 796, 2142), (26, 797, 2142), (26, 798, 2141), (26, 799, 2141), (26, 800, 2141), (26, 801, 2141), (26, 802, 2141), (26, 803, 2141), (26, 804, 2141), (26, 805, 2141), (26, 806, 2141), (26, 807, 2141), (26, 808, 2141), (26, 809, 2141), (26, 810, 2141), (26, 811, 2141), (26, 812, 2141), (26, 813, 2141), (26, 814, 2141), (26, 815, 2141), (26, 816, 2141), (26, 817, 2140), (26, 818, 2140), (26, 819, 2140), (26, 820, 2140), (26, 821, 2140), (26, 822, 2140), (26, 823, 2140), (26, 824, 2140), (26, 825, 2140), (26, 826, 2140), (26, 827, 2140), (26, 828, 2140), (26, 829, 2140), (26, 830, 2140), (26, 831, 2140), (26, 832, 2140), (26, 833, 2140), (26, 834, 2139), (26, 835, 2139), (26, 836, 2139), (26, 837, 2139), (26, 838, 2139), (26, 839, 2139), (26, 840, 2138), (26, 841, 2138), (26, 842, 2138), (26, 843, 2137), (26, 844, 2137), (26, 845, 2137), (26, 846, 2136), (26, 847, 2136), (26, 848, 2135), (26, 849, 2135), (26, 850, 2135), (26, 851, 2134), (26, 852, 2134), (26, 853, 2133), (27, 854, 2132), (27, 855, 2131), (27, 856, 2131), (27, 857, 2130), (27, 858, 2130), (27, 859, 2130), (27, 860, 2129), (27, 861, 2129), (27, 862, 2128), (27, 863, 2128), (27, 864, 2127), (27, 865, 2127), (27, 866, 2126), (27, 867, 2126), (27, 868, 2125), (27, 869, 2125), (27, 870, 2124), (27, 871, 2123), (27, 872, 2123), (27, 873, 2122), (28, 874, 2121), (28, 875, 2120), (28, 876, 2120), (28, 877, 2119), (28, 878, 2118), (28, 879, 2118), (28, 880, 2117), (28, 881, 2117), (28, 882, 2116), (28, 883, 2116), (28, 884, 2115), (28, 885, 2114), (28, 886, 2114), (28, 887, 2113), (28, 888, 2113), (28, 889, 2112), (28, 890, 2112), (28, 891, 2111), (29, 892, 2110), (29, 893, 2109), (29, 894, 2109), (29, 895, 2108), (29, 896, 2108), (29, 897, 2108), (29, 898, 2107), (29, 899, 2107), (29, 900, 2106), (29, 901, 2106), (29, 902, 2105), (29, 903, 2105), (29, 904, 2104), (29, 905, 2104), (29, 906, 2104), (29, 907, 2103), (29, 908, 2103), (29, 909, 2102), (30, 910, 2101), (30, 911, 2101), (30, 912, 2100), (30, 913, 2100), (30, 914, 2099), (30, 915, 2099), (30, 916, 2099), (30, 917, 2099), (30, 918, 2098), (30, 919, 2098), (29, 920, 2099), (29, 921, 2099), (29, 922, 2098), (29, 923, 2098), (29, 924, 2098), (29, 925, 2098), (29, 926, 2097), (29, 927, 2097), (29, 928, 2097), (29, 929, 2097), (29, 930, 2097), (29, 931, 2096), (29, 932, 2096), (29, 933, 2096), (29, 934, 2096), (29, 935, 2095), (29, 936, 2095), (29, 937, 2095), (29, 938, 2095), (29, 939, 2094), (29, 940, 2094), (29, 941, 2094), (29, 942, 2094), (29, 943, 2094), (29, 944, 2093), (29, 945, 2093), (29, 946, 2093), (29, 947, 2093), (29, 948, 2093), (28, 949, 2093), (28, 950, 2093), (28, 951, 2093), (28, 952, 2093), (28, 953, 2093), (28, 954, 2092), (28, 955, 2092), (28, 956, 2092), (28, 957, 2092), (28, 958, 2092), (28, 959, 2091), (28, 960, 2091), (28, 961, 2091), (28, 962, 2091), (28, 963, 2091), (28, 964, 2090), (28, 965, 2090), (28, 966, 2090), (28, 967, 2090), (28, 968, 2090), (28, 969, 2089), (28, 970, 2089), (28, 971, 2089), (28, 972, 2089), (28, 973, 2089), (28, 974, 2089), (28, 975, 2088), (28, 976, 2088), (28, 977, 2088), (28, 978, 2088), (27, 979, 2089), (27, 980, 2089), (27, 981, 2088), (27, 982, 2088), (27, 983, 2088), (27, 984, 2088), (27, 985, 2088), (27, 986, 2088), (27, 987, 2087), (27, 988, 2087), (27, 989, 2087), (27, 990, 2087), (27, 991, 2086), (27, 992, 2086), (27, 993, 2086), (27, 994, 2086), (27, 995, 2085), (27, 996, 2085), (27, 997, 2085), (27, 998, 2084), (27, 999, 2084), (27, 1000, 2084), (28, 1001, 2082), (28, 1002, 2082), (28, 1003, 2082), (28, 1004, 2081), (28, 1005, 2081), (28, 1006, 2081), (28, 1007, 2080), (28, 1008, 2080), (28, 1009, 2080), (28, 1010, 2079), (28, 1011, 2079), (28, 1012, 2079), (28, 1013, 2078), (28, 1014, 2078), (28, 1015, 2077), (28, 1016, 2077), (28, 1017, 2077), (28, 1018, 2076), (28, 1019, 2076), (28, 1020, 2076), (28, 1021, 2075), (28, 1022, 2075), (28, 1023, 2074), (28, 1024, 2074), (28, 1025, 2074), (28, 1026, 2073), (28, 1027, 2073), (28, 1028, 2073), (28, 1029, 2072), (29, 1030, 2071), (29, 1031, 2070), (29, 1032, 2070), (29, 1033, 2069), (29, 1034, 2069), (29, 1035, 2069), (29, 1036, 2068), (29, 1037, 2068), (29, 1038, 2067), (29, 1039, 2067), (29, 1040, 2067), (29, 1041, 2066), (29, 1042, 2066), (29, 1043, 2065), (29, 1044, 2065), (29, 1045, 2064), (29, 1046, 2064), (29, 1047, 2063), (29, 1048, 2063), (29, 1049, 2063), (29, 1050, 2062), (29, 1051, 2062), (29, 1052, 2061), (29, 1053, 2061), (29, 1054, 2060), (29, 1055, 2060), (29, 1056, 2059), (29, 1057, 2059), (29, 1058, 2058), (30, 1059, 2057), (30, 1060, 2056), (30, 1061, 2056), (30, 1062, 2055), (30, 1063, 2055), (30, 1064, 2054), (30, 1065, 2054), (30, 1066, 2053), (30, 1067, 2052), (30, 1068, 2052), (30, 1069, 2051), (30, 1070, 2050), (30, 1071, 2049), (30, 1072, 2049), (30, 1073, 2048), (30, 1074, 2047), (30, 1075, 2046), (30, 1076, 2045), (30, 1077, 2044), (30, 1078, 2043), (30, 1079, 2042), (29, 1080, 2042), (29, 1081, 2041), (29, 1082, 2040), (29, 1083, 2039), (29, 1084, 2038), (29, 1085, 2037), (29, 1086, 2036), (29, 1087, 2035), (29, 1088, 2034), (29, 1089, 2033), (29, 1090, 2032), (29, 1091, 2031), (29, 1092, 2030), (29, 1093, 2029), (29, 1094, 2028), (29, 1095, 2027), (29, 1096, 2026), (29, 1097, 2026), (29, 1098, 2025), (29, 1099, 2024), (29, 1100, 2023), (29, 1101, 2022), (29, 1102, 2021), (29, 1103, 2021), (29, 1104, 2020), (29, 1105, 2019), (29, 1106, 2019), (29, 1107, 2018), (29, 1108, 2017), (29, 1109, 2016), (29, 1110, 2016), (29, 1111, 2015), (29, 1112, 2014), (29, 1113, 2014), (29, 1114, 2013), (29, 1115, 2013), (29, 1116, 2012), (29, 1117, 2011), (29, 1118, 2011), (29, 1119, 2010), (29, 1120, 2010), (29, 1121, 2009), (29, 1122, 2009), (29, 1123, 2008), (29, 1124, 2007), (29, 1125, 2007), (29, 1126, 2006), (29, 1127, 2006), (29, 1128, 2005), (29, 1129, 2005), (29, 1130, 2004), (29, 1131, 2004), (29, 1132, 2003), (29, 1133, 2003), (28, 1134, 2004), (28, 1135, 2003), (28, 1136, 2003), (28, 1137, 2002), (28, 1138, 2002), (28, 1139, 2001), (28, 1140, 2001), (28, 1141, 2001), (28, 1142, 2000), (28, 1143, 2000), (28, 1144, 2000), (28, 1145, 1999), (28, 1146, 1999), (28, 1147, 1999), (28, 1148, 1998), (28, 1149, 1998), (28, 1150, 1998), (28, 1151, 1997), (29, 1152, 1996), (29, 1153, 1996), (29, 1154, 1995), (29, 1155, 1995), (29, 1156, 1995), (29, 1157, 1994), (29, 1158, 1994), (29, 1159, 1994), (29, 1160, 1993), (29, 1161, 1993), (29, 1162, 1992), (29, 1163, 1992), (29, 1164, 1992), (29, 1165, 1991), (29, 1166, 1991), (29, 1167, 1991), (29, 1168, 1990), (29, 1169, 1990), (29, 1170, 1989), (29, 1171, 1989), (29, 1172, 1989), (29, 1173, 1988), (29, 1174, 1988), (29, 1175, 1987), (29, 1176, 1987), (29, 1177, 1987), (29, 1178, 1986), (29, 1179, 1986), (29, 1180, 1985), (29, 1181, 1985), (29, 1182, 1985), (29, 1183, 1984), (29, 1184, 1984), (29, 1185, 1983), (29, 1186, 1983), (29, 1187, 1982), (29, 1188, 1982), (29, 1189, 1981), (29, 1190, 1981), (29, 1191, 1981), (29, 1192, 1980), (29, 1193, 1980), (29, 1194, 1979), (29, 1195, 1979), (29, 1196, 1978), (29, 1197, 1978), (29, 1198, 1977), (29, 1199, 1977), (29, 1200, 1976), (29, 1201, 1976), (29, 1202, 1975), (29, 1203, 1975), (29, 1204, 1974), (29, 1205, 1974), (29, 1206, 1973), (29, 1207, 1972), (29, 1208, 1972), (29, 1209, 1971), (29, 1210, 1971), (29, 1211, 1970), (29, 1212, 1970), (29, 1213, 1969), (29, 1214, 1969), (29, 1215, 1968), (29, 1216, 1967), (29, 1217, 1967), (29, 1218, 1966), (29, 1219, 1965), (29, 1220, 1965), (29, 1221, 1964), (29, 1222, 1963), (29, 1223, 1963), (29, 1224, 1962), (29, 1225, 1961), (29, 1226, 1960), (29, 1227, 1960), (29, 1228, 1959), (29, 1229, 1958), (29, 1230, 1957), (29, 1231, 1956), (29, 1232, 1955), (29, 1233, 1955), (29, 1234, 1954), (29, 1235, 1953), (29, 1236, 1952), (29, 1237, 1951), (29, 1238, 1951), (29, 1239, 1950), (29, 1240, 1949), (29, 1241, 1948), (29, 1242, 1948), (29, 1243, 1947), (30, 1244, 1945), (30, 1245, 1945), (30, 1246, 1944), (30, 1247, 1943), (30, 1248, 1943), (30, 1249, 1942), (30, 1250, 1941), (30, 1251, 1941), (30, 1252, 1940), (30, 1253, 1940), (30, 1254, 1939), (30, 1255, 1938), (30, 1256, 1938), (30, 1257, 1937), (30, 1258, 1937), (30, 1259, 1936), (30, 1260, 1936), (30, 1261, 1935), (30, 1262, 1935), (30, 1263, 1934), (30, 1264, 1934), (30, 1265, 1933), (30, 1266, 1933), (30, 1267, 1932), (30, 1268, 1932), (30, 1269, 1931), (30, 1270, 1931), (30, 1271, 1930), (30, 1272, 1930), (30, 1273, 1929), (30, 1274, 1929), (30, 1275, 1929), (30, 1276, 1928), (30, 1277, 1928), (30, 1278, 1927), (30, 1279, 1927), (30, 1280, 1927), (30, 1281, 1926), (30, 1282, 1926), (30, 1283, 1925), (30, 1284, 1925), (30, 1285, 1925), (30, 1286, 1924), (30, 1287, 1924), (30, 1288, 1924), (30, 1289, 1923), (30, 1290, 1923), (30, 1291, 1923), (30, 1292, 1922), (30, 1293, 1922), (30, 1294, 1922), (30, 1295, 1921), (30, 1296, 1921), (30, 1297, 1921), (30, 1298, 1921), (30, 1299, 1920), (30, 1300, 1920), (30, 1301, 1920), (30, 1302, 1920), (30, 1303, 1920), (30, 1304, 1919), (30, 1305, 1919), (30, 1306, 1919), (30, 1307, 1919), (30, 1308, 1918), (30, 1309, 1918), (30, 1310, 1918), (30, 1311, 1918), (30, 1312, 1917), (31, 1313, 1916), (31, 1314, 1916), (31, 1315, 1916), (31, 1316, 1915), (31, 1317, 1915), (31, 1318, 1915), (31, 1319, 1915), (31, 1320, 1914), (31, 1321, 1914), (31, 1322, 1914), (31, 1323, 1914), (31, 1324, 1913), (31, 1325, 1913), (31, 1326, 1913), (31, 1327, 1912), (31, 1328, 1912), (31, 1329, 1912), (31, 1330, 1912), (31, 1331, 1911), (31, 1332, 1911), (31, 1333, 1911), (31, 1334, 1911), (31, 1335, 1910), (31, 1336, 1910), (31, 1337, 1910), (31, 1338, 1909), (31, 1339, 1909), (32, 1340, 1908), (32, 1341, 1908), (32, 1342, 1907), (32, 1343, 1907), (32, 1344, 1907), (32, 1345, 1907), (32, 1346, 1906), (32, 1347, 1906), (32, 1348, 1906), (32, 1349, 1905), (32, 1350, 1905), (32, 1351, 1905), (32, 1352, 1904), (32, 1353, 1904), (32, 1354, 1904), (32, 1355, 1904), (32, 1356, 1903), (32, 1357, 1903), (32, 1358, 1903), (32, 1359, 1902), (32, 1360, 1902), (32, 1361, 1902), (32, 1362, 1901), (32, 1363, 1901), (32, 1364, 1901), (32, 1365, 1900), (32, 1366, 1900), (33, 1367, 1899), (33, 1368, 1899), (33, 1369, 1898), (33, 1370, 1898), (33, 1371, 1897), (33, 1372, 1897), (33, 1373, 1896), (33, 1374, 1896), (33, 1375, 1895), (33, 1376, 1895), (33, 1377, 1894), (33, 1378, 1894), (33, 1379, 1893), (33, 1380, 1892), (33, 1381, 1892), (34, 1382, 1890), (34, 1383, 1890), (34, 1384, 1889), (34, 1385, 1888), (34, 1386, 1888), (34, 1387, 1887), (34, 1388, 1886), (34, 1389, 1886), (34, 1390, 1885), (34, 1391, 1884), (34, 1392, 1884), (34, 1393, 1883), (34, 1394, 1882), (34, 1395, 1882), (35, 1396, 1880), (35, 1397, 1879), (35, 1398, 1878), (35, 1399, 1878), (35, 1400, 1877), (35, 1401, 1876), (35, 1402, 1875), (35, 1403, 1874), (35, 1404, 1873), (35, 1405, 1872), (35, 1406, 1872), (35, 1407, 1871), (35, 1408, 1870), (36, 1409, 1868), (36, 1410, 1867), (36, 1411, 1866), (36, 1412, 1865), (36, 1413, 1864), (36, 1414, 1863), (36, 1415, 1863), (36, 1416, 1862), (36, 1417, 1861), (36, 1418, 1860), (36, 1419, 1859), (36, 1420, 1859), (36, 1421, 1858), (37, 1422, 1856), (37, 1423, 1855), (37, 1424, 1855), (37, 1425, 1854), (37, 1426, 1853), (37, 1427, 1853), (37, 1428, 1852), (37, 1429, 1851), (37, 1430, 1851), (37, 1431, 1850), (37, 1432, 1850), (37, 1433, 1849), (37, 1434, 1848), (38, 1435, 1847), (38, 1436, 1846), (38, 1437, 1846), (38, 1438, 1845), (38, 1439, 1845), (38, 1440, 1844), (38, 1441, 1844), (38, 1442, 1843), (38, 1443, 1843), (38, 1444, 1842), (38, 1445, 1842), (38, 1446, 1842), (38, 1447, 1841), (38, 1448, 1841), (38, 1449, 1841), (38, 1450, 1841), (38, 1451, 1840), (38, 1452, 1840), (38, 1453, 1840), (39, 1454, 1839), (39, 1455, 1839), (39, 1456, 1838), (39, 1457, 1838), (39, 1458, 1838), (39, 1459, 1838), (39, 1460, 1838), (39, 1461, 1837), (39, 1462, 1837), (39, 1463, 1837), (39, 1464, 1837), (39, 1465, 1836), (39, 1466, 1836), (39, 1467, 1836), (39, 1468, 1836), (39, 1469, 1836), (39, 1470, 1835), (39, 1471, 1835), (39, 1472, 1835), (39, 1473, 1835), (39, 1474, 1835), (39, 1475, 1834), (39, 1476, 1834), (39, 1477, 1834), (39, 1478, 1834), (39, 1479, 1834), (39, 1480, 1834), (39, 1481, 1833), (39, 1482, 1833), (39, 1483, 1833), (39, 1484, 1833), (39, 1485, 1833), (39, 1486, 1832), (39, 1487, 1832), (39, 1488, 1832), (39, 1489, 1832), (39, 1490, 1832), (39, 1491, 1831), (39, 1492, 1831), (39, 1493, 1831), (39, 1494, 1831), (39, 1495, 1831), (40, 1496, 1830), (40, 1497, 1829), (40, 1498, 1829), (40, 1499, 1829), (40, 1500, 1829), (40, 1501, 1829), (40, 1502, 1829), (40, 1503, 1828), (40, 1504, 1828), (40, 1505, 1828), (40, 1506, 1828), (40, 1507, 1828), (40, 1508, 1828), (40, 1509, 1827), (40, 1510, 1827), (40, 1511, 1827), (40, 1512, 1827), (40, 1513, 1827), (40, 1514, 1827), (40, 1515, 1826), (40, 1516, 1826), (40, 1517, 1826), (40, 1518, 1826), (40, 1519, 1826), (40, 1520, 1826), (40, 1521, 1825), (40, 1522, 1825), (40, 1523, 1825), (40, 1524, 1825), (40, 1525, 1825), (40, 1526, 1825), (40, 1527, 1825), (40, 1528, 1825), (40, 1529, 1824), (40, 1530, 1824), (40, 1531, 1824), (40, 1532, 1824), (40, 1533, 1824), (40, 1534, 1824), (40, 1535, 1824), (40, 1536, 1824), (40, 1537, 1823), (40, 1538, 1823), (41, 1539, 1822), (41, 1540, 1822), (41, 1541, 1822), (41, 1542, 1822), (41, 1543, 1822), (41, 1544, 1822), (41, 1545, 1821), (41, 1546, 1821), (41, 1547, 1821), (41, 1548, 1821), (41, 1549, 1821), (41, 1550, 1821), (41, 1551, 1821), (41, 1552, 1821), (41, 1553, 1820), (41, 1554, 1820), (41, 1555, 1820), (41, 1556, 1820), (41, 1557, 1820), (41, 1558, 1820), (41, 1559, 1820), (41, 1560, 1820), (41, 1561, 1819), (41, 1562, 1819), (41, 1563, 1819), (41, 1564, 1819), (41, 1565, 1819), (41, 1566, 1819), (41, 1567, 1819), (41, 1568, 1819), (41, 1569, 1818), (41, 1570, 1818), (41, 1571, 1818), (41, 1572, 1818), (41, 1573, 1818), (41, 1574, 1818), (41, 1575, 1818), (41, 1576, 1818), (41, 1577, 1817), (41, 1578, 1817), (41, 1579, 1817), (41, 1580, 1817), (41, 1581, 1817), (41, 1582, 1817), (41, 1583, 1817), (42, 1584, 1815), (42, 1585, 1815), (42, 1586, 1815), (42, 1587, 1815), (42, 1588, 1815), (42, 1589, 1815), (42, 1590, 1815), (42, 1591, 1815), (42, 1592, 1814), (42, 1593, 1814), (42, 1594, 1814), (42, 1595, 1814), (42, 1596, 1814), (42, 1597, 1814), (42, 1598, 1814), (42, 1599, 1814), (42, 1600, 1813), (42, 1601, 1813), (42, 1602, 1813), (41, 1603, 1814), (41, 1604, 1814), (41, 1605, 1814), (41, 1606, 1814), (41, 1607, 1814), (41, 1608, 1814), (41, 1609, 1813), (41, 1610, 1813), (41, 1611, 1813), (41, 1612, 1813), (41, 1613, 1813), (41, 1614, 1813), (41, 1615, 1813), (41, 1616, 1813), (41, 1617, 1813), (41, 1618, 1812), (41, 1619, 1812), (41, 1620, 1812), (41, 1621, 1812), (41, 1622, 1812), (41, 1623, 1812), (41, 1624, 1812), (41, 1625, 1812), (40, 1626, 1812), (40, 1627, 1812), (40, 1628, 1812), (40, 1629, 1812), (40, 1630, 1812), (40, 1631, 1812), (40, 1632, 1812), (40, 1633, 1812), (40, 1634, 1811), (40, 1635, 1811), (40, 1636, 1811), (40, 1637, 1811), (40, 1638, 1811), (40, 1639, 1811), (40, 1640, 1811), (40, 1641, 1811), (40, 1642, 1810), (40, 1643, 1810), (40, 1644, 1810), (40, 1645, 1810), (40, 1646, 1810), (40, 1647, 1810), (40, 1648, 1810), (40, 1649, 1809), (40, 1650, 1809), (39, 1651, 1810), (39, 1652, 1810), (39, 1653, 1810), (39, 1654, 1810), (39, 1655, 1810), (39, 1656, 1810), (39, 1657, 1809), (39, 1658, 1809), (39, 1659, 1809), (39, 1660, 1809), (39, 1661, 1809), (39, 1662, 1809), (39, 1663, 1809), (39, 1664, 1808), (39, 1665, 1808), (39, 1666, 1808), (39, 1667, 1808), (39, 1668, 1808), (39, 1669, 1808), (39, 1670, 1808), (39, 1671, 1807), (39, 1672, 1807), (39, 1673, 1807), (39, 1674, 1807), (39, 1675, 1806), (39, 1676, 1806), (39, 1677, 1806), (40, 1678, 1805), (40, 1679, 1804), (40, 1680, 1804), (40, 1681, 1804), (40, 1682, 1804), (40, 1683, 1803), (41, 1684, 1802), (41, 1685, 1802), (41, 1686, 1802), (41, 1687, 1801), (41, 1688, 1801), (41, 1689, 1801), (41, 1690, 1801), (42, 1691, 1799), (42, 1692, 1799), (42, 1693, 1799), (42, 1694, 1798), (42, 1695, 1798), (42, 1696, 1798), (43, 1697, 1797), (43, 1698, 1796), (43, 1699, 1796), (43, 1700, 1796), (43, 1701, 1795), (43, 1702, 1795), (44, 1703, 1794), (44, 1704, 1793), (44, 1705, 1793), (44, 1706, 1793), (44, 1707, 1793), (45, 1708, 1791), (45, 1709, 1791), (45, 1710, 1791), (45, 1711, 1790), (45, 1712, 1790), (45, 1713, 1790), (46, 1714, 1788), (46, 1715, 1788), (46, 1716, 1788), (46, 1717, 1787), (46, 1718, 1787), (47, 1719, 1786), (47, 1720, 1785), (47, 1721, 1785), (47, 1722, 1784), (47, 1723, 1784), (48, 1724, 1783), (48, 1725, 1782), (48, 1726, 1782), (48, 1727, 1782), (48, 1728, 1781), (49, 1729, 1780), (49, 1730, 1779), (49, 1731, 1779), (49, 1732, 1779), (49, 1733, 1778), (50, 1734, 1777), (50, 1735, 1776), (50, 1736, 1776), (50, 1737, 1776), (50, 1738, 1775), (51, 1739, 1774), (51, 1740, 1773), (51, 1741, 1773), (51, 1742, 1772), (51, 1743, 1772), (52, 1744, 1771), (52, 1745, 1770), (52, 1746, 1770), (52, 1747, 1769), (52, 1748, 1769), (52, 1749, 1768), (52, 1750, 1768), (52, 1751, 1768), (52, 1752, 1767), (52, 1753, 1767), (53, 1754, 1765), (53, 1755, 1765), (53, 1756, 1765), (53, 1757, 1764), (53, 1758, 1764), (53, 1759, 1763), (53, 1760, 1763), (53, 1761, 1763), (53, 1762, 1762), (53, 1763, 1762), (53, 1764, 1762), (53, 1765, 1761), (53, 1766, 1761), (53, 1767, 1761), (53, 1768, 1760), (53, 1769, 1760), (53, 1770, 1759), (53, 1771, 1759), (53, 1772, 1759), (53, 1773, 1758), (53, 1774, 1758), (53, 1775, 1758), (53, 1776, 1757), (53, 1777, 1757), (53, 1778, 1757), (53, 1779, 1756), (53, 1780, 1756), (53, 1781, 1756), (53, 1782, 1755), (53, 1783, 1755), (53, 1784, 1755), (53, 1785, 1754), (53, 1786, 1754), (53, 1787, 1754), (53, 1788, 1753), (53, 1789, 1753), (53, 1790, 1753), (53, 1791, 1753), (53, 1792, 1752), (53, 1793, 1752), (53, 1794, 1752), (53, 1795, 1751), (53, 1796, 1751), (53, 1797, 1751), (53, 1798, 1750), (53, 1799, 1750), (53, 1800, 1750), (53, 1801, 1750), (53, 1802, 1749), (53, 1803, 1749), (53, 1804, 1749), (53, 1805, 1748), (53, 1806, 1748), (53, 1807, 1748), (53, 1808, 1748), (53, 1809, 1747), (53, 1810, 1747), (53, 1811, 1747), (53, 1812, 1746), (53, 1813, 1746), (53, 1814, 1746), (53, 1815, 1746), (54, 1816, 1744), (54, 1817, 1744), (54, 1818, 1744), (54, 1819, 1744), (54, 1820, 1743), (54, 1821, 1743), (54, 1822, 1743), (54, 1823, 1743), (54, 1824, 1742), (54, 1825, 1742), (55, 1826, 1741), (55, 1827, 1740), (56, 1828, 1739), (56, 1829, 1739), (57, 1830, 1737), (57, 1831, 1737), (58, 1832, 1736), (58, 1833, 1735), (59, 1834, 1734), (59, 1835, 1734), (60, 1836, 1732), (60, 1837, 1732), (61, 1838, 1731), (61, 1839, 1730), (61, 1840, 1730), (62, 1841, 1729), (62, 1842, 1728), (63, 1843, 1727), (63, 1844, 1727), (64, 1845, 1725), (64, 1846, 1725), (65, 1847, 1723), (65, 1848, 1723), (66, 1849, 1722), (66, 1850, 1721), (67, 1851, 1720), (67, 1852, 1720), (67, 1853, 1719), (68, 1854, 1718), (68, 1855, 1718), (69, 1856, 1716), (69, 1857, 1716), (70, 1858, 1714), (70, 1859, 1714), (71, 1860, 1713), (71, 1861, 1712), (72, 1862, 1711), (72, 1863, 1711), (72, 1864, 1710), (73, 1865, 1709), (73, 1866, 1708), (74, 1867, 1707), (74, 1868, 1707), (75, 1869, 1705), (75, 1870, 1705), (75, 1871, 1704), (76, 1872, 1703), (76, 1873, 1703), (77, 1874, 1701), (77, 1875, 1701), (78, 1876, 1699), (78, 1877, 1699), (79, 1878, 1698), (79, 1879, 1697), (79, 1880, 1697), (80, 1881, 1695), (80, 1882, 1695), (81, 1883, 1693), (81, 1884, 1693), (82, 1885, 1692), (82, 1886, 1691), (82, 1887, 1691), (83, 1888, 1689), (83, 1889, 1689), (84, 1890, 1687), (84, 1891, 1687), (84, 1892, 1687), (85, 1893, 1685), (85, 1894, 1685), (86, 1895, 1683), (86, 1896, 1683), (87, 1897, 1681), (87, 1898, 1681), (87, 1899, 1680), (88, 1900, 1679), (88, 1901, 1678), (88, 1902, 1678), (89, 1903, 1677), (89, 1904, 1676), (90, 1905, 1675), (90, 1906, 1674), (90, 1907, 1674), (91, 1908, 1672), (91, 1909, 1672), (91, 1910, 1671), (92, 1911, 1670), (92, 1912, 1669), (93, 1913, 1667), (93, 1914, 1667), (93, 1915, 1666), (94, 1916, 1665), (94, 1917, 1664), (95, 1918, 1663), (95, 1919, 1662), (96, 1920, 1661), (96, 1921, 1660), (97, 1922, 1658), (97, 1923, 1658), (97, 1924, 1657), (98, 1925, 1656), (98, 1926, 1655), (99, 1927, 1653), (99, 1928, 1653), (100, 1929, 1651), (100, 1930, 1651), (101, 1931, 1649), (101, 1932, 1648), (102, 1933, 1647), (102, 1934, 1646), (103, 1935, 1644), (103, 1936, 1644), (104, 1937, 1642), (105, 1938, 1640), (105, 1939, 1640), (106, 1940, 1638), (106, 1941, 1637), (107, 1942, 1635), (107, 1943, 1635), (108, 1944, 1633), (109, 1945, 1631), (109, 1946, 1630), (110, 1947, 1628), (110, 1948, 1628), (111, 1949, 1626), (112, 1950, 1624), (112, 1951, 1623), (113, 1952, 1622), (114, 1953, 1620), (114, 1954, 1619), (115, 1955, 1617), (116, 1956, 1616), (116, 1957, 1615), (117, 1958, 1613), (118, 1959, 1611), (119, 1960, 1610), (119, 1961, 1609), (120, 1962, 1607), (121, 1963, 87), (209, 1963, 1517), (122, 1964, 74), (216, 1964, 1510), (123, 1965, 61), (223, 1965, 1502), (123, 1966, 51), (230, 1966, 1494), (124, 1967, 40), (237, 1967, 1487), (125, 1968, 30), (244, 1968, 1479), (126, 1969, 21), (251, 1969, 1471), (127, 1970, 12), (259, 1970, 1463), (128, 1971, 4), (266, 1971, 1455), (274, 1972, 1446), (281, 1973, 1439), (289, 1974, 1430), (294, 1975, 1424), (299, 1976, 1418), (303, 1977, 1413), (308, 1978, 1407), (314, 1979, 1400), (319, 1980, 1394), (325, 1981, 1387), (331, 1982, 1380), (337, 1983, 1373), (344, 1984, 1365), (351, 1985, 1357), (358, 1986, 1349), (366, 1987, 1339), (372, 1988, 1332), (376, 1989, 1327), (380, 1990, 1322), (384, 1991, 1317), (389, 1992, 1310), (393, 1993, 1305), (397, 1994, 1300), (402, 1995, 1293), (406, 1996, 1288), (411, 1997, 1282), (415, 1998, 1276), (420, 1999, 1270), (425, 2000, 1263), (430, 2001, 1257), (435, 2002, 1251), (440, 2003, 1244), (446, 2004, 1236), (451, 2005, 1230), (456, 2006, 1223), (461, 2007, 1217), (466, 2008, 1210), (471, 2009, 1203), (476, 2010, 1196), (481, 2011, 1190), (487, 2012, 1182), (492, 2013, 1175), (497, 2014, 1168), (503, 2015, 1160), (508, 2016, 1074), (514, 2017, 1063), (520, 2018, 1053), (525, 2019, 1044), (531, 2020, 1034), (536, 2021, 1025), (542, 2022, 1016), (546, 2023, 1008), (551, 2024, 999), (556, 2025, 990), (561, 2026, 982), (565, 2027, 974), (570, 2028, 966), (574, 2029, 958), (579, 2030, 950), (583, 2031, 942), (587, 2032, 935), (591, 2033, 928), (596, 2034, 919), (600, 2035, 912), (604, 2036, 905), (607, 2037, 899), (611, 2038, 892), (614, 2039, 883), (617, 2040, 870), (620, 2041, 857), (622, 2042, 845), (625, 2043, 832), (627, 2044, 820), (629, 2045, 809), (631, 2046, 798), (634, 2047, 786), (636, 2048, 780), (638, 2049, 775), (640, 2050, 770), (642, 2051, 765), (644, 2052, 760), (645, 2053, 756), (647, 2054, 751), (649, 2055, 746), (651, 2056, 741), (652, 2057, 737), (654, 2058, 731), (656, 2059, 726), (658, 2060, 720), (660, 2061, 715), (662, 2062, 709), (664, 2063, 704), (666, 2064, 698), (669, 2065, 691), (671, 2066, 685), (673, 2067, 679), (676, 2068, 672), (678, 2069, 666), (680, 2070, 660), (683, 2071, 652), (686, 2072, 643), (688, 2073, 636), (691, 2074, 628), (694, 2075, 619), (699, 2076, 609), (703, 2077, 599), (708, 2078, 588), (712, 2079, 579), (717, 2080, 568), (721, 2081, 558), (726, 2082, 547), (730, 2083, 536), (734, 2084, 526), (739, 2085, 515), (743, 2086, 506), (747, 2087, 497), (751, 2088, 488), (756, 2089, 478), (760, 2090, 469), (764, 2091, 460), (768, 2092, 451), (772, 2093, 443), (776, 2094, 434), (779, 2095, 427), (782, 2096, 419), (786, 2097, 411), (789, 2098, 404), (792, 2099, 397), (795, 2100, 390), (799, 2101, 382), (802, 2102, 375), (805, 2103, 370), (808, 2104, 364), (811, 2105, 359), (815, 2106, 353), (818, 2107, 348), (821, 2108, 342), (824, 2109, 337), (827, 2110, 332), (830, 2111, 327), (833, 2112, 322), (836, 2113, 317), (839, 2114, 312), (842, 2115, 307), (846, 2116, 301), (849, 2117, 296), (852, 2118, 291), (855, 2119, 287), (858, 2120, 282), (861, 2121, 277), (864, 2122, 272), (867, 2123, 268), (869, 2124, 264), (872, 2125, 259), (875, 2126, 255), (878, 2127, 250), (880, 2128, 246), (884, 2129, 241), (887, 2130, 236), (890, 2131, 231), (893, 2132, 226), (896, 2133, 221), (899, 2134, 215), (903, 2135, 209), (906, 2136, 204), (910, 2137, 198), (913, 2138, 193), (917, 2139, 186), (920, 2140, 181), (924, 2141, 174), (928, 2142, 166), (932, 2143, 154), (936, 2144, 142), (946, 2145, 124), (957, 2146, 105), (967, 2147, 87), (978, 2148, 67), (989, 2149, 48), (1001, 2150, 27), (1013, 2151, 6)], ['1001,2150,920,2140,775,2093,694,2075,624,2042,365,1986,215,1963,128,1971,103,1936,54,1825,39,1677,39,1454,30,1312,29,892,21,696,27,543,39,458,93,308,126,270,210,206,291,179,373,132,520,121,1430,121,1584,128,1663,142,1768,178,1904,204,1993,277,2094,411,2148,535,2168,613,2171,717,2165,833,2128,914,2112,994,2031,1132,1958,1273,1931,1368,1879,1444,1846,1670,1782,1863,1719,1973,1662,2015,1582,2015,1496,2039,1420,2046,1347,2068,1177,2101,1093,2142'])], 'temp/1759634476_2282988_917877156_a9c2d4b99270c9302def4ed40606e685.jpg']} nb pixel non reg : 3692295 nb pixel common : 3690405 proportion of common points : 0.9994881232404237 #&_# TEST FAILED #&_# : tests/mask_test #&_# #&_# END OF TEST #&_# : tests/mask_test #&_# #&_# BEGIN OF TEST : tests/datou_test #&_# /home/admin/workarea/git/Velours/python/tests/datou_test.py Datou All Test python version used : 3 ############################### TEST sam ################################ TEST SAM Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : sam list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.2761509418487549 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! WARNING : we have an input that is not a photo, we should get rid of it Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:sam Sun Oct 5 05:21:48 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step sam ! Inside sam : nb paths : 1 (640, 960, 3) time for calcul the mask position with numpy : 0.001809835433959961 nb_pixel_total : 3763 time to create 1 rle with old method : 0.008005857467651367 time for calcul the mask position with numpy : 0.0016634464263916016 nb_pixel_total : 16246 time to create 1 rle with old method : 0.03509259223937988 time for calcul the mask position with numpy : 0.0019505023956298828 nb_pixel_total : 83742 time to create 1 rle with old method : 0.1843419075012207 time for calcul the mask position with numpy : 0.001428365707397461 nb_pixel_total : 5614 time to create 1 rle with old method : 0.012478351593017578 time for calcul the mask position with numpy : 0.0014650821685791016 nb_pixel_total : 16440 time to create 1 rle with old method : 0.03784942626953125 time for calcul the mask position with numpy : 0.0015630722045898438 nb_pixel_total : 10794 time to create 1 rle with old method : 0.025518178939819336 time for calcul the mask position with numpy : 0.0015420913696289062 nb_pixel_total : 1226 time to create 1 rle with old method : 0.003148317337036133 time for calcul the mask position with numpy : 0.0015947818756103516 nb_pixel_total : 7635 time to create 1 rle with old method : 0.018537044525146484 time for calcul the mask position with numpy : 0.0015528202056884766 nb_pixel_total : 5334 time to create 1 rle with old method : 0.012259721755981445 time for calcul the mask position with numpy : 0.0017104148864746094 nb_pixel_total : 29445 time to create 1 rle with old method : 0.06421661376953125 time for calcul the mask position with numpy : 0.001434326171875 nb_pixel_total : 3340 time to create 1 rle with old method : 0.0074367523193359375 time for calcul the mask position with numpy : 0.0014226436614990234 nb_pixel_total : 2940 time to create 1 rle with old method : 0.006257057189941406 time for calcul the mask position with numpy : 0.0013082027435302734 nb_pixel_total : 2778 time to create 1 rle with old method : 0.00609278678894043 time for calcul the mask position with numpy : 0.001440286636352539 nb_pixel_total : 1582 time to create 1 rle with old method : 0.003744840621948242 time for calcul the mask position with numpy : 0.001676321029663086 nb_pixel_total : 27557 time to create 1 rle with old method : 0.062465667724609375 time for calcul the mask position with numpy : 0.0015411376953125 nb_pixel_total : 3113 time to create 1 rle with old method : 0.006881237030029297 time for calcul the mask position with numpy : 0.0014736652374267578 nb_pixel_total : 13931 time to create 1 rle with old method : 0.029934406280517578 time for calcul the mask position with numpy : 0.0013022422790527344 nb_pixel_total : 713 time to create 1 rle with old method : 0.00160980224609375 time for calcul the mask position with numpy : 0.0013158321380615234 nb_pixel_total : 4292 time to create 1 rle with old method : 0.00896763801574707 time for calcul the mask position with numpy : 0.001425027847290039 nb_pixel_total : 2371 time to create 1 rle with old method : 0.005066394805908203 time for calcul the mask position with numpy : 0.0013420581817626953 nb_pixel_total : 3952 time to create 1 rle with old method : 0.008423328399658203 time for calcul the mask position with numpy : 0.001409769058227539 nb_pixel_total : 1207 time to create 1 rle with old method : 0.002805471420288086 time for calcul the mask position with numpy : 0.0013408660888671875 nb_pixel_total : 8108 time to create 1 rle with old method : 0.01979231834411621 time for calcul the mask position with numpy : 0.001537322998046875 nb_pixel_total : 13111 time to create 1 rle with old method : 0.03385615348815918 time for calcul the mask position with numpy : 0.001714944839477539 nb_pixel_total : 6632 time to create 1 rle with old method : 0.015842914581298828 time for calcul the mask position with numpy : 0.0018613338470458984 nb_pixel_total : 2747 time to create 1 rle with old method : 0.006923675537109375 time for calcul the mask position with numpy : 0.0016944408416748047 nb_pixel_total : 887 time to create 1 rle with old method : 0.002249479293823242 time for calcul the mask position with numpy : 0.002168893814086914 nb_pixel_total : 38066 time to create 1 rle with old method : 0.09494757652282715 time for calcul the mask position with numpy : 0.00612330436706543 nb_pixel_total : 1666 time to create 1 rle with old method : 0.003943443298339844 time for calcul the mask position with numpy : 0.0015556812286376953 nb_pixel_total : 2084 time to create 1 rle with old method : 0.004859209060668945 time for calcul the mask position with numpy : 0.0015780925750732422 nb_pixel_total : 4281 time to create 1 rle with old method : 0.01031804084777832 time for calcul the mask position with numpy : 0.0016515254974365234 nb_pixel_total : 5520 time to create 1 rle with old method : 0.013492345809936523 time for calcul the mask position with numpy : 0.0020253658294677734 nb_pixel_total : 8643 time to create 1 rle with old method : 0.02209186553955078 time for calcul the mask position with numpy : 0.0028285980224609375 nb_pixel_total : 11922 time to create 1 rle with old method : 0.03122258186340332 time for calcul the mask position with numpy : 0.0015225410461425781 nb_pixel_total : 3532 time to create 1 rle with old method : 0.008193492889404297 time for calcul the mask position with numpy : 0.001531839370727539 nb_pixel_total : 7810 time to create 1 rle with old method : 0.01814579963684082 time for calcul the mask position with numpy : 0.0015730857849121094 nb_pixel_total : 12755 time to create 1 rle with old method : 0.03083062171936035 time for calcul the mask position with numpy : 0.001520395278930664 nb_pixel_total : 2447 time to create 1 rle with old method : 0.005835056304931641 time for calcul the mask position with numpy : 0.0015478134155273438 nb_pixel_total : 9891 time to create 1 rle with old method : 0.024957895278930664 time for calcul the mask position with numpy : 0.0016968250274658203 nb_pixel_total : 2267 time to create 1 rle with old method : 0.007549762725830078 time for calcul the mask position with numpy : 0.001708984375 nb_pixel_total : 14652 time to create 1 rle with old method : 0.04707002639770508 time for calcul the mask position with numpy : 0.0016787052154541016 nb_pixel_total : 1022 time to create 1 rle with old method : 0.0034143924713134766 time for calcul the mask position with numpy : 0.001646280288696289 nb_pixel_total : 4134 time to create 1 rle with old method : 0.010993480682373047 time for calcul the mask position with numpy : 0.0015330314636230469 nb_pixel_total : 10626 time to create 1 rle with old method : 0.023746967315673828 time for calcul the mask position with numpy : 0.0014526844024658203 nb_pixel_total : 344 time to create 1 rle with old method : 0.0007879734039306641 time for calcul the mask position with numpy : 0.0014352798461914062 nb_pixel_total : 1252 time to create 1 rle with old method : 0.0028464794158935547 time for calcul the mask position with numpy : 0.001390218734741211 nb_pixel_total : 419 time to create 1 rle with old method : 0.0010402202606201172 time for calcul the mask position with numpy : 0.0013861656188964844 nb_pixel_total : 3859 time to create 1 rle with old method : 0.008875846862792969 time for calcul the mask position with numpy : 0.0014157295227050781 nb_pixel_total : 2421 time to create 1 rle with old method : 0.005708932876586914 time for calcul the mask position with numpy : 0.0014617443084716797 nb_pixel_total : 4178 time to create 1 rle with old method : 0.009691238403320312 time for calcul the mask position with numpy : 0.0015904903411865234 nb_pixel_total : 881 time to create 1 rle with old method : 0.0022263526916503906 time for calcul the mask position with numpy : 0.0015056133270263672 nb_pixel_total : 2393 time to create 1 rle with old method : 0.0056874752044677734 time for calcul the mask position with numpy : 0.001444101333618164 nb_pixel_total : 860 time to create 1 rle with old method : 0.002208232879638672 time for calcul the mask position with numpy : 0.00139617919921875 nb_pixel_total : 2323 time to create 1 rle with old method : 0.005291461944580078 time for calcul the mask position with numpy : 0.0013875961303710938 nb_pixel_total : 596 time to create 1 rle with old method : 0.0014088153839111328 time for calcul the mask position with numpy : 0.0022280216217041016 nb_pixel_total : 2030 time to create 1 rle with old method : 0.008243322372436523 time for calcul the mask position with numpy : 0.0023958683013916016 nb_pixel_total : 577 time to create 1 rle with old method : 0.0013616085052490234 time for calcul the mask position with numpy : 0.0016109943389892578 nb_pixel_total : 2766 time to create 1 rle with old method : 0.006697416305541992 time for calcul the mask position with numpy : 0.0019276142120361328 nb_pixel_total : 692 time to create 1 rle with old method : 0.002358675003051758 time for calcul the mask position with numpy : 0.0014681816101074219 nb_pixel_total : 338 time to create 1 rle with old method : 0.0008478164672851562 time for calcul the mask position with numpy : 0.0013413429260253906 nb_pixel_total : 1673 time to create 1 rle with old method : 0.003846883773803711 time for calcul the mask position with numpy : 0.0013556480407714844 nb_pixel_total : 1075 time to create 1 rle with old method : 0.002493619918823242 time for calcul the mask position with numpy : 0.0014290809631347656 nb_pixel_total : 1206 time to create 1 rle with old method : 0.0027909278869628906 time for calcul the mask position with numpy : 0.0014374256134033203 nb_pixel_total : 1056 time to create 1 rle with old method : 0.0025665760040283203 time for calcul the mask position with numpy : 0.001383066177368164 nb_pixel_total : 585 time to create 1 rle with old method : 0.0014202594757080078 time for calcul the mask position with numpy : 0.0013723373413085938 nb_pixel_total : 8592 time to create 1 rle with old method : 0.019449234008789062 time for calcul the mask position with numpy : 0.0014996528625488281 nb_pixel_total : 1332 time to create 1 rle with old method : 0.003210783004760742 time for calcul the mask position with numpy : 0.0014181137084960938 nb_pixel_total : 864 time to create 1 rle with old method : 0.001895904541015625 time for calcul the mask position with numpy : 0.0014276504516601562 nb_pixel_total : 6031 time to create 1 rle with old method : 0.013883113861083984 time for calcul the mask position with numpy : 0.0013604164123535156 nb_pixel_total : 1738 time to create 1 rle with old method : 0.0036842823028564453 time for calcul the mask position with numpy : 0.001489877700805664 nb_pixel_total : 16728 time to create 1 rle with old method : 0.03719735145568848 time for calcul the mask position with numpy : 0.0013852119445800781 nb_pixel_total : 9081 time to create 1 rle with old method : 0.020715713500976562 time for calcul the mask position with numpy : 0.0013751983642578125 nb_pixel_total : 1520 time to create 1 rle with old method : 0.00394749641418457 time for calcul the mask position with numpy : 0.0014438629150390625 nb_pixel_total : 3164 time to create 1 rle with old method : 0.00696563720703125 time for calcul the mask position with numpy : 0.0014781951904296875 nb_pixel_total : 231 time to create 1 rle with old method : 0.0005636215209960938 time for calcul the mask position with numpy : 0.0014407634735107422 nb_pixel_total : 267 time to create 1 rle with old method : 0.0006606578826904297 time for calcul the mask position with numpy : 0.001497507095336914 nb_pixel_total : 976 time to create 1 rle with old method : 0.002187967300415039 time for calcul the mask position with numpy : 0.001447916030883789 nb_pixel_total : 9496 time to create 1 rle with old method : 0.021235942840576172 time for calcul the mask position with numpy : 0.0013811588287353516 nb_pixel_total : 616 time to create 1 rle with old method : 0.0014693737030029297 time for calcul the mask position with numpy : 0.0013744831085205078 nb_pixel_total : 990 time to create 1 rle with old method : 0.002250194549560547 time for calcul the mask position with numpy : 0.0014271736145019531 nb_pixel_total : 248 time to create 1 rle with old method : 0.0006613731384277344 time for calcul the mask position with numpy : 0.0013587474822998047 nb_pixel_total : 221 time to create 1 rle with old method : 0.0005998611450195312 time for calcul the mask position with numpy : 0.0015628337860107422 nb_pixel_total : 39130 time to create 1 rle with old method : 0.0946512222290039 time for calcul the mask position with numpy : 0.0017058849334716797 nb_pixel_total : 7509 time to create 1 rle with old method : 0.023855924606323242 time for calcul the mask position with numpy : 0.001760244369506836 nb_pixel_total : 5005 time to create 1 rle with old method : 0.016329526901245117 time for calcul the mask position with numpy : 0.001621246337890625 nb_pixel_total : 917 time to create 1 rle with old method : 0.0033135414123535156 time for calcul the mask position with numpy : 0.0015892982482910156 nb_pixel_total : 735 time to create 1 rle with old method : 0.0026197433471679688 time for calcul the mask position with numpy : 0.00159454345703125 nb_pixel_total : 1484 time to create 1 rle with old method : 0.004884481430053711 time for calcul the mask position with numpy : 0.0015959739685058594 nb_pixel_total : 1633 time to create 1 rle with old method : 0.005370140075683594 time for calcul the mask position with numpy : 0.0014448165893554688 nb_pixel_total : 300 time to create 1 rle with old method : 0.0008084774017333984 time for calcul the mask position with numpy : 0.0014574527740478516 nb_pixel_total : 594 time to create 1 rle with old method : 0.0014815330505371094 time for calcul the mask position with numpy : 0.0014634132385253906 nb_pixel_total : 1445 time to create 1 rle with old method : 0.0035092830657958984 time for calcul the mask position with numpy : 0.0014410018920898438 nb_pixel_total : 1124 time to create 1 rle with old method : 0.0025632381439208984 time for calcul the mask position with numpy : 0.0014367103576660156 nb_pixel_total : 860 time to create 1 rle with old method : 0.0020294189453125 time for calcul the mask position with numpy : 0.001338958740234375 nb_pixel_total : 1614 time to create 1 rle with old method : 0.0037736892700195312 time for calcul the mask position with numpy : 0.0015151500701904297 nb_pixel_total : 2199 time to create 1 rle with old method : 0.005455970764160156 time for calcul the mask position with numpy : 0.0013554096221923828 nb_pixel_total : 1317 time to create 1 rle with old method : 0.003136873245239258 time for calcul the mask position with numpy : 0.001468658447265625 nb_pixel_total : 947 time to create 1 rle with old method : 0.002264738082885742 time for calcul the mask position with numpy : 0.001415252685546875 nb_pixel_total : 885 time to create 1 rle with old method : 0.0021347999572753906 time for calcul the mask position with numpy : 0.0014564990997314453 nb_pixel_total : 481 time to create 1 rle with old method : 0.0012278556823730469 time for calcul the mask position with numpy : 0.0014569759368896484 nb_pixel_total : 1416 time to create 1 rle with old method : 0.0035104751586914062 time for calcul the mask position with numpy : 0.0014944076538085938 nb_pixel_total : 1217 time to create 1 rle with old method : 0.002964019775390625 time for calcul the mask position with numpy : 0.0014841556549072266 nb_pixel_total : 830 time to create 1 rle with old method : 0.0020189285278320312 batch 1 Loaded 103 chid ids of type : 4677 Number RLEs to save : 9218 TO DO : save crop sub photo not yet done ! Inside saveOutput : final : True verbose : False saveOutput not yet implemented for datou_step.type : sam we use saveGeneral [1189321094] Looping around the photos to save general results len do output : 1 /1189321094Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4573', None, None, None, None, None, None, None, None) ('4573', None, '1189321094', None, None, None, None, None, None) begin to insert list_values into mtr_datou_result : length of list_values in save_final : 3 time used for this insertion : 0.03675198554992676 save_final save missing photos in datou_result : time spend for datou_step_exec : 13.9079110622406 time spend to save output : 0.03717207908630371 total time spend for step 1 : 13.945083141326904 caffe_path_current : About to save ! 2 After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {'1189321094': [[, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ], 'temp/1759634508_2282988_1189321094_9626af7f95d010f2a4fd524688d4ea22_76896585.png']} nb_objects detect : 103 ############################### TEST frcnn ################################ Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : frcnn list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.21622133255004883 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:frcnn Sun Oct 5 05:22:02 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step Faster rcnn ! To loadFromThcl() model_param file didn't exist model_name : detection_plaque_valcor_010622 model_type : caffe_faster_rcnn list file need : ['caffemodel', 'test.prototxt'] file exist in s3 : ['caffemodel', 'test.prototxt'] file manque in s3 : [] local folder : /data/models_weight/detection_plaque_valcor_010622 /data/models_weight/detection_plaque_valcor_010622/caffemodel size_local : 349723073 size in s3 : 349723073 create time local : 2022-07-12 14:12:27 create time in s3 : 2022-06-01 15:05:56 caffemodel already exist and didn't need to update /data/models_weight/detection_plaque_valcor_010622/test.prototxt size_local : 7163 size in s3 : 7163 create time local : 2022-07-12 14:12:27 create time in s3 : 2022-06-01 15:05:55 test.prototxt already exist and didn't need to update prototxt : /data/models_weight/detection_plaque_valcor_010622/test.prototxt caffemodel : /data/models_weight/detection_plaque_valcor_010622/caffemodel Loaded network /data/models_weight/detection_plaque_valcor_010622/caffemodel About to compute detect_faster_rcnn : len(args) : 1 Inside frcnn step exec : nb paths : 1 image_path : temp/1759634522_2282988_917754606_35f3c9ae49686a6be16030c6ec25c9ee.jpg image_size (600, 800, 3) [[[ 4 6 6] [ 5 7 7] [ 6 8 8] ... [207 215 214] [206 214 213] [206 214 213]] [[ 4 6 6] [ 5 7 7] [ 6 8 8] ... [207 215 214] [206 214 213] [206 214 213]] [[ 4 6 6] [ 5 7 7] [ 6 8 8] ... [207 215 214] [206 214 213] [206 214 213]] ... [[ 14 16 16] [ 13 15 15] [ 11 13 13] ... [198 206 205] [198 206 205] [198 206 205]] [[ 16 18 18] [ 14 16 16] [ 11 13 13] ... [206 214 213] [206 214 213] [206 214 213]] [[ 13 15 15] [ 12 14 14] [ 9 11 11] ... [210 218 217] [210 218 217] [210 218 217]]] Detection took 0.076s for 300 object proposals len de result frcnn : 1 time spend for datou_step_exec : 2.7306625843048096 time spend to save output : 0.00012183189392089844 total time spend for step 1 : 2.7307844161987305 caffe_path_current : About to save ! 1 Inside saveOutput : final : True verbose : False Inside saveFrcnn : final : True verbose : False threshold to save the result : 0.1 Warning : no hashtag_ids to insert in the database final : True begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1 time used for this insertion : 0.038170576095581055 [917754606] Looping around the photos to save general results len do output : 1 /0 before output type Managing all output in save final without adding information in the mtr_datou_result ('4184', None, None, None, None, None, None, None, None) ('4184', None, '917754606', None, None, None, None, None, None) begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1 time used for this insertion : 0.03566741943359375 save_final save missing photos in datou_result : After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {0: [[(0, 493029425, 4370, 374, 430, 293, 317, 0.06385646, None), (0, 493029425, 4370, 382, 552, 297, 344, 0.05221017, None), (0, 493029425, 4370, 345, 468, 272, 320, 0.012276229, None)], 'temp/1759634522_2282988_917754606_35f3c9ae49686a6be16030c6ec25c9ee.jpg']} ############################### TEST thcl ################################ TEST THCL Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : step 1 thcl is not linked in the step_by_step architecture ! WARNING : step 2 argmax is not linked in the step_by_step architecture ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! DataTypes for each output/input checked ! List Step Type Loaded in datou : thcl, argmax list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.10446786880493164 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 2 step1:thcl Sun Oct 5 05:22:05 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step Thcl ! we are using the classfication for only one thcl 355 time to import caffe and check if the image exist : 0.012495279312133789 time to convert the images to numpy array : 0.0011458396911621094 total time to convert the images to numpy array : 0.013890981674194336 list photo_ids error: [] list photo_ids correct : [916235064] number of photos to traite : 1 try to delete the photos incorrect in DB tagging for thcl : 355 To do loadFromThcl(), then load ParamDescType : thcl355 thcls : [{'id': 355, 'mtr_user_id': 31, 'name': 'car_360_1027', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'c_elysee_1027_gao__port_506302,mokka_1027_gao__port_506374,captur_1027_gao__port_506399,sorento_1027_gao__port_506192,navara_1027_gao__port_506205,xc90_1027_gao__port_506350,saxo_1027_gao__port_506052,trafic_1027_gao__port_506295,punto_evo_1027_gao__port_506066,5_1027_gao__port_506117,250_1027_gao__port_506065,d_max_1027_gao__port_506125,panamera_1027_gao__port_506387,alhambra_1027_gao__port_506381,x6_1027_gao__port_506349,vitara_1027_gao__port_506328,fiesta_1027_gao__port_506377,qashqai_1027_gao__port_506286,147_1027_gao__port_506124,c5_1027_gao__port_506172,q5_1027_gao__port_506206,giulia_1027_gao__port_506178,karl_1027_gao__port_506371,mehari_1027_gao__port_506076,911_1027_gao__port_506114,508_1027_gao__port_506329,idea_1027_gao__port_506122,megane_1027_gao__port_506220,ghibli_1027_gao__port_506174,touareg_1027_gao__port_506224,i10_1027_gao__port_506232,jumper_1027_gao__port_506234,classe_clk_1027_gao__port_506173,kuga_1027_gao__port_506181,ct_1027_gao__port_506323,leon_1027_gao__port_506326,ds5_1027_gao__port_506376,cordoba_1027_gao__port_506048,classe_cla_1027_gao__port_506400,jumpy_1027_gao__port_506179,avensis_1027_gao__port_506311,juke_1027_gao__port_506325,4008_1027_gao__port_506402,190_series_1027_gao__port_506051,serie_3_1027_gao__port_506294,q7_1027_gao__port_506318,glc_1027_gao__port_506303,grand_vitara_1027_gao__port_506175,s40_1027_gao__port_506099,toledo_1027_gao__port_506061,5008_1027_gao__port_506337,continental_1027_gao__port_506250,coupe_1027_gao__port_506082,iq_1027_gao__port_506166,407_1027_gao__port_506133,touran_1027_gao__port_506308,300c_1027_gao__port_506078,classe_gl_1027_gao__port_506340,vivaro_1027_gao__port_506310,sl_1027_gao__port_506100,elise_1027_gao__port_506121,1007_1027_gao__port_506070,i40_1027_gao__port_506218,bipper_tepee_1027_gao__port_506227,focus_1027_gao__port_506272,primera_1027_gao__port_506147,r4_1027_gao__port_506160,a8_1027_gao__port_506265,boxer_1027_gao__port_506202,s5_1027_gao__port_506222,r21_1027_gao__port_506093,c3_1027_gao__port_506257,santa_fe_1027_gao__port_506208,m4_1027_gao__port_506344,safrane_1027_gao__port_506077,classe_gle_1027_gao__port_506395,0_1027_gao__port_506094,ix35_1027_gao__port_506219,carens_1027_gao__port_506298,classe_a_1027_gao__port_506339,ix20_1027_gao__port_506343,note_1027_gao__port_506365,a5_1027_gao__port_506200,sx4_1027_gao__port_506348,sandero_1027_gao__port_506198,3008_1027_gao__port_506385,q50_1027_gao__port_506239,latitude_1027_gao__port_506236,v40_1027_gao__port_506391,xsara_1027_gao__port_506087,grand_c_max_1027_gao__port_506342,swift_1027_gao__port_506149,serie_1_1027_gao__port_506184,xc70_1027_gao__port_506393,master_1027_gao__port_506203,clio_1027_gao__port_506280,duster_1027_gao__port_506216,traveller_1027_gao__port_506403,tipo_1027_gao__port_506355,rav_4_1027_gao__port_506332,coccinelle_1027_gao__port_506259,spacetourer_1027_gao__port_506401,xe_1027_gao__port_506357,ds3_1027_gao__port_506324,mx_5_1027_gao__port_506098,land_cruiser_1027_gao__port_506315,classe_b_1027_gao__port_506335,806_1027_gao__port_506088,rx_8_1027_gao__port_506046,spark_1027_gao__port_506185,6_1027_gao__port_506171,bravo_1027_gao__port_506080,nx_1027_gao__port_506345,sharan_1027_gao__port_506347,x_type_1027_gao__port_506067,jimny_1027_gao__port_506233,wrangler_1027_gao__port_506225,c_crosser_1027_gao__port_506312,v70_1027_gao__port_506278,classe_e_1027_gao__port_506300,classe_v_1027_gao__port_506258,m3_1027_gao__port_506182,abarth_500_1027_gao__port_506226,serie_6_1027_gao__port_506262,modus_1027_gao__port_506146,3_1027_gao__port_506113,405_1027_gao__port_506108,allroad_1027_gao__port_506297,auris_1027_gao__port_506322,galaxy_1027_gao__port_506143,giulietta_1027_gao__port_506363,106_1027_gao__port_506073,classe_m_1027_gao__port_506154,espace_1027_gao__port_506313,panda_1027_gao__port_506189,rcz_1027_gao__port_506197,4007_1027_gao__port_506162,classe_cl_1027_gao__port_506249,leaf_1027_gao__port_506139,octavia_1027_gao__port_506237,ds4_1027_gao__port_506336,freelander_1027_gao__port_506084,evasion_1027_gao__port_506109,punto_1027_gao__port_506106,2cv_1027_gao__port_506045,x4_1027_gao__port_506392,antara_1027_gao__port_506247,murano_1027_gao__port_506316,alto_1027_gao__port_506201,meriva_1027_gao__port_506353,orlando_1027_gao__port_506305,new_beetle_1027_gao__port_506050,306_1027_gao__port_506145,tiguan_1027_gao__port_506362,s_type_1027_gao__port_506101,c1_1027_gao__port_506128,vectra_1027_gao__port_506044,outlander_1027_gao__port_506317,307_1027_gao__port_506074,a6_s6_1027_gao__port_506134,nemo_combi_1027_gao__port_506196,berlingo_1027_gao__port_506194,partner_1027_gao__port_506285,cayenne_1027_gao__port_506177,quattroporte_1027_gao__port_506240,c_max_1027_gao__port_506282,fabia_1027_gao__port_506396,cx_3_1027_gao__port_506281,x_trail_1027_gao__port_506264,scirocco_1027_gao__port_506276,matiz_1027_gao__port_506144,tigra_1027_gao__port_506069,escort_1027_gao__port_506091,c2_1027_gao__port_506081,mini_1027_gao__port_506168,i30_1027_gao__port_506291,picanto_1027_gao__port_506238,mito_1027_gao__port_506072,impreza_1027_gao__port_506085,kangoo_1027_gao__port_506235,a4_1027_gao__port_506193,cayman_1027_gao__port_506268,sportage_1027_gao__port_506148,up_1027_gao__port_506356,optima_1027_gao__port_506386,defender_1027_gao__port_506229,serie_2_1027_gao__port_506256,edge_1027_gao__port_506187,r19_1027_gao__port_506110,jetta_1027_gao__port_506304,eos_1027_gao__port_506115,accord_1027_gao__port_506214,yaris_1027_gao__port_506334,classe_cls_1027_gao__port_506289,polo_1027_gao__port_506361,serie_4_1027_gao__port_506366,mini_cabriolet_1027_gao__port_506204,prius_1027_gao__port_506190,lodgy_1027_gao__port_506188,serie_7_1027_gao__port_506307,c15_1027_gao__port_506055,kadjar_1027_gao__port_506389,insignia_1027_gao__port_506364,308_1027_gao__port_506279,roomster_1027_gao__port_506241,80_1027_gao__port_506057,309_1027_gao__port_506063,tucson_1027_gao__port_506320,x3_1027_gao__port_506212,xf_1027_gao__port_506263,2008_1027_gao__port_506394,passat_1027_gao__port_506306,compass_1027_gao__port_506260,twingo_1027_gao__port_506309,micra_1027_gao__port_506221,golf_1027_gao__port_506155,soul_1027_gao__port_506176,rapid_1027_gao__port_506398,forester_1027_gao__port_506360,slk_1027_gao__port_506210,forfour_1027_gao__port_506341,serie_5_1027_gao__port_506209,xj_1027_gao__port_506170,pajero_1027_gao__port_506097,agila_1027_gao__port_506119,a6_1027_gao__port_506163,fox_1027_gao__port_506092,boxster_1027_gao__port_506267,altea_1027_gao__port_506246,samurai_1027_gao__port_506047,trax_1027_gao__port_506296,getz_1027_gao__port_506058,cherokee_1027_gao__port_506269,koleos_1027_gao__port_506378,z_series_1027_gao__port_506123,ecosport_1027_gao__port_506271,space_star_1027_gao__port_506277,rs3_sportback_1027_gao__port_506207,civic_1027_gao__port_506141,talisman_1027_gao__port_506390,f_pace_1027_gao__port_506314,classe_c_1027_gao__port_506299,tt_1027_gao__port_506075,pathfinder_1027_gao__port_506183,156_1027_gao__port_506157,cx_5_1027_gao__port_506228,scenic_1027_gao__port_506255,yeti_1027_gao__port_506358,mustang_1027_gao__port_506053,stilo_1027_gao__port_506060,ateca_1027_gao__port_506382,fiorino_1027_gao__port_506217,classe_glk_1027_gao__port_506290,fortwo_1027_gao__port_506230,cruze_1027_gao__port_506186,107_1027_gao__port_506213,aygo_1027_gao__port_506248,rx_1027_gao__port_506354,500_1027_gao__port_506245,bora_1027_gao__port_506104,transit_1027_gao__port_506111,pt_cruiser_1027_gao__port_506054,patrol_1027_gao__port_506068,r8_1027_gao__port_506156,xm_1027_gao__port_506102,s60_1027_gao__port_506191,aveo_1027_gao__port_506158,captiva_1027_gao__port_506159,ax_1027_gao__port_506153,rexton_1027_gao__port_506107,camaro_1027_gao__port_506056,ypsilon_1027_gao__port_506131,delta_1027_gao__port_506165,c4_1027_gao__port_506370,zx_1027_gao__port_506161,verso_1027_gao__port_506242,superb_1027_gao__port_506327,r5_1027_gao__port_506253,caddy_1027_gao__port_506330,x5_1027_gao__port_506243,f_type_1027_gao__port_506231,fusion_1027_gao__port_506096,dokker_1027_gao__port_506331,205_1027_gao__port_506062,macan_1027_gao__port_506195,tourneo_1027_gao__port_506369,108_1027_gao__port_506384,9_3_1027_gao__port_506071,mondeo_1027_gao__port_506116,cr_v_1027_gao__port_506164,c30_1027_gao__port_506090,pulsar_1027_gao__port_506397,ibiza_1027_gao__port_506273,a1_1027_gao__port_506338,matrix_1027_gao__port_506140,carnival_1027_gao__port_506136,xantia_1027_gao__port_506086,terrano_1027_gao__port_506083,q3_1027_gao__port_506275,hr_v_1027_gao__port_506283,expert_1027_gao__port_506142,multivan_1027_gao__port_506383,venga_1027_gao__port_506380,scudo_1027_gao__port_506129,laguna_1027_gao__port_506368,vel_satis_1027_gao__port_506130,b_max_1027_gao__port_506367,ignis_1027_gao__port_506292,159_1027_gao__port_506064,grande_punto_1027_gao__port_506138,logan_1027_gao__port_506167,s_max_1027_gao__port_506223,caravelle_1027_gao__port_506351,adam_1027_gao__port_506079,406_1027_gao__port_506132,q30_1027_gao__port_506293,almera_1027_gao__port_506089,corsa_1027_gao__port_506095,corolla_1027_gao__port_506120,xc60_1027_gao__port_506388,viano_1027_gao__port_506211,pro_cee_d_1027_gao__port_506274,a3_1027_gao__port_506321,v50_1027_gao__port_506150,voyager_1027_gao__port_506169,corvette_1027_gao__port_506049,rio_1027_gao__port_506379,jazz_1027_gao__port_506252,200_1027_gao__port_506112,tts_1027_gao__port_506199,zafira_1027_gao__port_506287,asx_1027_gao__port_506266,607_1027_gao__port_506118,207_1027_gao__port_506103,classe_s_1027_gao__port_506301,c6_1027_gao__port_506105,express_1027_gao__port_506137,classe_gla_1027_gao__port_506352,v60_1027_gao__port_506333,ka_1027_gao__port_506180,range_rover_1027_gao__port_506254,discovery_1027_gao__port_506375,classe_r_1027_gao__port_506270,transporter_1027_gao__port_506319,cee_d_1027_gao__port_506288,zoe_1027_gao__port_506244,i20_1027_gao__port_506284,gtv_1027_gao__port_506059,s4_avant_1027_gao__port_506261,x1_1027_gao__port_506372,autres_1027_gao__port_506127,208_1027_gao__port_506359,c8_1027_gao__port_506135,astra_1027_gao__port_506215,2_1027_gao__port_506151,doblo_1027_gao__port_506251,807_1027_gao__port_506152,206_1027_gao__port_506126,a7_1027_gao__port_506373,renegade_1027_gao__port_506346', 'svm_portfolios_learning': '506302,506374,506399,506192,506205,506350,506052,506295,506066,506117,506065,506125,506387,506381,506349,506328,506377,506286,506124,506172,506206,506178,506371,506076,506114,506329,506122,506220,506174,506224,506232,506234,506173,506181,506323,506326,506376,506048,506400,506179,506311,506325,506402,506051,506294,506318,506303,506175,506099,506061,506337,506250,506082,506166,506133,506308,506078,506340,506310,506100,506121,506070,506218,506227,506272,506147,506160,506265,506202,506222,506093,506257,506208,506344,506077,506395,506094,506219,506298,506339,506343,506365,506200,506348,506198,506385,506239,506236,506391,506087,506342,506149,506184,506393,506203,506280,506216,506403,506355,506332,506259,506401,506357,506324,506098,506315,506335,506088,506046,506185,506171,506080,506345,506347,506067,506233,506225,506312,506278,506300,506258,506182,506226,506262,506146,506113,506108,506297,506322,506143,506363,506073,506154,506313,506189,506197,506162,506249,506139,506237,506336,506084,506109,506106,506045,506392,506247,506316,506201,506353,506305,506050,506145,506362,506101,506128,506044,506317,506074,506134,506196,506194,506285,506177,506240,506282,506396,506281,506264,506276,506144,506069,506091,506081,506168,506291,506238,506072,506085,506235,506193,506268,506148,506356,506386,506229,506256,506187,506110,506304,506115,506214,506334,506289,506361,506366,506204,506190,506188,506307,506055,506389,506364,506279,506241,506057,506063,506320,506212,506263,506394,506306,506260,506309,506221,506155,506176,506398,506360,506210,506341,506209,506170,506097,506119,506163,506092,506267,506246,506047,506296,506058,506269,506378,506123,506271,506277,506207,506141,506390,506314,506299,506075,506183,506157,506228,506255,506358,506053,506060,506382,506217,506290,506230,506186,506213,506248,506354,506245,506104,506111,506054,506068,506156,506102,506191,506158,506159,506153,506107,506056,506131,506165,506370,506161,506242,506327,506253,506330,506243,506231,506096,506331,506062,506195,506369,506384,506071,506116,506164,506090,506397,506273,506338,506140,506136,506086,506083,506275,506283,506142,506383,506380,506129,506368,506130,506367,506292,506064,506138,506167,506223,506351,506079,506132,506293,506089,506095,506120,506388,506211,506274,506321,506150,506169,506049,506379,506252,506112,506199,506287,506266,506118,506103,506301,506105,506137,506352,506333,506180,506254,506375,506270,506319,506288,506244,506284,506059,506261,506372,506127,506359,506135,506215,506151,506251,506152,506126,506373,506346', 'photo_hashtag_type': 332, 'photo_desc_type': 3390, 'type_classification': 'caffe', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}] thcl {'id': 355, 'mtr_user_id': 31, 'name': 'car_360_1027', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'c_elysee_1027_gao__port_506302,mokka_1027_gao__port_506374,captur_1027_gao__port_506399,sorento_1027_gao__port_506192,navara_1027_gao__port_506205,xc90_1027_gao__port_506350,saxo_1027_gao__port_506052,trafic_1027_gao__port_506295,punto_evo_1027_gao__port_506066,5_1027_gao__port_506117,250_1027_gao__port_506065,d_max_1027_gao__port_506125,panamera_1027_gao__port_506387,alhambra_1027_gao__port_506381,x6_1027_gao__port_506349,vitara_1027_gao__port_506328,fiesta_1027_gao__port_506377,qashqai_1027_gao__port_506286,147_1027_gao__port_506124,c5_1027_gao__port_506172,q5_1027_gao__port_506206,giulia_1027_gao__port_506178,karl_1027_gao__port_506371,mehari_1027_gao__port_506076,911_1027_gao__port_506114,508_1027_gao__port_506329,idea_1027_gao__port_506122,megane_1027_gao__port_506220,ghibli_1027_gao__port_506174,touareg_1027_gao__port_506224,i10_1027_gao__port_506232,jumper_1027_gao__port_506234,classe_clk_1027_gao__port_506173,kuga_1027_gao__port_506181,ct_1027_gao__port_506323,leon_1027_gao__port_506326,ds5_1027_gao__port_506376,cordoba_1027_gao__port_506048,classe_cla_1027_gao__port_506400,jumpy_1027_gao__port_506179,avensis_1027_gao__port_506311,juke_1027_gao__port_506325,4008_1027_gao__port_506402,190_series_1027_gao__port_506051,serie_3_1027_gao__port_506294,q7_1027_gao__port_506318,glc_1027_gao__port_506303,grand_vitara_1027_gao__port_506175,s40_1027_gao__port_506099,toledo_1027_gao__port_506061,5008_1027_gao__port_506337,continental_1027_gao__port_506250,coupe_1027_gao__port_506082,iq_1027_gao__port_506166,407_1027_gao__port_506133,touran_1027_gao__port_506308,300c_1027_gao__port_506078,classe_gl_1027_gao__port_506340,vivaro_1027_gao__port_506310,sl_1027_gao__port_506100,elise_1027_gao__port_506121,1007_1027_gao__port_506070,i40_1027_gao__port_506218,bipper_tepee_1027_gao__port_506227,focus_1027_gao__port_506272,primera_1027_gao__port_506147,r4_1027_gao__port_506160,a8_1027_gao__port_506265,boxer_1027_gao__port_506202,s5_1027_gao__port_506222,r21_1027_gao__port_506093,c3_1027_gao__port_506257,santa_fe_1027_gao__port_506208,m4_1027_gao__port_506344,safrane_1027_gao__port_506077,classe_gle_1027_gao__port_506395,0_1027_gao__port_506094,ix35_1027_gao__port_506219,carens_1027_gao__port_506298,classe_a_1027_gao__port_506339,ix20_1027_gao__port_506343,note_1027_gao__port_506365,a5_1027_gao__port_506200,sx4_1027_gao__port_506348,sandero_1027_gao__port_506198,3008_1027_gao__port_506385,q50_1027_gao__port_506239,latitude_1027_gao__port_506236,v40_1027_gao__port_506391,xsara_1027_gao__port_506087,grand_c_max_1027_gao__port_506342,swift_1027_gao__port_506149,serie_1_1027_gao__port_506184,xc70_1027_gao__port_506393,master_1027_gao__port_506203,clio_1027_gao__port_506280,duster_1027_gao__port_506216,traveller_1027_gao__port_506403,tipo_1027_gao__port_506355,rav_4_1027_gao__port_506332,coccinelle_1027_gao__port_506259,spacetourer_1027_gao__port_506401,xe_1027_gao__port_506357,ds3_1027_gao__port_506324,mx_5_1027_gao__port_506098,land_cruiser_1027_gao__port_506315,classe_b_1027_gao__port_506335,806_1027_gao__port_506088,rx_8_1027_gao__port_506046,spark_1027_gao__port_506185,6_1027_gao__port_506171,bravo_1027_gao__port_506080,nx_1027_gao__port_506345,sharan_1027_gao__port_506347,x_type_1027_gao__port_506067,jimny_1027_gao__port_506233,wrangler_1027_gao__port_506225,c_crosser_1027_gao__port_506312,v70_1027_gao__port_506278,classe_e_1027_gao__port_506300,classe_v_1027_gao__port_506258,m3_1027_gao__port_506182,abarth_500_1027_gao__port_506226,serie_6_1027_gao__port_506262,modus_1027_gao__port_506146,3_1027_gao__port_506113,405_1027_gao__port_506108,allroad_1027_gao__port_506297,auris_1027_gao__port_506322,galaxy_1027_gao__port_506143,giulietta_1027_gao__port_506363,106_1027_gao__port_506073,classe_m_1027_gao__port_506154,espace_1027_gao__port_506313,panda_1027_gao__port_506189,rcz_1027_gao__port_506197,4007_1027_gao__port_506162,classe_cl_1027_gao__port_506249,leaf_1027_gao__port_506139,octavia_1027_gao__port_506237,ds4_1027_gao__port_506336,freelander_1027_gao__port_506084,evasion_1027_gao__port_506109,punto_1027_gao__port_506106,2cv_1027_gao__port_506045,x4_1027_gao__port_506392,antara_1027_gao__port_506247,murano_1027_gao__port_506316,alto_1027_gao__port_506201,meriva_1027_gao__port_506353,orlando_1027_gao__port_506305,new_beetle_1027_gao__port_506050,306_1027_gao__port_506145,tiguan_1027_gao__port_506362,s_type_1027_gao__port_506101,c1_1027_gao__port_506128,vectra_1027_gao__port_506044,outlander_1027_gao__port_506317,307_1027_gao__port_506074,a6_s6_1027_gao__port_506134,nemo_combi_1027_gao__port_506196,berlingo_1027_gao__port_506194,partner_1027_gao__port_506285,cayenne_1027_gao__port_506177,quattroporte_1027_gao__port_506240,c_max_1027_gao__port_506282,fabia_1027_gao__port_506396,cx_3_1027_gao__port_506281,x_trail_1027_gao__port_506264,scirocco_1027_gao__port_506276,matiz_1027_gao__port_506144,tigra_1027_gao__port_506069,escort_1027_gao__port_506091,c2_1027_gao__port_506081,mini_1027_gao__port_506168,i30_1027_gao__port_506291,picanto_1027_gao__port_506238,mito_1027_gao__port_506072,impreza_1027_gao__port_506085,kangoo_1027_gao__port_506235,a4_1027_gao__port_506193,cayman_1027_gao__port_506268,sportage_1027_gao__port_506148,up_1027_gao__port_506356,optima_1027_gao__port_506386,defender_1027_gao__port_506229,serie_2_1027_gao__port_506256,edge_1027_gao__port_506187,r19_1027_gao__port_506110,jetta_1027_gao__port_506304,eos_1027_gao__port_506115,accord_1027_gao__port_506214,yaris_1027_gao__port_506334,classe_cls_1027_gao__port_506289,polo_1027_gao__port_506361,serie_4_1027_gao__port_506366,mini_cabriolet_1027_gao__port_506204,prius_1027_gao__port_506190,lodgy_1027_gao__port_506188,serie_7_1027_gao__port_506307,c15_1027_gao__port_506055,kadjar_1027_gao__port_506389,insignia_1027_gao__port_506364,308_1027_gao__port_506279,roomster_1027_gao__port_506241,80_1027_gao__port_506057,309_1027_gao__port_506063,tucson_1027_gao__port_506320,x3_1027_gao__port_506212,xf_1027_gao__port_506263,2008_1027_gao__port_506394,passat_1027_gao__port_506306,compass_1027_gao__port_506260,twingo_1027_gao__port_506309,micra_1027_gao__port_506221,golf_1027_gao__port_506155,soul_1027_gao__port_506176,rapid_1027_gao__port_506398,forester_1027_gao__port_506360,slk_1027_gao__port_506210,forfour_1027_gao__port_506341,serie_5_1027_gao__port_506209,xj_1027_gao__port_506170,pajero_1027_gao__port_506097,agila_1027_gao__port_506119,a6_1027_gao__port_506163,fox_1027_gao__port_506092,boxster_1027_gao__port_506267,altea_1027_gao__port_506246,samurai_1027_gao__port_506047,trax_1027_gao__port_506296,getz_1027_gao__port_506058,cherokee_1027_gao__port_506269,koleos_1027_gao__port_506378,z_series_1027_gao__port_506123,ecosport_1027_gao__port_506271,space_star_1027_gao__port_506277,rs3_sportback_1027_gao__port_506207,civic_1027_gao__port_506141,talisman_1027_gao__port_506390,f_pace_1027_gao__port_506314,classe_c_1027_gao__port_506299,tt_1027_gao__port_506075,pathfinder_1027_gao__port_506183,156_1027_gao__port_506157,cx_5_1027_gao__port_506228,scenic_1027_gao__port_506255,yeti_1027_gao__port_506358,mustang_1027_gao__port_506053,stilo_1027_gao__port_506060,ateca_1027_gao__port_506382,fiorino_1027_gao__port_506217,classe_glk_1027_gao__port_506290,fortwo_1027_gao__port_506230,cruze_1027_gao__port_506186,107_1027_gao__port_506213,aygo_1027_gao__port_506248,rx_1027_gao__port_506354,500_1027_gao__port_506245,bora_1027_gao__port_506104,transit_1027_gao__port_506111,pt_cruiser_1027_gao__port_506054,patrol_1027_gao__port_506068,r8_1027_gao__port_506156,xm_1027_gao__port_506102,s60_1027_gao__port_506191,aveo_1027_gao__port_506158,captiva_1027_gao__port_506159,ax_1027_gao__port_506153,rexton_1027_gao__port_506107,camaro_1027_gao__port_506056,ypsilon_1027_gao__port_506131,delta_1027_gao__port_506165,c4_1027_gao__port_506370,zx_1027_gao__port_506161,verso_1027_gao__port_506242,superb_1027_gao__port_506327,r5_1027_gao__port_506253,caddy_1027_gao__port_506330,x5_1027_gao__port_506243,f_type_1027_gao__port_506231,fusion_1027_gao__port_506096,dokker_1027_gao__port_506331,205_1027_gao__port_506062,macan_1027_gao__port_506195,tourneo_1027_gao__port_506369,108_1027_gao__port_506384,9_3_1027_gao__port_506071,mondeo_1027_gao__port_506116,cr_v_1027_gao__port_506164,c30_1027_gao__port_506090,pulsar_1027_gao__port_506397,ibiza_1027_gao__port_506273,a1_1027_gao__port_506338,matrix_1027_gao__port_506140,carnival_1027_gao__port_506136,xantia_1027_gao__port_506086,terrano_1027_gao__port_506083,q3_1027_gao__port_506275,hr_v_1027_gao__port_506283,expert_1027_gao__port_506142,multivan_1027_gao__port_506383,venga_1027_gao__port_506380,scudo_1027_gao__port_506129,laguna_1027_gao__port_506368,vel_satis_1027_gao__port_506130,b_max_1027_gao__port_506367,ignis_1027_gao__port_506292,159_1027_gao__port_506064,grande_punto_1027_gao__port_506138,logan_1027_gao__port_506167,s_max_1027_gao__port_506223,caravelle_1027_gao__port_506351,adam_1027_gao__port_506079,406_1027_gao__port_506132,q30_1027_gao__port_506293,almera_1027_gao__port_506089,corsa_1027_gao__port_506095,corolla_1027_gao__port_506120,xc60_1027_gao__port_506388,viano_1027_gao__port_506211,pro_cee_d_1027_gao__port_506274,a3_1027_gao__port_506321,v50_1027_gao__port_506150,voyager_1027_gao__port_506169,corvette_1027_gao__port_506049,rio_1027_gao__port_506379,jazz_1027_gao__port_506252,200_1027_gao__port_506112,tts_1027_gao__port_506199,zafira_1027_gao__port_506287,asx_1027_gao__port_506266,607_1027_gao__port_506118,207_1027_gao__port_506103,classe_s_1027_gao__port_506301,c6_1027_gao__port_506105,express_1027_gao__port_506137,classe_gla_1027_gao__port_506352,v60_1027_gao__port_506333,ka_1027_gao__port_506180,range_rover_1027_gao__port_506254,discovery_1027_gao__port_506375,classe_r_1027_gao__port_506270,transporter_1027_gao__port_506319,cee_d_1027_gao__port_506288,zoe_1027_gao__port_506244,i20_1027_gao__port_506284,gtv_1027_gao__port_506059,s4_avant_1027_gao__port_506261,x1_1027_gao__port_506372,autres_1027_gao__port_506127,208_1027_gao__port_506359,c8_1027_gao__port_506135,astra_1027_gao__port_506215,2_1027_gao__port_506151,doblo_1027_gao__port_506251,807_1027_gao__port_506152,206_1027_gao__port_506126,a7_1027_gao__port_506373,renegade_1027_gao__port_506346', 'svm_portfolios_learning': '506302,506374,506399,506192,506205,506350,506052,506295,506066,506117,506065,506125,506387,506381,506349,506328,506377,506286,506124,506172,506206,506178,506371,506076,506114,506329,506122,506220,506174,506224,506232,506234,506173,506181,506323,506326,506376,506048,506400,506179,506311,506325,506402,506051,506294,506318,506303,506175,506099,506061,506337,506250,506082,506166,506133,506308,506078,506340,506310,506100,506121,506070,506218,506227,506272,506147,506160,506265,506202,506222,506093,506257,506208,506344,506077,506395,506094,506219,506298,506339,506343,506365,506200,506348,506198,506385,506239,506236,506391,506087,506342,506149,506184,506393,506203,506280,506216,506403,506355,506332,506259,506401,506357,506324,506098,506315,506335,506088,506046,506185,506171,506080,506345,506347,506067,506233,506225,506312,506278,506300,506258,506182,506226,506262,506146,506113,506108,506297,506322,506143,506363,506073,506154,506313,506189,506197,506162,506249,506139,506237,506336,506084,506109,506106,506045,506392,506247,506316,506201,506353,506305,506050,506145,506362,506101,506128,506044,506317,506074,506134,506196,506194,506285,506177,506240,506282,506396,506281,506264,506276,506144,506069,506091,506081,506168,506291,506238,506072,506085,506235,506193,506268,506148,506356,506386,506229,506256,506187,506110,506304,506115,506214,506334,506289,506361,506366,506204,506190,506188,506307,506055,506389,506364,506279,506241,506057,506063,506320,506212,506263,506394,506306,506260,506309,506221,506155,506176,506398,506360,506210,506341,506209,506170,506097,506119,506163,506092,506267,506246,506047,506296,506058,506269,506378,506123,506271,506277,506207,506141,506390,506314,506299,506075,506183,506157,506228,506255,506358,506053,506060,506382,506217,506290,506230,506186,506213,506248,506354,506245,506104,506111,506054,506068,506156,506102,506191,506158,506159,506153,506107,506056,506131,506165,506370,506161,506242,506327,506253,506330,506243,506231,506096,506331,506062,506195,506369,506384,506071,506116,506164,506090,506397,506273,506338,506140,506136,506086,506083,506275,506283,506142,506383,506380,506129,506368,506130,506367,506292,506064,506138,506167,506223,506351,506079,506132,506293,506089,506095,506120,506388,506211,506274,506321,506150,506169,506049,506379,506252,506112,506199,506287,506266,506118,506103,506301,506105,506137,506352,506333,506180,506254,506375,506270,506319,506288,506244,506284,506059,506261,506372,506127,506359,506135,506215,506151,506251,506152,506126,506373,506346', 'photo_hashtag_type': 332, 'photo_desc_type': 3390, 'type_classification': 'caffe', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 3390 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (3390, 'car_360_1027', 16384, 25088, 'car_360_1027', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2017, 10, 28, 12, 29, 27), datetime.datetime(2017, 10, 28, 12, 29, 27)) To loadFromThcl() : net_3390 begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 6478 max_wait_temp : 1 max_wait : 0 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (3390, 'car_360_1027', 16384, 25088, 'car_360_1027', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2017, 10, 28, 12, 29, 27), datetime.datetime(2017, 10, 28, 12, 29, 27)) None mean_file_type : mean_file_path : prototxt_file_path : model : car_360_1027 Inside get_net Inside get_net before cache_data_model model_param file didn't exist Inside get_net before CDM.load_model_par_type model_name : car_360_1027 model_type : caffe list file need : ['caffemodel', 'deploy_conv_normal.prototxt', 'deploy_fc.prototxt', 'deploy.prototxt', 'mean.npy', 'synset_words.txt'] file exist in s3 : ['caffemodel', 'deploy_conv_normal.prototxt', 'deploy_fc.prototxt', 'deploy.prototxt', 'mean.npy', 'synset_words.txt'] file manque in s3 : [] local folder : /data/models_weight/car_360_1027 /data/models_weight/car_360_1027/caffemodel size_local : 542944640 size in s3 : 542944640 create time local : 2021-08-09 05:28:34 create time in s3 : 2021-08-06 17:57:43 caffemodel already exist and didn't need to update /data/models_weight/car_360_1027/deploy_conv_normal.prototxt size_local : 4626 size in s3 : 4626 create time local : 2021-08-09 05:28:34 create time in s3 : 2021-08-06 17:57:42 deploy_conv_normal.prototxt already exist and didn't need to update /data/models_weight/car_360_1027/deploy_fc.prototxt size_local : 1132 size in s3 : 1132 create time local : 2021-08-09 05:28:34 create time in s3 : 2021-08-06 17:57:43 deploy_fc.prototxt already exist and didn't need to update /data/models_weight/car_360_1027/deploy.prototxt size_local : 5654 size in s3 : 5654 create time local : 2021-08-09 05:28:34 create time in s3 : 2021-08-06 17:57:42 deploy.prototxt already exist and didn't need to update /data/models_weight/car_360_1027/mean.npy size_local : 1572944 size in s3 : 1572944 create time local : 2021-08-09 05:28:34 create time in s3 : 2021-08-06 17:57:55 mean.npy already exist and didn't need to update /data/models_weight/car_360_1027/synset_words.txt size_local : 13687 size in s3 : 13687 create time local : 2021-08-09 05:28:34 create time in s3 : 2021-08-06 17:57:43 synset_words.txt already exist and didn't need to update Inside get_net after CDM.load_model_par_type After if not only_with_local_cache: /home/admin/workarea/install/darknet/:/home/admin/workarea/git/Velours/python:/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python:/home/admin/mtr/.credentials:/home/admin/workarea/install/caffe/python:/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools/:/home/admin/workarea/git/fotonowerpip/:/home/admin/workarea/install/segment-anything:/home/admin//workarea/git/pyfvs/ Here before set mode gpu Doing nothing but we could set mode gpu after set mode gpu prototxt_filename : /data/models_weight/car_360_1027/deploy.prototxt caffemodel_filename : /data/models_weight/car_360_1027/caffemodel now we set caffe to gpu mode before predict begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 6478 max_wait_temp : 1 max_wait : 0 dict_keys(['prob', 'pool5']) time used to do the prepocess of the images : 0.01715373992919922 time used to do the prediction : 0.08450031280517578 save descriptor for thcl : 355 time to traite the descriptors : 0.06435251235961914 storage_type for insertDescriptorsMulti : 1 To insert : 916235064 Catched exception ! Connect or reconnect ! time to insert the descriptors : 1.9275903701782227 Inside saveOutput : final : False verbose : False time used to find the portfolios of the photos SAVE THCL : begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 0 time used for this insertion : 1.0013580322265625e-05 save missing photos in datou_result : time spend for datou_step_exec : 7.4894022941589355 time spend to save output : 6.12553334236145 total time spend for step 1 : 13.614935636520386 step2:argmax Sun Oct 5 05:22:19 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou_step Argmax ! calculate argmax for thcl : 355 Inside saveOutput : final : True verbose : False photo_id : 916235064 output[photo_id] : [('916235064', 'c15_1027_gao__port_506055', 0.01771304, 332, '355'), 'temp/1759634525_2282988_916235064_6293d1bb790dc6902450e7c572b7d10b.jpg'] begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 1 time used for this insertion : 0.03622841835021973 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 1 time used for this insertion : 0.040044307708740234 len list_finale : 1, len picture : 1 begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1 time used for this insertion : 0.05557441711425781 saving photo_ids in datou_result photo id not in port begin to insert list_values into mtr_datou_result : length of list_values in save_final : 0 time used for this insertion : 5.9604644775390625e-06 save missing photos in datou_result : time spend for datou_step_exec : 0.0002994537353515625 time spend to save output : 0.13227033615112305 total time spend for step 2 : 0.1325697898864746 caffe_path_current : About to save ! 2 After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 2 output : {'916235064': [('916235064', 'c15_1027_gao__port_506055', 0.01771304, 332, '355'), 'temp/1759634525_2282988_916235064_6293d1bb790dc6902450e7c572b7d10b.jpg']} ############################### TEST tfhub2 ################################ TEST TFHUB2 ######################## test with use_multi_inputs=0 ######################## Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : step 12835 tfhub_classification2 is not linked in the step_by_step architecture ! WARNING : step 12836 argmax is not linked in the step_by_step architecture ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! DataTypes for each output/input checked ! List Step Type Loaded in datou : tfhub_classification2, argmax list_input_json : [] origin BBBFFFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 3 ; length of list_pids : 3 ; length of list_args : 3 time to download the photos : 0.30860304832458496 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 2 step1:tfhub_classification2 Sun Oct 5 05:22:19 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou_step TFHub with tf2 ! we are using the classfication for only one thcl 3609 begin to check gpu status inside check gpu memory 2025-10-05 05:22:23.117371: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-10-05 05:22:23.118063: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-10-05 05:22:23.118137: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-05 05:22:23.118181: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-05 05:22:23.120730: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-10-05 05:22:23.120803: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-10-05 05:22:23.123767: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-10-05 05:22:23.125233: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-10-05 05:22:23.132688: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-10-05 05:22:23.134141: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-10-05 05:22:23.134968: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-10-05 05:22:23.168600: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3492910000 Hz 2025-10-05 05:22:23.170558: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f7214000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-10-05 05:22:23.170591: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-10-05 05:22:23.174308: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x190bd480 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-10-05 05:22:23.174342: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-10-05 05:22:23.175400: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-10-05 05:22:23.175518: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-05 05:22:23.175551: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-05 05:22:23.175641: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-10-05 05:22:23.175681: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-10-05 05:22:23.175729: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-10-05 05:22:23.175781: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-10-05 05:22:23.175834: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-10-05 05:22:23.177507: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-10-05 05:22:23.177588: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-05 05:22:23.177660: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-10-05 05:22:23.177677: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-10-05 05:22:23.177690: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-10-05 05:22:23.179388: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 3096 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) l 3637 free memory gpu now : 6478 max_wait_temp : 1 max_wait : 5 1 Physical GPUs, 1 Logical GPUs tagging for thcl : 3609 To do loadFromThcl(), then load ParamDescType : thcl3609 thcls : [{'id': 3609, 'mtr_user_id': 31, 'name': 'tfhub_19_06_2023', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'jrm,pcm,pcnc,pehd,tapis_vide', 'svm_portfolios_learning': '9336903,9336904,9336905,9336906,9336909', 'photo_hashtag_type': 4674, 'photo_desc_type': 5832, 'type_classification': 'tf_classification2', 'hashtag_id_list': '495916461,560181804,1284539308,628944319,2107748999'}] thcl {'id': 3609, 'mtr_user_id': 31, 'name': 'tfhub_19_06_2023', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'jrm,pcm,pcnc,pehd,tapis_vide', 'svm_portfolios_learning': '9336903,9336904,9336905,9336906,9336909', 'photo_hashtag_type': 4674, 'photo_desc_type': 5832, 'type_classification': 'tf_classification2', 'hashtag_id_list': '495916461,560181804,1284539308,628944319,2107748999'} Update svm_hashtag_type_desc : 5832 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5832, 'tfhub_19_06_2023', 1280, 1280, 'tfhub_19_06_2023', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 3, datetime.datetime(2023, 6, 19, 12, 55, 22), datetime.datetime(2023, 6, 19, 12, 55, 22)) model_name : tfhub_19_06_2023 model_param file didn't exist model_name : tfhub_19_06_2023 model_type : tf_classification2 list file need : ['Confusion_Matrix.png', 'Precision_Recall_jrm.jpg', 'Precision_Recall_pcm.jpg', 'Precision_Recall_pcnc.jpg', 'Precision_Recall_pehd.jpg', 'Precision_Recall_tapis_vide.jpg', 'Result_Summary.txt', 'checkpoint', 'model_checkpoint.ckpt.data-00000-of-00002', 'model_checkpoint.ckpt.data-00001-of-00002', 'model_checkpoint.ckpt.index', 'model_weights.h5'] file exist in s3 : ['Confusion_Matrix.png', 'Precision_Recall_jrm.jpg', 'Precision_Recall_pcm.jpg', 'Precision_Recall_pcnc.jpg', 'Precision_Recall_pehd.jpg', 'Precision_Recall_tapis_vide.jpg', 'Result_Summary.txt', 'checkpoint', 'model_checkpoint.ckpt.data-00000-of-00002', 'model_checkpoint.ckpt.data-00001-of-00002', 'model_checkpoint.ckpt.index', 'model_weights.h5'] file manque in s3 : [] /home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python/../../tools/../lib/rpn/proposal_layer.py:28: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details. layer_params = yaml.load(self.param_str_) local folder : /data/models_weight/tfhub_19_06_2023 /data/models_weight/tfhub_19_06_2023/Confusion_Matrix.png size_local : 57753 size in s3 : 57753 create time local : 2023-06-22 17:09:38 create time in s3 : 2023-06-19 10:55:15 Confusion_Matrix.png already exist and didn't need to update /data/models_weight/tfhub_19_06_2023/Precision_Recall_jrm.jpg size_local : 79724 size in s3 : 79724 create time local : 2023-06-22 17:09:38 create time in s3 : 2023-06-19 10:55:20 Precision_Recall_jrm.jpg already exist and didn't need to update /data/models_weight/tfhub_19_06_2023/Precision_Recall_pcm.jpg size_local : 83556 size in s3 : 83556 create time local : 2023-06-22 17:09:38 create time in s3 : 2023-06-19 10:55:15 Precision_Recall_pcm.jpg already exist and didn't need to update /data/models_weight/tfhub_19_06_2023/Precision_Recall_pcnc.jpg size_local : 74107 size in s3 : 74107 create time local : 2023-06-22 17:09:38 create time in s3 : 2023-06-19 10:55:20 Precision_Recall_pcnc.jpg already exist and didn't need to update /data/models_weight/tfhub_19_06_2023/Precision_Recall_pehd.jpg size_local : 72705 size in s3 : 72705 create time local : 2023-06-22 17:09:39 create time in s3 : 2023-06-19 10:55:20 Precision_Recall_pehd.jpg already exist and didn't need to update /data/models_weight/tfhub_19_06_2023/Precision_Recall_tapis_vide.jpg size_local : 70874 size in s3 : 70874 create time local : 2023-06-22 17:09:39 create time in s3 : 2023-06-19 10:55:15 Precision_Recall_tapis_vide.jpg already exist and didn't need to update /data/models_weight/tfhub_19_06_2023/Result_Summary.txt size_local : 642 size in s3 : 642 create time local : 2023-06-22 17:09:39 create time in s3 : 2023-06-19 10:55:22 Result_Summary.txt already exist and didn't need to update /data/models_weight/tfhub_19_06_2023/checkpoint size_local : 99 size in s3 : 99 create time local : 2023-06-22 17:09:39 create time in s3 : 2023-06-19 10:55:22 checkpoint already exist and didn't need to update /data/models_weight/tfhub_19_06_2023/model_checkpoint.ckpt.data-00000-of-00002 size_local : 216488 size in s3 : 216488 create time local : 2023-06-22 17:09:39 create time in s3 : 2023-06-19 10:55:22 model_checkpoint.ckpt.data-00000-of-00002 already exist and didn't need to update /data/models_weight/tfhub_19_06_2023/model_checkpoint.ckpt.data-00001-of-00002 size_local : 32279708 size in s3 : 32279708 create time local : 2023-06-22 17:09:40 create time in s3 : 2023-06-19 10:55:21 model_checkpoint.ckpt.data-00001-of-00002 already exist and didn't need to update /data/models_weight/tfhub_19_06_2023/model_checkpoint.ckpt.index size_local : 43546 size in s3 : 43546 create time local : 2023-06-22 17:09:40 create time in s3 : 2023-06-19 10:55:22 model_checkpoint.ckpt.index already exist and didn't need to update /data/models_weight/tfhub_19_06_2023/model_weights.h5 size_local : 16499144 size in s3 : 16499144 create time local : 2023-06-22 17:09:40 create time in s3 : 2023-06-19 10:55:15 model_weights.h5 already exist and didn't need to update desc size : 1280 Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= module (KerasLayer) (None, 1280) 4049564 _________________________________________________________________ tfhub_19_06_2023dense (Dense (None, 5) 6405 ================================================================= Total params: 4,055,969 Trainable params: 6,405 Non-trainable params: 4,049,564 _________________________________________________________________ Loading Weights... time used to create the model : 11.597323656082153 time used to load_weights : 0.1580183506011963 0it [00:00, ?it/s] 3it [00:00, 1142.24it/s]2025-10-05 05:22:37.583228: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 temp/1759634539_2282988_1171252764_29d5179a892cc50aadc9d67245534b59.jpg temp/1759634539_2282988_1171252784_5a3c5d3bb155a7a116f67ded51bffb59.jpg temp/1759634539_2282988_1171252487_5ebdd6b0a6bb39942a3808ed114806de.jpg Found 3 images belonging to 1 classes. begin to do the prediction : time used to do the prediction : 2.9125659465789795 (3,) (3, 5) (3, 1280) shape of features : (3, 1280) shape of new features : (1, 3, 1280) save descriptor for thcl : 3609 time to traite the descriptors : 0.027649402618408203 storage_type for insertDescriptorsMulti : 3 To insert : 1171252764 To insert : 1171252784 To insert : 1171252487 time to insert the descriptors : 1.5878779888153076 Inside saveOutput : final : False verbose : False saveOutput not yet implemented for datou_step.type : tfhub_classification2 we use saveGeneral [1171252764, 1171252784, 1171252487] Looping around the photos to save general results len do output : 3 /1171252764Didn't retrieve data . /1171252784Didn't retrieve data . /1171252487Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4567', None, None, None, None, None, None, None, None) ('4567', None, '1171252764', None, None, None, None, None, None) ('4567', None, None, None, None, None, None, None, None) ('4567', None, '1171252784', None, None, None, None, None, None) ('4567', None, None, None, None, None, None, None, None) ('4567', None, '1171252487', None, None, None, None, None, None) begin to insert list_values into mtr_datou_result : length of list_values in save_final : 6 time used for this insertion : 0.036669015884399414 save_final save missing photos in datou_result : time spend for datou_step_exec : 22.0626540184021 time spend to save output : 0.03700375556945801 total time spend for step 1 : 22.099657773971558 step2:argmax Sun Oct 5 05:22:41 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou_step Argmax ! calculate argmax for thcl : 3609 Inside saveOutput : final : True verbose : False photo_id : 1171252764 output[photo_id] : [(1171252764, 'jrm', 0.9853587, 4674, '3609'), 'temp/1759634539_2282988_1171252764_29d5179a892cc50aadc9d67245534b59.jpg'] photo_id : 1171252784 output[photo_id] : [(1171252784, 'jrm', 0.9677384, 4674, '3609'), 'temp/1759634539_2282988_1171252784_5a3c5d3bb155a7a116f67ded51bffb59.jpg'] photo_id : 1171252487 output[photo_id] : [(1171252487, 'jrm', 0.9263028, 4674, '3609'), 'temp/1759634539_2282988_1171252487_5ebdd6b0a6bb39942a3808ed114806de.jpg'] begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 3 time used for this insertion : 0.2891805171966553 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 3 time used for this insertion : 0.040025949478149414 len list_finale : 3, len picture : 3 begin to insert list_values into mtr_datou_result : length of list_values in save_final : 3 time used for this insertion : 0.035830020904541016 saving photo_ids in datou_result photo id not in port photo id not in port photo id not in port begin to insert list_values into mtr_datou_result : length of list_values in save_final : 0 time used for this insertion : 4.291534423828125e-06 save missing photos in datou_result : time spend for datou_step_exec : 0.0001747608184814453 time spend to save output : 0.3831644058227539 total time spend for step 2 : 0.38333916664123535 caffe_path_current : About to save ! 2 After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 2 output : {'1171252764': [(1171252764, 'jrm', 0.9853587, 4674, '3609'), 'temp/1759634539_2282988_1171252764_29d5179a892cc50aadc9d67245534b59.jpg'], '1171252784': [(1171252784, 'jrm', 0.9677384, 4674, '3609'), 'temp/1759634539_2282988_1171252784_5a3c5d3bb155a7a116f67ded51bffb59.jpg'], '1171252487': [(1171252487, 'jrm', 0.9263028, 4674, '3609'), 'temp/1759634539_2282988_1171252487_5ebdd6b0a6bb39942a3808ed114806de.jpg']} --------------------- test with use_multi_inputs=0 is succeded ------------------- ######################## test with use_multi_inputs=1 ######################## Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : step 12927 tfhub_classification2 is not linked in the step_by_step architecture ! WARNING : step 12928 argmax is not linked in the step_by_step architecture ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! DataTypes for each output/input checked ! List Step Type Loaded in datou : tfhub_classification2, argmax list_input_json : [] origin BBBFFFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 3 ; length of list_pids : 3 ; length of list_args : 3 time to download the photos : 0.3687095642089844 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 2 step1:tfhub_classification2 Sun Oct 5 05:22:42 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou_step TFHub with tf2 ! we are using the classfication for only one thcl 3655 begin to check gpu status inside check gpu memory inside check gpu memory inside check gpu memory inside check gpu memory inside check gpu memory inside check gpu memory l 3637 free memory gpu now : 2926 max_wait_temp : 6 max_wait : 5 1 Physical GPUs, 1 Logical GPUs tagging for thcl : 3655 To do loadFromThcl(), then load ParamDescType : thcl3655 thcls : [{'id': 3655, 'mtr_user_id': 31, 'name': 'tfhub_18_7_2023', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'pcm,pcnc,jrm,pehd,tapis_vide', 'svm_portfolios_learning': '9336904,9336905,9336903,9336906,9336909', 'photo_hashtag_type': 4723, 'photo_desc_type': 5862, 'type_classification': 'tf_classification2', 'hashtag_id_list': '560181804,1284539308,495916461,628944319,2107748999'}] thcl {'id': 3655, 'mtr_user_id': 31, 'name': 'tfhub_18_7_2023', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'pcm,pcnc,jrm,pehd,tapis_vide', 'svm_portfolios_learning': '9336904,9336905,9336903,9336906,9336909', 'photo_hashtag_type': 4723, 'photo_desc_type': 5862, 'type_classification': 'tf_classification2', 'hashtag_id_list': '560181804,1284539308,495916461,628944319,2107748999'} Update svm_hashtag_type_desc : 5862 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5862, 'tfhub_18_7_2023', 1280, 1280, 'tfhub_18_7_2023', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 3, datetime.datetime(2023, 7, 18, 22, 46, 29), datetime.datetime(2023, 7, 18, 22, 46, 29)) model_name : tfhub_18_7_2023 model_param file didn't exist model_name : tfhub_18_7_2023 model_type : tf_classification2 list file need : ['Confusion_Matrix.png', 'Precision_Recall_jrm.jpg', 'Precision_Recall_pcm.jpg', 'Precision_Recall_pcnc.jpg', 'Precision_Recall_pehd.jpg', 'Precision_Recall_tapis_vide.jpg', 'Result_Summary.txt', 'checkpoint', 'model_checkpoint.ckpt.data-00000-of-00002', 'model_checkpoint.ckpt.data-00001-of-00002', 'model_checkpoint.ckpt.index', 'model_weights.h5'] file exist in s3 : ['Confusion_Matrix.png', 'Precision_Recall_jrm.jpg', 'Precision_Recall_pcm.jpg', 'Precision_Recall_pcnc.jpg', 'Precision_Recall_pehd.jpg', 'Precision_Recall_tapis_vide.jpg', 'Result_Summary.txt', 'checkpoint', 'model_checkpoint.ckpt.data-00000-of-00002', 'model_checkpoint.ckpt.data-00001-of-00002', 'model_checkpoint.ckpt.index', 'model_weights.h5'] file manque in s3 : [] local folder : /data/models_weight/tfhub_18_7_2023 /data/models_weight/tfhub_18_7_2023/Confusion_Matrix.png size_local : 54360 size in s3 : 54360 create time local : 2023-08-11 11:22:56 create time in s3 : 2023-07-18 20:46:28 Confusion_Matrix.png already exist and didn't need to update /data/models_weight/tfhub_18_7_2023/Precision_Recall_jrm.jpg size_local : 72583 size in s3 : 72583 create time local : 2023-08-11 11:22:56 create time in s3 : 2023-07-18 20:46:23 Precision_Recall_jrm.jpg already exist and didn't need to update /data/models_weight/tfhub_18_7_2023/Precision_Recall_pcm.jpg size_local : 81681 size in s3 : 81681 create time local : 2023-08-11 11:22:56 create time in s3 : 2023-07-18 20:46:17 Precision_Recall_pcm.jpg already exist and didn't need to update /data/models_weight/tfhub_18_7_2023/Precision_Recall_pcnc.jpg size_local : 79510 size in s3 : 79510 create time local : 2023-08-11 11:22:56 create time in s3 : 2023-07-18 20:46:23 Precision_Recall_pcnc.jpg already exist and didn't need to update /data/models_weight/tfhub_18_7_2023/Precision_Recall_pehd.jpg size_local : 59936 size in s3 : 59936 create time local : 2023-08-11 11:22:57 create time in s3 : 2023-07-18 20:46:23 Precision_Recall_pehd.jpg already exist and didn't need to update /data/models_weight/tfhub_18_7_2023/Precision_Recall_tapis_vide.jpg size_local : 78974 size in s3 : 78974 create time local : 2023-08-11 11:22:57 create time in s3 : 2023-07-18 20:46:17 Precision_Recall_tapis_vide.jpg already exist and didn't need to update /data/models_weight/tfhub_18_7_2023/Result_Summary.txt size_local : 642 size in s3 : 642 create time local : 2023-08-11 11:22:57 create time in s3 : 2023-07-18 20:46:23 Result_Summary.txt already exist and didn't need to update /data/models_weight/tfhub_18_7_2023/checkpoint size_local : 99 size in s3 : 99 create time local : 2023-08-11 11:22:57 create time in s3 : 2023-07-18 20:46:23 checkpoint already exist and didn't need to update /data/models_weight/tfhub_18_7_2023/model_checkpoint.ckpt.data-00000-of-00002 size_local : 216529 size in s3 : 216529 create time local : 2023-08-11 11:22:57 create time in s3 : 2023-07-18 20:46:17 model_checkpoint.ckpt.data-00000-of-00002 already exist and didn't need to update /data/models_weight/tfhub_18_7_2023/model_checkpoint.ckpt.data-00001-of-00002 size_local : 32279748 size in s3 : 32279748 create time local : 2023-08-11 11:22:58 create time in s3 : 2023-07-18 20:46:19 model_checkpoint.ckpt.data-00001-of-00002 already exist and didn't need to update /data/models_weight/tfhub_18_7_2023/model_checkpoint.ckpt.index size_local : 43546 size in s3 : 43546 create time local : 2023-08-11 11:22:58 create time in s3 : 2023-07-18 20:46:19 model_checkpoint.ckpt.index already exist and didn't need to update /data/models_weight/tfhub_18_7_2023/model_weights.h5 size_local : 16500868 size in s3 : 16500868 create time local : 2023-08-11 11:22:58 create time in s3 : 2023-07-18 20:46:18 model_weights.h5 already exist and didn't need to update desc size : 1280 Model: "model" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) [(None, 224, 224, 3) 0 __________________________________________________________________________________________________ input_2 (InputLayer) [(None, 1)] 0 __________________________________________________________________________________________________ module (KerasLayer) (None, 1280) 4049564 input_1[0][0] __________________________________________________________________________________________________ concatenate (Concatenate) (None, 1281) 0 input_2[0][0] module[0][0] __________________________________________________________________________________________________ tfhub_18_7_2023dense (Dense) (None, 5) 6410 concatenate[0][0] ================================================================================================== Total params: 4,055,974 Trainable params: 0 Non-trainable params: 4,055,974 __________________________________________________________________________________________________ Loading Weights... time used to create the model : 9.014185428619385 time used to load_weights : 0.3751182556152344 found 3 data found 0 labels begin to do the prediction : time used to do the prediction : 1.0235555171966553 (3,) (3, 5) (3, 1280) shape of features : (3, 1280) shape of new features : (1, 3, 1280) save descriptor for thcl : 3655 time to traite the descriptors : 0.03921937942504883 storage_type for insertDescriptorsMulti : 3 To insert : 1171275314 To insert : 1171291875 To insert : 1171275372 time to insert the descriptors : 1.503662109375 Inside saveOutput : final : False verbose : False saveOutput not yet implemented for datou_step.type : tfhub_classification2 we use saveGeneral [1171275314, 1171291875, 1171275372] Looping around the photos to save general results len do output : 3 /1171275314Didn't retrieve data . /1171291875Didn't retrieve data . /1171275372Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4621', None, None, None, None, None, None, None, None) ('4621', None, '1171275314', None, None, None, None, None, None) ('4621', None, None, None, None, None, None, None, None) ('4621', None, '1171291875', None, None, None, None, None, None) ('4621', None, None, None, None, None, None, None, None) ('4621', None, '1171275372', None, None, None, None, None, None) begin to insert list_values into mtr_datou_result : length of list_values in save_final : 6 time used for this insertion : 0.03753352165222168 save_final save missing photos in datou_result : time spend for datou_step_exec : 21.27155113220215 time spend to save output : 0.037822723388671875 total time spend for step 1 : 21.30937385559082 step2:argmax Sun Oct 5 05:23:04 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou_step Argmax ! calculate argmax for thcl : 3655 Inside saveOutput : final : True verbose : False photo_id : 1171275314 output[photo_id] : [(1171275314, 'tapis_vide', 0.96519727, 4723, '3655'), 'temp/1759634562_2282988_1171275314_6e0a72c8fa00d5e4b018bd689b547133.jpg'] photo_id : 1171291875 output[photo_id] : [(1171291875, 'tapis_vide', 0.9706545, 4723, '3655'), 'temp/1759634562_2282988_1171291875_b62cd9e0d976b143f86fe82d072798c0.jpg'] photo_id : 1171275372 output[photo_id] : [(1171275372, 'tapis_vide', 0.96743417, 4723, '3655'), 'temp/1759634562_2282988_1171275372_76d81364ff7df843bff095f45c07ba35.jpg'] begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 3 time used for this insertion : 0.03558015823364258 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 3 time used for this insertion : 0.03646206855773926 len list_finale : 3, len picture : 3 begin to insert list_values into mtr_datou_result : length of list_values in save_final : 3 time used for this insertion : 0.03665614128112793 saving photo_ids in datou_result photo id not in port photo id not in port photo id not in port begin to insert list_values into mtr_datou_result : length of list_values in save_final : 0 time used for this insertion : 4.76837158203125e-06 save missing photos in datou_result : time spend for datou_step_exec : 0.00017547607421875 time spend to save output : 0.1261913776397705 total time spend for step 2 : 0.12636685371398926 caffe_path_current : About to save ! 2 After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 2 output : {'1171275314': [(1171275314, 'tapis_vide', 0.96519727, 4723, '3655'), 'temp/1759634562_2282988_1171275314_6e0a72c8fa00d5e4b018bd689b547133.jpg'], '1171291875': [(1171291875, 'tapis_vide', 0.9706545, 4723, '3655'), 'temp/1759634562_2282988_1171291875_b62cd9e0d976b143f86fe82d072798c0.jpg'], '1171275372': [(1171275372, 'tapis_vide', 0.96743417, 4723, '3655'), 'temp/1759634562_2282988_1171275372_76d81364ff7df843bff095f45c07ba35.jpg']} --------------------- test with use_multi_inputs=1 is succeded ------------------- ############################### TEST ordonner ################################ To do loadFromThcl(), then load ParamDescType : thcl358 thcls : [{'id': 358, 'mtr_user_id': 31, 'name': 'car_orientation_0111', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'FirstUploadExperveo_vignette__port_505674,CAR_EXTERIEUR_Roue__port_503398,FirstUploadExperveo_carrosseriegrosplan_VIndanslamoquette__port_506486,FirstUploadExperveo_carrosseriegrosplan_siegegrosplan__port_506485,CAR_EXTERIEUR_Cote_droit_axe_avant__port_504465,CAR_EXTERIEUR_Cote_gauche_axe_arriere__port_504198,CAR_EXTERIEUR_Face_avant_axe_droit__port_504451,CAR_EXTERIEUR_angle_avant_gauche_axe_avant__port_504235,FirstUploadExperveo_vin__port_505675,CAR_EXTERIEUR_cote_droite__port_504108,CAR_INTERIEUR_avant_volant_class_6_levierdevitesse__port_506565,FirstUploadExperveo_carrosseriegrosplan_carrosserie__port_506483,CAR_EXTERIEUR_Angle_arriere_gauche_axe_arriere__port_504201,cartegrise_orientation__port_505064,CAR_EXTERIEUR_Angle_arriere_droit_axe_arriere__port_504217,CAR_INTERIEUR_avant_vue-arriere_class_1__port_506531,CAR_EXTERIEUR_Face_arriere_axe_droit__port_504218,CAR_EXTERIEUR_Cote_droit_axe_arriere__port_504214,CAR_EXTERIEUR_Angle_avant_droit__port_504087,FirstUploadExperveo_carrosseriegrosplan_morceauderoue__port_506484,CAR_INTERIEUR_avant_volant_class_6_class_2__port_506563,CAR_EXTERIEUR_Angle_arriere_droit__port_504160,CAR_EXTERIEUR_arriere__port_504184,CAR_INTERIEUR_avant_volant_class_6_boutonrond__port_506562,INTERIEUR_Compteur_kilometrique__port_503644,CAR_INTERIEUR_avant_vue_gauche_habitacle_class_1__port_506494,CAR_EXTERIEUR_Angle_arriere_gauche__port_504170,CAR_EXTERIEUR_Angle_avant_droit_axe_arriere__port_504226,CAR_EXTERIEUR_Face_arriere_axe_gauche__port_504202,CAR_EXTERIEUR_moteur__port_503704,FirstUploadExperveo_carrosseriegrosplan_class_6__port_506487,CAR_INTERIEUR_siege_arriere_class_1__port_506551,CAR_EXTERIEUR_avant__port_504146,CAR_EXTERIEUR_Angle_arriere_droit_axe_droit__port_504215,CAR_EXTERIEUR_Angle_avant_droit_axe_droit__port_504225,CAR_INTERIEUR_avant_volant_class_6_ecrangrosplan__port_506564,FirstUploadExperveo_carrosseriegrosplan_moteurgrosplanetdegat__port_506482,CAR_INTERIEUR_coffre__port_503412,FirstUploadExperveo_rouetranche__port_505677,UploadPhotoImmatBest_class_1__port_505051,CAR_INTERIEUR_avant_vue-arriere_class_2__port_506532,CAR_EXTERIEUR_angle_avant_gauche__port_504098,CAR_EXTERIEUR_face_avant_axe_gauche__port_504236,CAR_INTERIEUR_avant_vue_droite_habitacle_class_1__port_506540,CAR_EXTERIEUR_cote_gauche_axe_avant__port_504233,CAR_EXTERIEUR_roue_de_secour__port_503763,CAR_EXTERIEUR_Angle_arriere_gauche_axe_gauche__port_504199,CAR_EXTERIEUR_cote_gauche__port_504017,CAR_INTERIEUR_avant_volant_class_1__port_506503,CAR_INTERIEUR_avant_volant_class_2__port_506504,CAR_EXTERIEUR_angle_avant_gauche_axe_gauche__port_504234', 'svm_portfolios_learning': '505674,503398,506486,506485,504465,504198,504451,504235,505675,504108,506565,506483,504201,505064,504217,506531,504218,504214,504087,506484,506563,504160,504184,506562,503644,506494,504170,504226,504202,503704,506487,506551,504146,504215,504225,506564,506482,503412,505677,505051,506532,504098,504236,506540,504233,503763,504199,504017,506503,506504,504234', 'photo_hashtag_type': 337, 'photo_desc_type': 3392, 'type_classification': 'caffe', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}] thcl {'id': 358, 'mtr_user_id': 31, 'name': 'car_orientation_0111', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'FirstUploadExperveo_vignette__port_505674,CAR_EXTERIEUR_Roue__port_503398,FirstUploadExperveo_carrosseriegrosplan_VIndanslamoquette__port_506486,FirstUploadExperveo_carrosseriegrosplan_siegegrosplan__port_506485,CAR_EXTERIEUR_Cote_droit_axe_avant__port_504465,CAR_EXTERIEUR_Cote_gauche_axe_arriere__port_504198,CAR_EXTERIEUR_Face_avant_axe_droit__port_504451,CAR_EXTERIEUR_angle_avant_gauche_axe_avant__port_504235,FirstUploadExperveo_vin__port_505675,CAR_EXTERIEUR_cote_droite__port_504108,CAR_INTERIEUR_avant_volant_class_6_levierdevitesse__port_506565,FirstUploadExperveo_carrosseriegrosplan_carrosserie__port_506483,CAR_EXTERIEUR_Angle_arriere_gauche_axe_arriere__port_504201,cartegrise_orientation__port_505064,CAR_EXTERIEUR_Angle_arriere_droit_axe_arriere__port_504217,CAR_INTERIEUR_avant_vue-arriere_class_1__port_506531,CAR_EXTERIEUR_Face_arriere_axe_droit__port_504218,CAR_EXTERIEUR_Cote_droit_axe_arriere__port_504214,CAR_EXTERIEUR_Angle_avant_droit__port_504087,FirstUploadExperveo_carrosseriegrosplan_morceauderoue__port_506484,CAR_INTERIEUR_avant_volant_class_6_class_2__port_506563,CAR_EXTERIEUR_Angle_arriere_droit__port_504160,CAR_EXTERIEUR_arriere__port_504184,CAR_INTERIEUR_avant_volant_class_6_boutonrond__port_506562,INTERIEUR_Compteur_kilometrique__port_503644,CAR_INTERIEUR_avant_vue_gauche_habitacle_class_1__port_506494,CAR_EXTERIEUR_Angle_arriere_gauche__port_504170,CAR_EXTERIEUR_Angle_avant_droit_axe_arriere__port_504226,CAR_EXTERIEUR_Face_arriere_axe_gauche__port_504202,CAR_EXTERIEUR_moteur__port_503704,FirstUploadExperveo_carrosseriegrosplan_class_6__port_506487,CAR_INTERIEUR_siege_arriere_class_1__port_506551,CAR_EXTERIEUR_avant__port_504146,CAR_EXTERIEUR_Angle_arriere_droit_axe_droit__port_504215,CAR_EXTERIEUR_Angle_avant_droit_axe_droit__port_504225,CAR_INTERIEUR_avant_volant_class_6_ecrangrosplan__port_506564,FirstUploadExperveo_carrosseriegrosplan_moteurgrosplanetdegat__port_506482,CAR_INTERIEUR_coffre__port_503412,FirstUploadExperveo_rouetranche__port_505677,UploadPhotoImmatBest_class_1__port_505051,CAR_INTERIEUR_avant_vue-arriere_class_2__port_506532,CAR_EXTERIEUR_angle_avant_gauche__port_504098,CAR_EXTERIEUR_face_avant_axe_gauche__port_504236,CAR_INTERIEUR_avant_vue_droite_habitacle_class_1__port_506540,CAR_EXTERIEUR_cote_gauche_axe_avant__port_504233,CAR_EXTERIEUR_roue_de_secour__port_503763,CAR_EXTERIEUR_Angle_arriere_gauche_axe_gauche__port_504199,CAR_EXTERIEUR_cote_gauche__port_504017,CAR_INTERIEUR_avant_volant_class_1__port_506503,CAR_INTERIEUR_avant_volant_class_2__port_506504,CAR_EXTERIEUR_angle_avant_gauche_axe_gauche__port_504234', 'svm_portfolios_learning': '505674,503398,506486,506485,504465,504198,504451,504235,505675,504108,506565,506483,504201,505064,504217,506531,504218,504214,504087,506484,506563,504160,504184,506562,503644,506494,504170,504226,504202,503704,506487,506551,504146,504215,504225,506564,506482,503412,505677,505051,506532,504098,504236,506540,504233,503763,504199,504017,506503,506504,504234', 'photo_hashtag_type': 337, 'photo_desc_type': 3392, 'type_classification': 'caffe', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 3392 ['FirstUploadExperveo_vignette__port_505674', 'CAR_EXTERIEUR_Roue__port_503398', 'FirstUploadExperveo_carrosseriegrosplan_VIndanslamoquette__port_506486', 'FirstUploadExperveo_carrosseriegrosplan_siegegrosplan__port_506485', 'CAR_EXTERIEUR_Cote_droit_axe_avant__port_504465', 'CAR_EXTERIEUR_Cote_gauche_axe_arriere__port_504198', 'CAR_EXTERIEUR_Face_avant_axe_droit__port_504451', 'CAR_EXTERIEUR_angle_avant_gauche_axe_avant__port_504235', 'FirstUploadExperveo_vin__port_505675', 'CAR_EXTERIEUR_cote_droite__port_504108', 'CAR_INTERIEUR_avant_volant_class_6_levierdevitesse__port_506565', 'FirstUploadExperveo_carrosseriegrosplan_carrosserie__port_506483', 'CAR_EXTERIEUR_Angle_arriere_gauche_axe_arriere__port_504201', 'cartegrise_orientation__port_505064', 'CAR_EXTERIEUR_Angle_arriere_droit_axe_arriere__port_504217', 'CAR_INTERIEUR_avant_vue-arriere_class_1__port_506531', 'CAR_EXTERIEUR_Face_arriere_axe_droit__port_504218', 'CAR_EXTERIEUR_Cote_droit_axe_arriere__port_504214', 'CAR_EXTERIEUR_Angle_avant_droit__port_504087', 'FirstUploadExperveo_carrosseriegrosplan_morceauderoue__port_506484', 'CAR_INTERIEUR_avant_volant_class_6_class_2__port_506563', 'CAR_EXTERIEUR_Angle_arriere_droit__port_504160', 'CAR_EXTERIEUR_arriere__port_504184', 'CAR_INTERIEUR_avant_volant_class_6_boutonrond__port_506562', 'INTERIEUR_Compteur_kilometrique__port_503644', 'CAR_INTERIEUR_avant_vue_gauche_habitacle_class_1__port_506494', 'CAR_EXTERIEUR_Angle_arriere_gauche__port_504170', 'CAR_EXTERIEUR_Angle_avant_droit_axe_arriere__port_504226', 'CAR_EXTERIEUR_Face_arriere_axe_gauche__port_504202', 'CAR_EXTERIEUR_moteur__port_503704', 'FirstUploadExperveo_carrosseriegrosplan_class_6__port_506487', 'CAR_INTERIEUR_siege_arriere_class_1__port_506551', 'CAR_EXTERIEUR_avant__port_504146', 'CAR_EXTERIEUR_Angle_arriere_droit_axe_droit__port_504215', 'CAR_EXTERIEUR_Angle_avant_droit_axe_droit__port_504225', 'CAR_INTERIEUR_avant_volant_class_6_ecrangrosplan__port_506564', 'FirstUploadExperveo_carrosseriegrosplan_moteurgrosplanetdegat__port_506482', 'CAR_INTERIEUR_coffre__port_503412', 'FirstUploadExperveo_rouetranche__port_505677', 'UploadPhotoImmatBest_class_1__port_505051', 'CAR_INTERIEUR_avant_vue-arriere_class_2__port_506532', 'CAR_EXTERIEUR_angle_avant_gauche__port_504098', 'CAR_EXTERIEUR_face_avant_axe_gauche__port_504236', 'CAR_INTERIEUR_avant_vue_droite_habitacle_class_1__port_506540', 'CAR_EXTERIEUR_cote_gauche_axe_avant__port_504233', 'CAR_EXTERIEUR_roue_de_secour__port_503763', 'CAR_EXTERIEUR_Angle_arriere_gauche_axe_gauche__port_504199', 'CAR_EXTERIEUR_cote_gauche__port_504017', 'CAR_INTERIEUR_avant_volant_class_1__port_506503', 'CAR_INTERIEUR_avant_volant_class_2__port_506504', 'CAR_EXTERIEUR_angle_avant_gauche_axe_gauche__port_504234'] 51 51 thcl : 358 photo_hashtag_type : 337 ############################### TEST rotate ################################ test rotate only Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : rotate list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.23195481300354004 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:rotate Sun Oct 5 05:23:06 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou_step_rotate ! We are in a linear step without datou_depend ! rotate photos of 90,180,270 degres batch 1 Loaded 0 chid ids of type : 0 map_chi of length : 0 Needs to change image size ! Needs to change image size ! Needs to change image size ! About to upload 3 photos upload in portfolio : 551782 init cache_photo without model_param we have 3 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1759634586_2282988 INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634587), 0.0, 0.0, 14, '', 0, 0, '1759634585_2282988_917849322_2bd260e91e91df8378dde8bb8b8c454890.jpg', 0, 2448, 3264, 0, 1759634587,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634587), 0.0, 0.0, 14, '', 0, 0, '1759634585_2282988_917849322_2bd260e91e91df8378dde8bb8b8c4548180.jpg', 0, 3264, 2448, 0, 1759634587,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634587), 0.0, 0.0, 14, '', 0, 0, '1759634585_2282988_917849322_2bd260e91e91df8378dde8bb8b8c4548270.jpg', 0, 2448, 3264, 0, 1759634587,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! we have uploaded 3 photos in the portfolio 551782 time of upload the photos Elapsed time : 1.5211906433105469 Len new_chis : 3 Len list_new_chi_with_photo_id : 0 of type : 0 time spend for datou_step_exec : 1.74526047706604 time spend to save output : 2.09808349609375e-05 total time spend for step 1 : 1.745281457901001 caffe_path_current : About to save ! 1 Inside saveOutput : final : True verbose : False saveOutput not yet implemented for datou_step.type : rotate we use saveGeneral [917849322] Looping around the photos to save general results len do output : 3 /1387735203Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735204Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735205Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('230', None, None, None, None, None, None, None, None) ('230', None, '917849322', None, None, None, None, None, None) begin to insert list_values into mtr_datou_result : length of list_values in save_final : 10 time used for this insertion : 0.036665916442871094 save_final save missing photos in datou_result : After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {1387735203: ['917849322', 'temp/1759634585_2282988_917849322_2bd260e91e91df8378dde8bb8b8c454890.jpg', []], 1387735204: ['917849322', 'temp/1759634585_2282988_917849322_2bd260e91e91df8378dde8bb8b8c4548180.jpg', []], 1387735205: ['917849322', 'temp/1759634585_2282988_917849322_2bd260e91e91df8378dde8bb8b8c4548270.jpg', []]} test rotate only is a success ! test rotate conditionnel Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! DataTypes for each output/input checked ! List Step Type Loaded in datou : thcl, argmax, rotate list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.16016864776611328 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 3 step1:thcl Sun Oct 5 05:23:08 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step Thcl ! we are using the classfication for only one thcl 500 time to import caffe and check if the image exist : 0.0002486705780029297 time to convert the images to numpy array : 0.6053369045257568 total time to convert the images to numpy array : 0.6059696674346924 list photo_ids error: [] list photo_ids correct : [917849322] number of photos to traite : 1 try to delete the photos incorrect in DB tagging for thcl : 500 To do loadFromThcl(), then load ParamDescType : thcl500 thcls : [{'id': 500, 'mtr_user_id': 31, 'name': 'orientation_carte_grise_all_2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'carteGrisesVerticales__port_549774,cartegrise_90deg__port_550987,cartesGrisesEnvers__port_549765,portfolio_270deg__port_550988', 'svm_portfolios_learning': '549774,550987,549765,550988', 'photo_hashtag_type': 507, 'photo_desc_type': 3517, 'type_classification': 'caffe', 'hashtag_id_list': '0,0,0,0'}] thcl {'id': 500, 'mtr_user_id': 31, 'name': 'orientation_carte_grise_all_2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'carteGrisesVerticales__port_549774,cartegrise_90deg__port_550987,cartesGrisesEnvers__port_549765,portfolio_270deg__port_550988', 'svm_portfolios_learning': '549774,550987,549765,550988', 'photo_hashtag_type': 507, 'photo_desc_type': 3517, 'type_classification': 'caffe', 'hashtag_id_list': '0,0,0,0'} Update svm_hashtag_type_desc : 3517 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (3517, 'orientation_carte_grise_all_2', 16384, 25088, 'orientation_carte_grise_all_2', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2018, 4, 18, 20, 4, 34), datetime.datetime(2018, 4, 18, 20, 4, 34)) To loadFromThcl() : net_3517 begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 2926 max_wait_temp : 1 max_wait : 0 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (3517, 'orientation_carte_grise_all_2', 16384, 25088, 'orientation_carte_grise_all_2', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2018, 4, 18, 20, 4, 34), datetime.datetime(2018, 4, 18, 20, 4, 34)) None mean_file_type : mean_file_path : prototxt_file_path : model : orientation_carte_grise_all_2 Inside get_net Inside get_net before cache_data_model model_param file didn't exist Inside get_net before CDM.load_model_par_type model_name : orientation_carte_grise_all_2 model_type : caffe list file need : ['caffemodel', 'deploy_conv_normal.prototxt', 'deploy_fc.prototxt', 'deploy.prototxt', 'mean.npy', 'synset_words.txt'] file exist in s3 : ['caffemodel', 'deploy_conv_normal.prototxt', 'deploy_fc.prototxt', 'deploy.prototxt', 'mean.npy', 'synset_words.txt'] file manque in s3 : [] local folder : /data/models_weight/orientation_carte_grise_all_2 /data/models_weight/orientation_carte_grise_all_2/caffemodel size_local : 537110520 size in s3 : 537110520 create time local : 2021-08-09 05:29:00 create time in s3 : 2021-08-06 20:07:17 caffemodel already exist and didn't need to update /data/models_weight/orientation_carte_grise_all_2/deploy_conv_normal.prototxt size_local : 4626 size in s3 : 4626 create time local : 2021-08-09 05:29:00 create time in s3 : 2021-08-06 20:07:16 deploy_conv_normal.prototxt already exist and didn't need to update /data/models_weight/orientation_carte_grise_all_2/deploy_fc.prototxt size_local : 1130 size in s3 : 1130 create time local : 2021-08-09 05:29:00 create time in s3 : 2021-08-06 20:07:16 deploy_fc.prototxt already exist and didn't need to update /data/models_weight/orientation_carte_grise_all_2/deploy.prototxt size_local : 5653 size in s3 : 5653 create time local : 2021-08-09 05:29:00 create time in s3 : 2021-08-06 20:07:16 deploy.prototxt already exist and didn't need to update /data/models_weight/orientation_carte_grise_all_2/mean.npy size_local : 1572992 size in s3 : 1572992 create time local : 2021-08-09 05:29:00 create time in s3 : 2021-08-06 20:07:31 mean.npy already exist and didn't need to update /data/models_weight/orientation_carte_grise_all_2/synset_words.txt size_local : 159 size in s3 : 159 create time local : 2021-08-09 05:29:00 create time in s3 : 2021-08-06 20:07:16 synset_words.txt already exist and didn't need to update Inside get_net after CDM.load_model_par_type After if not only_with_local_cache: /home/admin/workarea/install/darknet/:/home/admin/workarea/git/Velours/python:/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python:/home/admin/mtr/.credentials:/home/admin/workarea/install/caffe/python:/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools/:/home/admin/workarea/git/fotonowerpip/:/home/admin/workarea/install/segment-anything:/home/admin//workarea/git/pyfvs/ Here before set mode gpu Doing nothing but we could set mode gpu after set mode gpu prototxt_filename : /data/models_weight/orientation_carte_grise_all_2/deploy.prototxt caffemodel_filename : /data/models_weight/orientation_carte_grise_all_2/caffemodel now we set caffe to gpu mode before predict begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 2926 max_wait_temp : 1 max_wait : 0 dict_keys(['prob', 'pool5']) time used to do the prepocess of the images : 2.0348613262176514 time used to do the prediction : 0.11329436302185059 save descriptor for thcl : 500 time to traite the descriptors : 0.06569337844848633 storage_type for insertDescriptorsMulti : 1 To insert : 917849322 time to insert the descriptors : 0.7284867763519287 time spend for datou_step_exec : 9.503950119018555 time spend to save output : 3.647804260253906e-05 total time spend for step 1 : 9.503986597061157 step2:argmax Sun Oct 5 05:23:17 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou_step Argmax ! calculate argmax for thcl : 500 time spend for datou_step_exec : 0.0001385211944580078 time spend to save output : 3.0279159545898438e-05 total time spend for step 2 : 0.00016880035400390625 step3:rotate Sun Oct 5 05:23:17 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou_step_rotate ! We are in a datou with depends ! angle_condi : {'carteGrisesVerticales__port_549774': 0, 'cartegrise_90deg__port_550987': 270, 'portfolio_270deg__port_550988': 90, 'cartesGrisesEnvers__port_549765': 180} rotate photos for hashtag carteGrisesVerticales__port_549774 of 0 degres 1 photos founded : [917849322] batch 1 Loaded 0 chid ids of type : 0 map_chi of length : 0 Needs to change image size ! About to upload 1 photos upload in portfolio : 551782 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1759634598_2282988 we have uploaded 1 photos in the portfolio 551782 time of upload the photos Elapsed time : 0.7103819847106934 Len new_chis : 1 Len list_new_chi_with_photo_id : 0 of type : 0 rotate photos for hashtag cartegrise_90deg__port_550987 of 270 degres 0 photos founded : [] rotate photos for hashtag portfolio_270deg__port_550988 of 90 degres 0 photos founded : [] rotate photos for hashtag cartesGrisesEnvers__port_549765 of 180 degres 0 photos founded : [] time spend for datou_step_exec : 1.284162998199463 time spend to save output : 5.2928924560546875e-05 total time spend for step 3 : 1.2842159271240234 caffe_path_current : About to save ! 1 Inside saveOutput : final : True verbose : False saveOutput not yet implemented for datou_step.type : rotate we use saveGeneral [917849322] Looping around the photos to save general results len do output : 1 /0Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('233', None, None, None, None, None, None, None, None) ('233', None, '917849322', None, None, None, None, None, None) begin to insert list_values into mtr_datou_result : length of list_values in save_final : 4 time used for this insertion : 0.03670215606689453 save_final save missing photos in datou_result : After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 3 output : {0: ['917849322', 'temp/1759634588_2282988_917849322_2bd260e91e91df8378dde8bb8b8c45480.jpg', []]} ############################### TEST data_augmentation_ellipse_varroa_tile_rotate ################################ # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : step 316 crop is not linked in the step_by_step architecture ! Step 318 rotate have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 318 rotate have less outputs used (0) than in the step definition (3) : some outputs may be not used ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! DataTypes for each output/input checked ! Unexpected type seems boolean for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : DATA AUGMENTATION ELLIPSE VARROA TILE ROTATE Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : step 316 crop is not linked in the step_by_step architecture ! Step 318 rotate have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 318 rotate have less outputs used (0) than in the step definition (3) : some outputs may be not used ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! DataTypes for each output/input checked ! List Step Type Loaded in datou : crop, tile, rotate list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.14966583251953125 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 3 step1:crop Sun Oct 5 05:23:19 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou_step Crop ! param_json : {'hashtag_id_ellipse': 2087736828, 'photo_hashtag_type_from_ellipse': 520, 'token': '78d09a0790ec6ecbf119343125a81fdc', 'portfolio_name': 'crop_detect_varroa', 'photo_hashtag_type': 407, 'feed_id_new_photos_not_used': 549103, 'host': 'www.fotonower.com', 'margin': 8, 'upload_type': 'python'} margin_type : margin margin_value : [8, 8, 8, 8] Loading chi in step crop with photo_hashtag_type : 407 Loading chi in step crop for list_pids : 1 ! batch 1 Loaded 4 chid ids of type : 407 +WARNING : Unexpected points, we should remove this data for chi_id : 8165075, for now we just ignore these empty polygon points +WARNING : Unexpected points, we should remove this data for chi_id : 8165076, for now we just ignore these empty polygon points +WARNING : Unexpected points, we should remove this data for chi_id : 8165077, for now we just ignore these empty polygon points +WARNING : Unexpected points, we should remove this data for chi_id : 8165078, for now we just ignore these empty polygon points WARNING : margin is only used for type bib ! map_result returned by crop_photo_return_map_crop : length : 4 Here we crop with rles About to insert : list_path_to_insert length 4 new photo from crops ! About to upload 4 photos upload in portfolio : 27518746 init cache_photo without model_param we have 4 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1759634601_2282988 INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634602), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_bib_crop_8165075_0.jpg', 0, 57, 51, 0, 1759634602,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634602), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_bib_crop_8165076_0.jpg', 0, 50, 45, 0, 1759634602,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634602), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_bib_crop_8165077_0.jpg', 0, 51, 54, 0, 1759634602,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634602), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_bib_crop_8165078_0.jpg', 0, 43, 52, 0, 1759634602,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! we have uploaded 4 photos in the portfolio 27518746 time of upload the photos Elapsed time : 3.491410493850708 Now we prepare data that will be used for ellipse search ! About to compute ellipse and record with type : 520 score : 5120 strategy_opt : 5| | arg_min : 1.9500000000000002 min_score : 2311 | arg_min : -30.0 min_score : 1968 | arg_min : 17.8125 min_score : 1614 | arg_min : 31.875 min_score : 1105 | arg_min : 28.5 min_score : 1105 arg_min : 1.9500000000000002 min_score : 1105 arg_min : 25.0 min_score : 1088 arg_min : 24.9375 min_score : 979 arg_min : 31.875 min_score : 979 arg_min : 28.5 min_score : 979 yc : 31.875 xc : 24.9375 angle : 25.0 radius : 28.5 excentricity : 1.9500000000000002 yc : 31.875 xc : 24.9375 angle : 25.0 radius : 28.5 excentricity : 1.9500000000000002 Now saving polygons points : 1| batch 1 Loaded 1 chid ids of type : 520 CHI and polygons saved ! score : 5362 strategy_opt : 5| | arg_min : 1.9500000000000002 min_score : 2281 | arg_min : -10.0 min_score : 2127 | arg_min : 25.0 min_score : 2127 | arg_min : 30.9375 min_score : 714 | arg_min : 25.0 min_score : 714 arg_min : 1.9500000000000002 min_score : 714 arg_min : -5.0 min_score : 668 arg_min : 23.4375 min_score : 655 arg_min : 29.53125 min_score : 631 arg_min : 25.0 min_score : 631 yc : 29.53125 xc : 23.4375 angle : -5.0 radius : 25.0 excentricity : 1.9500000000000002 yc : 29.53125 xc : 23.4375 angle : -5.0 radius : 25.0 excentricity : 1.9500000000000002 Now saving polygons points : 1| batch 1 Loaded 2 chid ids of type : 520 + CHI and polygons saved ! score : 4603 strategy_opt : 5| | arg_min : 1.85 min_score : 2981 | arg_min : -50.0 min_score : 1356 | arg_min : 30.28125 min_score : 1079 | arg_min : 23.625 min_score : 995 | arg_min : 27.0 min_score : 995 arg_min : 1.6500000000000001 min_score : 961 arg_min : -70.0 min_score : 852 arg_min : 28.6875 min_score : 847 arg_min : 23.625 min_score : 847 arg_min : 27.0 min_score : 847 yc : 23.625 xc : 28.6875 angle : -70.0 radius : 27.0 excentricity : 1.6500000000000001 yc : 23.625 xc : 28.6875 angle : -70.0 radius : 27.0 excentricity : 1.6500000000000001 Now saving polygons points : 1| batch 1 Loaded 3 chid ids of type : 520 ++ CHI and polygons saved ! score : 7970 strategy_opt : 5| | arg_min : 1.9500000000000002 min_score : 1576 | arg_min : 40.0 min_score : 632 | arg_min : 20.15625 min_score : 561 | arg_min : 26.0 min_score : 561 | arg_min : 26.0 min_score : 561 arg_min : 1.8 min_score : 520 arg_min : 40.0 min_score : 520 arg_min : 18.8125 min_score : 494 arg_min : 26.0 min_score : 494 arg_min : 26.0 min_score : 494 yc : 26.0 xc : 18.8125 angle : 40.0 radius : 26.0 excentricity : 1.8 yc : 26.0 xc : 18.8125 angle : 40.0 radius : 26.0 excentricity : 1.8 Now saving polygons points : 1| batch 1 Loaded 4 chid ids of type : 520 +++ CHI and polygons saved ! ['temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_bib_crop_8165075_0_ellipsebest.jpg', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_bib_crop_8165075_0_varroa_with_ellipsebest.jpg', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_bib_crop_8165076_0_ellipsebest.jpg', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_bib_crop_8165076_0_varroa_with_ellipsebest.jpg', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_bib_crop_8165077_0_ellipsebest.jpg', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_bib_crop_8165077_0_varroa_with_ellipsebest.jpg', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_bib_crop_8165078_0_ellipsebest.jpg', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_bib_crop_8165078_0_varroa_with_ellipsebest.jpg'] About to upload 8 photos upload in portfolio : 27518747 Result OK ! uploaded one batch 0 Elapsed time : 15.374150276184082 time spend for datou_step_exec : 23.05627727508545 time spend to save output : 1.4781951904296875e-05 total time spend for step 1 : 23.056292057037354 step2:tile Sun Oct 5 05:23:42 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure verbose : False param_json : {'photo_tile_type': 17, 'whiten': True, 'remove_crop_border': True, 'minimal_size_crop_border': 900, 'stride': 240, 'crop_hashtag_type_tiled': 521, 'ETA': 86400, 'new_width': 480, 'new_height': 480, 'token': '78d09a0790ec6ecbf119343125a81fdc', 'portfolio_name': 'tile_taggage_varroa', 'crop_hashtag_type': 520, 'host': 'www.fotonower.com', 'arg_aux_upload': {'type_upload': 'python'}} type(crop_hashtag_type) : type(crop_hashtag_type_tiled) : We consider crop_hashtag_type is an integer ! map_chi_type_to_chi_type_cropped : {520: 521} TO DEPRECATE VR 14-6-18 map_filenames : {937852786: 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67.jpg'} list_pids : 1 list_pids : 2 list_subpids to replace list_pids : 0 batch 1 Loaded 4 chid ids of type : 520 ++++ created feed_id_new_photos : 27518748 with name tile_taggage_varroa feed_id_new_photos : 27518748 filename : temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67.jpg photo_id : 937852786 height_image_input : 480 width_image_input : 480 new_width : 480 new_height : 480 stride : 240 stride_relative : 0.1 chi to copy from the main photo to the tiled photo input_chi_for_this_image_as_chi : 4 list_bib_to_crops : 1 [(0, 480, 0, 480, 0)] new_crops_tiles : 1 crop_transformed : 4 batch 1 Loaded 1 chid ids of type : 17 treat the image : temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67.jpg , 0 before upload mediasElapsed time : 0.2267436981201172 on upload les photos avec python init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1759634629_2282988 INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634629), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0.jpg', 0, 480, 480, 0, 1759634629,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! we have uploaded 1 photos in the portfolio 27518748 Importing ! upload mediasElapsed time : 0.8661441802978516 , 0Saving 4 CHIs. batch 1 Loaded 4 chid ids of type : 521 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! end of tileElapsed time : 1.0806243419647217 time spend for datou_step_exec : 7.477764844894409 time spend to save output : 2.1219253540039062e-05 total time spend for step 2 : 7.477786064147949 step3:rotate Sun Oct 5 05:23:50 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou_step_rotate ! Warning, new_feed_id is empty ! We are in a datou with depends ! rotate photos of 0,15,30,45,60,75,90,105,120,135,150,165,180,195,210,225,240,255,270,285,300,315,330,345 degres batch 1 Loaded 4 chid ids of type : 521 ++++++++ map_chi of length : 1 feed_id_new_photos : 27518749 Needs to change image size ! time for calcul the mask position with numpy : 0.0004279613494873047 nb_pixel_total : 1389 time to create 1 rle with old method : 0.003026723861694336 .time for calcul the mask position with numpy : 0.00034236907958984375 nb_pixel_total : 1157 time to create 1 rle with old method : 0.002543926239013672 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! Needs to change image size ! time for calcul the mask position with numpy : 0.0003998279571533203 nb_pixel_total : 694 time to create 1 rle with old method : 0.00159454345703125 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.00034165382385253906 nb_pixel_total : 1162 time to create 1 rle with old method : 0.002607107162475586 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! Needs to change image size ! time for calcul the mask position with numpy : 0.00035881996154785156 nb_pixel_total : 221 time to create 1 rle with old method : 0.0005383491516113281 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.0003581047058105469 nb_pixel_total : 1155 time to create 1 rle with old method : 0.0026030540466308594 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! Needs to change image size ! time for calcul the mask position with numpy : 0.00037789344787597656 nb_pixel_total : 143 time to create 1 rle with old method : 0.0003597736358642578 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.00035119056701660156 nb_pixel_total : 1161 time to create 1 rle with old method : 0.0026247501373291016 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! Needs to change image size ! time for calcul the mask position with numpy : 0.00038623809814453125 nb_pixel_total : 414 time to create 1 rle with old method : 0.0009930133819580078 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.0003437995910644531 nb_pixel_total : 1159 time to create 1 rle with old method : 0.0026290416717529297 . crop are not in the shrunk photo ! On the border Smaller than minimal size ! Needs to change image size ! time for calcul the mask position with numpy : 0.0004017353057861328 nb_pixel_total : 1204 time to create 1 rle with old method : 0.0028417110443115234 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.0003294944763183594 nb_pixel_total : 1157 time to create 1 rle with old method : 0.002576589584350586 . crop are not in the shrunk photo ! time for calcul the mask position with numpy : 0.00032639503479003906 nb_pixel_total : 264 time to create 1 rle with old method : 0.0006594657897949219 On the border Smaller than minimal size ! Needs to change image size ! time for calcul the mask position with numpy : 0.0003952980041503906 nb_pixel_total : 1389 time to create 1 rle with old method : 0.003080129623413086 .time for calcul the mask position with numpy : 0.00034332275390625 nb_pixel_total : 1157 time to create 1 rle with old method : 0.002601146697998047 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! Needs to change image size ! time for calcul the mask position with numpy : 0.00038623809814453125 nb_pixel_total : 694 time to create 1 rle with old method : 0.0016040802001953125 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.00034165382385253906 nb_pixel_total : 1162 time to create 1 rle with old method : 0.0025544166564941406 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! Needs to change image size ! time for calcul the mask position with numpy : 0.000377655029296875 nb_pixel_total : 221 time to create 1 rle with old method : 0.0005373954772949219 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.00033926963806152344 nb_pixel_total : 1155 time to create 1 rle with old method : 0.002559185028076172 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! Needs to change image size ! time for calcul the mask position with numpy : 0.0003714561462402344 nb_pixel_total : 143 time to create 1 rle with old method : 0.0003578662872314453 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.00034356117248535156 nb_pixel_total : 1160 time to create 1 rle with old method : 0.0026183128356933594 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! Needs to change image size ! time for calcul the mask position with numpy : 0.00038504600524902344 nb_pixel_total : 414 time to create 1 rle with old method : 0.0009708404541015625 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.0003440380096435547 nb_pixel_total : 1159 time to create 1 rle with old method : 0.0025577545166015625 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! time for calcul the mask position with numpy : 0.0003199577331542969 nb_pixel_total : 1 time to create 1 rle with old method : 2.4557113647460938e-05 Needs to change image size ! time for calcul the mask position with numpy : 0.00039696693420410156 nb_pixel_total : 1204 time to create 1 rle with old method : 0.0027642250061035156 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.00033092498779296875 nb_pixel_total : 1158 time to create 1 rle with old method : 0.002566814422607422 . crop are not in the shrunk photo ! time for calcul the mask position with numpy : 0.00032448768615722656 nb_pixel_total : 264 time to create 1 rle with old method : 0.0006170272827148438 On the border Smaller than minimal size ! Needs to change image size ! time for calcul the mask position with numpy : 0.0003948211669921875 nb_pixel_total : 1389 time to create 1 rle with old method : 0.0030722618103027344 .time for calcul the mask position with numpy : 0.0003325939178466797 nb_pixel_total : 1157 time to create 1 rle with old method : 0.002629995346069336 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! Needs to change image size ! time for calcul the mask position with numpy : 0.0003943443298339844 nb_pixel_total : 727 time to create 1 rle with old method : 0.0016491413116455078 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.0003383159637451172 nb_pixel_total : 1162 time to create 1 rle with old method : 0.002658843994140625 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! Needs to change image size ! time for calcul the mask position with numpy : 0.0003781318664550781 nb_pixel_total : 250 time to create 1 rle with old method : 0.0006089210510253906 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.00034046173095703125 nb_pixel_total : 1155 time to create 1 rle with old method : 0.0026187896728515625 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! Needs to change image size ! time for calcul the mask position with numpy : 0.0003707408905029297 nb_pixel_total : 169 time to create 1 rle with old method : 0.0004405975341796875 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.0003418922424316406 nb_pixel_total : 1161 time to create 1 rle with old method : 0.002609729766845703 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! Needs to change image size ! time for calcul the mask position with numpy : 0.0003819465637207031 nb_pixel_total : 450 time to create 1 rle with old method : 0.0011096000671386719 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.00034236907958984375 nb_pixel_total : 1159 time to create 1 rle with old method : 0.002548694610595703 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! time for calcul the mask position with numpy : 0.0003170967102050781 nb_pixel_total : 1 time to create 1 rle with old method : 2.47955322265625e-05 Needs to change image size ! time for calcul the mask position with numpy : 0.00040268898010253906 nb_pixel_total : 1237 time to create 1 rle with old method : 0.002810955047607422 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.0003299713134765625 nb_pixel_total : 1158 time to create 1 rle with old method : 0.002516031265258789 . crop are not in the shrunk photo ! time for calcul the mask position with numpy : 0.0003230571746826172 nb_pixel_total : 234 time to create 1 rle with old method : 0.0005755424499511719 On the border Smaller than minimal size ! Needs to change image size ! time for calcul the mask position with numpy : 0.0003905296325683594 nb_pixel_total : 1389 time to create 1 rle with old method : 0.003065347671508789 .time for calcul the mask position with numpy : 0.000339508056640625 nb_pixel_total : 1157 time to create 1 rle with old method : 0.002523183822631836 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! Needs to change image size ! time for calcul the mask position with numpy : 0.000484466552734375 nb_pixel_total : 727 time to create 1 rle with old method : 0.0017075538635253906 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.0003719329833984375 nb_pixel_total : 1162 time to create 1 rle with old method : 0.0026481151580810547 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! Needs to change image size ! time for calcul the mask position with numpy : 0.00045609474182128906 nb_pixel_total : 250 time to create 1 rle with old method : 0.0006601810455322266 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.0003821849822998047 nb_pixel_total : 1155 time to create 1 rle with old method : 0.002645730972290039 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! Needs to change image size ! time for calcul the mask position with numpy : 0.00046706199645996094 nb_pixel_total : 169 time to create 1 rle with old method : 0.00047516822814941406 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.000370025634765625 nb_pixel_total : 1161 time to create 1 rle with old method : 0.0026078224182128906 . crop are not in the shrunk photo ! crop are not in the shrunk photo ! Needs to change image size ! time for calcul the mask position with numpy : 0.0004696846008300781 nb_pixel_total : 450 time to create 1 rle with old method : 0.0011053085327148438 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.0003693103790283203 nb_pixel_total : 1159 time to create 1 rle with old method : 0.002691030502319336 . crop are not in the shrunk photo ! On the border Smaller than minimal size ! Needs to change image size ! time for calcul the mask position with numpy : 0.0004913806915283203 nb_pixel_total : 1237 time to create 1 rle with old method : 0.0027730464935302734 On the border Smaller than minimal size ! time for calcul the mask position with numpy : 0.00036597251892089844 nb_pixel_total : 1157 time to create 1 rle with old method : 0.01510930061340332 . crop are not in the shrunk photo ! time for calcul the mask position with numpy : 0.0003504753112792969 nb_pixel_total : 234 time to create 1 rle with old method : 0.0006632804870605469 On the border Smaller than minimal size ! About to upload 24 photos upload in portfolio : 27518749 init cache_photo without model_param we have 24 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1759634632_2282988 INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_00.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_015.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_030.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_045.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_060.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_075.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_090.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0105.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0120.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0135.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0150.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0165.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0180.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0195.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0210.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0225.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0240.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0255.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0270.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0285.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0300.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0315.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0330.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634637), 0.0, 0.0, 14, '', 0, 0, '1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0345.jpg', 0, 320, 320, 0, 1759634637,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! we have uploaded 24 photos in the portfolio 27518749 time of upload the photos Elapsed time : 8.498385906219482 Len new_chis : 24 Len list_new_chi_with_photo_id : 28 of type : 529 batch 1 Loaded 28 chid ids of type : 529 Number RLEs to save : 1197 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! batch 1 Loaded 28 chid ids of type : 529 ++++++++++++++++++++++++++++Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! time spend for datou_step_exec : 13.88261342048645 time spend to save output : 6.0558319091796875e-05 total time spend for step 3 : 13.882673978805542 caffe_path_current : About to save ! 1 Inside saveOutput : final : True verbose : False saveOutput not yet implemented for datou_step.type : rotate we use saveGeneral [937852786, 937852786, '1387735233'] Looping around the photos to save general results len do output : 24 /1387735235Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735236Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735237Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735238Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735239Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735240Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735241Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735242Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735243Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735244Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735245Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735246Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735247Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735248Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735249Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735250Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735251Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735252Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735253Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735254Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735255Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735256Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735257Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735258Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('243', None, None, None, None, None, None, None, None) ('243', None, '937852786', None, None, None, None, None, None) ('243', None, None, None, None, None, None, None, None) ('243', None, '937852786', None, None, None, None, None, None) ('243', None, None, None, None, None, None, None, None) ('243', None, '1387735233', None, None, None, None, None, None) begin to insert list_values into mtr_datou_result : length of list_values in save_final : 75 time used for this insertion : 0.041008710861206055 save_final save missing photos in datou_result : After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 3 output : {1387735235: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_00.jpg', [, ]], 1387735236: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_015.jpg', []], 1387735237: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_030.jpg', []], 1387735238: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_045.jpg', []], 1387735239: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_060.jpg', []], 1387735240: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_075.jpg', []], 1387735241: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_090.jpg', [, ]], 1387735242: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0105.jpg', []], 1387735243: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0120.jpg', []], 1387735244: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0135.jpg', []], 1387735245: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0150.jpg', []], 1387735246: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0165.jpg', []], 1387735247: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0180.jpg', [, ]], 1387735248: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0195.jpg', []], 1387735249: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0210.jpg', []], 1387735250: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0225.jpg', []], 1387735251: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0240.jpg', []], 1387735252: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0255.jpg', []], 1387735253: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0270.jpg', [, ]], 1387735254: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0285.jpg', []], 1387735255: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0300.jpg', []], 1387735256: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0315.jpg', []], 1387735257: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0330.jpg', []], 1387735258: ['937852786', 'temp/1759634599_2282988_937852786_7d9a231a08a1c63d0868e56a5361bf67_0345.jpg', []]} list chi : [[, ], [], [], [], [], [], [, ], [], [], [], [], [], [, ], [], [], [], [], [], [, ], [], [], [], [], []] ############################### TEST flip ################################ t Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : flip list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.14383172988891602 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:flip Sun Oct 5 05:24:04 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou_step_flip ! We are in a linear step without datou_depend ! batch 1 Loaded 6 chid ids of type : 741 +++++WARNING : Unexpected points, we should remove this data for chi_id : 18344210, for now we just ignore these empty polygon points + map_chi_objs of length : 1 photo_id in download_rotate_and_save : 911785586 list_chi_loc : 6 Vertical flip of photo 911785586 Horizontal flip of photo 911785586 About to upload 2 photos upload in portfolio : 1090565 init cache_photo without model_param we have 2 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1759634645_2282988 INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634645), 0.0, 0.0, 14, '', 0, 0, '1759634644_2282988_911785586_d8582feabcd359151ff718b5832248c7-big_flip_vert.jpg', 0, 640, 640, 0, 1759634645,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759634645), 0.0, 0.0, 14, '', 0, 0, '1759634644_2282988_911785586_d8582feabcd359151ff718b5832248c7-big_flip_hori.jpg', 0, 640, 640, 0, 1759634645,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! we have uploaded 2 photos in the portfolio 1090565 time of upload the photos Elapsed time : 1.0416455268859863 Len new_chis : 12 Len list_new_chi_with_photo_id : 12 of type : 741 batch 1 Loaded 12 chid ids of type : 741 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! time spend for datou_step_exec : 1.3126797676086426 time spend to save output : 2.8371810913085938e-05 total time spend for step 1 : 1.3127081394195557 caffe_path_current : About to save ! 1 Inside saveOutput : final : True verbose : False saveOutput not yet implemented for datou_step.type : flip we use saveGeneral [911785586] Looping around the photos to save general results len do output : 2 /1387735259 /1387735260 before output type Managing all output in save final without adding information in the mtr_datou_result ('571', None, None, None, None, None, None, None, None) ('571', None, '911785586', None, None, None, None, None, None) begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1 time used for this insertion : 0.03667163848876953 save_final save missing photos in datou_result : After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {'1387735259': ['911785586', 'temp/1759634644_2282988_911785586_d8582feabcd359151ff718b5832248c7-big_flip_vert.jpg', [, , , , , ]], '1387735260': ['911785586', 'temp/1759634644_2282988_911785586_d8582feabcd359151ff718b5832248c7-big_flip_hori.jpg', [, , , , , ]]} ############################### TEST crop_rles ################################ # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! Unexpected type seems boolean for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : TEST CROP RLES Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : crop list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.17364835739135742 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:crop Sun Oct 5 05:24:06 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou_step Crop ! param_json : {'photo_hashtag_type': 755, 'token': '78d09a0790ec6ecbf119343125a81fdc', 'feed_id_new_photos': 0, 'host': 'www.fotonower.com', 'crop_type': 'rle', 'margin_relative': 0.1, 'min_score': 0.3, 'upload,type': 'python'} margin_type : margin_relative margin_value : [0.1, 0.1, 0.1, 0.1] Loading chi in step crop with photo_hashtag_type : 755 Loading chi in step crop for list_pids : 1 ! batch 1 Loaded 8 chid ids of type : 755 ++++++++WARNING : margin is only used for type bib ! we have both polygon and rles we have both polygon and rles we have both polygon and rles we have both polygon and rles we have both polygon and rles we have both polygon and rles we have both polygon and rles we have both polygon and rles map_result returned by crop_photo_return_map_crop : length : 8 Here we crop with rles About to insert : list_path_to_insert length 8 new photo from crops ! About to upload 8 photos upload in portfolio : 27518750 Result OK ! uploaded one batch 0 Elapsed time : 14.534467935562134 Now we prepare data that will be used for ellipse search ! time spend for datou_step_exec : 14.687753438949585 time spend to save output : 2.6226043701171875e-05 total time spend for step 1 : 14.687779664993286 caffe_path_current : About to save ! 1 Inside saveOutput : final : True verbose : False saveOutput not yet implemented for datou_step.type : crop we use saveGeneral [950103132] Looping around the photos to save general results len do output : 8 /1387735261Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735262Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735263Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735264Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735265Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735266Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735267Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387735268Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('686', None, None, None, None, None, None, None, None) ('686', None, '950103132', None, None, None, None, None, None) begin to insert list_values into mtr_datou_result : length of list_values in save_final : 25 time used for this insertion : 0.0387263298034668 save_final save missing photos in datou_result : After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {'1387735261': ['950103132', 'temp/1759634646_2282988_950103132_4f47bd527301396b0a701a1b4183ba00_rle_crop_1947670931_0.jpg', (183, 199, 15, 41)], '1387735262': ['950103132', 'temp/1759634646_2282988_950103132_4f47bd527301396b0a701a1b4183ba00_rle_crop_1947670932_0.jpg', (38, 85, 113, 140)], '1387735263': ['950103132', 'temp/1759634646_2282988_950103132_4f47bd527301396b0a701a1b4183ba00_rle_crop_1947670933_0.jpg', (168, 194, 141, 151)], '1387735264': ['950103132', 'temp/1759634646_2282988_950103132_4f47bd527301396b0a701a1b4183ba00_rle_crop_1947670934_0.jpg', (47, 101, 16, 110)], '1387735265': ['950103132', 'temp/1759634646_2282988_950103132_4f47bd527301396b0a701a1b4183ba00_rle_crop_1947670935_0.jpg', (175, 199, 104, 111)], '1387735266': ['950103132', 'temp/1759634646_2282988_950103132_4f47bd527301396b0a701a1b4183ba00_rle_crop_1947670936_0.jpg', (86, 130, 184, 196)], '1387735267': ['950103132', 'temp/1759634646_2282988_950103132_4f47bd527301396b0a701a1b4183ba00_rle_crop_1947670937_0.jpg', (79, 195, 0, 61)], '1387735268': ['950103132', 'temp/1759634646_2282988_950103132_4f47bd527301396b0a701a1b4183ba00_rle_crop_1947670938_0.jpg', (131, 155, 181, 195)]} 8 ############################### TEST angular_coeff ################################ t Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : angular_coeff list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.28864073753356934 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:angular_coeff Sun Oct 5 05:24:21 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec beginning of step detection filter param_json : {'input_type': 846, 'output_type': -1, 'orientation_type': 872, 'ref_crop_type': 846, 'condition_crop': 'car', 'criteria_crop': 'center_rect', 'crops_coeffs': {'CAR_EXTERIEUR_angle_avant_droit.*': {'aile-avant': [[15, 0.0], [240, 0.0], [285, 1.0], [345, 1.0]], 'capot': [[45, 1.0], [60, 0.5], [270, 0.0], [315, 1.0], [360, 1.0]]}}} angular_coefficients_to_crops batch 1 Loaded 19 chid ids of type : 846 treating photo 932296368 time spend for datou_step_exec : 0.2193317413330078 time spend to save output : 3.838539123535156e-05 total time spend for step 1 : 0.21937012672424316 caffe_path_current : About to save ! 0 After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {932296368: ([(932296368, 2106233860, 846, 1066, 1277, 93, 340, 0.31964028378983567, 0, []), (932296368, 2106233860, 846, 434, 690, 218, 498, 0.7170410105787726, 0, []), (932296368, 503548896, 846, 902, 1111, 466, 576, 0.31724966, 769189715, []), (932296368, 599722655, 846, 523, 1100, 152, 337, 0.98039776, 0, []), (932296368, 492601069, 846, 143, 1190, 90, 695, 0.9696157, 769189717, []), (932296368, 492601069, 846, 0, 408, 246, 719, 0.9431181, 769189718, []), (932296368, 2096875722, 846, 567, 964, 162, 215, 0.55490255, 769189721, []), (932296368, 2096875709, 846, 437, 939, 24, 198, 0.9983077, 769189723, []), (932296368, 2096875709, 846, 1004, 1263, 28, 144, 0.9485744, 769189724, []), (932296368, 624624117, 846, 595, 1122, 331, 640, 0.99100167, 769189725, []), (932296368, 492624020, 846, 585, 874, 308, 393, 0.78697366, 769189727, []), (932296368, 2096875719, 846, 943, 1100, 428, 547, 0.96733797, 769189729, []), (932296368, 492654799, 846, 253, 467, 35, 441, 0.99621326, 769189730, []), (932296368, 492689227, 846, 1118, 1264, 270, 438, 0.9901647, 769189732, []), (932296368, 492689227, 846, 486, 671, 378, 690, 0.98789483, 769189733, []), (932296368, 492689227, 846, 161, 255, 229, 409, 0.70801014, 769189734, []), (932296368, 492925064, 846, 261, 421, 27, 193, 0.92215157, 769189737, []), (932296368, 492925064, 846, 873, 1045, 46, 156, 0.7535122, 769189738, []), (932296368, 492925064, 846, 1090, 1279, 20, 107, 0.45259848, 769189739, [])],)} test angular coeff is a success ! ############################### TEST detection_filter_by_crop ################################ t Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : detection_filter_by_crop list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.1816999912261963 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:detection_filter_by_crop Sun Oct 5 05:24:22 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec beginning of step detection filter param_json : {'input_type': 631, 'output_type': -1, 'condition_type': 445, 'condition_crop': 'car', 'criteria_crop': 'center_rect', 'min_surface_ratio': 0.7} conditional_crop_copy batch 1 Loaded 3 chid ids of type : 445 +++batch 1 Loaded 35 chid ids of type : 631 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++batch 1 Loaded 3 chid ids of type : 445 +++ treating photo 946711423 time spend for datou_step_exec : 0.3089308738708496 time spend to save output : 9.131431579589844e-05 total time spend for step 1 : 0.3090221881866455 caffe_path_current : About to save ! 0 After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {946711423: ([(946711423, 624624117, 631, 226, 569, 252, 425, 0.99812776, 1947740368, ['395,419,341,419,340,418,316,418,315,417,306,417,305,416,293,415,290,413,284,412,283,411,280,411,272,407,264,405,258,400,254,398,250,394,244,391,242,389,242,386,239,380,240,368,239,367,239,347,238,346,238,331,237,330,237,327,238,326,237,314,239,311,239,308,237,304,238,302,243,298,244,296,244,292,246,291,250,291,251,290,259,290,260,289,264,289,265,288,269,288,271,290,273,294,278,299,280,300,285,300,286,301,293,301,294,302,302,304,305,307,309,308,312,310,314,310,317,312,335,312,336,313,343,313,344,314,370,314,371,315,381,315,382,314,389,313,393,311,405,309,406,308,408,308,412,306,414,304,417,304,421,307,426,308,427,309,433,309,434,310,464,309,467,306,471,304,476,304,477,303,489,303,490,302,494,302,495,301,500,301,501,300,515,300,516,299,519,298,522,292,525,290,533,290,534,291,540,291,541,290,543,290,547,288,550,285,550,285,552,289,552,291,553,292,553,313,552,314,552,324,550,328,550,333,549,334,549,336,544,346,543,353,539,361,532,368,531,368,527,372,519,374,509,379,503,384,499,385,498,386,496,386,492,388,490,390,486,392,484,392,479,396,475,397,474,398,472,398,471,399,469,399,462,403,460,403,459,404,457,404,456,405,454,405,450,407,448,407,443,410,425,413,424,414,422,414,416,417,404,417,403,418,396,418']), (946711423, 492689227, 631, 162, 245, 233, 396, 0.99702626, 1947740369, ['215,393,206,393,202,390,200,390,192,383,191,380,187,375,184,369,184,367,180,360,180,358,179,357,177,349,175,347,174,339,172,336,171,330,170,329,169,324,168,323,168,313,167,312,167,304,166,303,166,298,165,297,165,288,164,287,165,286,165,272,166,271,166,268,167,267,167,263,168,262,169,254,173,249,177,247,178,247,181,251,184,251,184,252,187,255,189,255,193,259,193,261,195,263,195,264,201,270,203,278,207,282,208,289,211,293,211,296,213,299,214,304,215,305,216,312,219,316,219,319,220,320,220,325,222,329,222,335,223,336,223,338,225,342,225,349,226,350,226,359,227,360,227,366,228,367,228,371,231,375,231,382,227,385,226,388,225,389,223,388,219,392,216,392']), (946711423, 492654799, 631, 96, 172, 39, 261, 0.9928518, 1947740370, ['143,252,143,249,141,246,140,246,138,248,138,251,137,250,137,248,135,246,134,246,132,248,127,244,124,244,122,241,122,236,121,235,121,232,118,229,117,225,116,224,116,212,113,209,115,207,116,201,111,194,110,184,106,178,107,154,108,152,112,148,113,144,112,143,112,138,110,136,108,136,107,135,103,128,103,124,102,123,102,121,103,120,103,118,106,115,106,106,107,105,110,104,113,101,117,93,117,71,114,65,116,61,116,59,117,58,117,55,118,54,119,49,122,45,122,44,124,42,150,42,151,43,153,43,153,47,152,48,152,50,154,52,155,56,156,57,156,85,155,86,155,95,154,96,154,98,155,99,155,105,156,106,155,107,155,116,157,120,159,121,159,123,156,127,156,134,157,135,157,138,156,139,156,141,154,145,152,147,150,151,149,159,148,160,148,164,149,165,149,174,148,175,148,197,149,198,149,215,150,216,150,241,149,242,149,245,148,247,146,245,144,247', '122,147,121,138,120,141,119,142,119,144,118,145,121,148']), (946711423, 2096875719, 631, 468, 555, 292, 365, 0.9830025, 1947740372, ['491,350,489,350,488,349,487,350,483,350,480,348,480,341,482,339,482,337,485,334,487,334,491,330,494,330,495,328,498,326,501,326,503,324,507,325,509,323,514,321,516,319,518,321,520,321,521,319,522,319,524,321,527,321,530,317,530,315,531,314,535,313,540,309,543,310,544,311,542,313,542,314,544,316,541,318,541,322,536,322,535,323,533,323,532,322,528,322,527,321,524,321,522,323,518,322,516,324,517,327,516,328,512,327,510,329,512,332,513,332,515,330,516,331,516,333,514,332,511,333,511,336,514,337,516,336,516,339,515,339,513,338,511,340,512,341,512,342,510,343,507,343,502,347,500,347,497,349,492,349', '514,325,515,324,513,322,512,322,511,325,512,326', '522,327,521,327,521,326,522,325']), (946711423, 599722655, 631, 176, 535, 138, 264, 0.9818268, 1947740373, ['453,253,413,253,412,252,387,252,386,250,386,248,383,246,379,245,376,243,361,243,361,240,362,239,359,238,358,237,356,237,355,236,352,236,351,235,333,235,332,234,329,234,329,233,331,231,331,229,329,228,328,224,330,222,330,221,324,218,308,219,307,218,302,218,298,216,288,217,287,218,285,218,283,220,283,221,287,224,295,225,295,225,294,226,289,226,288,227,283,227,282,228,273,228,272,229,271,228,259,228,258,227,254,227,253,226,247,225,247,225,251,221,248,218,243,216,247,213,248,213,249,212,248,211,246,211,245,210,241,210,240,209,237,209,236,208,231,207,230,206,228,202,224,201,223,200,221,200,220,199,214,198,213,195,211,193,208,193,203,189,203,184,201,181,201,176,198,171,199,170,199,158,203,154,205,153,205,151,206,149,209,149,210,148,225,148,226,147,283,147,284,148,287,148,288,147,305,147,306,148,312,148,313,147,354,147,355,146,428,146,429,147,433,147,434,148,437,148,438,149,451,149,457,156,459,162,462,165,464,166,471,166,472,165,477,165,480,167,480,171,486,175,488,175,489,176,502,176,503,178,503,180,509,185,509,189,512,193,512,199,513,200,513,203,514,204,514,210,513,211,514,217,512,221,513,222,513,225,510,229,510,235,507,237,504,238,502,243,490,243,489,244,485,244,484,245,480,245,479,246,463,246,462,247,460,247,458,249,457,252,454,252', '528,212,528,207,526,206,524,203,526,203,527,202,528,202', '299,215,302,212,299,211,298,210,291,210,290,211,281,212,286,215,290,215,291,216', '375,242,376,240,375,238,363,239,368,242,371,242,372,243']), (946711423, 492844413, 631, 89, 163, 93, 144, 0.9772748, 1947740375, ['159,142,153,141,151,139,148,138,145,135,141,133,139,133,138,132,131,132,130,131,125,131,124,130,121,130,120,129,116,129,115,128,112,128,108,126,106,126,100,123,98,121,94,113,94,104,97,101,103,98,105,98,106,97,110,97,111,96,116,96,117,95,132,95,133,96,139,97,141,99,144,100,149,105,150,107,154,108,155,113,157,115,158,115,160,118,160,120,161,121,161,133,160,134,160,140']), (946711423, 2096875709, 631, 185, 431, 39, 136, 0.97171515, 1947740377, ['331,134,287,134,286,133,284,133,283,134,272,134,271,133,264,133,263,134,258,134,257,133,254,133,253,132,236,132,235,131,225,131,224,132,223,131,213,131,212,130,208,130,207,129,204,129,203,128,199,127,193,121,192,117,189,113,189,110,188,109,187,93,186,92,187,91,187,89,186,88,186,65,185,64,186,63,186,61,185,60,185,48,186,47,186,42,187,40,232,40,233,41,248,41,249,42,281,43,282,44,290,44,291,45,300,45,301,46,308,46,309,47,314,47,315,48,322,49,328,53,334,54,336,56,339,57,344,62,349,64,351,66,353,67,356,67,358,69,359,72,363,76,367,78,369,80,379,91,380,93,383,94,390,100,393,101,395,103,396,106,399,109,402,110,406,115,408,115,410,117,410,120,412,123,411,127,409,129,399,129,398,130,395,130,394,131,378,131,377,132,368,132,367,131,346,131,345,132,342,132,341,133,332,133']), (946711423, 2096875722, 631, 198, 395, 118, 142, 0.9699756, 1947740378, ['328,137,251,137,250,136,249,137,241,137,240,136,219,136,218,135,213,135,212,134,206,133,205,132,201,131,200,130,200,122,201,121,205,121,206,122,222,122,226,124,239,124,240,125,369,125,370,124,371,125,389,125,391,127,391,133,390,134,386,134,385,135,380,135,379,134,375,134,374,135,341,135,340,136,329,136']), (946711423, 499500794, 631, 93, 107, 127, 146, 0.9574813, 1947740379, ['101,143,98,143,95,139,95,131,97,129,100,129,101,133,102,134,102,136,103,137,103,140']), (946711423, 492925064, 631, 71, 125, 36, 95, 0.95296955, 1947740380, ['104,92,96,92,93,90,91,90,86,86,83,85,83,84,81,82,80,82,75,77,75,75,74,74,74,66,75,65,75,62,77,60,77,58,80,55,80,54,83,51,83,50,88,45,94,44,95,43,99,43,100,42,113,42,117,45,117,47,116,48,116,51,115,52,114,59,113,60,112,65,111,66,111,69,110,70,110,75,109,76,109,83,108,84,108,86,109,87,108,89']), (946711423, 492925064, 631, 101, 167, 38, 127, 0.9508439, 1947740381, ['154,117,152,115,152,112,150,110,148,106,148,104,145,101,143,100,138,100,137,99,135,99,133,95,131,95,126,93,126,91,128,88,128,83,129,82,129,70,127,68,127,66,128,65,125,61,127,59,127,56,129,52,129,49,130,47,135,42,144,42,148,45,151,49,151,60,152,61,152,75,153,76,153,80,155,83,155,87,156,88,156,105,155,106,156,107,156,110,154,112,156,116', '109,100,108,100,107,99,109,97']), (946711423, 492624020, 631, 249, 400, 219, 316, 0.8792459, 1947740382, ['395,313,390,313,386,311,384,312,381,312,376,309,358,308,357,307,354,307,353,306,350,306,349,307,345,305,343,303,341,304,337,304,334,302,325,302,324,301,315,300,313,298,313,297,310,295,304,295,300,293,295,293,291,288,289,287,283,287,281,285,281,283,278,280,274,280,272,279,272,276,270,273,270,270,269,268,266,265,265,265,264,264,264,262,261,260,260,258,260,256,261,255,259,252,259,248,258,246,255,244,256,241,252,239,251,238,251,226,265,226,266,227,268,227,272,232,276,232,277,233,279,233,281,234,283,237,285,238,290,239,293,241,296,241,304,246,312,247,316,251,318,252,320,252,326,255,328,255,337,260,342,260,343,261,345,261,349,264,351,264,355,266,357,268,364,271,366,273,370,275,374,275,376,276,377,279,379,281,383,282,384,283,386,283,387,286,390,289,390,290,394,294,396,294,398,296,398,308,397,309,397,311']), (946711423, 503548896, 631, 302, 540, 339, 403, 0.7406652, 1947740386, ['442,401,372,401,372,397,370,395,369,392,366,390,367,389,366,386,357,386,354,384,350,384,349,383,320,383,319,382,320,378,318,376,318,374,314,370,309,370,308,369,306,369,305,363,305,357,306,356,306,353,307,353,308,354,313,354,314,355,315,354,320,354,321,353,331,353,332,354,335,354,336,355,339,355,340,356,379,356,380,357,406,357,407,356,409,356,410,357,474,357,475,356,482,356,484,357,485,356,488,356,493,353,501,354,502,353,506,353,507,352,517,352,518,351,522,351,525,347,527,346,530,347,530,349,533,351,530,355,528,355,527,356,515,356,509,359,508,361,505,362,503,365,497,368,494,372,490,373,489,374,492,376,495,376,493,377,488,378,490,380,495,380,497,381,497,381,487,382,485,385,476,387,469,392,466,392,465,393,460,393,456,396,453,397,451,399,443,400', '519,353,518,352,517,353,518,354']), (946711423, 2106233860, 631, 53, 85, 75, 182, 0.73015845, 1947740387, ['70,147,68,145,65,139,65,137,62,132,61,128,57,126,56,124,56,121,54,119,54,110,56,108,56,103,59,100,60,101,61,100,61,96,63,93,63,89,66,84,65,83,65,80,66,78,68,78,68,79,70,80,70,83,74,87,74,90,75,91,75,100,77,102,77,105,75,106,75,125,76,126,77,125,77,125,77,128,76,129,77,131,77,136,75,139,75,143,78,145,76,146,71,146', '61,107,60,106,59,107,60,108', '77,134,76,131,75,134,76,135']), (946711423, 2096875717, 631, 477, 510, 220, 243, 0.69028217, 1947740388, ['501,241,493,241,489,239,488,237,487,237,480,232,479,230,479,226,484,222,487,222,488,223,492,224,496,228,497,228,497,229,502,234,502,235,504,236,504,240,502,240']), (946711423, 2096875712, 631, 309, 326, 382, 404, 0.6633776, 1947740390, ['309,383,309,382,311,382', '325,385,324,383,319,383,318,382,325,382', '320,400,311,400,309,398,309,385,310,386,311,385,315,385,316,384,318,384,319,385,322,385,325,387,325,398,323,398']), (946711423, 2096875719, 631, 427, 553, 258, 315, 0.6446218, 1947740391, ['531,284,526,284,525,283,525,281,523,279,522,280,519,281,521,282,521,283,519,284,518,283,519,281,515,279,516,277,515,276,513,276,512,277,513,279,511,280,507,279,505,276,504,279,497,279,496,278,495,279,485,279,484,280,480,280,479,281,481,283,482,283,482,283,469,283,468,284,438,284,436,283,440,279,446,279,447,278,456,278,458,276,457,275,457,275,467,275,468,274,490,274,491,273,494,273,496,271,496,269,499,268,501,266,503,265,506,265,510,263,514,263,516,261,520,259,544,259,544,259,543,260,545,262,548,262,550,263,550,276,549,277,547,277,540,282,538,282,537,283,532,283']), (946711423, 2106233861, 631, 144, 267, 181, 307, 0.63958377, 1947740392, ['212,251,209,251,208,250,203,251,201,250,201,249,195,243,189,242,188,241,185,241,184,240,182,236,180,236,179,235,173,235,172,234,170,235,164,235,163,234,163,232,162,231,163,217,166,217,168,218,170,215,171,210,172,209,173,209,176,212,178,210,178,208,181,203,186,203,188,201,193,201,194,202,195,201,201,201,202,202,204,202,205,201,209,201,210,202,212,202,215,200,217,200,220,202,221,201,227,201,231,205,231,206,234,209,235,209,235,210,238,213,238,224,234,228,235,232,234,233,228,234,225,237,224,241,222,242,216,242,209,246,211,248,212,248,213,250', '221,228,220,227,219,228,220,229', '224,238,224,237,221,235,217,237,219,239']), (946711423, 2096875712, 631, 285, 433, 343, 377, 0.61493844, 1947740393, ['431,376,286,376,285,375,285,368,286,367,286,362,287,361,287,359,291,359,297,362,306,363,307,364,312,364,313,365,322,366,323,367,331,366,332,368,334,368,335,369,338,368,337,366,336,366,337,365,336,364,327,364,325,361,319,361,317,357,317,356,318,355,325,355,326,353,331,353,333,351,332,350,330,350,328,348,326,348,325,347,319,347,315,345,306,345,305,344,297,344,299,344,300,343,431,343,432,344,432,353,431,354,431,358,430,359,430,365,427,365,425,363,424,363,422,364,421,366,418,366,413,369,404,370,409,371,409,371,399,371,398,372,395,373,419,374,420,373,426,372,428,370,428,367,429,367,430,368,429,369,430,370,430,373,431,374', '381,373,378,372,377,371,356,371,356,371,359,370,347,369,345,367,343,367,342,368,343,369,341,370,354,371,354,371,352,372,353,373,359,373,360,374']), (946711423, 2106233860, 631, 146, 287, 140, 311, 0.54784286, 1947740394, ['234,254,227,254,221,251,219,248,215,253,212,253,210,252,206,247,203,247,198,243,197,243,194,239,189,238,186,236,182,236,181,235,167,235,164,233,164,228,159,227,158,226,158,219,159,218,159,213,162,207,162,205,169,192,169,186,170,185,172,185,177,179,175,175,173,173,177,171,181,171,182,170,184,170,187,167,187,164,188,163,188,161,199,161,202,164,205,165,207,167,209,167,212,165,215,165,216,168,218,170,219,170,221,168,221,164,220,163,220,161,222,161,223,160,230,160,231,159,242,159,244,158,247,161,248,161,247,162,246,168,248,172,248,174,253,176,254,180,253,182,249,182,247,185,249,188,253,188,254,189,254,194,249,194,247,196,247,198,249,200,252,200,253,199,255,202,255,205,254,206,254,208,250,207,249,206,246,209,246,210,249,214,252,212,254,212,254,214,255,215,255,217,254,218,254,221,252,221,249,219,247,221,247,225,249,228,250,228,252,226,253,224,253,224,253,229,252,229,251,228,249,228,247,230,247,233,246,234,246,237,245,238,245,240,243,244,243,247,239,251,237,251', '230,167,229,166,227,167,228,168']), (946711423, 495920967, 631, 202, 524, 112, 333, 0.45109355, 1947740396, ['483,289,483,286,482,285,482,283,480,279,480,274,476,270,472,268,465,268,464,269,459,269,458,268,454,268,453,267,437,267,436,268,428,268,427,269,418,269,417,270,414,270,410,266,410,265,416,262,418,262,421,260,423,260,425,259,426,257,424,255,422,255,419,253,417,253,416,252,412,251,410,250,410,249,412,249,413,248,415,248,416,247,422,246,428,243,429,242,428,241,424,240,423,239,420,239,419,238,390,238,389,237,386,237,385,236,369,236,368,235,363,234,363,233,364,232,364,230,366,226,365,225,357,220,344,220,341,218,339,218,339,218,342,212,342,210,336,207,327,207,326,206,319,206,318,205,314,205,313,204,297,204,291,207,288,210,288,212,291,217,290,220,288,222,284,224,282,224,278,227,273,228,271,230,270,235,265,239,262,236,261,232,263,228,266,226,261,224,256,219,256,210,249,206,242,205,237,202,234,195,226,186,227,184,227,180,228,179,225,175,225,174,222,171,225,165,227,163,229,158,230,157,232,156,235,156,236,155,239,155,240,154,245,154,246,155,254,155,255,156,258,156,259,157,268,157,269,156,272,156,273,155,280,155,281,156,298,156,300,155,301,156,307,156,308,157,311,157,318,152,322,151,323,150,333,150,338,146,339,146,342,143,343,143,346,140,357,140,362,136,366,134,368,134,369,133,373,133,374,132,377,132,378,131,388,131,389,130,410,130,411,131,417,131,428,140,432,142,434,142,435,143,443,145,446,147,448,147,451,154,453,156,457,158,462,159,463,160,466,160,467,161,472,162,474,163,474,164,481,171,489,175,491,175,492,176,494,176,495,177,499,178,500,179,502,184,507,189,517,194,518,195,514,201,514,203,518,207,519,209,518,214,515,218,517,227,515,229,515,231,514,232,515,236,518,239,518,252,519,253,519,263,518,264,518,267,517,269,514,272,512,273,512,274,506,280,500,277,498,273,496,272,493,272,491,274,491,278,490,279,490,281', '312,179,311,178,308,179,309,180', '268,269,264,269,259,266,259,262,261,258,261,250,265,245,269,250,270,257,274,260,278,265,275,267,269,268', '414,281,401,281,414,281']), (946711423, 2096875722, 631, 433, 558, 248, 286, 0.44133398, 1947740397, ['492,272,474,272,473,271,468,271,465,269,460,269,460,268,465,266,467,266,468,265,470,265,471,264,475,264,476,263,479,263,480,262,486,262,487,261,491,261,492,260,495,260,496,259,502,259,506,257,510,257,514,255,517,255,518,254,530,253,531,252,535,252,536,251,538,251,539,252,543,252,544,253,547,253,549,251,553,251,555,253,555,267,552,270,550,270,550,269,548,267,547,267,547,267,548,266,547,265,545,266,540,266,539,264,530,264,529,263,524,263,519,266,513,266,510,268,507,268,506,269,499,270,498,271,493,271', '438,279,435,279,435,273,436,272,448,271,449,272,448,274,443,274,440,277,440,278']), (946711423, 492654799, 631, 399, 569, 68, 251, 0.41876298, 1947740399, []), (946711423, 492624020, 631, 420, 552, 244, 293, 0.35962066, 1947740400, ['474,289,453,289,452,288,439,288,437,286,431,286,427,284,423,284,422,283,422,275,427,275,428,273,430,272,435,272,436,271,438,271,442,269,447,269,450,267,454,267,460,264,464,264,467,262,483,261,484,260,488,260,489,259,494,259,495,258,502,258,503,257,505,257,509,255,512,255,516,252,520,252,521,251,526,250,530,248,534,248,535,247,546,247,547,248,549,248,549,250,550,251,550,266,551,267,551,275,550,276,550,278,549,279,549,281,537,282,535,284,528,284,527,285,504,285,503,286,495,286,492,288,488,287,487,288,475,288']), (946711423, 503548896, 631, 301, 540, 339, 403, 0.740756, 3140491551, ['442,401,371,401,371,397,366,390,365,386,356,386,353,384,348,383,319,383,319,378,314,370,310,370,305,368,304,357,305,353,330,353,339,356,378,356,379,357,474,357,475,356,488,356,493,353,501,354,507,352,517,352,522,351,527,346,530,347,533,351,530,355,527,356,515,356,505,362,503,365,497,368,494,372,489,374,492,376,488,378,490,380,495,380,487,382,485,385,476,387,469,392,461,393,456,395,451,399,447,399', '519,353,518,352,517,353,518,354'])],)} test detection filter by crop is a success ! ############################### TEST detection_filter_by_classif ################################ t Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : detection_filter_by_classif list_input_json : [] origin we have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB time to download the photos : 0.016968965530395508 About to test input to load Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:detection_filter_by_classif Sun Oct 5 05:24:22 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec beginning of step detection filter with classification results param_json : {'input_type': 631, 'output_type': 816, 'condition_type': 872, 'crops_ok': {'CAR_DOCUMENT.*': {}, 'CAR_INTERIEUR.*': {}, 'CAR_EXTERIEUR_angle_avant_droit.*': {'Retroviseur': 2, 'Roue': 2, 'Capot': 1, 'Pare-brise': 1, 'vitre': 10, 'phare': 2, 'Feu-antibrouillard': 2, 'poignee': 2, 'porte': 2, 'calandre': 1, 'logo-marque': 1, 'Plaque-immatriculation': 1, 'Essuie-glace': 1, 'pare-choc': 1, 'toit': 1, 'logo-roue': 1, 'aile-avant': 1}}, 'separation': {'CAR_EXTERIEUR_avant.*': {'pare-choc': ['pare-chocs-avant'], 'phare': ['phare-gauche', 'a-droite-de', 'phare-droit']}, 'CAR_EXTERIEUR_angle_avant_droit.*': {'pare-choc': ['pare-chocs-avant'], 'phare': ['phare-droite', 'a-gauche-de', 'phare-gauche'], 'porte': ['porte-avant', 'a-droite-de', 'porte-arriere']}}} conditional_crop_by_classif_copy batch 1 Loaded 35 chid ids of type : 631 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ treating photo 946711423 batch 1 Loaded 0 chid ids of type : 0 batch 1 Loaded 23 chid ids of type : 816 Number RLEs to save : 1600 TO DO : save crop sub photo not yet done ! time spend for datou_step_exec : 0.9948587417602539 time spend to save output : 9.608268737792969e-05 total time spend for step 1 : 0.9949548244476318 caffe_path_current : About to save ! 0 After save, about to update current ! test detection filter by classif is a success ! ############################### TEST blur_detection ################################ t Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : blur_detection list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.16755247116088867 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:blur_detection Sun Oct 5 05:24:23 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec inside step blur_detection methode: ratio et variance treat image : temp/1759634663_2282988_930729675_b2d2beaaee733d521cbb0c9800a29073.jpg resize: (600, 800) 930729675 12.961859636534896 time spend for datou_step_exec : 0.3079221248626709 time spend to save output : 3.7670135498046875e-05 total time spend for step 1 : 0.30795979499816895 caffe_path_current : About to save ! 0 After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {930729675: [(930729675, 12.961859636534896, 492688767)]} {930729675: [(930729675, 12.961859636534896, 492688767)]} ############################### TEST detect_point_224x224 ################################ test_detect_point_224x224 Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : step 4589 thcl is not linked in the step_by_step architecture ! WARNING : step 4590 argmax is not linked in the step_by_step architecture ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! DataTypes for each output/input checked ! List Step Type Loaded in datou : thcl, argmax list_input_json : [] origin maybe url didn't existe for photo_id 987515175 , please check it BBBBBFBFBFBFBFBFBFBFBFBFBFBFBFFBBFBFBFmaybe url didn't existe for photo_id 987515212 , please check it BFBFBFBFBFBFmaybe url didn't existe for photo_id 987515195 , please check it BFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFmaybe url didn't existe for photo_id 987515248 , please check it BFBFBFBFBFBFBFBFBFBFFFFFBFwe have missing 4 photos in the step downloads : photo missing : [987515248, 987515195, 987515175, 987515212] try to delete the photos missing in DB HTTP Error 500: Internal Server Error HTTP Error 500: Internal Server Error HTTP Error 500: Internal Server Error HTTP Error 500: Internal Server Error length of list_filenames : 60 ; length of list_pids : 60 ; length of list_args : 60 time to download the photos : 123.00380253791809 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 2 step1:thcl Sun Oct 5 05:26:27 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step Thcl ! we are using the classfication for only one thcl 1528 time to import caffe and check if the image exist : 0.0043337345123291016 time to convert the images to numpy array : 0.04674124717712402 time to import caffe and check if the image exist : 0.007538318634033203 time to convert the images to numpy array : 0.047496795654296875 time to import caffe and check if the image exist : 0.008508920669555664 time to convert the images to numpy array : 0.04770207405090332 time to import caffe and check if the image exist : 0.013080120086669922 time to convert the images to numpy array : 0.04655957221984863 time to import caffe and check if the image exist : 0.005977630615234375 time to convert the images to numpy array : 0.05446934700012207 time to import caffe and check if the image exist : 0.006982326507568359 time to convert the images to numpy array : 0.0557706356048584 time to import caffe and check if the image exist : 0.010122060775756836 time to convert the images to numpy array : 0.052568912506103516 time to import caffe and check if the image exist : 0.004669666290283203 time to convert the images to numpy array : 0.05502772331237793 time to import caffe and check if the image exist : 0.01469731330871582 time to convert the images to numpy array : 0.04434394836425781 time to import caffe and check if the image exist : 0.017818212509155273 time to convert the images to numpy array : 0.04306149482727051 total time to convert the images to numpy array : 0.06624317169189453 list photo_ids error: [] list photo_ids correct : [987515189, 987515190, 987515192, 987515193, 987515196, 987515198, 987515200, 987515201, 987515202, 987515204, 987515205, 987515176, 987515177, 987515178, 987515179, 987515180, 987515181, 987515182, 987515208, 987515209, 987515211, 987515213, 987515215, 987515216, 987515183, 987515184, 987515185, 987515186, 987515187, 987515207, 987515239, 987515240, 987515241, 987515242, 987515243, 987515244, 987515245, 987515246, 987515247, 987515249, 987515250, 987515188, 987515217, 987515219, 987515220, 987515222, 987515223, 987515224, 987515233, 987515234, 987515235, 987515236, 987515237, 987515238, 987515226, 987515227, 987515228, 987515230, 987515231, 987515232] number of photos to traite : 60 try to delete the photos incorrect in DB tagging for thcl : 1528 To do loadFromThcl(), then load ParamDescType : thcl1528 thcls : [{'id': 1528, 'mtr_user_id': 31, 'name': 'learn_refus_upm_blanches_1924', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Autre_Environement,Carton,Kraft,Lointain_Papier_Magazine,Metal,Papier_Magazine,Plastique,Sol_Environement,Teint_Dans_La_Masse,autre_refus', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 1927, 'photo_desc_type': 4421, 'type_classification': 'caffe', 'hashtag_id_list': '2107752388,492774966,493202403,2107752389,492628673,2107752386,492725882,2107752387,2107752385,2107752406'}] thcl {'id': 1528, 'mtr_user_id': 31, 'name': 'learn_refus_upm_blanches_1924', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Autre_Environement,Carton,Kraft,Lointain_Papier_Magazine,Metal,Papier_Magazine,Plastique,Sol_Environement,Teint_Dans_La_Masse,autre_refus', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 1927, 'photo_desc_type': 4421, 'type_classification': 'caffe', 'hashtag_id_list': '2107752388,492774966,493202403,2107752389,492628673,2107752386,492725882,2107752387,2107752385,2107752406'} Update svm_hashtag_type_desc : 4421 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (4421, 'learn_refus_upm_blanches_1924', 16384, 25088, 'learn_refus_upm_blanches_1924', 'res5b', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2019, 10, 22, 17, 39, 25), datetime.datetime(2019, 10, 22, 17, 39, 25)) To loadFromThcl() : net_4421 begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 2926 max_wait_temp : 1 max_wait : 0 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (4421, 'learn_refus_upm_blanches_1924', 16384, 25088, 'learn_refus_upm_blanches_1924', 'res5b', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2019, 10, 22, 17, 39, 25), datetime.datetime(2019, 10, 22, 17, 39, 25)) None mean_file_type : mean_file_path : prototxt_file_path : model : learn_refus_upm_blanches_1924 Inside get_net Inside get_net before cache_data_model model_param file didn't exist Inside get_net before CDM.load_model_par_type model_name : learn_refus_upm_blanches_1924 model_type : caffe list file need : ['caffemodel', 'deploy_conv_normal.prototxt', 'deploy_fc.prototxt', 'deploy.prototxt', 'mean.npy', 'synset_words.txt'] file exist in s3 : ['caffemodel', 'deploy.prototxt', 'mean.npy', 'synset_words.txt'] file manque in s3 : ['deploy_conv_normal.prototxt', 'deploy_fc.prototxt'] local folder : /data/models_weight/learn_refus_upm_blanches_1924 /data/models_weight/learn_refus_upm_blanches_1924/caffemodel size_local : 45774543 size in s3 : 45774543 create time local : 2021-08-09 05:29:53 create time in s3 : 2021-08-06 19:36:04 caffemodel already exist and didn't need to update /data/models_weight/learn_refus_upm_blanches_1924/deploy.prototxt size_local : 17312 size in s3 : 17312 create time local : 2021-08-09 05:29:53 create time in s3 : 2021-08-06 19:36:03 deploy.prototxt already exist and didn't need to update /data/models_weight/learn_refus_upm_blanches_1924/mean.npy size_local : 1572992 size in s3 : 1572992 create time local : 2021-08-09 05:29:53 create time in s3 : 2021-08-06 19:36:05 mean.npy already exist and didn't need to update /data/models_weight/learn_refus_upm_blanches_1924/synset_words.txt size_local : 218 size in s3 : 218 create time local : 2021-08-09 05:29:53 create time in s3 : 2021-08-06 19:36:04 synset_words.txt already exist and didn't need to update Inside get_net after CDM.load_model_par_type After if not only_with_local_cache: /home/admin/workarea/install/darknet/:/home/admin/workarea/git/Velours/python:/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python:/home/admin/mtr/.credentials:/home/admin/workarea/install/caffe/python:/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools/:/home/admin/workarea/git/fotonowerpip/:/home/admin/workarea/install/segment-anything:/home/admin//workarea/git/pyfvs/ Here before set mode gpu Doing nothing but we could set mode gpu after set mode gpu prototxt_filename : /data/models_weight/learn_refus_upm_blanches_1924/deploy.prototxt caffemodel_filename : /data/models_weight/learn_refus_upm_blanches_1924/caffemodel now we set caffe to gpu mode before predict begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 2926 max_wait_temp : 1 max_wait : 0 dict_keys(['prob', 'res5b']) time used to do the prepocess of the images : 0.0640110969543457 time used to do the prediction : 0.14201998710632324 save descriptor for thcl : 1528 time to traite the descriptors : 3.9659266471862793 storage_type for insertDescriptorsMulti : 1 To insert : 987515189 To insert : 987515190 To insert : 987515192 To insert : 987515193 To insert : 987515196 To insert : 987515198 To insert : 987515200 To insert : 987515201 To insert : 987515202 To insert : 987515204 To insert : 987515205 To insert : 987515176 To insert : 987515177 To insert : 987515178 To insert : 987515179 To insert : 987515180 To insert : 987515181 To insert : 987515182 To insert : 987515208 To insert : 987515209 To insert : 987515211 To insert : 987515213 To insert : 987515215 To insert : 987515216 To insert : 987515183 To insert : 987515184 To insert : 987515185 To insert : 987515186 To insert : 987515187 To insert : 987515207 To insert : 987515239 To insert : 987515240 To insert : 987515241 To insert : 987515242 To insert : 987515243 To insert : 987515244 To insert : 987515245 To insert : 987515246 To insert : 987515247 To insert : 987515249 To insert : 987515250 To insert : 987515188 To insert : 987515217 To insert : 987515219 To insert : 987515220 To insert : 987515222 To insert : 987515223 To insert : 987515224 To insert : 987515233 To insert : 987515234 To insert : 987515235 To insert : 987515236 To insert : 987515237 To insert : 987515238 To insert : 987515226 To insert : 987515227 To insert : 987515228 To insert : 987515230 To insert : 987515231 To insert : 987515232 time to insert the descriptors : 16.761760234832764 time spend for datou_step_exec : 24.70028328895569 time spend to save output : 7.987022399902344e-05 total time spend for step 1 : 24.700363159179688 step2:argmax Sun Oct 5 05:26:52 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou_step Argmax ! calculate argmax for thcl : 1528 time spend for datou_step_exec : 0.0009367465972900391 time spend to save output : 1.049041748046875e-05 total time spend for step 2 : 0.0009472370147705078 caffe_path_current : About to save ! 0 After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 2 output : {'987515189': [('987515189', 'Carton', 0.9977913, 1927, '1528'), 'temp/1759634664_2282988_987515189_8e8590a26f72249d4c2116dffd0cf668.jpg'], '987515190': [('987515190', 'Carton', 0.9763833, 1927, '1528'), 'temp/1759634664_2282988_987515190_d56932bfc6ba2a8c974c691108755017.jpg'], '987515192': [('987515192', 'Papier_Magazine', 0.9999114, 1927, '1528'), 'temp/1759634664_2282988_987515192_b661073b218f5f056833d6af1c617153.jpg'], '987515193': [('987515193', 'Papier_Magazine', 0.99939716, 1927, '1528'), 'temp/1759634664_2282988_987515193_1a97fceb4dcbf5821d783b2e00b52fe6.jpg'], '987515196': [('987515196', 'Carton', 0.9847227, 1927, '1528'), 'temp/1759634664_2282988_987515196_30ccb89dfe410c445878a7f2819ddc36.jpg'], '987515198': [('987515198', 'Carton', 0.9659588, 1927, '1528'), 'temp/1759634664_2282988_987515198_599e80f444c876f407e94b533c89360b.jpg'], '987515200': [('987515200', 'Carton', 0.9859773, 1927, '1528'), 'temp/1759634664_2282988_987515200_978964436b5d5fb0eeda17e3bfafe889.jpg'], '987515201': [('987515201', 'Carton', 0.9954665, 1927, '1528'), 'temp/1759634664_2282988_987515201_b224d2acdc7fa2bbb134c09db6bca7ce.jpg'], '987515202': [('987515202', 'Carton', 0.9910931, 1927, '1528'), 'temp/1759634664_2282988_987515202_3314bd90d1404f31b827d8925abf2d62.jpg'], '987515204': [('987515204', 'Papier_Magazine', 0.9950717, 1927, '1528'), 'temp/1759634664_2282988_987515204_9779c4f9d44360a9c80499e3b01e8a09.jpg'], '987515205': [('987515205', 'Papier_Magazine', 0.99085873, 1927, '1528'), 'temp/1759634664_2282988_987515205_fd4b136d0b3a9a1a347942d7191f6fea.jpg'], '987515176': [('987515176', 'Papier_Magazine', 0.9998141, 1927, '1528'), 'temp/1759634664_2282988_987515176_8b398cba2f448622cd9657f5eb3f9796.jpg'], '987515177': [('987515177', 'Papier_Magazine', 0.97709596, 1927, '1528'), 'temp/1759634664_2282988_987515177_4a54e9967227806219ddf45d256539d8.jpg'], '987515178': [('987515178', 'Carton', 0.8572532, 1927, '1528'), 'temp/1759634664_2282988_987515178_298b3d2bfe0fda6787b59a78e2e68867.jpg'], '987515179': [('987515179', 'Carton', 0.9269205, 1927, '1528'), 'temp/1759634664_2282988_987515179_f7d4d1757a470f4c96dc3541eac88b9e.jpg'], '987515180': [('987515180', 'Carton', 0.9899734, 1927, '1528'), 'temp/1759634664_2282988_987515180_776a5d7d8486ee2961bbe3a0d90f95b5.jpg'], '987515181': [('987515181', 'Carton', 0.997781, 1927, '1528'), 'temp/1759634664_2282988_987515181_1738c2798fb31152809ecb443ac286d6.jpg'], '987515182': [('987515182', 'Carton', 0.9924264, 1927, '1528'), 'temp/1759634664_2282988_987515182_fe7f29bf6d13e08c3e985f91b5232178.jpg'], '987515208': [('987515208', 'Carton', 0.9917282, 1927, '1528'), 'temp/1759634664_2282988_987515208_a2b90cb74908aa64bbc4aae58f0c5ae8.jpg'], '987515209': [('987515209', 'Carton', 0.9677462, 1927, '1528'), 'temp/1759634664_2282988_987515209_02dfe1ae39f51994652f4a8538844aea.jpg'], '987515211': [('987515211', 'Carton', 0.97340405, 1927, '1528'), 'temp/1759634664_2282988_987515211_72cc7664d45bd40477351b9b764f1500.jpg'], '987515213': [('987515213', 'Carton', 0.98688656, 1927, '1528'), 'temp/1759634664_2282988_987515213_b0a038fcb9678ebfd60d9b1f6ec1fc17.jpg'], '987515215': [('987515215', 'Papier_Magazine', 0.9939241, 1927, '1528'), 'temp/1759634664_2282988_987515215_902ef348a7eebb9a8b87f42927347936.jpg'], '987515216': [('987515216', 'Papier_Magazine', 0.97744375, 1927, '1528'), 'temp/1759634664_2282988_987515216_4f7dc21f1d2cd3fcabadc4a6755921e1.jpg'], '987515183': [('987515183', 'Papier_Magazine', 0.99999213, 1927, '1528'), 'temp/1759634664_2282988_987515183_6aab9ca0421398b4899892c10c2594c6.jpg'], '987515184': [('987515184', 'Papier_Magazine', 0.99973196, 1927, '1528'), 'temp/1759634664_2282988_987515184_19c8c2177209a285df6014d95fe53f2c.jpg'], '987515185': [('987515185', 'Papier_Magazine', 0.7984763, 1927, '1528'), 'temp/1759634664_2282988_987515185_e172d54457cabee9d7f02ee1300f3ae9.jpg'], '987515186': [('987515186', 'Carton', 0.98477143, 1927, '1528'), 'temp/1759634664_2282988_987515186_797def426440b544aa80dbd63a19234a.jpg'], '987515187': [('987515187', 'Carton', 0.9810558, 1927, '1528'), 'temp/1759634664_2282988_987515187_9f62f98efd3caca0b9c17d27f5c70440.jpg'], '987515207': [('987515207', 'Papier_Magazine', 0.87404877, 1927, '1528'), 'temp/1759634664_2282988_987515207_de216ddb041e249524b0fb2b949064a5.jpg'], '987515239': [('987515239', 'Carton', 0.999783, 1927, '1528'), 'temp/1759634664_2282988_987515239_b3fa6f29636080b5138c8d8c33fea309.jpg'], '987515240': [('987515240', 'Carton', 0.9995198, 1927, '1528'), 'temp/1759634664_2282988_987515240_7829b9b15f1bf128ea4e2c1a39b9f0dd.jpg'], '987515241': [('987515241', 'Carton', 0.98213357, 1927, '1528'), 'temp/1759634664_2282988_987515241_073420d938f5f010ffd5b4353c064e09.jpg'], '987515242': [('987515242', 'Carton', 0.9359004, 1927, '1528'), 'temp/1759634664_2282988_987515242_327abb5215d6fd1f0aad51f53ed8c324.jpg'], '987515243': [('987515243', 'Papier_Magazine', 0.87432617, 1927, '1528'), 'temp/1759634664_2282988_987515243_4375283f3bc5cdaa431c2fc6f17f53a4.jpg'], '987515244': [('987515244', 'Papier_Magazine', 0.81671196, 1927, '1528'), 'temp/1759634664_2282988_987515244_419530eaef5ef868f75c758b94eea4b4.jpg'], '987515245': [('987515245', 'Carton', 0.8659148, 1927, '1528'), 'temp/1759634664_2282988_987515245_757d9d208d5bd4375c5f21f68b699148.jpg'], '987515246': [('987515246', 'Carton', 0.99923396, 1927, '1528'), 'temp/1759634664_2282988_987515246_671a708f67f2efa19004b8257fc7b9c8.jpg'], '987515247': [('987515247', 'Carton', 0.99966896, 1927, '1528'), 'temp/1759634664_2282988_987515247_e47b65403df916ba909bc9c439b0af73.jpg'], '987515249': [('987515249', 'Carton', 0.9812608, 1927, '1528'), 'temp/1759634664_2282988_987515249_a70ad88462a22fb62a120721a42b2d42.jpg'], '987515250': [('987515250', 'Carton', 0.98080117, 1927, '1528'), 'temp/1759634664_2282988_987515250_b2827c9639df69656f23abcc7f2f82d9.jpg'], '987515188': [('987515188', 'Carton', 0.99567056, 1927, '1528'), 'temp/1759634664_2282988_987515188_4116f9906657a69bb76c2fda982037b9.jpg'], '987515217': [('987515217', 'Carton', 0.52929336, 1927, '1528'), 'temp/1759634664_2282988_987515217_78877bb2c5760be28518d17f77d1c609.jpg'], '987515219': [('987515219', 'Carton', 0.9993698, 1927, '1528'), 'temp/1759634664_2282988_987515219_c2d417a5ba6ccf7c84527636f8d5eef9.jpg'], '987515220': [('987515220', 'Carton', 0.99637836, 1927, '1528'), 'temp/1759634664_2282988_987515220_e729f316c4c3b32049adfbaaa336d95c.jpg'], '987515222': [('987515222', 'Carton', 0.9974583, 1927, '1528'), 'temp/1759634664_2282988_987515222_067a027bc7402f969b6277d0dcb47eaa.jpg'], '987515223': [('987515223', 'Carton', 0.9920736, 1927, '1528'), 'temp/1759634664_2282988_987515223_ebb57f09941cd11d7ee45a9368a883c1.jpg'], '987515224': [('987515224', 'Carton', 0.90839016, 1927, '1528'), 'temp/1759634664_2282988_987515224_e8747b400e713ecbd08d5b75db4d7568.jpg'], '987515233': [('987515233', 'Carton', 0.98347616, 1927, '1528'), 'temp/1759634664_2282988_987515233_a92514bed0e8c5724f2d032d3ab1e2ad.jpg'], '987515234': [('987515234', 'Carton', 0.94490683, 1927, '1528'), 'temp/1759634664_2282988_987515234_2eca3480aed0f8b876242675ad99b666.jpg'], '987515235': [('987515235', 'Papier_Magazine', 0.8920974, 1927, '1528'), 'temp/1759634664_2282988_987515235_87075955a2f76b3948b47ffe1825ecd9.jpg'], '987515236': [('987515236', 'Papier_Magazine', 0.5370348, 1927, '1528'), 'temp/1759634664_2282988_987515236_8b44a98b1aceadad73ed000d65836a9a.jpg'], '987515237': [('987515237', 'Carton', 0.77036667, 1927, '1528'), 'temp/1759634664_2282988_987515237_1183dfa371a457f11ce2b622c7cf9467.jpg'], '987515238': [('987515238', 'Carton', 0.99957436, 1927, '1528'), 'temp/1759634664_2282988_987515238_e6292cb81e05894cfeb4b99f21a1d3f8.jpg'], '987515226': [('987515226', 'Papier_Magazine', 0.9869839, 1927, '1528'), 'temp/1759634664_2282988_987515226_a18048dca1a77ae086b62cf07759f704.jpg'], '987515227': [('987515227', 'Papier_Magazine', 0.9004452, 1927, '1528'), 'temp/1759634664_2282988_987515227_e9c45a0e576ec9e44c1379c3fc5fec7c.jpg'], '987515228': [('987515228', 'Papier_Magazine', 0.52154404, 1927, '1528'), 'temp/1759634664_2282988_987515228_9f1759f20c9e603bccb9f9879d2f0d54.jpg'], '987515230': [('987515230', 'Carton', 0.9994066, 1927, '1528'), 'temp/1759634664_2282988_987515230_846ad925884264181565c81d152a2e94.jpg'], '987515231': [('987515231', 'Carton', 0.9994211, 1927, '1528'), 'temp/1759634664_2282988_987515231_dbf4cafa71b6db4771c5c8f0c25e9cda.jpg'], '987515232': [('987515232', 'Carton', 0.9992467, 1927, '1528'), 'temp/1759634664_2282988_987515232_38db7950cdb3c674ee0ad65915b021f3.jpg']} Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : detect_points list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.1293177604675293 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:detect_points Sun Oct 5 05:26:52 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step predict points ! Inside try reload ! gpu_mode in detect_points : 1 To load net FromThcl() model_param file didn't exist model_name : learn_refus_upm_blanches_1924 model_type : caffe list file need : ['caffemodel', 'deploy_conv_normal.prototxt', 'deploy_fc.prototxt', 'deploy.prototxt', 'mean.npy', 'synset_words.txt'] file exist in s3 : ['caffemodel', 'deploy.prototxt', 'mean.npy', 'synset_words.txt'] file manque in s3 : ['deploy_conv_normal.prototxt', 'deploy_fc.prototxt'] local folder : /data/models_weight/learn_refus_upm_blanches_1924 /data/models_weight/learn_refus_upm_blanches_1924/caffemodel size_local : 45774543 size in s3 : 45774543 create time local : 2021-08-09 05:29:53 create time in s3 : 2021-08-06 19:36:04 caffemodel already exist and didn't need to update /data/models_weight/learn_refus_upm_blanches_1924/deploy.prototxt size_local : 17312 size in s3 : 17312 create time local : 2021-08-09 05:29:53 create time in s3 : 2021-08-06 19:36:03 deploy.prototxt already exist and didn't need to update /data/models_weight/learn_refus_upm_blanches_1924/mean.npy size_local : 1572992 size in s3 : 1572992 create time local : 2021-08-09 05:29:53 create time in s3 : 2021-08-06 19:36:05 mean.npy already exist and didn't need to update /data/models_weight/learn_refus_upm_blanches_1924/synset_words.txt size_local : 218 size in s3 : 218 create time local : 2021-08-09 05:29:53 create time in s3 : 2021-08-06 19:36:04 synset_words.txt already exist and didn't need to update reshape net's input to : (224, 224) origin shape : (10, 3, 224, 224) after reshape : (1, 3, 224, 224) [('data', (1, 3, 224, 224)), ('conv1', (1, 64, 112, 112)), ('pool1', (1, 64, 56, 56)), ('pool1_pool1_0_split_0', (1, 64, 56, 56)), ('pool1_pool1_0_split_1', (1, 64, 56, 56)), ('res2a_branch1', (1, 64, 56, 56)), ('res2a_branch2a', (1, 64, 56, 56)), ('res2a_branch2b', (1, 64, 56, 56)), ('res2a', (1, 64, 56, 56)), ('res2a_res2a_relu_0_split_0', (1, 64, 56, 56)), ('res2a_res2a_relu_0_split_1', (1, 64, 56, 56)), ('res2b_branch2a', (1, 64, 56, 56)), ('res2b_branch2b', (1, 64, 56, 56)), ('res2b', (1, 64, 56, 56)), ('res2b_res2b_relu_0_split_0', (1, 64, 56, 56)), ('res2b_res2b_relu_0_split_1', (1, 64, 56, 56)), ('res3a_branch1', (1, 128, 28, 28)), ('res3a_branch2a', (1, 128, 28, 28)), ('res3a_branch2b', (1, 128, 28, 28)), ('res3a', (1, 128, 28, 28)), ('res3a_res3a_relu_0_split_0', (1, 128, 28, 28)), ('res3a_res3a_relu_0_split_1', (1, 128, 28, 28)), ('res3b_branch2a', (1, 128, 28, 28)), ('res3b_branch2b', (1, 128, 28, 28)), ('res3b', (1, 128, 28, 28)), ('res3b_res3b_relu_0_split_0', (1, 128, 28, 28)), ('res3b_res3b_relu_0_split_1', (1, 128, 28, 28)), ('res4a_branch1', (1, 256, 14, 14)), ('res4a_branch2a', (1, 256, 14, 14)), ('res4a_branch2b', (1, 256, 14, 14)), ('res4a', (1, 256, 14, 14)), ('res4a_res4a_relu_0_split_0', (1, 256, 14, 14)), ('res4a_res4a_relu_0_split_1', (1, 256, 14, 14)), ('res4b_branch2a', (1, 256, 14, 14)), ('res4b_branch2b', (1, 256, 14, 14)), ('res4b', (1, 256, 14, 14)), ('res4b_res4b_relu_0_split_0', (1, 256, 14, 14)), ('res4b_res4b_relu_0_split_1', (1, 256, 14, 14)), ('res5a_branch1', (1, 512, 7, 7)), ('res5a_branch2a', (1, 512, 7, 7)), ('res5a_branch2b', (1, 512, 7, 7)), ('res5a', (1, 512, 7, 7)), ('res5a_res5a_relu_0_split_0', (1, 512, 7, 7)), ('res5a_res5a_relu_0_split_1', (1, 512, 7, 7)), ('res5b_branch2a', (1, 512, 7, 7)), ('res5b_branch2b', (1, 512, 7, 7)), ('res5b', (1, 512, 7, 7)), ('fc2019-10-22_15-02-46', (1, 10, 1, 1)), ('prob', (1, 10, 1, 1))] set image transformer : About to compute detect the points : len(args) : 1 Inside predict_points step exec : nb paths : 1 treate image : temp/1759634812_2282988_987515173_91fa471b1a04f95b356afdbaf021f623.jpg size of numpy array img : 2408584 scale method : caffe/skimage size of numpy array img_scale : 2408584 (448, 448, 3) nb_h 8 nb_w 8 size of sub images : (224, 224, 3) size of caffe_input : 38535320 (64, 3, 224, 224) time to do the preprocess : 0.04147958755493164 time to do a prediction : 0.2418215274810791 dict_keys(['prob']) shape of output (64, 10, 1, 1) shape of the out_put heatmap (10, 8, 8) number of sub_photos vertical and horizon 8 8 size of heatmap : (8,8) size of heatmap : (8,8) size of heatmap : (8,8) size of heatmap : (8,8) size of heatmap : (8,8) size of heatmap : (8,8) size of heatmap : (8,8) size of heatmap : (8,8) size of heatmap : (8,8) size of heatmap : (8,8) time spend for datou_step_exec : 1.6646842956542969 time spend to save output : 3.3855438232421875e-05 total time spend for step 1 : 1.6647181510925293 caffe_path_current : About to save ! 0 After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {987515173: [(987515173, 1982, 'Autre_Environement', 112, -1, 112, -1, 6.2738729142419736e-12), (987515173, 1982, 'Autre_Environement', 144, -1, 112, -1, 2.4550051228033887e-11), (987515173, 1982, 'Autre_Environement', 176, -1, 112, -1, 1.0628511404320307e-08), (987515173, 1982, 'Autre_Environement', 208, -1, 112, -1, 4.4457962644628424e-07), (987515173, 1982, 'Autre_Environement', 240, -1, 112, -1, 1.9179133232682943e-06), (987515173, 1982, 'Autre_Environement', 272, -1, 112, -1, 3.771615229197778e-05), (987515173, 1982, 'Autre_Environement', 304, -1, 112, -1, 0.00012228268315084279), (987515173, 1982, 'Autre_Environement', 336, -1, 112, -1, 2.9449340217979625e-05), (987515173, 1982, 'Autre_Environement', 112, -1, 144, -1, 2.3751319133680227e-08), (987515173, 1982, 'Autre_Environement', 144, -1, 144, -1, 2.2095473894978568e-08), (987515173, 1982, 'Autre_Environement', 176, -1, 144, -1, 1.381740304395862e-07), (987515173, 1982, 'Autre_Environement', 208, -1, 144, -1, 1.4723466392752016e-06), (987515173, 1982, 'Autre_Environement', 240, -1, 144, -1, 1.130070086219348e-05), (987515173, 1982, 'Autre_Environement', 272, -1, 144, -1, 0.0001580465614097193), (987515173, 1982, 'Autre_Environement', 304, -1, 144, -1, 0.0004436133895069361), (987515173, 1982, 'Autre_Environement', 336, -1, 144, -1, 6.54067043797113e-05), (987515173, 1982, 'Autre_Environement', 112, -1, 176, -1, 1.3317809361979016e-06), (987515173, 1982, 'Autre_Environement', 144, -1, 176, -1, 1.6290135818053386e-06), (987515173, 1982, 'Autre_Environement', 176, -1, 176, -1, 2.5289571112807607e-06), (987515173, 1982, 'Autre_Environement', 208, -1, 176, -1, 1.6085141396615654e-06), (987515173, 1982, 'Autre_Environement', 240, -1, 176, -1, 6.238918558665318e-06), (987515173, 1982, 'Autre_Environement', 272, -1, 176, -1, 8.626116323284805e-05), (987515173, 1982, 'Autre_Environement', 304, -1, 176, -1, 0.0003253525937907398), (987515173, 1982, 'Autre_Environement', 336, -1, 176, -1, 0.0003067129582632333), (987515173, 1982, 'Autre_Environement', 112, -1, 208, -1, 1.860252632468473e-05), (987515173, 1982, 'Autre_Environement', 144, -1, 208, -1, 7.876939889683854e-06), (987515173, 1982, 'Autre_Environement', 176, -1, 208, -1, 2.7082936867373064e-05), (987515173, 1982, 'Autre_Environement', 208, -1, 208, -1, 1.804059684218373e-05), (987515173, 1982, 'Autre_Environement', 240, -1, 208, -1, 2.3557642634841613e-05), (987515173, 1982, 'Autre_Environement', 272, -1, 208, -1, 1.6999989384203218e-05), (987515173, 1982, 'Autre_Environement', 304, -1, 208, -1, 4.547386652120622e-06), (987515173, 1982, 'Autre_Environement', 336, -1, 208, -1, 8.776374670560472e-06), (987515173, 1982, 'Autre_Environement', 112, -1, 240, -1, 6.099151050875662e-06), (987515173, 1982, 'Autre_Environement', 144, -1, 240, -1, 1.648167767598352e-06), (987515173, 1982, 'Autre_Environement', 176, -1, 240, -1, 1.9642907318484504e-06), (987515173, 1982, 'Autre_Environement', 208, -1, 240, -1, 1.4367076346388785e-06), (987515173, 1982, 'Autre_Environement', 240, -1, 240, -1, 7.831054972484708e-06), (987515173, 1982, 'Autre_Environement', 272, -1, 240, -1, 1.2892345694126561e-05), (987515173, 1982, 'Autre_Environement', 304, -1, 240, -1, 9.287549801229034e-06), (987515173, 1982, 'Autre_Environement', 336, -1, 240, -1, 2.1690337234758772e-05), (987515173, 1982, 'Autre_Environement', 112, -1, 272, -1, 3.826833562925458e-06), (987515173, 1982, 'Autre_Environement', 144, -1, 272, -1, 2.54541600952507e-06), (987515173, 1982, 'Autre_Environement', 176, -1, 272, -1, 2.965051180581213e-06), (987515173, 1982, 'Autre_Environement', 208, -1, 272, -1, 2.7526555186341284e-06), (987515173, 1982, 'Autre_Environement', 240, -1, 272, -1, 4.317631464800797e-06), (987515173, 1982, 'Autre_Environement', 272, -1, 272, -1, 8.18494208942866e-06), (987515173, 1982, 'Autre_Environement', 304, -1, 272, -1, 1.1524578440003097e-05), (987515173, 1982, 'Autre_Environement', 336, -1, 272, -1, 3.924907650798559e-05), (987515173, 1982, 'Autre_Environement', 112, -1, 304, -1, 1.2308129043958616e-05), (987515173, 1982, 'Autre_Environement', 144, -1, 304, -1, 1.5689185602241196e-05), (987515173, 1982, 'Autre_Environement', 176, -1, 304, -1, 3.339910108479671e-05), (987515173, 1982, 'Autre_Environement', 208, -1, 304, -1, 0.00015466778131667525), (987515173, 1982, 'Autre_Environement', 240, -1, 304, -1, 0.00025957811158150434), (987515173, 1982, 'Autre_Environement', 272, -1, 304, -1, 0.00018865568563342094), (987515173, 1982, 'Autre_Environement', 304, -1, 304, -1, 0.00021353580814320594), (987515173, 1982, 'Autre_Environement', 336, -1, 304, -1, 0.00016418272571172565), (987515173, 1982, 'Autre_Environement', 112, -1, 336, -1, 4.556975000014063e-06), (987515173, 1982, 'Autre_Environement', 144, -1, 336, -1, 1.7327407476841472e-05), (987515173, 1982, 'Autre_Environement', 176, -1, 336, -1, 4.9286740249954164e-05), (987515173, 1982, 'Autre_Environement', 208, -1, 336, -1, 0.00012142434570705518), (987515173, 1982, 'Autre_Environement', 240, -1, 336, -1, 0.00019638086087070405), (987515173, 1982, 'Autre_Environement', 272, -1, 336, -1, 0.0001880270428955555), (987515173, 1982, 'Autre_Environement', 304, -1, 336, -1, 0.00012352326302789152), (987515173, 1982, 'Autre_Environement', 336, -1, 336, -1, 0.0002723882207646966), (987515173, 1982, 'Carton', 112, -1, 112, -1, 1.5763404803692538e-07), (987515173, 1982, 'Carton', 144, -1, 112, -1, 4.05132914238493e-06), (987515173, 1982, 'Carton', 176, -1, 112, -1, 6.990682322793873e-06), (987515173, 1982, 'Carton', 208, -1, 112, -1, 0.0008727443055249751), (987515173, 1982, 'Carton', 240, -1, 112, -1, 0.0026408799458295107), (987515173, 1982, 'Carton', 272, -1, 112, -1, 0.003375329077243805), (987515173, 1982, 'Carton', 304, -1, 112, -1, 0.03133251518011093), (987515173, 1982, 'Carton', 336, -1, 112, -1, 0.055769383907318115), (987515173, 1982, 'Carton', 112, -1, 144, -1, 0.00012456874537747353), (987515173, 1982, 'Carton', 144, -1, 144, -1, 0.00020860486256424338), (987515173, 1982, 'Carton', 176, -1, 144, -1, 0.00036968212225474417), (987515173, 1982, 'Carton', 208, -1, 144, -1, 0.006833076011389494), (987515173, 1982, 'Carton', 240, -1, 144, -1, 0.01589813083410263), (987515173, 1982, 'Carton', 272, -1, 144, -1, 0.009402163326740265), (987515173, 1982, 'Carton', 304, -1, 144, -1, 0.009782025590538979), (987515173, 1982, 'Carton', 336, -1, 144, -1, 0.0220881849527359), (987515173, 1982, 'Carton', 112, -1, 176, -1, 0.021936502307653427), (987515173, 1982, 'Carton', 144, -1, 176, -1, 0.1946984827518463), (987515173, 1982, 'Carton', 176, -1, 176, -1, 0.09662488102912903), (987515173, 1982, 'Carton', 208, -1, 176, -1, 0.12327710539102554), (987515173, 1982, 'Carton', 240, -1, 176, -1, 0.5323872566223145), (987515173, 1982, 'Carton', 272, -1, 176, -1, 0.4602355659008026), (987515173, 1982, 'Carton', 304, -1, 176, -1, 0.7712738513946533), (987515173, 1982, 'Carton', 336, -1, 176, -1, 0.8661515712738037), (987515173, 1982, 'Carton', 112, -1, 208, -1, 0.849798321723938), (987515173, 1982, 'Carton', 144, -1, 208, -1, 0.9842603802680969), (987515173, 1982, 'Carton', 176, -1, 208, -1, 0.9847458600997925), (987515173, 1982, 'Carton', 208, -1, 208, -1, 0.9919501543045044), (987515173, 1982, 'Carton', 240, -1, 208, -1, 0.9993775486946106), (987515173, 1982, 'Carton', 272, -1, 208, -1, 0.9994118213653564), (987515173, 1982, 'Carton', 304, -1, 208, -1, 0.9995879530906677), (987515173, 1982, 'Carton', 336, -1, 208, -1, 0.99922776222229), (987515173, 1982, 'Carton', 112, -1, 240, -1, 0.9275565147399902), (987515173, 1982, 'Carton', 144, -1, 240, -1, 0.9811466932296753), (987515173, 1982, 'Carton', 176, -1, 240, -1, 0.9661900401115417), (987515173, 1982, 'Carton', 208, -1, 240, -1, 0.9677619338035583), (987515173, 1982, 'Carton', 240, -1, 240, -1, 0.9964058995246887), (987515173, 1982, 'Carton', 272, -1, 240, -1, 0.9994204044342041), (987515173, 1982, 'Carton', 304, -1, 240, -1, 0.9997864365577698), (987515173, 1982, 'Carton', 336, -1, 240, -1, 0.999669075012207), (987515173, 1982, 'Carton', 112, -1, 272, -1, 0.9895122647285461), (987515173, 1982, 'Carton', 144, -1, 272, -1, 0.9954647421836853), (987515173, 1982, 'Carton', 176, -1, 272, -1, 0.985528290271759), (987515173, 1982, 'Carton', 208, -1, 272, -1, 0.9732157588005066), (987515173, 1982, 'Carton', 240, -1, 272, -1, 0.9974708557128906), (987515173, 1982, 'Carton', 272, -1, 272, -1, 0.9992035031318665), (987515173, 1982, 'Carton', 304, -1, 272, -1, 0.9995160102844238), (987515173, 1982, 'Carton', 336, -1, 272, -1, 0.9991282820701599), (987515173, 1982, 'Carton', 112, -1, 304, -1, 0.9977686405181885), (987515173, 1982, 'Carton', 144, -1, 304, -1, 0.9977625608444214), (987515173, 1982, 'Carton', 176, -1, 304, -1, 0.9955434203147888), (987515173, 1982, 'Carton', 208, -1, 304, -1, 0.9927611351013184), (987515173, 1982, 'Carton', 240, -1, 304, -1, 0.9920235872268677), (987515173, 1982, 'Carton', 272, -1, 304, -1, 0.9835695624351501), (987515173, 1982, 'Carton', 304, -1, 304, -1, 0.9820236563682556), (987515173, 1982, 'Carton', 336, -1, 304, -1, 0.9808681607246399), (987515173, 1982, 'Carton', 112, -1, 336, -1, 0.992424726486206), (987515173, 1982, 'Carton', 144, -1, 336, -1, 0.9762069582939148), (987515173, 1982, 'Carton', 176, -1, 336, -1, 0.9911613464355469), (987515173, 1982, 'Carton', 208, -1, 336, -1, 0.9869142174720764), (987515173, 1982, 'Carton', 240, -1, 336, -1, 0.9089488387107849), (987515173, 1982, 'Carton', 272, -1, 336, -1, 0.9452250599861145), (987515173, 1982, 'Carton', 304, -1, 336, -1, 0.9366644024848938), (987515173, 1982, 'Carton', 336, -1, 336, -1, 0.9808259606361389), (987515173, 1982, 'Kraft', 112, -1, 112, -1, 1.9627175440461997e-09), (987515173, 1982, 'Kraft', 144, -1, 112, -1, 1.718169961861804e-08), (987515173, 1982, 'Kraft', 176, -1, 112, -1, 9.591618663762347e-07), (987515173, 1982, 'Kraft', 208, -1, 112, -1, 3.130649565719068e-05), (987515173, 1982, 'Kraft', 240, -1, 112, -1, 4.4244134187465534e-05), (987515173, 1982, 'Kraft', 272, -1, 112, -1, 0.00020572681387420744), (987515173, 1982, 'Kraft', 304, -1, 112, -1, 0.0010789006482809782), (987515173, 1982, 'Kraft', 336, -1, 112, -1, 0.0008283101487904787), (987515173, 1982, 'Kraft', 112, -1, 144, -1, 2.6555951990303583e-05), (987515173, 1982, 'Kraft', 144, -1, 144, -1, 7.005025508988183e-06), (987515173, 1982, 'Kraft', 176, -1, 144, -1, 3.6302169519331073e-06), (987515173, 1982, 'Kraft', 208, -1, 144, -1, 3.561826088116504e-05), (987515173, 1982, 'Kraft', 240, -1, 144, -1, 6.703906547045335e-05), (987515173, 1982, 'Kraft', 272, -1, 144, -1, 8.694737334735692e-05), (987515173, 1982, 'Kraft', 304, -1, 144, -1, 0.0001221822021761909), (987515173, 1982, 'Kraft', 336, -1, 144, -1, 0.0001124453337979503), (987515173, 1982, 'Kraft', 112, -1, 176, -1, 0.0005001540412195027), (987515173, 1982, 'Kraft', 144, -1, 176, -1, 0.0001244097074959427), (987515173, 1982, 'Kraft', 176, -1, 176, -1, 9.107824007514864e-05), (987515173, 1982, 'Kraft', 208, -1, 176, -1, 5.173912359168753e-05), (987515173, 1982, 'Kraft', 240, -1, 176, -1, 0.00011520022962940857), (987515173, 1982, 'Kraft', 272, -1, 176, -1, 0.0004362784093245864), (987515173, 1982, 'Kraft', 304, -1, 176, -1, 0.0009182795183733106), (987515173, 1982, 'Kraft', 336, -1, 176, -1, 0.0014294983120635152), (987515173, 1982, 'Kraft', 112, -1, 208, -1, 6.89085281919688e-05), (987515173, 1982, 'Kraft', 144, -1, 208, -1, 1.837030504248105e-05), (987515173, 1982, 'Kraft', 176, -1, 208, -1, 2.5922940039890818e-05), (987515173, 1982, 'Kraft', 208, -1, 208, -1, 3.537412703735754e-05), (987515173, 1982, 'Kraft', 240, -1, 208, -1, 3.7442106986418366e-05), (987515173, 1982, 'Kraft', 272, -1, 208, -1, 8.668832015246153e-05), (987515173, 1982, 'Kraft', 304, -1, 208, -1, 0.00012369209434837103), (987515173, 1982, 'Kraft', 336, -1, 208, -1, 0.0003906854835804552), (987515173, 1982, 'Kraft', 112, -1, 240, -1, 0.00030792219331488013), (987515173, 1982, 'Kraft', 144, -1, 240, -1, 4.1544197301846e-05), (987515173, 1982, 'Kraft', 176, -1, 240, -1, 1.2247569429746363e-05), (987515173, 1982, 'Kraft', 208, -1, 240, -1, 7.357524737017229e-06), (987515173, 1982, 'Kraft', 240, -1, 240, -1, 2.2969186829868704e-05), (987515173, 1982, 'Kraft', 272, -1, 240, -1, 5.880299795535393e-05), (987515173, 1982, 'Kraft', 304, -1, 240, -1, 6.558185123139992e-05), (987515173, 1982, 'Kraft', 336, -1, 240, -1, 0.00018698872008826584), (987515173, 1982, 'Kraft', 112, -1, 272, -1, 0.0014622012386098504), (987515173, 1982, 'Kraft', 144, -1, 272, -1, 0.0006890835938975215), (987515173, 1982, 'Kraft', 176, -1, 272, -1, 0.0002739106130320579), (987515173, 1982, 'Kraft', 208, -1, 272, -1, 4.355998316896148e-05), (987515173, 1982, 'Kraft', 240, -1, 272, -1, 3.345709410496056e-05), (987515173, 1982, 'Kraft', 272, -1, 272, -1, 8.336937025887892e-05), (987515173, 1982, 'Kraft', 304, -1, 272, -1, 0.00011194613034604117), (987515173, 1982, 'Kraft', 336, -1, 272, -1, 0.0004234245861880481), (987515173, 1982, 'Kraft', 112, -1, 304, -1, 0.001015371410176158), (987515173, 1982, 'Kraft', 144, -1, 304, -1, 0.0009032521047629416), (987515173, 1982, 'Kraft', 176, -1, 304, -1, 0.0006167780957184732), (987515173, 1982, 'Kraft', 208, -1, 304, -1, 0.0010905977105721831), (987515173, 1982, 'Kraft', 240, -1, 304, -1, 0.0017836177721619606), (987515173, 1982, 'Kraft', 272, -1, 304, -1, 0.004686745814979076), (987515173, 1982, 'Kraft', 304, -1, 304, -1, 0.00468348478898406), (987515173, 1982, 'Kraft', 336, -1, 304, -1, 0.01250129658728838), (987515173, 1982, 'Kraft', 112, -1, 336, -1, 0.0021809651516377926), (987515173, 1982, 'Kraft', 144, -1, 336, -1, 0.0057175904512405396), (987515173, 1982, 'Kraft', 176, -1, 336, -1, 0.000830701959785074), (987515173, 1982, 'Kraft', 208, -1, 336, -1, 0.0012632566504180431), (987515173, 1982, 'Kraft', 240, -1, 336, -1, 0.007757127750664949), (987515173, 1982, 'Kraft', 272, -1, 336, -1, 0.012540942057967186), (987515173, 1982, 'Kraft', 304, -1, 336, -1, 0.017895614728331566), (987515173, 1982, 'Kraft', 336, -1, 336, -1, 0.007762879598885775), (987515173, 1982, 'Lointain_Papier_Magazine', 112, -1, 112, -1, 1.4993487007508577e-10), (987515173, 1982, 'Lointain_Papier_Magazine', 144, -1, 112, -1, 8.27888246845987e-09), (987515173, 1982, 'Lointain_Papier_Magazine', 176, -1, 112, -1, 5.523119739336835e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 208, -1, 112, -1, 5.514185431820806e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 240, -1, 112, -1, 8.230977073253598e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 272, -1, 112, -1, 3.556802403181791e-05), (987515173, 1982, 'Lointain_Papier_Magazine', 304, -1, 112, -1, 8.13290462247096e-05), (987515173, 1982, 'Lointain_Papier_Magazine', 336, -1, 112, -1, 4.219042602926493e-05), (987515173, 1982, 'Lointain_Papier_Magazine', 112, -1, 144, -1, 3.641560226697038e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 144, -1, 144, -1, 1.309831532125827e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 176, -1, 144, -1, 2.4755788672337076e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 208, -1, 144, -1, 3.46073920809431e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 240, -1, 144, -1, 5.718031843571225e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 272, -1, 144, -1, 2.0651199520216323e-05), (987515173, 1982, 'Lointain_Papier_Magazine', 304, -1, 144, -1, 3.902939715771936e-05), (987515173, 1982, 'Lointain_Papier_Magazine', 336, -1, 144, -1, 1.3672386558027938e-05), (987515173, 1982, 'Lointain_Papier_Magazine', 112, -1, 176, -1, 3.903657841419772e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 144, -1, 176, -1, 2.5594224553060485e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 176, -1, 176, -1, 1.20317736218567e-05), (987515173, 1982, 'Lointain_Papier_Magazine', 208, -1, 176, -1, 7.790682502673008e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 240, -1, 176, -1, 6.346535428747302e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 272, -1, 176, -1, 2.6438250642968342e-05), (987515173, 1982, 'Lointain_Papier_Magazine', 304, -1, 176, -1, 3.9037597161950544e-05), (987515173, 1982, 'Lointain_Papier_Magazine', 336, -1, 176, -1, 1.8522048776503652e-05), (987515173, 1982, 'Lointain_Papier_Magazine', 112, -1, 208, -1, 1.8059379272017395e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 144, -1, 208, -1, 5.677164267581247e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 176, -1, 208, -1, 2.2004298898536945e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 208, -1, 208, -1, 1.7341538978143944e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 240, -1, 208, -1, 4.005448488442198e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 272, -1, 208, -1, 2.0307260228946689e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 304, -1, 208, -1, 9.874829487444003e-08), (987515173, 1982, 'Lointain_Papier_Magazine', 336, -1, 208, -1, 8.660671113602803e-08), (987515173, 1982, 'Lointain_Papier_Magazine', 112, -1, 240, -1, 1.8226204474558472e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 144, -1, 240, -1, 6.313660492196504e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 176, -1, 240, -1, 7.825018428775365e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 208, -1, 240, -1, 6.402892154255824e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 240, -1, 240, -1, 7.68767392855807e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 272, -1, 240, -1, 5.206512696531718e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 304, -1, 240, -1, 1.771176698639465e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 336, -1, 240, -1, 1.4431732608954917e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 112, -1, 272, -1, 4.1842156406346476e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 144, -1, 272, -1, 3.166678084198793e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 176, -1, 272, -1, 5.087516683488502e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 208, -1, 272, -1, 7.355315574386623e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 240, -1, 272, -1, 4.019340451577591e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 272, -1, 272, -1, 3.2205448974309547e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 304, -1, 272, -1, 2.1020976248564693e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 336, -1, 272, -1, 3.4837708540180756e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 112, -1, 304, -1, 3.3167515312015894e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 144, -1, 304, -1, 3.251780356094969e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 176, -1, 304, -1, 8.383913723264413e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 208, -1, 304, -1, 3.29513568431139e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 240, -1, 304, -1, 4.426457508088788e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 272, -1, 304, -1, 5.041595613874961e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 304, -1, 304, -1, 8.190505468519405e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 336, -1, 304, -1, 5.571561814576853e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 112, -1, 336, -1, 4.197459588795027e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 144, -1, 336, -1, 8.220947620429797e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 176, -1, 336, -1, 7.151665499804949e-07), (987515173, 1982, 'Lointain_Papier_Magazine', 208, -1, 336, -1, 1.9526198684616247e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 240, -1, 336, -1, 3.915576144208899e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 272, -1, 336, -1, 2.456168658682145e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 304, -1, 336, -1, 3.4308084195799893e-06), (987515173, 1982, 'Lointain_Papier_Magazine', 336, -1, 336, -1, 4.852242909691995e-06), (987515173, 1982, 'Metal', 112, -1, 112, -1, 7.43668390756902e-11), (987515173, 1982, 'Metal', 144, -1, 112, -1, 1.937712434951777e-09), (987515173, 1982, 'Metal', 176, -1, 112, -1, 7.275363032022142e-07), (987515173, 1982, 'Metal', 208, -1, 112, -1, 9.442700502404477e-06), (987515173, 1982, 'Metal', 240, -1, 112, -1, 2.6482319299248047e-05), (987515173, 1982, 'Metal', 272, -1, 112, -1, 0.00016899804177228361), (987515173, 1982, 'Metal', 304, -1, 112, -1, 0.0007525069522671402), (987515173, 1982, 'Metal', 336, -1, 112, -1, 0.00030306243570521474), (987515173, 1982, 'Metal', 112, -1, 144, -1, 5.377669936024176e-07), (987515173, 1982, 'Metal', 144, -1, 144, -1, 4.382940801406221e-07), (987515173, 1982, 'Metal', 176, -1, 144, -1, 3.1853305699769408e-06), (987515173, 1982, 'Metal', 208, -1, 144, -1, 9.589394721842837e-06), (987515173, 1982, 'Metal', 240, -1, 144, -1, 1.658510518609546e-05), (987515173, 1982, 'Metal', 272, -1, 144, -1, 7.67492238082923e-05), (987515173, 1982, 'Metal', 304, -1, 144, -1, 0.00013125737314112484), (987515173, 1982, 'Metal', 336, -1, 144, -1, 4.670850466936827e-05), (987515173, 1982, 'Metal', 112, -1, 176, -1, 5.901713393541286e-06), (987515173, 1982, 'Metal', 144, -1, 176, -1, 4.059038928971859e-06), (987515173, 1982, 'Metal', 176, -1, 176, -1, 1.0151354217668995e-05), (987515173, 1982, 'Metal', 208, -1, 176, -1, 4.152178462391021e-06), (987515173, 1982, 'Metal', 240, -1, 176, -1, 6.6467559918237384e-06), (987515173, 1982, 'Metal', 272, -1, 176, -1, 5.4515068768523633e-05), (987515173, 1982, 'Metal', 304, -1, 176, -1, 0.00012845745368395), (987515173, 1982, 'Metal', 336, -1, 176, -1, 6.582926289411262e-05), (987515173, 1982, 'Metal', 112, -1, 208, -1, 8.156595868058503e-06), (987515173, 1982, 'Metal', 144, -1, 208, -1, 2.244794586658827e-06), (987515173, 1982, 'Metal', 176, -1, 208, -1, 6.6600255195226055e-06), (987515173, 1982, 'Metal', 208, -1, 208, -1, 4.597271527018165e-06), (987515173, 1982, 'Metal', 240, -1, 208, -1, 1.7205046560775372e-06), (987515173, 1982, 'Metal', 272, -1, 208, -1, 1.3715972499994677e-06), (987515173, 1982, 'Metal', 304, -1, 208, -1, 5.792628599010641e-07), (987515173, 1982, 'Metal', 336, -1, 208, -1, 8.319934750034008e-07), (987515173, 1982, 'Metal', 112, -1, 240, -1, 2.9283198728080606e-06), (987515173, 1982, 'Metal', 144, -1, 240, -1, 7.381804039141571e-07), (987515173, 1982, 'Metal', 176, -1, 240, -1, 6.082773893467674e-07), (987515173, 1982, 'Metal', 208, -1, 240, -1, 6.242793801902735e-07), (987515173, 1982, 'Metal', 240, -1, 240, -1, 1.1946610811719438e-06), (987515173, 1982, 'Metal', 272, -1, 240, -1, 7.012953915364051e-07), (987515173, 1982, 'Metal', 304, -1, 240, -1, 3.141718707411201e-07), (987515173, 1982, 'Metal', 336, -1, 240, -1, 4.115262299819733e-07), (987515173, 1982, 'Metal', 112, -1, 272, -1, 2.621364728838671e-06), (987515173, 1982, 'Metal', 144, -1, 272, -1, 8.042811145969608e-07), (987515173, 1982, 'Metal', 176, -1, 272, -1, 8.221299481192545e-07), (987515173, 1982, 'Metal', 208, -1, 272, -1, 4.015674335278163e-07), (987515173, 1982, 'Metal', 240, -1, 272, -1, 2.7430286309026997e-07), (987515173, 1982, 'Metal', 272, -1, 272, -1, 3.312202636607253e-07), (987515173, 1982, 'Metal', 304, -1, 272, -1, 3.4383205615995394e-07), (987515173, 1982, 'Metal', 336, -1, 272, -1, 1.477982095821062e-06), (987515173, 1982, 'Metal', 112, -1, 304, -1, 2.3452557798009366e-06), (987515173, 1982, 'Metal', 144, -1, 304, -1, 1.949135821632808e-06), (987515173, 1982, 'Metal', 176, -1, 304, -1, 4.527315468294546e-06), (987515173, 1982, 'Metal', 208, -1, 304, -1, 1.3398477676673792e-05), (987515173, 1982, 'Metal', 240, -1, 304, -1, 5.651891115121543e-06), (987515173, 1982, 'Metal', 272, -1, 304, -1, 5.684566531272139e-06), (987515173, 1982, 'Metal', 304, -1, 304, -1, 6.3615007093176246e-06), (987515173, 1982, 'Metal', 336, -1, 304, -1, 7.372220807155827e-06), (987515173, 1982, 'Metal', 112, -1, 336, -1, 7.145749805204105e-06), (987515173, 1982, 'Metal', 144, -1, 336, -1, 3.309639578219503e-05), (987515173, 1982, 'Metal', 176, -1, 336, -1, 4.013393117929809e-05), (987515173, 1982, 'Metal', 208, -1, 336, -1, 7.598577212775126e-05), (987515173, 1982, 'Metal', 240, -1, 336, -1, 0.00012133002746850252), (987515173, 1982, 'Metal', 272, -1, 336, -1, 3.716520222951658e-05), (987515173, 1982, 'Metal', 304, -1, 336, -1, 2.9063032343401574e-05), (987515173, 1982, 'Metal', 336, -1, 336, -1, 2.811024387483485e-05), (987515173, 1982, 'Papier_Magazine', 112, -1, 112, -1, 0.999995231628418), (987515173, 1982, 'Papier_Magazine', 144, -1, 112, -1, 0.9999926090240479), (987515173, 1982, 'Papier_Magazine', 176, -1, 112, -1, 0.9999086856842041), (987515173, 1982, 'Papier_Magazine', 208, -1, 112, -1, 0.994962215423584), (987515173, 1982, 'Papier_Magazine', 240, -1, 112, -1, 0.9937800765037537), (987515173, 1982, 'Papier_Magazine', 272, -1, 112, -1, 0.9867060780525208), (987515173, 1982, 'Papier_Magazine', 304, -1, 112, -1, 0.8925097584724426), (987515173, 1982, 'Papier_Magazine', 336, -1, 112, -1, 0.8722216486930847), (987515173, 1982, 'Papier_Magazine', 112, -1, 144, -1, 0.9998132586479187), (987515173, 1982, 'Papier_Magazine', 144, -1, 144, -1, 0.9997429251670837), (987515173, 1982, 'Papier_Magazine', 176, -1, 144, -1, 0.9994120597839355), (987515173, 1982, 'Papier_Magazine', 208, -1, 144, -1, 0.9912009239196777), (987515173, 1982, 'Papier_Magazine', 240, -1, 144, -1, 0.9781122803688049), (987515173, 1982, 'Papier_Magazine', 272, -1, 144, -1, 0.8997899889945984), (987515173, 1982, 'Papier_Magazine', 304, -1, 144, -1, 0.5358294248580933), (987515173, 1982, 'Papier_Magazine', 336, -1, 144, -1, 0.8109337091445923), (987515173, 1982, 'Papier_Magazine', 112, -1, 176, -1, 0.9775210618972778), (987515173, 1982, 'Papier_Magazine', 144, -1, 176, -1, 0.8049374222755432), (987515173, 1982, 'Papier_Magazine', 176, -1, 176, -1, 0.9026542901992798), (987515173, 1982, 'Papier_Magazine', 208, -1, 176, -1, 0.8754345178604126), (987515173, 1982, 'Papier_Magazine', 240, -1, 176, -1, 0.46616387367248535), (987515173, 1982, 'Papier_Magazine', 272, -1, 176, -1, 0.5258055329322815), (987515173, 1982, 'Papier_Magazine', 304, -1, 176, -1, 0.1667405664920807), (987515173, 1982, 'Papier_Magazine', 336, -1, 176, -1, 0.11059257388114929), (987515173, 1982, 'Papier_Magazine', 112, -1, 208, -1, 0.14991986751556396), (987515173, 1982, 'Papier_Magazine', 144, -1, 208, -1, 0.015482528135180473), (987515173, 1982, 'Papier_Magazine', 176, -1, 208, -1, 0.01478438638150692), (987515173, 1982, 'Papier_Magazine', 208, -1, 208, -1, 0.007589730899780989), (987515173, 1982, 'Papier_Magazine', 240, -1, 208, -1, 0.0003478124854154885), (987515173, 1982, 'Papier_Magazine', 272, -1, 208, -1, 0.00017103862774092704), (987515173, 1982, 'Papier_Magazine', 304, -1, 208, -1, 5.368834172259085e-05), (987515173, 1982, 'Papier_Magazine', 336, -1, 208, -1, 6.245652184588835e-05), (987515173, 1982, 'Papier_Magazine', 112, -1, 240, -1, 0.07169905304908752), (987515173, 1982, 'Papier_Magazine', 144, -1, 240, -1, 0.018659375607967377), (987515173, 1982, 'Papier_Magazine', 176, -1, 240, -1, 0.03370406851172447), (987515173, 1982, 'Papier_Magazine', 208, -1, 240, -1, 0.03219267353415489), (987515173, 1982, 'Papier_Magazine', 240, -1, 240, -1, 0.0034774017985910177), (987515173, 1982, 'Papier_Magazine', 272, -1, 240, -1, 0.00034121397766284645), (987515173, 1982, 'Papier_Magazine', 304, -1, 240, -1, 2.9651477234438062e-05), (987515173, 1982, 'Papier_Magazine', 336, -1, 240, -1, 1.6363093891413882e-05), (987515173, 1982, 'Papier_Magazine', 112, -1, 272, -1, 0.008512412197887897), (987515173, 1982, 'Papier_Magazine', 144, -1, 272, -1, 0.0036519591230899096), (987515173, 1982, 'Papier_Magazine', 176, -1, 272, -1, 0.014011409133672714), (987515173, 1982, 'Papier_Magazine', 208, -1, 272, -1, 0.026665475219488144), (987515173, 1982, 'Papier_Magazine', 240, -1, 272, -1, 0.0024535846896469593), (987515173, 1982, 'Papier_Magazine', 272, -1, 272, -1, 0.0006450231303460896), (987515173, 1982, 'Papier_Magazine', 304, -1, 272, -1, 0.0002753009321168065), (987515173, 1982, 'Papier_Magazine', 336, -1, 272, -1, 0.00015959444863256067), (987515173, 1982, 'Papier_Magazine', 112, -1, 304, -1, 0.0009835998062044382), (987515173, 1982, 'Papier_Magazine', 144, -1, 304, -1, 0.0009857217082753778), (987515173, 1982, 'Papier_Magazine', 176, -1, 304, -1, 0.0032105897553265095), (987515173, 1982, 'Papier_Magazine', 208, -1, 304, -1, 0.00499159237369895), (987515173, 1982, 'Papier_Magazine', 240, -1, 304, -1, 0.003504862543195486), (987515173, 1982, 'Papier_Magazine', 272, -1, 304, -1, 0.004065932240337133), (987515173, 1982, 'Papier_Magazine', 304, -1, 304, -1, 0.007200407795608044), (987515173, 1982, 'Papier_Magazine', 336, -1, 304, -1, 0.0019593823235481977), (987515173, 1982, 'Papier_Magazine', 112, -1, 336, -1, 0.004877334460616112), (987515173, 1982, 'Papier_Magazine', 144, -1, 336, -1, 0.015495209954679012), (987515173, 1982, 'Papier_Magazine', 176, -1, 336, -1, 0.006797932554036379), (987515173, 1982, 'Papier_Magazine', 208, -1, 336, -1, 0.008006147108972073), (987515173, 1982, 'Papier_Magazine', 240, -1, 336, -1, 0.01907145045697689), (987515173, 1982, 'Papier_Magazine', 272, -1, 336, -1, 0.0036614700220525265), (987515173, 1982, 'Papier_Magazine', 304, -1, 336, -1, 0.006303672678768635), (987515173, 1982, 'Papier_Magazine', 336, -1, 336, -1, 0.004444703925400972), (987515173, 1982, 'Plastique', 112, -1, 112, -1, 5.6171856499531714e-08), (987515173, 1982, 'Plastique', 144, -1, 112, -1, 8.205748827094794e-07), (987515173, 1982, 'Plastique', 176, -1, 112, -1, 6.251792365219444e-05), (987515173, 1982, 'Plastique', 208, -1, 112, -1, 0.0035411433782428503), (987515173, 1982, 'Plastique', 240, -1, 112, -1, 0.0031569208949804306), (987515173, 1982, 'Plastique', 272, -1, 112, -1, 0.007460438180714846), (987515173, 1982, 'Plastique', 304, -1, 112, -1, 0.05466697737574577), (987515173, 1982, 'Plastique', 336, -1, 112, -1, 0.05961107835173607), (987515173, 1982, 'Plastique', 112, -1, 144, -1, 2.7866769869433483e-06), (987515173, 1982, 'Plastique', 144, -1, 144, -1, 1.904137752717361e-05), (987515173, 1982, 'Plastique', 176, -1, 144, -1, 0.00017628348723519593), (987515173, 1982, 'Plastique', 208, -1, 144, -1, 0.0015008171321824193), (987515173, 1982, 'Plastique', 240, -1, 144, -1, 0.005057745147496462), (987515173, 1982, 'Plastique', 272, -1, 144, -1, 0.08484913408756256), (987515173, 1982, 'Plastique', 304, -1, 144, -1, 0.40972793102264404), (987515173, 1982, 'Plastique', 336, -1, 144, -1, 0.07087679207324982), (987515173, 1982, 'Plastique', 112, -1, 176, -1, 3.935522727260832e-06), (987515173, 1982, 'Plastique', 144, -1, 176, -1, 9.543946362100542e-05), (987515173, 1982, 'Plastique', 176, -1, 176, -1, 0.00034031973336823285), (987515173, 1982, 'Plastique', 208, -1, 176, -1, 0.0003886849735863507), (987515173, 1982, 'Plastique', 240, -1, 176, -1, 0.0006228170823305845), (987515173, 1982, 'Plastique', 272, -1, 176, -1, 0.008931935764849186), (987515173, 1982, 'Plastique', 304, -1, 176, -1, 0.037116147577762604), (987515173, 1982, 'Plastique', 336, -1, 176, -1, 0.0020805762615054846), (987515173, 1982, 'Plastique', 112, -1, 208, -1, 7.629734318470582e-05), (987515173, 1982, 'Plastique', 144, -1, 208, -1, 4.113754039281048e-05), (987515173, 1982, 'Plastique', 176, -1, 208, -1, 8.234679989982396e-05), (987515173, 1982, 'Plastique', 208, -1, 208, -1, 3.830824061878957e-05), (987515173, 1982, 'Plastique', 240, -1, 208, -1, 8.385713954339735e-06), (987515173, 1982, 'Plastique', 272, -1, 208, -1, 1.2024496754747815e-05), (987515173, 1982, 'Plastique', 304, -1, 208, -1, 1.208805315400241e-05), (987515173, 1982, 'Plastique', 336, -1, 208, -1, 4.098301815247396e-06), (987515173, 1982, 'Plastique', 112, -1, 240, -1, 0.0001447406248189509), (987515173, 1982, 'Plastique', 144, -1, 240, -1, 3.530857793521136e-05), (987515173, 1982, 'Plastique', 176, -1, 240, -1, 2.1154617570573464e-05), (987515173, 1982, 'Plastique', 208, -1, 240, -1, 7.3572232395235915e-06), (987515173, 1982, 'Plastique', 240, -1, 240, -1, 4.8577680900052655e-06), (987515173, 1982, 'Plastique', 272, -1, 240, -1, 2.780695467663463e-06), (987515173, 1982, 'Plastique', 304, -1, 240, -1, 7.964539463500842e-07), (987515173, 1982, 'Plastique', 336, -1, 240, -1, 3.5284702448734606e-07), (987515173, 1982, 'Plastique', 112, -1, 272, -1, 3.3701402571750805e-05), (987515173, 1982, 'Plastique', 144, -1, 272, -1, 1.164870263892226e-05), (987515173, 1982, 'Plastique', 176, -1, 272, -1, 1.3387130820774473e-05), (987515173, 1982, 'Plastique', 208, -1, 272, -1, 5.430733835964929e-06), (987515173, 1982, 'Plastique', 240, -1, 272, -1, 9.562522791384254e-07), (987515173, 1982, 'Plastique', 272, -1, 272, -1, 6.880003411424696e-07), (987515173, 1982, 'Plastique', 304, -1, 272, -1, 5.693135562978568e-07), (987515173, 1982, 'Plastique', 336, -1, 272, -1, 7.457582569259102e-07), (987515173, 1982, 'Plastique', 112, -1, 304, -1, 3.371861566847656e-06), (987515173, 1982, 'Plastique', 144, -1, 304, -1, 3.3535009151819395e-06), (987515173, 1982, 'Plastique', 176, -1, 304, -1, 6.50782385491766e-06), (987515173, 1982, 'Plastique', 208, -1, 304, -1, 8.752262147027068e-06), (987515173, 1982, 'Plastique', 240, -1, 304, -1, 3.387981223568204e-06), (987515173, 1982, 'Plastique', 272, -1, 304, -1, 4.764699497172842e-06), (987515173, 1982, 'Plastique', 304, -1, 304, -1, 4.499734586715931e-06), (987515173, 1982, 'Plastique', 336, -1, 304, -1, 2.416558572804206e-06), (987515173, 1982, 'Plastique', 112, -1, 336, -1, 1.550018168927636e-05), (987515173, 1982, 'Plastique', 144, -1, 336, -1, 5.3455103625310585e-05), (987515173, 1982, 'Plastique', 176, -1, 336, -1, 3.4266584407305345e-05), (987515173, 1982, 'Plastique', 208, -1, 336, -1, 2.3813272491679527e-05), (987515173, 1982, 'Plastique', 240, -1, 336, -1, 3.865895268972963e-05), (987515173, 1982, 'Plastique', 272, -1, 336, -1, 2.576561200839933e-05), (987515173, 1982, 'Plastique', 304, -1, 336, -1, 3.321040276205167e-05), (987515173, 1982, 'Plastique', 336, -1, 336, -1, 4.1911825974239036e-05), (987515173, 1982, 'Sol_Environement', 112, -1, 112, -1, 2.9147684699887266e-12), (987515173, 1982, 'Sol_Environement', 144, -1, 112, -1, 5.572824979260815e-10), (987515173, 1982, 'Sol_Environement', 176, -1, 112, -1, 4.931148964715248e-07), (987515173, 1982, 'Sol_Environement', 208, -1, 112, -1, 1.0146173735847697e-05), (987515173, 1982, 'Sol_Environement', 240, -1, 112, -1, 9.16911358217476e-06), (987515173, 1982, 'Sol_Environement', 272, -1, 112, -1, 5.6088851124513894e-05), (987515173, 1982, 'Sol_Environement', 304, -1, 112, -1, 0.00037712411722168326), (987515173, 1982, 'Sol_Environement', 336, -1, 112, -1, 0.00016502899234183133), (987515173, 1982, 'Sol_Environement', 112, -1, 144, -1, 1.0388486515466866e-07), (987515173, 1982, 'Sol_Environement', 144, -1, 144, -1, 5.797362518933369e-07), (987515173, 1982, 'Sol_Environement', 176, -1, 144, -1, 1.4367750509336474e-06), (987515173, 1982, 'Sol_Environement', 208, -1, 144, -1, 5.894803507544566e-06), (987515173, 1982, 'Sol_Environement', 240, -1, 144, -1, 1.8237458789371885e-05), (987515173, 1982, 'Sol_Environement', 272, -1, 144, -1, 0.00012657007027883083), (987515173, 1982, 'Sol_Environement', 304, -1, 144, -1, 0.00032424667733721435), (987515173, 1982, 'Sol_Environement', 336, -1, 144, -1, 9.640173084335402e-05), (987515173, 1982, 'Sol_Environement', 112, -1, 176, -1, 4.1924576521523704e-07), (987515173, 1982, 'Sol_Environement', 144, -1, 176, -1, 1.0222153150607483e-06), (987515173, 1982, 'Sol_Environement', 176, -1, 176, -1, 6.5917988649744075e-06), (987515173, 1982, 'Sol_Environement', 208, -1, 176, -1, 1.714908307803853e-06), (987515173, 1982, 'Sol_Environement', 240, -1, 176, -1, 3.718574134836672e-06), (987515173, 1982, 'Sol_Environement', 272, -1, 176, -1, 2.8040711185894907e-05), (987515173, 1982, 'Sol_Environement', 304, -1, 176, -1, 0.00015287206042557955), (987515173, 1982, 'Sol_Environement', 336, -1, 176, -1, 0.00024334668705705553), (987515173, 1982, 'Sol_Environement', 112, -1, 208, -1, 4.016848379251314e-06), (987515173, 1982, 'Sol_Environement', 144, -1, 208, -1, 7.449643248946813e-07), (987515173, 1982, 'Sol_Environement', 176, -1, 208, -1, 1.3342541933525354e-06), (987515173, 1982, 'Sol_Environement', 208, -1, 208, -1, 9.046160016623617e-07), (987515173, 1982, 'Sol_Environement', 240, -1, 208, -1, 6.972634309931891e-07), (987515173, 1982, 'Sol_Environement', 272, -1, 208, -1, 1.0830935934791341e-06), (987515173, 1982, 'Sol_Environement', 304, -1, 208, -1, 1.4432730495173018e-06), (987515173, 1982, 'Sol_Environement', 336, -1, 208, -1, 2.3145382783695823e-06), (987515173, 1982, 'Sol_Environement', 112, -1, 240, -1, 4.611034455592744e-06), (987515173, 1982, 'Sol_Environement', 144, -1, 240, -1, 3.769544889564713e-07), (987515173, 1982, 'Sol_Environement', 176, -1, 240, -1, 2.489750841050409e-07), (987515173, 1982, 'Sol_Environement', 208, -1, 240, -1, 2.1691661800105067e-07), (987515173, 1982, 'Sol_Environement', 240, -1, 240, -1, 5.564348271036579e-07), (987515173, 1982, 'Sol_Environement', 272, -1, 240, -1, 8.692940127730253e-07), (987515173, 1982, 'Sol_Environement', 304, -1, 240, -1, 6.635166300839046e-07), (987515173, 1982, 'Sol_Environement', 336, -1, 240, -1, 7.824715453352837e-07), (987515173, 1982, 'Sol_Environement', 112, -1, 272, -1, 2.996717512360192e-06), (987515173, 1982, 'Sol_Environement', 144, -1, 272, -1, 6.862887858005706e-07), (987515173, 1982, 'Sol_Environement', 176, -1, 272, -1, 5.020879711992166e-07), (987515173, 1982, 'Sol_Environement', 208, -1, 272, -1, 2.0106818965359707e-07), (987515173, 1982, 'Sol_Environement', 240, -1, 272, -1, 1.316578988053152e-07), (987515173, 1982, 'Sol_Environement', 272, -1, 272, -1, 2.7000481850336655e-07), (987515173, 1982, 'Sol_Environement', 304, -1, 272, -1, 1.611626601061289e-07), (987515173, 1982, 'Sol_Environement', 336, -1, 272, -1, 4.856589157498092e-07), (987515173, 1982, 'Sol_Environement', 112, -1, 304, -1, 7.128397214728466e-07), (987515173, 1982, 'Sol_Environement', 144, -1, 304, -1, 4.044945569603442e-07), (987515173, 1982, 'Sol_Environement', 176, -1, 304, -1, 8.910733413358685e-07), (987515173, 1982, 'Sol_Environement', 208, -1, 304, -1, 2.3422926460625604e-06), (987515173, 1982, 'Sol_Environement', 240, -1, 304, -1, 3.0637415875389706e-06), (987515173, 1982, 'Sol_Environement', 272, -1, 304, -1, 2.6779441668622894e-06), (987515173, 1982, 'Sol_Environement', 304, -1, 304, -1, 1.4986285350460093e-06), (987515173, 1982, 'Sol_Environement', 336, -1, 304, -1, 1.6102003428386524e-06), (987515173, 1982, 'Sol_Environement', 112, -1, 336, -1, 4.6376644036172365e-07), (987515173, 1982, 'Sol_Environement', 144, -1, 336, -1, 1.126954202845809e-06), (987515173, 1982, 'Sol_Environement', 176, -1, 336, -1, 6.800661935812968e-07), (987515173, 1982, 'Sol_Environement', 208, -1, 336, -1, 1.435661147297651e-06), (987515173, 1982, 'Sol_Environement', 240, -1, 336, -1, 3.3446758607169613e-06), (987515173, 1982, 'Sol_Environement', 272, -1, 336, -1, 2.120417093465221e-06), (987515173, 1982, 'Sol_Environement', 304, -1, 336, -1, 2.293623083460261e-06), (987515173, 1982, 'Sol_Environement', 336, -1, 336, -1, 3.3894764328579186e-06), (987515173, 1982, 'Teint_Dans_La_Masse', 112, -1, 112, -1, 4.596810413204366e-06), (987515173, 1982, 'Teint_Dans_La_Masse', 144, -1, 112, -1, 2.4632665827084566e-06), (987515173, 1982, 'Teint_Dans_La_Masse', 176, -1, 112, -1, 1.7659805962466635e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 208, -1, 112, -1, 0.0004745331534650177), (987515173, 1982, 'Teint_Dans_La_Masse', 240, -1, 112, -1, 0.00011552977230167016), (987515173, 1982, 'Teint_Dans_La_Masse', 272, -1, 112, -1, 0.00020928923913743347), (987515173, 1982, 'Teint_Dans_La_Masse', 304, -1, 112, -1, 0.0014383975649252534), (987515173, 1982, 'Teint_Dans_La_Masse', 336, -1, 112, -1, 0.0020228715147823095), (987515173, 1982, 'Teint_Dans_La_Masse', 112, -1, 144, -1, 3.083619230892509e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 144, -1, 144, -1, 1.720988620945718e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 176, -1, 144, -1, 2.07681496249279e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 208, -1, 144, -1, 0.00027426431188359857), (987515173, 1982, 'Teint_Dans_La_Masse', 240, -1, 144, -1, 0.00046499818563461304), (987515173, 1982, 'Teint_Dans_La_Masse', 272, -1, 144, -1, 0.0005293394206091762), (987515173, 1982, 'Teint_Dans_La_Masse', 304, -1, 144, -1, 0.00018999373423866928), (987515173, 1982, 'Teint_Dans_La_Masse', 336, -1, 144, -1, 0.00020561015116982162), (987515173, 1982, 'Teint_Dans_La_Masse', 112, -1, 176, -1, 1.5055226867843885e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 144, -1, 176, -1, 6.7920245783170685e-06), (987515173, 1982, 'Teint_Dans_La_Masse', 176, -1, 176, -1, 2.1078474674141034e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 208, -1, 176, -1, 2.4128898076014593e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 240, -1, 176, -1, 3.5300439776619896e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 272, -1, 176, -1, 7.951473526190966e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 304, -1, 176, -1, 0.00010045941598946229), (987515173, 1982, 'Teint_Dans_La_Masse', 336, -1, 176, -1, 0.0003794369986280799), (987515173, 1982, 'Teint_Dans_La_Masse', 112, -1, 208, -1, 1.5120272109925281e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 144, -1, 208, -1, 1.4640936569776386e-06), (987515173, 1982, 'Teint_Dans_La_Masse', 176, -1, 208, -1, 4.738924872071948e-06), (987515173, 1982, 'Teint_Dans_La_Masse', 208, -1, 208, -1, 3.8270859477052e-06), (987515173, 1982, 'Teint_Dans_La_Masse', 240, -1, 208, -1, 3.669352736324072e-06), (987515173, 1982, 'Teint_Dans_La_Masse', 272, -1, 208, -1, 1.1683931916195434e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 304, -1, 208, -1, 1.3372040484682657e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 336, -1, 208, -1, 5.937371315667406e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 112, -1, 240, -1, 4.23176861659158e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 144, -1, 240, -1, 5.303556918079266e-06), (987515173, 1982, 'Teint_Dans_La_Masse', 176, -1, 240, -1, 3.9016103983158246e-06), (987515173, 1982, 'Teint_Dans_La_Masse', 208, -1, 240, -1, 2.4134531031450024e-06), (987515173, 1982, 'Teint_Dans_La_Masse', 240, -1, 240, -1, 5.75672902414226e-06), (987515173, 1982, 'Teint_Dans_La_Masse', 272, -1, 240, -1, 2.1093366740387864e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 304, -1, 240, -1, 1.776229328243062e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 336, -1, 240, -1, 2.269659671583213e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 112, -1, 272, -1, 0.00020108804164920002), (987515173, 1982, 'Teint_Dans_La_Masse', 144, -1, 272, -1, 6.680710066575557e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 176, -1, 272, -1, 4.355606870376505e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 208, -1, 272, -1, 1.4587301848223433e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 240, -1, 272, -1, 7.089161499607144e-06), (987515173, 1982, 'Teint_Dans_La_Masse', 272, -1, 272, -1, 1.549914486531634e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 304, -1, 272, -1, 1.5423898730659857e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 336, -1, 272, -1, 0.00010414663847768679), (987515173, 1982, 'Teint_Dans_La_Masse', 112, -1, 304, -1, 9.57615629886277e-05), (987515173, 1982, 'Teint_Dans_La_Masse', 144, -1, 304, -1, 0.00010980851948261261), (987515173, 1982, 'Teint_Dans_La_Masse', 176, -1, 304, -1, 0.00015770187019370496), (987515173, 1982, 'Teint_Dans_La_Masse', 208, -1, 304, -1, 0.0005425253184512258), (987515173, 1982, 'Teint_Dans_La_Masse', 240, -1, 304, -1, 0.00234618759714067), (987515173, 1982, 'Teint_Dans_La_Masse', 272, -1, 304, -1, 0.007439217064529657), (987515173, 1982, 'Teint_Dans_La_Masse', 304, -1, 304, -1, 0.005846815183758736), (987515173, 1982, 'Teint_Dans_La_Masse', 336, -1, 304, -1, 0.004471165128052235), (987515173, 1982, 'Teint_Dans_La_Masse', 112, -1, 336, -1, 0.00024185025540646166), (987515173, 1982, 'Teint_Dans_La_Masse', 144, -1, 336, -1, 0.0020053659100085497), (987515173, 1982, 'Teint_Dans_La_Masse', 176, -1, 336, -1, 0.0007500680512748659), (987515173, 1982, 'Teint_Dans_La_Masse', 208, -1, 336, -1, 0.0033546038903295994), (987515173, 1982, 'Teint_Dans_La_Masse', 240, -1, 336, -1, 0.06375189125537872), (987515173, 1982, 'Teint_Dans_La_Masse', 272, -1, 336, -1, 0.038221243768930435), (987515173, 1982, 'Teint_Dans_La_Masse', 304, -1, 336, -1, 0.03881365805864334), (987515173, 1982, 'Teint_Dans_La_Masse', 336, -1, 336, -1, 0.005883656442165375), (987515173, 1982, 'autre_refus', 112, -1, 112, -1, 5.812370029723013e-10), (987515173, 1982, 'autre_refus', 144, -1, 112, -1, 2.0696997893310254e-08), (987515173, 1982, 'autre_refus', 176, -1, 112, -1, 1.4419156286749057e-06), (987515173, 1982, 'autre_refus', 208, -1, 112, -1, 9.239451901521534e-05), (987515173, 1982, 'autre_refus', 240, -1, 112, -1, 0.00021657411707565188), (987515173, 1982, 'autre_refus', 272, -1, 112, -1, 0.0017446751007810235), (987515173, 1982, 'autre_refus', 304, -1, 112, -1, 0.01764024794101715), (987515173, 1982, 'autre_refus', 336, -1, 112, -1, 0.009007013402879238), (987515173, 1982, 'autre_refus', 112, -1, 144, -1, 9.233241371475742e-07), (987515173, 1982, 'autre_refus', 144, -1, 144, -1, 2.9129250833648257e-06), (987515173, 1982, 'autre_refus', 176, -1, 144, -1, 1.0356764505559113e-05), (987515173, 1982, 'autre_refus', 208, -1, 144, -1, 0.00013480613415595144), (987515173, 1982, 'autre_refus', 240, -1, 144, -1, 0.0003479480219539255), (987515173, 1982, 'autre_refus', 272, -1, 144, -1, 0.004960400052368641), (987515173, 1982, 'autre_refus', 304, -1, 144, -1, 0.04341026395559311), (987515173, 1982, 'autre_refus', 336, -1, 144, -1, 0.09556104242801666), (987515173, 1982, 'autre_refus', 112, -1, 176, -1, 1.517724376753904e-05), (987515173, 1982, 'autre_refus', 144, -1, 176, -1, 0.00012808706378564239), (987515173, 1982, 'autre_refus', 176, -1, 176, -1, 0.00023708897060714662), (987515173, 1982, 'autre_refus', 208, -1, 176, -1, 0.0008086910238489509), (987515173, 1982, 'autre_refus', 240, -1, 176, -1, 0.0006526345387101173), (987515173, 1982, 'autre_refus', 272, -1, 176, -1, 0.004315905272960663), (987515173, 1982, 'autre_refus', 304, -1, 176, -1, 0.023204928264021873), (987515173, 1982, 'autre_refus', 336, -1, 176, -1, 0.018731890246272087), (987515173, 1982, 'autre_refus', 112, -1, 208, -1, 8.882414840627462e-05), (987515173, 1982, 'autre_refus', 144, -1, 208, -1, 0.00018469721544533968), (987515173, 1982, 'autre_refus', 176, -1, 208, -1, 0.0003194631717633456), (987515173, 1982, 'autre_refus', 208, -1, 208, -1, 0.00035719756851904094), (987515173, 1982, 'autre_refus', 240, -1, 208, -1, 0.0001989541488001123), (987515173, 1982, 'autre_refus', 272, -1, 208, -1, 0.00028685841243714094), (987515173, 1982, 'autre_refus', 304, -1, 208, -1, 0.00020254986884538084), (987515173, 1982, 'autre_refus', 336, -1, 208, -1, 0.0002437039656797424), (987515173, 1982, 'autre_refus', 112, -1, 240, -1, 0.0002339483326068148), (987515173, 1982, 'autre_refus', 144, -1, 240, -1, 0.00010851729894056916), (987515173, 1982, 'autre_refus', 176, -1, 240, -1, 6.494591070804745e-05), (987515173, 1982, 'autre_refus', 208, -1, 240, -1, 2.53166272159433e-05), (987515173, 1982, 'autre_refus', 240, -1, 240, -1, 7.286608160939068e-05), (987515173, 1982, 'autre_refus', 272, -1, 240, -1, 0.00014085727161727846), (987515173, 1982, 'autre_refus', 304, -1, 240, -1, 8.926200825953856e-05), (987515173, 1982, 'autre_refus', 336, -1, 240, -1, 8.167748455889523e-05), (987515173, 1982, 'autre_refus', 112, -1, 272, -1, 0.00026843580417335033), (987515173, 1982, 'autre_refus', 144, -1, 272, -1, 0.00011144220479764044), (987515173, 1982, 'autre_refus', 176, -1, 272, -1, 0.00012479646829888225), (987515173, 1982, 'autre_refus', 208, -1, 272, -1, 5.112917278893292e-05), (987515173, 1982, 'autre_refus', 240, -1, 272, -1, 2.917945857916493e-05), (987515173, 1982, 'autre_refus', 272, -1, 272, -1, 4.275592073099688e-05), (987515173, 1982, 'autre_refus', 304, -1, 272, -1, 6.835387466708198e-05), (987515173, 1982, 'autre_refus', 336, -1, 272, -1, 0.00014238253061193973), (987515173, 1982, 'autre_refus', 112, -1, 304, -1, 0.00011768297554226592), (987515173, 1982, 'autre_refus', 144, -1, 304, -1, 0.00021710699365939945), (987515173, 1982, 'autre_refus', 176, -1, 304, -1, 0.0004253224760759622), (987515173, 1982, 'autre_refus', 208, -1, 304, -1, 0.0004317373677622527), (987515173, 1982, 'autre_refus', 240, -1, 304, -1, 6.571003905264661e-05), (987515173, 1982, 'autre_refus', 272, -1, 304, -1, 3.1749023037264124e-05), (987515173, 1982, 'autre_refus', 304, -1, 304, -1, 1.1654428817564622e-05), (987515173, 1982, 'autre_refus', 336, -1, 304, -1, 1.8815018847817555e-05), (987515173, 1982, 'autre_refus', 112, -1, 336, -1, 0.0002470784238539636), (987515173, 1982, 'autre_refus', 144, -1, 336, -1, 0.0004691357316914946), (987515173, 1982, 'autre_refus', 176, -1, 336, -1, 0.0003347588353790343), (987515173, 1982, 'autre_refus', 208, -1, 336, -1, 0.00023712823167443275), (987515173, 1982, 'autre_refus', 240, -1, 336, -1, 0.00010695509263314307), (987515173, 1982, 'autre_refus', 272, -1, 336, -1, 9.554780990583822e-05), (987515173, 1982, 'autre_refus', 304, -1, 336, -1, 0.00013107464474160224), (987515173, 1982, 'autre_refus', 336, -1, 336, -1, 0.0007321758894249797)]} ############################### TEST certificat_qualite_papier ################################ TEST certificat qualite papier Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! Step 4442 tile have less inputs used (1) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 4441 detect_points is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 4443 count_percent_refus is not consistent : 4 used against 3 in the step definition ! Step 4444 send_mail_dechet have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : output 1 of step 4440 have datatype=1 whereas input 0 of step 4443 have datatype=2 WARNING : type of output 1 of step 4441 doesn't seem to be define in the database( WARNING : type of input 4 of step 4443 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : init_dechet, tile, detect_points, count_percent_refus, brightness, blur_detection, send_mail_dechet list_input_json : [] origin Catched exception ! Connect or reconnect ! We have 1 , BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.49870920181274414 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 7 step1:init_dechet Sun Oct 5 05:26:54 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec debut step init detect dechets input : temp/1759634814_2282988_987321136_6a08497399a24a3041045c21475a90ea.jpg ON MODIFIE NB AVEC LE INPUT map photo id path extension : temp/1759634814_2282988_987321136_6a08497399a24a3041045c21475a90ea.jpg scale : 0.9481481481481482 FIN step init dechet Inside saveOutput : final : False verbose : False saveOutput not yet implemented for datou_step.type : init_dechet we use saveGeneral [987321136] Looping around the photos to save general results len do output : 1 /987321136Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('1848', None, None, None, None, None, None, None, None) ('1848', '1902940', '987321136', None, None, None, None, None, None) begin to insert list_values into mtr_datou_result : length of list_values in save_final : 4 time used for this insertion : 0.0394282341003418 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.00018405914306640625 time spend to save output : 0.03976106643676758 total time spend for step 1 : 0.039945125579833984 step2:tile Sun Oct 5 05:26:54 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure verbose : False param_json : {'token': '78d09a0790ec6ecbf119343125a81fdc', 'portfolio_name': 'tile_correct_upm', 'ETA': 86400, 'new_width': 1500, 'new_height': 20000, 'host': 'www.fotonower.com', 'protocol': 'https', 'photo_tile_type': 1522, 'option_bande': 'True'} type(crop_hashtag_type) : type(crop_hashtag_type_tiled) : We consider crop_hashtag_type is an integer ! map_chi_type_to_chi_type_cropped : {406: 410} map_filenames : {987321136: 'temp/1759634814_2282988_987321136_6a08497399a24a3041045c21475a90ea.jpg'} list_pids : 1 list_pids : 2 list_subpids to replace list_pids : 1 batch 1 Loaded 0 chid ids of type : 0 created feed_id_new_photos : 27518751 with name tile_correct_upm feed_id_new_photos : 27518751 filename : temp/1759634814_2282988_987321136_6a08497399a24a3041045c21475a90ea.jpg photo_id : 987321136 height_image_input : 439 width_image_input : 562 new_width : 1500 new_height : 20000 stride : 0 stride_relative : 0.1 chi to copy from the main photo to the tiled photo input_chi_for_this_image_as_chi : 0 list_bib_to_crops : 1 [(0, 562, 0, 439, 0)] new_crops_tiles : 1 crop_transformed : 0 batch 1 Loaded 1 chid ids of type : 1522 treat the image : temp/1759634814_2282988_987321136_6a08497399a24a3041045c21475a90ea.jpg , 0 before upload mediasElapsed time : 0.010763168334960938 About to upload 1 photos upload in portfolio : 27518751 400 uploaded one batch 0 Elapsed time : 0.14161467552185059 upload mediasElapsed time : 0.1524498462677002 ERROR in datou_step_exec, will save and exit ! 'temp/1759634814_2282988_987321136_6a08497399a24a3041045c21475a90ea_0.jpg' File "/home/admin/workarea/git/Velours/python/mtr/datou/datou_lib.py", line 2339, in datou_exec output = datou_step_exec(sNext, args, cache, context, map_info, verbose, mtr_user_id) File "/home/admin/workarea/git/Velours/python/mtr/datou/datou_lib.py", line 2448, in datou_step_exec return pre_process.datou_step_exec_tile(param, json_param, args, context, map_info, verbose, mtr_user_id) File "/home/admin/workarea/git/Velours/python/mtr/datou/lib_step_exec/lib_step_pre_processing.py", line 474, in datou_step_exec_tile map_new_photo_id_files = crop_and_record(photo_id, filename, new_crops_tiles, File "/home/admin/workarea/git/Velours/python/mtr/simple_image_editor/rotate_crop_and_images.py", line 1269, in crop_and_record new_photo_id = map_path_new_photo_id[path_photo_id_temp] [987321136, 987321136] begin to insert list_values into mtr_datou_result : length of list_values in save_final : 2 time used for this insertion : 0.039263010025024414 save_final ERROR in last step tile, 'temp/1759634814_2282988_987321136_6a08497399a24a3041045c21475a90ea_0.jpg' time spend for datou_step_exec : 6.402935266494751 time spend to save output : 0.04540538787841797 total time spend for step 1 : 6.448340654373169 Useless call to update_current_state caffe_path_current : About to save ! 2 After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 7 output : None probably due to empty image bug ERROR expected : {987321136: (-110, -0.39870825574700136, -5.392404060312662, 30.0, 61.64383561643836, {'carton': 3, 'Papier_Magazine': 7}, {'refus_total': 30.0, 'carton': 30.0, 'Papier_Magazine': 70.0}, {'refus_total': 61.64383561643836, 'carton': 61.64383561643836, 'Papier_Magazine': 38.35616438356164})} got : None ERROR certificat_qualite_papier FAILED ############################### TEST image_temperature_detection ################################ t Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : image_temperature_detection list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.17905354499816895 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:image_temperature_detection Sun Oct 5 05:27:01 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec inside step blanche_jaune_detection treat image : temp/1759634821_2282988_984484223_2e25dc219a9a57a9f85bcae482a80c35.jpg 984484223 1.004309911525615 time spend for datou_step_exec : 0.20682001113891602 time spend to save output : 5.364418029785156e-05 total time spend for step 1 : 0.20687365531921387 caffe_path_current : About to save ! 0 After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {984484223: [(984484223, 1.004309911525615, 492630606)]} {984484223: [(984484223, 1.004309911525615, 492630606)]} ############################### TEST broca ################################ t Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : split_time_score list_input_json : [] origin We have 1 , we have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB time to download the photos : 0.053436279296875 About to test input to load Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:split_time_score Sun Oct 5 05:27:01 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec split portfolio by speed calcul order for each photo with time calcul time for a portfolio 2021-12-01 10:11:30 2021-12-01 10:11:32 2021-12-01 10:11:30 2021-12-01 10:11:34 2021-12-01 10:11:32 2021-12-01 10:11:40 2021-12-01 10:11:34 2021-12-01 10:12:17 2021-12-01 10:11:40 2021-12-01 10:12:24 2021-12-01 10:12:17 2021-12-01 10:12:27 2021-12-01 10:12:24 2021-12-01 10:12:29 2021-12-01 10:12:27 2021-12-01 10:12:56 2021-12-01 10:12:29 2021-12-01 10:13:04 2021-12-01 10:12:56 2021-12-01 10:13:13 2021-12-01 10:13:04 2021-12-01 10:13:04 distance 1.4513659170185111 2021-12-01 10:13:13 2021-12-01 10:13:22 2021-12-01 10:13:13 2021-12-01 10:13:30 2021-12-01 10:13:22 2021-12-01 10:16:14 2021-12-01 10:13:30 2021-12-01 10:13:30 distance 8.382409567451603 2021-12-01 10:16:14 2021-12-01 10:16:18 2021-12-01 10:16:14 2021-12-01 10:16:47 2021-12-01 10:16:18 2021-12-01 10:16:53 2021-12-01 10:16:47 2021-12-01 10:16:47 distance 8.03396608896571 2021-12-01 10:16:53 2021-12-01 10:16:57 2021-12-01 10:16:53 dict_time_useful: {0: [1098136690, 1098136784, 48.864288393888884, 2.19199505125, [datetime.datetime(2021, 12, 1, 10, 11, 30), datetime.datetime(2021, 12, 1, 10, 13, 4), 94]], 1: [1098136974, 1098137007, 48.86291258986111, 2.19361357125, [datetime.datetime(2021, 12, 1, 10, 16, 14), datetime.datetime(2021, 12, 1, 10, 16, 47), 33]]} get gps info of PAV SELECT id,Y_WGS84,X_WGS84 FROM MTRLabel.info_PAV; get gps info of PAV SELECT id,Y_WGS84,X_WGS84 FROM MTRLabel.info_PAV WHERE type_pav = "CS"; get gps info of PAV SELECT id,Y_WGS84,X_WGS84 FROM MTRLabel.info_PAV WHERE type_pav = "OM"; distance: RUEIL14CS [48.864288393888884, 2.19199505125] 16.57008455321128 time spend for datou_step_exec : 0.387648344039917 time spend to save output : 0.00012755393981933594 total time spend for step 1 : 0.38777589797973633 caffe_path_current : About to save ! 0 After save, about to update current ! {15: [(27518753, 48.864288393888884, 2.19199505125, 10, 1064919752, [datetime.datetime(2021, 12, 1, 10, 11, 30), datetime.datetime(2021, 12, 1, 10, 13, 4), 94.0], 5205529)]} résultat du premier test BROCA : True True ############################### TEST crop_conditional ################################ t Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : step 1335 frcnn is not linked in the step_by_step architecture ! WARNING : step 1336 crop_condition is not linked in the step_by_step architecture ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! DataTypes for each output/input checked ! List Step Type Loaded in datou : frcnn, crop_condition list_input_json : [] origin We have 1 , BBBFBFBFFBFFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 6 ; length of list_pids : 6 ; length of list_args : 6 time to download the photos : 0.5065286159515381 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 2 step1:frcnn Sun Oct 5 05:27:02 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step Faster rcnn ! Inside try reload ! To loadFromThcl() model_param file didn't exist model_name : learn_piece_voiture_0808_v2 model_type : caffe_faster_rcnn list file need : ['caffemodel', 'test.prototxt'] file exist in s3 : ['caffemodel', 'test.prototxt'] file manque in s3 : [] WARNING: Logging before InitGoogleLogging() is written to STDERR F1005 05:27:05.672667 2282988 syncedmem.cpp:71] Check failed: error == cudaSuccess (2 vs. 0) out of memory *** Check failure stack trace: *** Aborted (core dumped) No data to report. No data to report. ret : 34304 command : coverage3 html -i --omit=/usr/local/lib/python3.8/dist-packages/*,/home/admin/.local/lib/python3.8/site-packages/*,/usr/lib/python3/dist-packages/* -d htmlcov ret : 256 command : coverage3 report -i -m ret : 256 90.85user 46.20system 6:40.64elapsed 34%CPU (0avgtext+0avgdata 6380280maxresident)k 6275248inputs+43192outputs (9309major+6015961minor)pagefaults 0swaps