python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 3318 ' -s datou_3318 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 3935224 load datou : 3318 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? [(photo_id, hashtag_id, hashtag_type, x0, x1, y0, y1, score, seg_temp, polygons), ...] was removed should we ? chemin de la photo was removed should we ? [ (photo_id_loc, hashtag_id, hashtag_type, x0, x1, y0, y1, score, None), ...] was removed should we ? chemin de la photo was removed should we ? id de la photo (peut être local ou global) was removed should we ? chemin de la photo was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? None was removed should we ? donnée sous forme de nombre was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) load thcls load THCL from format json or kwargs add thcl : 2847 in CacheModelConfig load pdts add pdt : 5275 in CacheModelConfig Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 3318, datou_cur_ids : ['2913534'] with mtr_portfolio_ids : ['23180311'] and first list_photo_ids : [] new path : /proc/3935224/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : mask_detect, crop_condition, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, velours_tree, send_mail_cod, split_time_score over limit max, limiting to limit_max 40 list_input_json : [] origin We have 1 , BFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 20 ; length of list_pids : 20 ; length of list_args : 20 time to download the photos : 4.372506856918335 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 10 step1:mask_detect Tue May 20 21:00:33 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 10593 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-05-20 21:00:36.408567: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-05-20 21:00:36.435284: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493065000 Hz 2025-05-20 21:00:36.437408: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f3f30000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-05-20 21:00:36.437440: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-05-20 21:00:36.442006: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-05-20 21:00:36.586049: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x2df7cdb0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-05-20 21:00:36.586123: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-05-20 21:00:36.587501: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-20 21:00:36.587983: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-20 21:00:36.591286: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-20 21:00:36.593613: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-20 21:00:36.593959: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-20 21:00:36.596296: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-20 21:00:36.597399: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-20 21:00:36.602094: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-20 21:00:36.603705: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-20 21:00:36.603810: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-20 21:00:36.604567: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-05-20 21:00:36.604584: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-05-20 21:00:36.604593: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-05-20 21:00:36.606496: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9671 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-05-20 21:00:36.953336: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-20 21:00:36.953512: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-20 21:00:36.953542: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-20 21:00:36.953567: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-20 21:00:36.953591: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-20 21:00:36.953614: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-20 21:00:36.953636: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-20 21:00:36.953660: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-20 21:00:36.955310: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-20 21:00:36.956814: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-20 21:00:36.956875: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-20 21:00:36.956898: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-20 21:00:36.956918: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-20 21:00:36.956938: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-20 21:00:36.956957: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-20 21:00:36.956977: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-20 21:00:36.956998: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-20 21:00:36.958364: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-20 21:00:36.958405: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-05-20 21:00:36.958414: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-05-20 21:00:36.958422: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-05-20 21:00:36.959911: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9671 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl2847 thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) {'thcl': {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'], 'list_hashtags_csv': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'svm_hashtag_type_desc': 5275, 'photo_desc_type': 5275, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME learn_RUBBIA_REFUS_AMIENS_23 NUM_CLASSES 9 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : learn_RUBBIA_REFUS_AMIENS_23 model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-05-20 21:00:52.525271: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-20 21:00:52.735035: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23 /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23/mask_model.h5 size_local : 256009536 size in s3 : 256009536 create time local : 2021-08-09 09:43:22 create time in s3 : 2021-08-06 18:54:04 mask_model.h5 already exist and didn't need to update list_images length : 20 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 9.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 33 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 3.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 44 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 4.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 27 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 12.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 45 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 5.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 38 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 5.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 33 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 10.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 48 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 2.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 41 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 49 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 7.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 36 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 7.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 22 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 42 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 35 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 2.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 43 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 33 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 5.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 34 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 6.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 35 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 7.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 28 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 12.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 43 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 8.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 30 Detection mask done ! Trying to reset tf kernel 3935848 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 1525 tf kernel not reseted sub process len(results) : 20 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 20 len(list_Values) 0 process is alive process is alive process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 6814 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl2847 Catched exception ! Connect or reconnect ! thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] time for calcul the mask position with numpy : 0.06281757354736328 nb_pixel_total : 29854 time to create 1 rle with old method : 0.05976104736328125 length of segment : 143 time for calcul the mask position with numpy : 0.07470440864562988 nb_pixel_total : 25805 time to create 1 rle with old method : 0.03534054756164551 length of segment : 472 time for calcul the mask position with numpy : 0.12665677070617676 nb_pixel_total : 49228 time to create 1 rle with old method : 0.05965375900268555 length of segment : 392 time for calcul the mask position with numpy : 0.09538888931274414 nb_pixel_total : 47764 time to create 1 rle with old method : 0.05982708930969238 length of segment : 428 time for calcul the mask position with numpy : 0.30714941024780273 nb_pixel_total : 185800 time to create 1 rle with new method : 0.019425153732299805 length of segment : 531 time for calcul the mask position with numpy : 0.03769850730895996 nb_pixel_total : 8958 time to create 1 rle with old method : 0.017465591430664062 length of segment : 166 time for calcul the mask position with numpy : 0.008356332778930664 nb_pixel_total : 34324 time to create 1 rle with old method : 0.038910865783691406 length of segment : 269 time for calcul the mask position with numpy : 0.050928592681884766 nb_pixel_total : 19056 time to create 1 rle with old method : 0.026173830032348633 length of segment : 135 time for calcul the mask position with numpy : 0.1009829044342041 nb_pixel_total : 50386 time to create 1 rle with old method : 0.06400704383850098 length of segment : 307 time for calcul the mask position with numpy : 0.21448683738708496 nb_pixel_total : 144762 time to create 1 rle with old method : 0.17057085037231445 length of segment : 484 time for calcul the mask position with numpy : 0.06057929992675781 nb_pixel_total : 67833 time to create 1 rle with old method : 0.08729434013366699 length of segment : 170 time for calcul the mask position with numpy : 0.00015807151794433594 nb_pixel_total : 3952 time to create 1 rle with old method : 0.006805419921875 length of segment : 79 time for calcul the mask position with numpy : 0.05108284950256348 nb_pixel_total : 12295 time to create 1 rle with old method : 0.01999807357788086 length of segment : 150 time for calcul the mask position with numpy : 0.17693686485290527 nb_pixel_total : 99546 time to create 1 rle with old method : 0.11906242370605469 length of segment : 533 time for calcul the mask position with numpy : 0.029488325119018555 nb_pixel_total : 31334 time to create 1 rle with old method : 0.04055476188659668 length of segment : 248 time for calcul the mask position with numpy : 0.03562569618225098 nb_pixel_total : 37582 time to create 1 rle with old method : 0.047629594802856445 length of segment : 385 time for calcul the mask position with numpy : 0.11908316612243652 nb_pixel_total : 67699 time to create 1 rle with old method : 0.08237171173095703 length of segment : 377 time for calcul the mask position with numpy : 0.09339714050292969 nb_pixel_total : 56208 time to create 1 rle with old method : 0.06927132606506348 length of segment : 314 time for calcul the mask position with numpy : 0.2777247428894043 nb_pixel_total : 144984 time to create 1 rle with old method : 0.16621637344360352 length of segment : 401 time for calcul the mask position with numpy : 0.11645364761352539 nb_pixel_total : 52431 time to create 1 rle with old method : 0.06340622901916504 length of segment : 342 time for calcul the mask position with numpy : 0.11368560791015625 nb_pixel_total : 44986 time to create 1 rle with old method : 0.05896949768066406 length of segment : 262 time for calcul the mask position with numpy : 0.09526324272155762 nb_pixel_total : 28736 time to create 1 rle with old method : 0.05620265007019043 length of segment : 260 time for calcul the mask position with numpy : 0.05919361114501953 nb_pixel_total : 24448 time to create 1 rle with old method : 0.028348922729492188 length of segment : 196 time for calcul the mask position with numpy : 0.0548703670501709 nb_pixel_total : 19286 time to create 1 rle with old method : 0.022148609161376953 length of segment : 209 time for calcul the mask position with numpy : 0.06330394744873047 nb_pixel_total : 29562 time to create 1 rle with old method : 0.03398919105529785 length of segment : 143 time for calcul the mask position with numpy : 0.053702354431152344 nb_pixel_total : 22990 time to create 1 rle with old method : 0.03286242485046387 length of segment : 213 time for calcul the mask position with numpy : 0.5782928466796875 nb_pixel_total : 229172 time to create 1 rle with new method : 0.047921180725097656 length of segment : 1041 time for calcul the mask position with numpy : 0.05165600776672363 nb_pixel_total : 34401 time to create 1 rle with old method : 0.04456663131713867 length of segment : 185 time for calcul the mask position with numpy : 0.3204805850982666 nb_pixel_total : 255726 time to create 1 rle with new method : 0.021178722381591797 length of segment : 606 time for calcul the mask position with numpy : 0.03985309600830078 nb_pixel_total : 38757 time to create 1 rle with old method : 0.052866458892822266 length of segment : 174 time for calcul the mask position with numpy : 0.03571939468383789 nb_pixel_total : 23004 time to create 1 rle with old method : 0.031129837036132812 length of segment : 153 time for calcul the mask position with numpy : 0.18362045288085938 nb_pixel_total : 74072 time to create 1 rle with old method : 0.11347460746765137 length of segment : 728 time for calcul the mask position with numpy : 0.10361313819885254 nb_pixel_total : 115659 time to create 1 rle with old method : 0.13308000564575195 length of segment : 600 time for calcul the mask position with numpy : 0.06338143348693848 nb_pixel_total : 30470 time to create 1 rle with old method : 0.0403139591217041 length of segment : 176 time for calcul the mask position with numpy : 0.024601221084594727 nb_pixel_total : 8572 time to create 1 rle with old method : 0.014486312866210938 length of segment : 127 time for calcul the mask position with numpy : 0.05022692680358887 nb_pixel_total : 19045 time to create 1 rle with old method : 0.02368330955505371 length of segment : 115 time for calcul the mask position with numpy : 0.1243736743927002 nb_pixel_total : 82931 time to create 1 rle with old method : 0.09821248054504395 length of segment : 334 time for calcul the mask position with numpy : 0.2828054428100586 nb_pixel_total : 278993 time to create 1 rle with new method : 0.01663064956665039 length of segment : 543 time for calcul the mask position with numpy : 0.054987430572509766 nb_pixel_total : 31372 time to create 1 rle with old method : 0.06048941612243652 length of segment : 213 time for calcul the mask position with numpy : 0.0724036693572998 nb_pixel_total : 30911 time to create 1 rle with old method : 0.06319403648376465 length of segment : 399 time for calcul the mask position with numpy : 0.2656729221343994 nb_pixel_total : 230180 time to create 1 rle with new method : 0.0171356201171875 length of segment : 606 time for calcul the mask position with numpy : 0.04998207092285156 nb_pixel_total : 22465 time to create 1 rle with old method : 0.030253171920776367 length of segment : 207 time for calcul the mask position with numpy : 0.1435532569885254 nb_pixel_total : 230237 time to create 1 rle with new method : 0.013439178466796875 length of segment : 739 time for calcul the mask position with numpy : 0.01233530044555664 nb_pixel_total : 5709 time to create 1 rle with old method : 0.010992288589477539 length of segment : 69 time for calcul the mask position with numpy : 0.0001842975616455078 nb_pixel_total : 7044 time to create 1 rle with old method : 0.008510351181030273 length of segment : 87 time for calcul the mask position with numpy : 0.08704066276550293 nb_pixel_total : 91486 time to create 1 rle with old method : 0.1089482307434082 length of segment : 333 time for calcul the mask position with numpy : 0.04768228530883789 nb_pixel_total : 34536 time to create 1 rle with old method : 0.04479551315307617 length of segment : 196 time for calcul the mask position with numpy : 0.06393599510192871 nb_pixel_total : 40850 time to create 1 rle with old method : 0.05103492736816406 length of segment : 145 time for calcul the mask position with numpy : 0.17918181419372559 nb_pixel_total : 91706 time to create 1 rle with old method : 0.1102454662322998 length of segment : 306 time for calcul the mask position with numpy : 0.12923407554626465 nb_pixel_total : 35565 time to create 1 rle with old method : 0.04528665542602539 length of segment : 255 time for calcul the mask position with numpy : 0.07731747627258301 nb_pixel_total : 32845 time to create 1 rle with old method : 0.04404926300048828 length of segment : 284 time for calcul the mask position with numpy : 0.060814857482910156 nb_pixel_total : 27570 time to create 1 rle with old method : 0.03386259078979492 length of segment : 237 time for calcul the mask position with numpy : 0.29314184188842773 nb_pixel_total : 165917 time to create 1 rle with new method : 0.01803278923034668 length of segment : 409 time for calcul the mask position with numpy : 0.08532905578613281 nb_pixel_total : 62010 time to create 1 rle with old method : 0.07360100746154785 length of segment : 328 time for calcul the mask position with numpy : 0.05824995040893555 nb_pixel_total : 25793 time to create 1 rle with old method : 0.03376317024230957 length of segment : 161 time for calcul the mask position with numpy : 0.06102871894836426 nb_pixel_total : 44019 time to create 1 rle with old method : 0.056000709533691406 length of segment : 235 time for calcul the mask position with numpy : 0.04626655578613281 nb_pixel_total : 28807 time to create 1 rle with old method : 0.04045438766479492 length of segment : 164 time for calcul the mask position with numpy : 0.07455134391784668 nb_pixel_total : 34750 time to create 1 rle with old method : 0.042873382568359375 length of segment : 276 time for calcul the mask position with numpy : 0.07553625106811523 nb_pixel_total : 52645 time to create 1 rle with old method : 0.06984210014343262 length of segment : 273 time for calcul the mask position with numpy : 0.06995677947998047 nb_pixel_total : 30449 time to create 1 rle with old method : 0.04143095016479492 length of segment : 180 time for calcul the mask position with numpy : 0.14555573463439941 nb_pixel_total : 73948 time to create 1 rle with old method : 0.1411118507385254 length of segment : 468 time for calcul the mask position with numpy : 0.08287549018859863 nb_pixel_total : 70547 time to create 1 rle with old method : 0.08820486068725586 length of segment : 266 time for calcul the mask position with numpy : 0.10126781463623047 nb_pixel_total : 72933 time to create 1 rle with old method : 0.0947270393371582 length of segment : 440 time for calcul the mask position with numpy : 0.06810355186462402 nb_pixel_total : 42389 time to create 1 rle with old method : 0.05339407920837402 length of segment : 136 time for calcul the mask position with numpy : 0.13462185859680176 nb_pixel_total : 72280 time to create 1 rle with old method : 0.0847315788269043 length of segment : 326 time for calcul the mask position with numpy : 0.014343500137329102 nb_pixel_total : 10907 time to create 1 rle with old method : 0.017923831939697266 length of segment : 111 time for calcul the mask position with numpy : 0.05578112602233887 nb_pixel_total : 59599 time to create 1 rle with old method : 0.08270835876464844 length of segment : 322 time for calcul the mask position with numpy : 0.09683537483215332 nb_pixel_total : 115968 time to create 1 rle with old method : 0.1496269702911377 length of segment : 382 time for calcul the mask position with numpy : 0.050469398498535156 nb_pixel_total : 33982 time to create 1 rle with old method : 0.04302191734313965 length of segment : 329 time for calcul the mask position with numpy : 0.016088247299194336 nb_pixel_total : 11767 time to create 1 rle with old method : 0.01696467399597168 length of segment : 119 time for calcul the mask position with numpy : 0.029210805892944336 nb_pixel_total : 129140 time to create 1 rle with old method : 0.1689300537109375 length of segment : 510 time for calcul the mask position with numpy : 0.03124690055847168 nb_pixel_total : 123693 time to create 1 rle with old method : 0.14238858222961426 length of segment : 443 time for calcul the mask position with numpy : 0.02209162712097168 nb_pixel_total : 12226 time to create 1 rle with old method : 0.02038407325744629 length of segment : 204 time for calcul the mask position with numpy : 0.02654242515563965 nb_pixel_total : 171629 time to create 1 rle with new method : 0.010942220687866211 length of segment : 635 time for calcul the mask position with numpy : 0.07577872276306152 nb_pixel_total : 129676 time to create 1 rle with old method : 0.1588442325592041 length of segment : 376 time for calcul the mask position with numpy : 0.04606270790100098 nb_pixel_total : 73256 time to create 1 rle with old method : 0.08639216423034668 length of segment : 224 time for calcul the mask position with numpy : 0.0013396739959716797 nb_pixel_total : 10269 time to create 1 rle with old method : 0.01185297966003418 length of segment : 104 time for calcul the mask position with numpy : 0.025844573974609375 nb_pixel_total : 105411 time to create 1 rle with old method : 0.12136006355285645 length of segment : 445 time for calcul the mask position with numpy : 0.029100418090820312 nb_pixel_total : 20282 time to create 1 rle with old method : 0.026185274124145508 length of segment : 130 time for calcul the mask position with numpy : 0.0038945674896240234 nb_pixel_total : 29796 time to create 1 rle with old method : 0.03637528419494629 length of segment : 210 time for calcul the mask position with numpy : 0.011978864669799805 nb_pixel_total : 46721 time to create 1 rle with old method : 0.05687141418457031 length of segment : 208 time for calcul the mask position with numpy : 0.0013413429260253906 nb_pixel_total : 18290 time to create 1 rle with old method : 0.021157264709472656 length of segment : 145 time for calcul the mask position with numpy : 0.04511404037475586 nb_pixel_total : 75688 time to create 1 rle with old method : 0.0859529972076416 length of segment : 496 time for calcul the mask position with numpy : 0.0002276897430419922 nb_pixel_total : 8985 time to create 1 rle with old method : 0.010740280151367188 length of segment : 75 time for calcul the mask position with numpy : 0.011093616485595703 nb_pixel_total : 53551 time to create 1 rle with old method : 0.06049036979675293 length of segment : 327 time for calcul the mask position with numpy : 0.1544969081878662 nb_pixel_total : 87057 time to create 1 rle with old method : 0.10867643356323242 length of segment : 313 time for calcul the mask position with numpy : 0.26221299171447754 nb_pixel_total : 159544 time to create 1 rle with new method : 0.015079736709594727 length of segment : 986 time for calcul the mask position with numpy : 0.005931854248046875 nb_pixel_total : 8038 time to create 1 rle with old method : 0.013033151626586914 length of segment : 98 time for calcul the mask position with numpy : 0.15172290802001953 nb_pixel_total : 75064 time to create 1 rle with old method : 0.09339404106140137 length of segment : 462 time for calcul the mask position with numpy : 0.07670855522155762 nb_pixel_total : 84557 time to create 1 rle with old method : 0.10517597198486328 length of segment : 527 time for calcul the mask position with numpy : 0.009687185287475586 nb_pixel_total : 32606 time to create 1 rle with old method : 0.04212474822998047 length of segment : 264 time for calcul the mask position with numpy : 0.020853519439697266 nb_pixel_total : 78468 time to create 1 rle with old method : 0.09361696243286133 length of segment : 417 time for calcul the mask position with numpy : 0.005524396896362305 nb_pixel_total : 9568 time to create 1 rle with old method : 0.013614892959594727 length of segment : 105 time for calcul the mask position with numpy : 0.008201122283935547 nb_pixel_total : 31887 time to create 1 rle with old method : 0.03971362113952637 length of segment : 330 time for calcul the mask position with numpy : 0.06244349479675293 nb_pixel_total : 22662 time to create 1 rle with old method : 0.031010150909423828 length of segment : 226 time for calcul the mask position with numpy : 0.008151769638061523 nb_pixel_total : 26242 time to create 1 rle with old method : 0.035378456115722656 length of segment : 175 time for calcul the mask position with numpy : 0.01161646842956543 nb_pixel_total : 31188 time to create 1 rle with old method : 0.049794673919677734 length of segment : 198 time for calcul the mask position with numpy : 0.02751469612121582 nb_pixel_total : 37336 time to create 1 rle with old method : 0.04760551452636719 length of segment : 607 time for calcul the mask position with numpy : 0.11766457557678223 nb_pixel_total : 93364 time to create 1 rle with old method : 0.10837674140930176 length of segment : 439 time for calcul the mask position with numpy : 0.015137434005737305 nb_pixel_total : 22579 time to create 1 rle with old method : 0.02622532844543457 length of segment : 218 time for calcul the mask position with numpy : 0.02302408218383789 nb_pixel_total : 35086 time to create 1 rle with old method : 0.04184365272521973 length of segment : 231 time for calcul the mask position with numpy : 0.03126263618469238 nb_pixel_total : 67379 time to create 1 rle with old method : 0.08070564270019531 length of segment : 218 time for calcul the mask position with numpy : 0.03424358367919922 nb_pixel_total : 45673 time to create 1 rle with old method : 0.0515139102935791 length of segment : 451 time for calcul the mask position with numpy : 0.058016300201416016 nb_pixel_total : 67886 time to create 1 rle with old method : 0.08054924011230469 length of segment : 442 time for calcul the mask position with numpy : 0.03378558158874512 nb_pixel_total : 98696 time to create 1 rle with old method : 0.1184396743774414 length of segment : 418 time for calcul the mask position with numpy : 0.018938302993774414 nb_pixel_total : 19716 time to create 1 rle with old method : 0.027537822723388672 length of segment : 158 time for calcul the mask position with numpy : 0.09564018249511719 nb_pixel_total : 38526 time to create 1 rle with old method : 0.04724860191345215 length of segment : 424 time for calcul the mask position with numpy : 0.010915517807006836 nb_pixel_total : 48598 time to create 1 rle with old method : 0.05926012992858887 length of segment : 258 time for calcul the mask position with numpy : 0.0730741024017334 nb_pixel_total : 88170 time to create 1 rle with old method : 0.10601615905761719 length of segment : 471 time for calcul the mask position with numpy : 0.010661125183105469 nb_pixel_total : 30389 time to create 1 rle with old method : 0.040199995040893555 length of segment : 247 time for calcul the mask position with numpy : 0.06456279754638672 nb_pixel_total : 70713 time to create 1 rle with old method : 0.11640620231628418 length of segment : 426 time for calcul the mask position with numpy : 0.044653892517089844 nb_pixel_total : 38182 time to create 1 rle with old method : 0.04977536201477051 length of segment : 234 time for calcul the mask position with numpy : 0.01138925552368164 nb_pixel_total : 38317 time to create 1 rle with old method : 0.04775595664978027 length of segment : 284 time for calcul the mask position with numpy : 0.0004305839538574219 nb_pixel_total : 12575 time to create 1 rle with old method : 0.014808177947998047 length of segment : 114 time for calcul the mask position with numpy : 0.020325183868408203 nb_pixel_total : 15545 time to create 1 rle with old method : 0.025212764739990234 length of segment : 128 time for calcul the mask position with numpy : 0.012934684753417969 nb_pixel_total : 28591 time to create 1 rle with old method : 0.0371859073638916 length of segment : 232 time for calcul the mask position with numpy : 0.02322530746459961 nb_pixel_total : 99678 time to create 1 rle with old method : 0.1120612621307373 length of segment : 500 time for calcul the mask position with numpy : 0.03020501136779785 nb_pixel_total : 52014 time to create 1 rle with old method : 0.07606196403503418 length of segment : 253 time for calcul the mask position with numpy : 0.04966259002685547 nb_pixel_total : 49032 time to create 1 rle with old method : 0.05848407745361328 length of segment : 466 time for calcul the mask position with numpy : 0.015085935592651367 nb_pixel_total : 20088 time to create 1 rle with old method : 0.027123451232910156 length of segment : 168 time for calcul the mask position with numpy : 0.0531618595123291 nb_pixel_total : 115341 time to create 1 rle with old method : 0.13826322555541992 length of segment : 347 time for calcul the mask position with numpy : 0.07074141502380371 nb_pixel_total : 73840 time to create 1 rle with old method : 0.08899998664855957 length of segment : 457 time for calcul the mask position with numpy : 0.043355703353881836 nb_pixel_total : 104322 time to create 1 rle with old method : 0.12005901336669922 length of segment : 621 time for calcul the mask position with numpy : 0.011848926544189453 nb_pixel_total : 15983 time to create 1 rle with old method : 0.023199081420898438 length of segment : 213 time for calcul the mask position with numpy : 0.007392168045043945 nb_pixel_total : 31221 time to create 1 rle with old method : 0.03548288345336914 length of segment : 249 time for calcul the mask position with numpy : 0.011005401611328125 nb_pixel_total : 20887 time to create 1 rle with old method : 0.027109861373901367 length of segment : 198 time for calcul the mask position with numpy : 0.014864921569824219 nb_pixel_total : 16491 time to create 1 rle with old method : 0.018707752227783203 length of segment : 212 time for calcul the mask position with numpy : 0.07672929763793945 nb_pixel_total : 155814 time to create 1 rle with new method : 0.011610984802246094 length of segment : 467 time for calcul the mask position with numpy : 0.03160667419433594 nb_pixel_total : 136242 time to create 1 rle with old method : 0.15101337432861328 length of segment : 453 time for calcul the mask position with numpy : 0.0049245357513427734 nb_pixel_total : 16485 time to create 1 rle with old method : 0.019006967544555664 length of segment : 173 time for calcul the mask position with numpy : 0.006745576858520508 nb_pixel_total : 10301 time to create 1 rle with old method : 0.012248516082763672 length of segment : 105 time for calcul the mask position with numpy : 0.0057430267333984375 nb_pixel_total : 24020 time to create 1 rle with old method : 0.02917313575744629 length of segment : 220 time for calcul the mask position with numpy : 0.012447834014892578 nb_pixel_total : 18933 time to create 1 rle with old method : 0.03188681602478027 length of segment : 147 time for calcul the mask position with numpy : 0.01714324951171875 nb_pixel_total : 94169 time to create 1 rle with old method : 0.10664820671081543 length of segment : 442 time for calcul the mask position with numpy : 0.009534120559692383 nb_pixel_total : 12864 time to create 1 rle with old method : 0.014987468719482422 length of segment : 110 time for calcul the mask position with numpy : 0.015567779541015625 nb_pixel_total : 36608 time to create 1 rle with old method : 0.04122447967529297 length of segment : 318 time for calcul the mask position with numpy : 0.005600690841674805 nb_pixel_total : 75876 time to create 1 rle with old method : 0.08464932441711426 length of segment : 334 time for calcul the mask position with numpy : 0.014647960662841797 nb_pixel_total : 30993 time to create 1 rle with old method : 0.04104304313659668 length of segment : 234 time for calcul the mask position with numpy : 0.006220340728759766 nb_pixel_total : 29732 time to create 1 rle with old method : 0.035707712173461914 length of segment : 244 time for calcul the mask position with numpy : 0.0036814212799072266 nb_pixel_total : 7613 time to create 1 rle with old method : 0.010690450668334961 length of segment : 107 time for calcul the mask position with numpy : 0.030836105346679688 nb_pixel_total : 191442 time to create 1 rle with new method : 0.011445760726928711 length of segment : 634 time for calcul the mask position with numpy : 0.004400491714477539 nb_pixel_total : 16639 time to create 1 rle with old method : 0.019681930541992188 length of segment : 118 time for calcul the mask position with numpy : 0.02717733383178711 nb_pixel_total : 107697 time to create 1 rle with old method : 0.12348723411560059 length of segment : 425 time for calcul the mask position with numpy : 0.010455608367919922 nb_pixel_total : 20885 time to create 1 rle with old method : 0.029095172882080078 length of segment : 195 time for calcul the mask position with numpy : 0.0025641918182373047 nb_pixel_total : 22266 time to create 1 rle with old method : 0.02548670768737793 length of segment : 167 time for calcul the mask position with numpy : 0.0022699832916259766 nb_pixel_total : 6012 time to create 1 rle with old method : 0.007797956466674805 length of segment : 77 time for calcul the mask position with numpy : 0.002956390380859375 nb_pixel_total : 15828 time to create 1 rle with old method : 0.01849532127380371 length of segment : 241 time for calcul the mask position with numpy : 0.013469219207763672 nb_pixel_total : 27683 time to create 1 rle with old method : 0.04145479202270508 length of segment : 221 time for calcul the mask position with numpy : 0.03014826774597168 nb_pixel_total : 82439 time to create 1 rle with old method : 0.0980081558227539 length of segment : 622 time for calcul the mask position with numpy : 0.1468672752380371 nb_pixel_total : 87792 time to create 1 rle with old method : 0.1015157699584961 length of segment : 413 time for calcul the mask position with numpy : 0.03697395324707031 nb_pixel_total : 25628 time to create 1 rle with old method : 0.04590344429016113 length of segment : 231 time for calcul the mask position with numpy : 0.002968311309814453 nb_pixel_total : 12694 time to create 1 rle with old method : 0.017421722412109375 length of segment : 101 time for calcul the mask position with numpy : 0.011912345886230469 nb_pixel_total : 9494 time to create 1 rle with old method : 0.016115427017211914 length of segment : 128 time for calcul the mask position with numpy : 0.015372276306152344 nb_pixel_total : 63910 time to create 1 rle with old method : 0.07874488830566406 length of segment : 431 time for calcul the mask position with numpy : 0.007641315460205078 nb_pixel_total : 13549 time to create 1 rle with old method : 0.017915725708007812 length of segment : 186 time for calcul the mask position with numpy : 0.0010716915130615234 nb_pixel_total : 17837 time to create 1 rle with old method : 0.020635366439819336 length of segment : 158 time for calcul the mask position with numpy : 0.10206747055053711 nb_pixel_total : 51391 time to create 1 rle with old method : 0.062059879302978516 length of segment : 581 time for calcul the mask position with numpy : 0.0018811225891113281 nb_pixel_total : 16697 time to create 1 rle with old method : 0.01917409896850586 length of segment : 214 time for calcul the mask position with numpy : 0.0008814334869384766 nb_pixel_total : 8749 time to create 1 rle with old method : 0.010181427001953125 length of segment : 160 time for calcul the mask position with numpy : 0.05240917205810547 nb_pixel_total : 72094 time to create 1 rle with old method : 0.08739495277404785 length of segment : 284 time for calcul the mask position with numpy : 0.0022220611572265625 nb_pixel_total : 13034 time to create 1 rle with old method : 0.020726919174194336 length of segment : 155 time for calcul the mask position with numpy : 0.07725787162780762 nb_pixel_total : 110249 time to create 1 rle with old method : 0.13217854499816895 length of segment : 388 time for calcul the mask position with numpy : 0.011168241500854492 nb_pixel_total : 73781 time to create 1 rle with old method : 0.08766341209411621 length of segment : 366 time for calcul the mask position with numpy : 0.0425877571105957 nb_pixel_total : 45335 time to create 1 rle with old method : 0.055010080337524414 length of segment : 248 time for calcul the mask position with numpy : 0.023530960083007812 nb_pixel_total : 136339 time to create 1 rle with old method : 0.18626952171325684 length of segment : 426 time for calcul the mask position with numpy : 0.01824474334716797 nb_pixel_total : 13838 time to create 1 rle with old method : 0.020542383193969727 length of segment : 132 time for calcul the mask position with numpy : 0.028295040130615234 nb_pixel_total : 50543 time to create 1 rle with old method : 0.06210803985595703 length of segment : 269 time for calcul the mask position with numpy : 0.0010924339294433594 nb_pixel_total : 15944 time to create 1 rle with old method : 0.018800973892211914 length of segment : 137 time for calcul the mask position with numpy : 0.006787300109863281 nb_pixel_total : 16212 time to create 1 rle with old method : 0.03077864646911621 length of segment : 242 time for calcul the mask position with numpy : 0.04949164390563965 nb_pixel_total : 100464 time to create 1 rle with old method : 0.12371706962585449 length of segment : 427 time for calcul the mask position with numpy : 0.003814220428466797 nb_pixel_total : 91167 time to create 1 rle with old method : 0.1085667610168457 length of segment : 243 time for calcul the mask position with numpy : 0.054784297943115234 nb_pixel_total : 224393 time to create 1 rle with new method : 0.015743017196655273 length of segment : 672 time for calcul the mask position with numpy : 0.01410675048828125 nb_pixel_total : 27375 time to create 1 rle with old method : 0.03504514694213867 length of segment : 221 time for calcul the mask position with numpy : 0.00832986831665039 nb_pixel_total : 38255 time to create 1 rle with old method : 0.045805931091308594 length of segment : 285 time for calcul the mask position with numpy : 0.04395174980163574 nb_pixel_total : 90847 time to create 1 rle with old method : 0.11244773864746094 length of segment : 324 time for calcul the mask position with numpy : 0.0012538433074951172 nb_pixel_total : 18313 time to create 1 rle with old method : 0.021144866943359375 length of segment : 197 time for calcul the mask position with numpy : 0.038620710372924805 nb_pixel_total : 61845 time to create 1 rle with old method : 0.0695497989654541 length of segment : 378 time for calcul the mask position with numpy : 0.08196115493774414 nb_pixel_total : 163604 time to create 1 rle with new method : 0.011511564254760742 length of segment : 538 time for calcul the mask position with numpy : 0.0071141719818115234 nb_pixel_total : 44418 time to create 1 rle with old method : 0.05341792106628418 length of segment : 301 time for calcul the mask position with numpy : 0.002887248992919922 nb_pixel_total : 32350 time to create 1 rle with old method : 0.04020428657531738 length of segment : 267 time for calcul the mask position with numpy : 0.04504036903381348 nb_pixel_total : 258760 time to create 1 rle with new method : 0.016217470169067383 length of segment : 640 time for calcul the mask position with numpy : 0.09152865409851074 nb_pixel_total : 287198 time to create 1 rle with new method : 0.03714156150817871 length of segment : 667 time for calcul the mask position with numpy : 0.0311129093170166 nb_pixel_total : 138104 time to create 1 rle with old method : 0.16133904457092285 length of segment : 323 time for calcul the mask position with numpy : 0.013279914855957031 nb_pixel_total : 40625 time to create 1 rle with old method : 0.05615496635437012 length of segment : 203 time for calcul the mask position with numpy : 0.003787994384765625 nb_pixel_total : 9838 time to create 1 rle with old method : 0.017470836639404297 length of segment : 99 time for calcul the mask position with numpy : 0.012079477310180664 nb_pixel_total : 107972 time to create 1 rle with old method : 0.1571817398071289 length of segment : 465 time for calcul the mask position with numpy : 0.00083160400390625 nb_pixel_total : 23323 time to create 1 rle with old method : 0.027428865432739258 length of segment : 191 time for calcul the mask position with numpy : 0.005236387252807617 nb_pixel_total : 30942 time to create 1 rle with old method : 0.054457664489746094 length of segment : 210 time for calcul the mask position with numpy : 0.009201288223266602 nb_pixel_total : 87508 time to create 1 rle with old method : 0.1013040542602539 length of segment : 343 time for calcul the mask position with numpy : 0.005747795104980469 nb_pixel_total : 42157 time to create 1 rle with old method : 0.04982185363769531 length of segment : 301 time for calcul the mask position with numpy : 0.0029916763305664062 nb_pixel_total : 26688 time to create 1 rle with old method : 0.03092026710510254 length of segment : 106 time for calcul the mask position with numpy : 0.015497446060180664 nb_pixel_total : 458949 time to create 1 rle with new method : 0.038137197494506836 length of segment : 1073 time for calcul the mask position with numpy : 0.0026144981384277344 nb_pixel_total : 61654 time to create 1 rle with old method : 0.08169293403625488 length of segment : 225 time for calcul the mask position with numpy : 0.0016608238220214844 nb_pixel_total : 30041 time to create 1 rle with old method : 0.03926992416381836 length of segment : 195 time for calcul the mask position with numpy : 0.009150028228759766 nb_pixel_total : 121126 time to create 1 rle with old method : 0.1462714672088623 length of segment : 434 time for calcul the mask position with numpy : 0.0022242069244384766 nb_pixel_total : 39058 time to create 1 rle with old method : 0.04412126541137695 length of segment : 283 time for calcul the mask position with numpy : 0.0014619827270507812 nb_pixel_total : 25217 time to create 1 rle with old method : 0.029550552368164062 length of segment : 252 time for calcul the mask position with numpy : 0.002599000930786133 nb_pixel_total : 26917 time to create 1 rle with old method : 0.03090500831604004 length of segment : 257 time for calcul the mask position with numpy : 0.0037086009979248047 nb_pixel_total : 65109 time to create 1 rle with old method : 0.07456326484680176 length of segment : 346 time for calcul the mask position with numpy : 0.004207134246826172 nb_pixel_total : 80504 time to create 1 rle with old method : 0.0901648998260498 length of segment : 294 time for calcul the mask position with numpy : 0.0017294883728027344 nb_pixel_total : 68791 time to create 1 rle with old method : 0.08630776405334473 length of segment : 315 time for calcul the mask position with numpy : 0.003905773162841797 nb_pixel_total : 74014 time to create 1 rle with old method : 0.08587646484375 length of segment : 340 time for calcul the mask position with numpy : 0.0013005733489990234 nb_pixel_total : 15862 time to create 1 rle with old method : 0.018688440322875977 length of segment : 148 time for calcul the mask position with numpy : 0.0009677410125732422 nb_pixel_total : 18158 time to create 1 rle with old method : 0.021317243576049805 length of segment : 192 time for calcul the mask position with numpy : 0.002665281295776367 nb_pixel_total : 40217 time to create 1 rle with old method : 0.045163869857788086 length of segment : 292 time for calcul the mask position with numpy : 0.0006899833679199219 nb_pixel_total : 15706 time to create 1 rle with old method : 0.01824808120727539 length of segment : 188 time for calcul the mask position with numpy : 0.0026276111602783203 nb_pixel_total : 76012 time to create 1 rle with old method : 0.08858489990234375 length of segment : 465 time for calcul the mask position with numpy : 0.0011849403381347656 nb_pixel_total : 15361 time to create 1 rle with old method : 0.019388914108276367 length of segment : 140 time for calcul the mask position with numpy : 0.0009725093841552734 nb_pixel_total : 24401 time to create 1 rle with old method : 0.028260231018066406 length of segment : 199 time for calcul the mask position with numpy : 0.0007503032684326172 nb_pixel_total : 32978 time to create 1 rle with old method : 0.042954206466674805 length of segment : 244 time for calcul the mask position with numpy : 0.0006375312805175781 nb_pixel_total : 6047 time to create 1 rle with old method : 0.007161617279052734 length of segment : 129 time for calcul the mask position with numpy : 0.0028734207153320312 nb_pixel_total : 90773 time to create 1 rle with old method : 0.10337710380554199 length of segment : 273 time for calcul the mask position with numpy : 0.003129720687866211 nb_pixel_total : 84113 time to create 1 rle with old method : 0.09477496147155762 length of segment : 378 time for calcul the mask position with numpy : 0.004161834716796875 nb_pixel_total : 89211 time to create 1 rle with old method : 0.10841989517211914 length of segment : 292 time for calcul the mask position with numpy : 0.0004680156707763672 nb_pixel_total : 5288 time to create 1 rle with old method : 0.009285688400268555 length of segment : 121 time for calcul the mask position with numpy : 0.0005052089691162109 nb_pixel_total : 10492 time to create 1 rle with old method : 0.01353311538696289 length of segment : 119 time for calcul the mask position with numpy : 0.0023224353790283203 nb_pixel_total : 63458 time to create 1 rle with old method : 0.07204198837280273 length of segment : 259 time for calcul the mask position with numpy : 0.0005240440368652344 nb_pixel_total : 11514 time to create 1 rle with old method : 0.013675928115844727 length of segment : 140 time for calcul the mask position with numpy : 0.0008647441864013672 nb_pixel_total : 27880 time to create 1 rle with old method : 0.03258800506591797 length of segment : 235 time for calcul the mask position with numpy : 0.0005140304565429688 nb_pixel_total : 12634 time to create 1 rle with old method : 0.014889717102050781 length of segment : 146 time for calcul the mask position with numpy : 0.0015537738800048828 nb_pixel_total : 48808 time to create 1 rle with old method : 0.055585384368896484 length of segment : 276 time for calcul the mask position with numpy : 0.0035500526428222656 nb_pixel_total : 110743 time to create 1 rle with old method : 0.12337183952331543 length of segment : 386 time for calcul the mask position with numpy : 0.0024118423461914062 nb_pixel_total : 82829 time to create 1 rle with old method : 0.0956120491027832 length of segment : 267 time for calcul the mask position with numpy : 0.01242828369140625 nb_pixel_total : 274237 time to create 1 rle with new method : 0.028894424438476562 length of segment : 1130 time for calcul the mask position with numpy : 0.0009152889251708984 nb_pixel_total : 38287 time to create 1 rle with old method : 0.043628692626953125 length of segment : 300 time for calcul the mask position with numpy : 0.0010852813720703125 nb_pixel_total : 25874 time to create 1 rle with old method : 0.0296785831451416 length of segment : 279 time for calcul the mask position with numpy : 0.0012135505676269531 nb_pixel_total : 36023 time to create 1 rle with old method : 0.04134988784790039 length of segment : 211 time for calcul the mask position with numpy : 0.0009105205535888672 nb_pixel_total : 22227 time to create 1 rle with old method : 0.0263059139251709 length of segment : 167 time for calcul the mask position with numpy : 0.0008909702301025391 nb_pixel_total : 22071 time to create 1 rle with old method : 0.0263369083404541 length of segment : 190 time for calcul the mask position with numpy : 0.0009098052978515625 nb_pixel_total : 19745 time to create 1 rle with old method : 0.023684978485107422 length of segment : 179 time for calcul the mask position with numpy : 0.0008559226989746094 nb_pixel_total : 17979 time to create 1 rle with old method : 0.02161407470703125 length of segment : 116 time for calcul the mask position with numpy : 0.002239227294921875 nb_pixel_total : 39489 time to create 1 rle with old method : 0.048699140548706055 length of segment : 315 time for calcul the mask position with numpy : 0.0012199878692626953 nb_pixel_total : 28245 time to create 1 rle with old method : 0.0440213680267334 length of segment : 223 time for calcul the mask position with numpy : 0.0008502006530761719 nb_pixel_total : 19274 time to create 1 rle with old method : 0.026547670364379883 length of segment : 133 time for calcul the mask position with numpy : 0.0006129741668701172 nb_pixel_total : 15043 time to create 1 rle with old method : 0.01794743537902832 length of segment : 127 time for calcul the mask position with numpy : 0.00023698806762695312 nb_pixel_total : 5755 time to create 1 rle with old method : 0.007253885269165039 length of segment : 155 time for calcul the mask position with numpy : 0.0006375312805175781 nb_pixel_total : 11357 time to create 1 rle with old method : 0.018560409545898438 length of segment : 134 time for calcul the mask position with numpy : 0.0005347728729248047 nb_pixel_total : 8508 time to create 1 rle with old method : 0.014107704162597656 length of segment : 139 time for calcul the mask position with numpy : 0.0008685588836669922 nb_pixel_total : 19293 time to create 1 rle with old method : 0.022175312042236328 length of segment : 152 time for calcul the mask position with numpy : 0.00688481330871582 nb_pixel_total : 201805 time to create 1 rle with new method : 0.011817693710327148 length of segment : 551 time for calcul the mask position with numpy : 0.0011386871337890625 nb_pixel_total : 20193 time to create 1 rle with old method : 0.023998498916625977 length of segment : 263 time for calcul the mask position with numpy : 0.0009932518005371094 nb_pixel_total : 23443 time to create 1 rle with old method : 0.027056455612182617 length of segment : 205 time for calcul the mask position with numpy : 0.002578258514404297 nb_pixel_total : 43920 time to create 1 rle with old method : 0.051290035247802734 length of segment : 286 time for calcul the mask position with numpy : 0.004147768020629883 nb_pixel_total : 89064 time to create 1 rle with old method : 0.10179305076599121 length of segment : 511 time for calcul the mask position with numpy : 0.0008220672607421875 nb_pixel_total : 25069 time to create 1 rle with old method : 0.028873682022094727 length of segment : 188 time for calcul the mask position with numpy : 0.001010894775390625 nb_pixel_total : 16770 time to create 1 rle with old method : 0.019862890243530273 length of segment : 163 time for calcul the mask position with numpy : 0.001695394515991211 nb_pixel_total : 22556 time to create 1 rle with old method : 0.0259859561920166 length of segment : 285 time for calcul the mask position with numpy : 0.0012156963348388672 nb_pixel_total : 26318 time to create 1 rle with old method : 0.036272287368774414 length of segment : 107 time for calcul the mask position with numpy : 0.00173187255859375 nb_pixel_total : 37353 time to create 1 rle with old method : 0.051377058029174805 length of segment : 242 time for calcul the mask position with numpy : 0.0014095306396484375 nb_pixel_total : 33686 time to create 1 rle with old method : 0.03948569297790527 length of segment : 285 time for calcul the mask position with numpy : 0.00042557716369628906 nb_pixel_total : 8993 time to create 1 rle with old method : 0.01072835922241211 length of segment : 103 time for calcul the mask position with numpy : 0.0003533363342285156 nb_pixel_total : 12081 time to create 1 rle with old method : 0.01430368423461914 length of segment : 134 time for calcul the mask position with numpy : 0.0004553794860839844 nb_pixel_total : 9086 time to create 1 rle with old method : 0.011077404022216797 length of segment : 182 time for calcul the mask position with numpy : 0.00020885467529296875 nb_pixel_total : 5493 time to create 1 rle with old method : 0.006595611572265625 length of segment : 71 time for calcul the mask position with numpy : 0.006537675857543945 nb_pixel_total : 177721 time to create 1 rle with new method : 0.014624357223510742 length of segment : 526 time for calcul the mask position with numpy : 0.002286195755004883 nb_pixel_total : 57332 time to create 1 rle with old method : 0.06454086303710938 length of segment : 361 time for calcul the mask position with numpy : 0.0031151771545410156 nb_pixel_total : 95329 time to create 1 rle with old method : 0.10886383056640625 length of segment : 394 time for calcul the mask position with numpy : 0.0004954338073730469 nb_pixel_total : 12983 time to create 1 rle with old method : 0.021210908889770508 length of segment : 137 time for calcul the mask position with numpy : 0.0033719539642333984 nb_pixel_total : 96118 time to create 1 rle with old method : 0.10984110832214355 length of segment : 368 time for calcul the mask position with numpy : 0.0012500286102294922 nb_pixel_total : 21879 time to create 1 rle with old method : 0.0254671573638916 length of segment : 250 time for calcul the mask position with numpy : 0.0024290084838867188 nb_pixel_total : 42804 time to create 1 rle with old method : 0.06572222709655762 length of segment : 325 time for calcul the mask position with numpy : 0.0007901191711425781 nb_pixel_total : 17435 time to create 1 rle with old method : 0.020319223403930664 length of segment : 175 time for calcul the mask position with numpy : 0.001967191696166992 nb_pixel_total : 48757 time to create 1 rle with old method : 0.05991506576538086 length of segment : 351 time for calcul the mask position with numpy : 0.003373384475708008 nb_pixel_total : 93072 time to create 1 rle with old method : 0.10397148132324219 length of segment : 438 time for calcul the mask position with numpy : 0.0006558895111083984 nb_pixel_total : 17267 time to create 1 rle with old method : 0.020196199417114258 length of segment : 156 time for calcul the mask position with numpy : 0.0009102821350097656 nb_pixel_total : 15843 time to create 1 rle with old method : 0.019478797912597656 length of segment : 158 time for calcul the mask position with numpy : 0.005299568176269531 nb_pixel_total : 149970 time to create 1 rle with old method : 0.17076349258422852 length of segment : 799 time for calcul the mask position with numpy : 0.0019638538360595703 nb_pixel_total : 40844 time to create 1 rle with old method : 0.050188302993774414 length of segment : 326 time for calcul the mask position with numpy : 0.0028183460235595703 nb_pixel_total : 40845 time to create 1 rle with old method : 0.04602813720703125 length of segment : 334 time for calcul the mask position with numpy : 0.001992464065551758 nb_pixel_total : 59571 time to create 1 rle with old method : 0.06735420227050781 length of segment : 351 time for calcul the mask position with numpy : 0.001058340072631836 nb_pixel_total : 19742 time to create 1 rle with old method : 0.022545814514160156 length of segment : 309 time for calcul the mask position with numpy : 0.0004360675811767578 nb_pixel_total : 11591 time to create 1 rle with old method : 0.013406515121459961 length of segment : 130 time for calcul the mask position with numpy : 0.0009391307830810547 nb_pixel_total : 17645 time to create 1 rle with old method : 0.02108478546142578 length of segment : 173 time for calcul the mask position with numpy : 0.0024793148040771484 nb_pixel_total : 58740 time to create 1 rle with old method : 0.06730055809020996 length of segment : 318 time for calcul the mask position with numpy : 0.003156900405883789 nb_pixel_total : 99131 time to create 1 rle with old method : 0.11203956604003906 length of segment : 457 time for calcul the mask position with numpy : 0.0035092830657958984 nb_pixel_total : 66672 time to create 1 rle with old method : 0.07618546485900879 length of segment : 346 time for calcul the mask position with numpy : 0.002928495407104492 nb_pixel_total : 40827 time to create 1 rle with old method : 0.04609799385070801 length of segment : 724 time for calcul the mask position with numpy : 0.0019047260284423828 nb_pixel_total : 49197 time to create 1 rle with old method : 0.055626630783081055 length of segment : 364 time for calcul the mask position with numpy : 0.001341104507446289 nb_pixel_total : 37127 time to create 1 rle with old method : 0.04477715492248535 length of segment : 266 time for calcul the mask position with numpy : 0.004721403121948242 nb_pixel_total : 142999 time to create 1 rle with old method : 0.16613984107971191 length of segment : 302 time for calcul the mask position with numpy : 0.0006711483001708984 nb_pixel_total : 22772 time to create 1 rle with old method : 0.02882862091064453 length of segment : 187 time for calcul the mask position with numpy : 0.003487110137939453 nb_pixel_total : 109151 time to create 1 rle with old method : 0.1239161491394043 length of segment : 486 time for calcul the mask position with numpy : 0.0008680820465087891 nb_pixel_total : 20782 time to create 1 rle with old method : 0.024035930633544922 length of segment : 203 time for calcul the mask position with numpy : 0.0005385875701904297 nb_pixel_total : 13689 time to create 1 rle with old method : 0.01612544059753418 length of segment : 120 time for calcul the mask position with numpy : 0.005565643310546875 nb_pixel_total : 173232 time to create 1 rle with new method : 0.011158227920532227 length of segment : 474 time for calcul the mask position with numpy : 0.005225181579589844 nb_pixel_total : 152098 time to create 1 rle with new method : 0.010637521743774414 length of segment : 637 time for calcul the mask position with numpy : 0.0016164779663085938 nb_pixel_total : 65489 time to create 1 rle with old method : 0.0748906135559082 length of segment : 278 time for calcul the mask position with numpy : 0.002371072769165039 nb_pixel_total : 78621 time to create 1 rle with old method : 0.08901739120483398 length of segment : 660 time for calcul the mask position with numpy : 0.0004425048828125 nb_pixel_total : 25144 time to create 1 rle with old method : 0.029241085052490234 length of segment : 173 time for calcul the mask position with numpy : 0.003964900970458984 nb_pixel_total : 116481 time to create 1 rle with old method : 0.13240575790405273 length of segment : 353 time for calcul the mask position with numpy : 0.0035550594329833984 nb_pixel_total : 90949 time to create 1 rle with old method : 0.11379528045654297 length of segment : 482 time for calcul the mask position with numpy : 0.0032019615173339844 nb_pixel_total : 98685 time to create 1 rle with old method : 0.12341952323913574 length of segment : 332 time for calcul the mask position with numpy : 0.003168821334838867 nb_pixel_total : 91756 time to create 1 rle with old method : 0.10596656799316406 length of segment : 491 time for calcul the mask position with numpy : 0.001336812973022461 nb_pixel_total : 27258 time to create 1 rle with old method : 0.03102731704711914 length of segment : 327 time for calcul the mask position with numpy : 0.00048160552978515625 nb_pixel_total : 11514 time to create 1 rle with old method : 0.013518810272216797 length of segment : 170 time for calcul the mask position with numpy : 0.0028083324432373047 nb_pixel_total : 49865 time to create 1 rle with old method : 0.05824637413024902 length of segment : 427 time for calcul the mask position with numpy : 0.0010819435119628906 nb_pixel_total : 20225 time to create 1 rle with old method : 0.023259878158569336 length of segment : 567 time for calcul the mask position with numpy : 0.0018279552459716797 nb_pixel_total : 64512 time to create 1 rle with old method : 0.07497453689575195 length of segment : 294 time for calcul the mask position with numpy : 0.0028405189514160156 nb_pixel_total : 75817 time to create 1 rle with old method : 0.0861518383026123 length of segment : 335 time for calcul the mask position with numpy : 0.0005240440368652344 nb_pixel_total : 11637 time to create 1 rle with old method : 0.013323545455932617 length of segment : 156 time for calcul the mask position with numpy : 0.0019075870513916016 nb_pixel_total : 69291 time to create 1 rle with old method : 0.08059215545654297 length of segment : 275 time for calcul the mask position with numpy : 0.002969980239868164 nb_pixel_total : 68056 time to create 1 rle with old method : 0.07892513275146484 length of segment : 599 time for calcul the mask position with numpy : 0.0014333724975585938 nb_pixel_total : 40842 time to create 1 rle with old method : 0.046210527420043945 length of segment : 217 time for calcul the mask position with numpy : 0.0026793479919433594 nb_pixel_total : 57900 time to create 1 rle with old method : 0.06692910194396973 length of segment : 491 time for calcul the mask position with numpy : 0.002508878707885742 nb_pixel_total : 84319 time to create 1 rle with old method : 0.09422779083251953 length of segment : 360 time for calcul the mask position with numpy : 0.0006148815155029297 nb_pixel_total : 16426 time to create 1 rle with old method : 0.019377946853637695 length of segment : 163 time for calcul the mask position with numpy : 0.017030715942382812 nb_pixel_total : 327325 time to create 1 rle with new method : 0.016316652297973633 length of segment : 733 time for calcul the mask position with numpy : 0.0029752254486083984 nb_pixel_total : 54248 time to create 1 rle with old method : 0.06184840202331543 length of segment : 276 time for calcul the mask position with numpy : 0.003830432891845703 nb_pixel_total : 62476 time to create 1 rle with old method : 0.0713047981262207 length of segment : 525 time for calcul the mask position with numpy : 0.0015206336975097656 nb_pixel_total : 37045 time to create 1 rle with old method : 0.042305946350097656 length of segment : 230 time for calcul the mask position with numpy : 0.0015921592712402344 nb_pixel_total : 37987 time to create 1 rle with old method : 0.04517698287963867 length of segment : 223 time for calcul the mask position with numpy : 0.0007359981536865234 nb_pixel_total : 17562 time to create 1 rle with old method : 0.023748397827148438 length of segment : 155 time for calcul the mask position with numpy : 0.0004794597625732422 nb_pixel_total : 8336 time to create 1 rle with old method : 0.010178327560424805 length of segment : 111 time for calcul the mask position with numpy : 0.0006763935089111328 nb_pixel_total : 14210 time to create 1 rle with old method : 0.01671910285949707 length of segment : 158 time for calcul the mask position with numpy : 0.0046083927154541016 nb_pixel_total : 129406 time to create 1 rle with old method : 0.14703679084777832 length of segment : 503 time for calcul the mask position with numpy : 0.0010685920715332031 nb_pixel_total : 19949 time to create 1 rle with old method : 0.023319721221923828 length of segment : 227 time for calcul the mask position with numpy : 0.006213188171386719 nb_pixel_total : 77291 time to create 1 rle with old method : 0.08786821365356445 length of segment : 515 time for calcul the mask position with numpy : 0.0006322860717773438 nb_pixel_total : 17525 time to create 1 rle with old method : 0.020662307739257812 length of segment : 102 time for calcul the mask position with numpy : 0.0033028125762939453 nb_pixel_total : 65393 time to create 1 rle with old method : 0.0760195255279541 length of segment : 298 time for calcul the mask position with numpy : 0.0008118152618408203 nb_pixel_total : 12751 time to create 1 rle with old method : 0.015188932418823242 length of segment : 128 time for calcul the mask position with numpy : 0.0014796257019042969 nb_pixel_total : 31276 time to create 1 rle with old method : 0.03655242919921875 length of segment : 282 time for calcul the mask position with numpy : 0.0007112026214599609 nb_pixel_total : 15280 time to create 1 rle with old method : 0.018543481826782227 length of segment : 94 time for calcul the mask position with numpy : 0.0040283203125 nb_pixel_total : 100962 time to create 1 rle with old method : 0.1156911849975586 length of segment : 726 time for calcul the mask position with numpy : 0.0010137557983398438 nb_pixel_total : 20437 time to create 1 rle with old method : 0.023952007293701172 length of segment : 164 time for calcul the mask position with numpy : 0.0016167163848876953 nb_pixel_total : 35721 time to create 1 rle with old method : 0.04752492904663086 length of segment : 295 time for calcul the mask position with numpy : 0.003933906555175781 nb_pixel_total : 90215 time to create 1 rle with old method : 0.12127113342285156 length of segment : 483 time for calcul the mask position with numpy : 0.0054247379302978516 nb_pixel_total : 189941 time to create 1 rle with new method : 0.009618759155273438 length of segment : 533 time for calcul the mask position with numpy : 0.0013074874877929688 nb_pixel_total : 33724 time to create 1 rle with old method : 0.03872418403625488 length of segment : 198 time for calcul the mask position with numpy : 0.001657247543334961 nb_pixel_total : 39836 time to create 1 rle with old method : 0.045244693756103516 length of segment : 346 time for calcul the mask position with numpy : 0.0022156238555908203 nb_pixel_total : 67778 time to create 1 rle with old method : 0.07842135429382324 length of segment : 188 time for calcul the mask position with numpy : 0.010660171508789062 nb_pixel_total : 325861 time to create 1 rle with new method : 0.03571009635925293 length of segment : 1103 time for calcul the mask position with numpy : 0.0007212162017822266 nb_pixel_total : 13753 time to create 1 rle with old method : 0.01609969139099121 length of segment : 168 time for calcul the mask position with numpy : 0.001100778579711914 nb_pixel_total : 29687 time to create 1 rle with old method : 0.04513883590698242 length of segment : 229 time for calcul the mask position with numpy : 0.0006363391876220703 nb_pixel_total : 11742 time to create 1 rle with old method : 0.019489288330078125 length of segment : 138 time for calcul the mask position with numpy : 0.0015845298767089844 nb_pixel_total : 31380 time to create 1 rle with old method : 0.035834550857543945 length of segment : 293 time for calcul the mask position with numpy : 0.0010647773742675781 nb_pixel_total : 24051 time to create 1 rle with old method : 0.028873205184936523 length of segment : 177 time for calcul the mask position with numpy : 0.009576797485351562 nb_pixel_total : 268875 time to create 1 rle with new method : 0.029653549194335938 length of segment : 535 time for calcul the mask position with numpy : 0.003289461135864258 nb_pixel_total : 104643 time to create 1 rle with old method : 0.12225103378295898 length of segment : 503 time for calcul the mask position with numpy : 0.0006079673767089844 nb_pixel_total : 17037 time to create 1 rle with old method : 0.019752979278564453 length of segment : 220 time spent for convertir_results : 45.152679204940796 Inside saveOutput : final : False verbose : 0 eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 339 chid ids of type : 3594 Number RLEs to save : 103797 save missing photos in datou_result : time spend for datou_step_exec : 291.72016954421997 time spend to save output : 6.565809965133667 total time spend for step 1 : 298.28597950935364 step2:crop_condition Tue May 20 21:05:31 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure Loading chi in step crop with photo_hashtag_type : 3594 Loading chi in step crop for list_pids : 20 ! batch 1 Loaded 339 chid ids of type : 3594 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ begin to crop the class : papier param for this class : {'min_score': 0.7} filtre for class : papier hashtag_id of this class : 492668766 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 226 About to insert : list_path_to_insert length 226 new photo from crops ! About to upload 226 photos upload in portfolio : 3736932 init cache_photo without model_param we have 226 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1747767989_3935224 we have uploaded 226 photos in the portfolio 3736932 time of upload the photos Elapsed time : 59.21100401878357 we have finished the crop for the class : papier begin to crop the class : carton param for this class : {'min_score': 0.7} filtre for class : carton hashtag_id of this class : 492774966 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 21 About to insert : list_path_to_insert length 21 new photo from crops ! About to upload 21 photos upload in portfolio : 3736932 init cache_photo without model_param we have 21 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1747768058_3935224 we have uploaded 21 photos in the portfolio 3736932 time of upload the photos Elapsed time : 5.171204328536987 we have finished the crop for the class : carton begin to crop the class : metal param for this class : {'min_score': 0.7} filtre for class : metal hashtag_id of this class : 492628673 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 7 About to insert : list_path_to_insert length 7 new photo from crops ! About to upload 7 photos upload in portfolio : 3736932 init cache_photo without model_param we have 7 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1747768068_3935224 we have uploaded 7 photos in the portfolio 3736932 time of upload the photos Elapsed time : 2.041776180267334 we have finished the crop for the class : metal begin to crop the class : pet_clair param for this class : {'min_score': 0.7} filtre for class : pet_clair hashtag_id of this class : 2107755846 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 68 About to insert : list_path_to_insert length 68 new photo from crops ! About to upload 68 photos upload in portfolio : 3736932 init cache_photo without model_param we have 68 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1747768106_3935224 we have uploaded 68 photos in the portfolio 3736932 time of upload the photos Elapsed time : 18.57629108428955 we have finished the crop for the class : pet_clair begin to crop the class : autre param for this class : {'min_score': 0.7} filtre for class : autre hashtag_id of this class : 494826614 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 8 About to insert : list_path_to_insert length 8 new photo from crops ! About to upload 8 photos upload in portfolio : 3736932 init cache_photo without model_param we have 8 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1747768130_3935224 we have uploaded 8 photos in the portfolio 3736932 time of upload the photos Elapsed time : 2.5717060565948486 we have finished the crop for the class : autre begin to crop the class : pehd param for this class : {'min_score': 0.7} filtre for class : pehd hashtag_id of this class : 628944319 we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 1 About to insert : list_path_to_insert length 1 new photo from crops ! About to upload 1 photos upload in portfolio : 3736932 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1747768135_3935224 we have uploaded 1 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.5417437553405762 we have finished the crop for the class : pehd begin to crop the class : pet_fonce param for this class : {'min_score': 0.7} filtre for class : pet_fonce hashtag_id of this class : 2107755900 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 8 About to insert : list_path_to_insert length 8 new photo from crops ! About to upload 8 photos upload in portfolio : 3736932 init cache_photo without model_param we have 8 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1747768141_3935224 we have uploaded 8 photos in the portfolio 3736932 time of upload the photos Elapsed time : 2.9787449836730957 we have finished the crop for the class : pet_fonce delete rles from all chi we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : crop_condition we use saveGeneral [1359829146, 1359829133, 1359829127, 1359829122, 1359829092, 1359829087, 1359829085, 1359829083, 1359829069, 1359829062, 1359829058, 1359829054, 1359829049, 1359829046, 1359829018, 1359829014, 1359829011, 1359829006, 1359828975, 1359828933] Looping around the photos to save general results len do output : 339 /1359858480Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858481Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858482Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858483Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858484Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858485Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858486Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858487Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858488Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858489Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858490Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858491Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858492Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858493Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858494Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858495Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858496Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858497Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858498Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858499Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858500Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858501Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858502Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858504Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858505Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858506Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858507Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858508Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858510Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858511Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858512Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858513Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858514Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858515Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858516Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858517Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858518Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858519Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858520Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858521Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858522Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858523Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858524Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858525Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858526Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858527Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858528Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858529Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858530Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858531Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858532Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858533Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858534Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858535Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858536Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858537Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858538Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858539Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858540Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858541Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858542Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858544Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858545Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858546Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858547Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858548Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858549Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858550Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858551Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858552Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858553Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858554Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858555Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858556Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858557Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858558Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858559Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858560Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858561Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858562Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858563Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858564Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858565Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858566Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858568Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858569Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858570Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858571Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858572Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858573Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858574Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858575Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858576Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858577Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858579Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858580Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858581Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858582Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858583Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858585Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858586Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858587Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858588Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858589Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858590Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858591Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858592Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858593Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858594Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858595Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858596Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858597Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858598Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858599Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858600Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858601Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858602Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858603Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858604Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858605Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858606Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858607Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858608Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858609Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858610Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858611Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858612Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858613Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858614Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858615Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858616Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858617Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858618Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858619Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858620Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858621Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858622Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858623Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858624Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858625Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858626Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858627Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858628Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858629Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858630Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858631Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858632Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858633Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858634Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858635Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858636Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858637Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858638Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858640Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858641Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858642Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858643Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858644Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858645Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858646Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858647Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858648Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858649Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858650Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858651Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858652Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858653Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858654Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858655Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858656Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858658Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858659Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858660Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858661Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858662Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858663Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858664Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858665Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858666Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858667Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858668Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858669Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858670Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858671Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858672Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858673Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858674Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858675Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858676Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858678Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858679Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858680Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858681Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858682Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858683Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858684Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858685Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858686Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858687Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858688Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858690Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858691Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858692Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858693Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858694Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858695Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858696Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858697Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858698Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858699Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858700Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858701Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858702Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858703Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858704Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858705Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858706Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858707Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858708Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858709Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858710Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858711Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858712Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858713Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858715Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858716Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858727Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858728Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858729Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858730Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858731Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858732Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858733Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858734Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858736Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858737Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858738Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858739Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858740Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858741Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858742Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858743Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858744Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858745Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858746Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858747Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858748Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858753Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858754Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858755Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858756Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858757Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858758Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858759Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858808Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858810Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858812Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858814Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858815Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858816Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858818Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858820Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858822Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858824Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858826Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858828Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858829Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858830Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858831Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858832Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858833Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858834Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858835Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858836Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858837Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858838Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858839Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858840Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858841Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858842Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858843Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858845Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858846Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858847Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858848Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858849Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858850Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858851Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858852Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858853Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858854Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858855Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858856Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858857Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858858Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858859Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858860Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858861Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858862Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858863Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858864Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858865Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858866Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858867Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858868Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858869Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858870Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858871Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858872Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858873Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858874Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858875Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858876Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858877Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858878Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858879Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858880Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858881Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858882Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858883Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858884Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858885Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858906Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858908Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858909Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858910Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858911Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858912Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858913Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858914Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858917Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858953Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858954Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858955Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858956Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858957Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858958Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858959Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1359858960Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829146', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829133', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829127', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829122', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829092', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829087', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829085', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829083', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829069', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829062', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829058', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829054', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829049', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829046', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829018', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829014', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829011', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829006', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359828975', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359828933', None, None, None, None, None, '2913534') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1037 time used for this insertion : 0.052378177642822266 save_final save missing photos in datou_result : time spend for datou_step_exec : 212.07932138442993 time spend to save output : 0.059958457946777344 total time spend for step 2 : 212.1392798423767 step3:rle_unique_nms_with_priority Tue May 20 21:09:04 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 339 chid ids of type : 3594 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++nb_obj : 18 nb_hashtags : 4 time to prepare the origin masks : 11.988893508911133 time for calcul the mask position with numpy : 1.0973916053771973 nb_pixel_total : 7322014 time to create 1 rle with new method : 1.0795443058013916 time for calcul the mask position with numpy : 0.04909229278564453 nb_pixel_total : 56208 time to create 1 rle with old method : 0.06318426132202148 time for calcul the mask position with numpy : 0.041849613189697266 nb_pixel_total : 67699 time to create 1 rle with old method : 0.0758974552154541 time for calcul the mask position with numpy : 0.0417170524597168 nb_pixel_total : 37582 time to create 1 rle with old method : 0.042128801345825195 time for calcul the mask position with numpy : 0.04083824157714844 nb_pixel_total : 31334 time to create 1 rle with old method : 0.03541064262390137 time for calcul the mask position with numpy : 0.045319318771362305 nb_pixel_total : 99546 time to create 1 rle with old method : 0.11070966720581055 time for calcul the mask position with numpy : 0.04090142250061035 nb_pixel_total : 12295 time to create 1 rle with old method : 0.013765335083007812 time for calcul the mask position with numpy : 0.04060053825378418 nb_pixel_total : 3952 time to create 1 rle with old method : 0.004544496536254883 time for calcul the mask position with numpy : 0.04384160041809082 nb_pixel_total : 67833 time to create 1 rle with old method : 0.07862472534179688 time for calcul the mask position with numpy : 0.04086637496948242 nb_pixel_total : 144762 time to create 1 rle with old method : 0.16335773468017578 time for calcul the mask position with numpy : 0.03938436508178711 nb_pixel_total : 50386 time to create 1 rle with old method : 0.054775238037109375 time for calcul the mask position with numpy : 0.04210686683654785 nb_pixel_total : 19056 time to create 1 rle with old method : 0.03069758415222168 time for calcul the mask position with numpy : 0.045003414154052734 nb_pixel_total : 34324 time to create 1 rle with old method : 0.042047739028930664 time for calcul the mask position with numpy : 0.04037976264953613 nb_pixel_total : 8958 time to create 1 rle with old method : 0.010084867477416992 time for calcul the mask position with numpy : 0.04032564163208008 nb_pixel_total : 185800 time to create 1 rle with new method : 1.1067848205566406 time for calcul the mask position with numpy : 0.03808426856994629 nb_pixel_total : 47764 time to create 1 rle with old method : 0.05482363700866699 time for calcul the mask position with numpy : 0.04390716552734375 nb_pixel_total : 49228 time to create 1 rle with old method : 0.06132960319519043 time for calcul the mask position with numpy : 0.02664494514465332 nb_pixel_total : 25805 time to create 1 rle with old method : 0.029291391372680664 time for calcul the mask position with numpy : 0.025467872619628906 nb_pixel_total : 29854 time to create 1 rle with old method : 0.03460383415222168 create new chi : 4.9851696491241455 time to delete rle : 0.017678499221801758 batch 1 Loaded 37 chid ids of type : 3594 +++++++++++++++++++++++++++++++Number RLEs to save : 13326 TO DO : save crop sub photo not yet done ! save time : 0.8603231906890869 nb_obj : 18 nb_hashtags : 3 time to prepare the origin masks : 9.498635530471802 time for calcul the mask position with numpy : 0.7700340747833252 nb_pixel_total : 7098119 time to create 1 rle with new method : 1.5060067176818848 time for calcul the mask position with numpy : 0.04081892967224121 nb_pixel_total : 19045 time to create 1 rle with old method : 0.02105116844177246 time for calcul the mask position with numpy : 0.04039454460144043 nb_pixel_total : 8572 time to create 1 rle with old method : 0.009782552719116211 time for calcul the mask position with numpy : 0.041611671447753906 nb_pixel_total : 30470 time to create 1 rle with old method : 0.034343719482421875 time for calcul the mask position with numpy : 0.042232513427734375 nb_pixel_total : 115659 time to create 1 rle with old method : 0.12948393821716309 time for calcul the mask position with numpy : 0.040190696716308594 nb_pixel_total : 74072 time to create 1 rle with old method : 0.08736276626586914 time for calcul the mask position with numpy : 0.039702415466308594 nb_pixel_total : 23004 time to create 1 rle with old method : 0.025400638580322266 time for calcul the mask position with numpy : 0.04033493995666504 nb_pixel_total : 38737 time to create 1 rle with old method : 0.04255223274230957 time for calcul the mask position with numpy : 0.04275703430175781 nb_pixel_total : 255726 time to create 1 rle with new method : 0.7342684268951416 time for calcul the mask position with numpy : 0.04118704795837402 nb_pixel_total : 34401 time to create 1 rle with old method : 0.039649009704589844 time for calcul the mask position with numpy : 0.04287576675415039 nb_pixel_total : 229172 time to create 1 rle with new method : 0.813176155090332 time for calcul the mask position with numpy : 0.04159855842590332 nb_pixel_total : 22990 time to create 1 rle with old method : 0.02599930763244629 time for calcul the mask position with numpy : 0.04156160354614258 nb_pixel_total : 29562 time to create 1 rle with old method : 0.03308892250061035 time for calcul the mask position with numpy : 0.0373990535736084 nb_pixel_total : 19286 time to create 1 rle with old method : 0.021666288375854492 time for calcul the mask position with numpy : 0.03824734687805176 nb_pixel_total : 24448 time to create 1 rle with old method : 0.02739405632019043 time for calcul the mask position with numpy : 0.04252195358276367 nb_pixel_total : 28736 time to create 1 rle with old method : 0.0321352481842041 time for calcul the mask position with numpy : 0.04382967948913574 nb_pixel_total : 44986 time to create 1 rle with old method : 0.05063223838806152 time for calcul the mask position with numpy : 0.04023313522338867 nb_pixel_total : 52431 time to create 1 rle with old method : 0.058042049407958984 time for calcul the mask position with numpy : 0.03972339630126953 nb_pixel_total : 144984 time to create 1 rle with old method : 0.16201448440551758 create new chi : 5.4609174728393555 time to delete rle : 0.002144336700439453 batch 1 Loaded 37 chid ids of type : 3594 +++++++++++++++++++++++++++Number RLEs to save : 14018 TO DO : save crop sub photo not yet done ! save time : 0.8974242210388184 nb_obj : 11 nb_hashtags : 4 time to prepare the origin masks : 5.271130800247192 time for calcul the mask position with numpy : 0.4300215244293213 nb_pixel_total : 7252375 time to create 1 rle with new method : 0.6755056381225586 time for calcul the mask position with numpy : 0.03300309181213379 nb_pixel_total : 34536 time to create 1 rle with old method : 0.03873252868652344 time for calcul the mask position with numpy : 0.029624462127685547 nb_pixel_total : 91486 time to create 1 rle with old method : 0.10345172882080078 time for calcul the mask position with numpy : 0.02578425407409668 nb_pixel_total : 3205 time to create 1 rle with old method : 0.0037491321563720703 time for calcul the mask position with numpy : 0.025768041610717773 nb_pixel_total : 5709 time to create 1 rle with old method : 0.0066356658935546875 time for calcul the mask position with numpy : 0.027715444564819336 nb_pixel_total : 230237 time to create 1 rle with new method : 0.6923024654388428 time for calcul the mask position with numpy : 0.029143810272216797 nb_pixel_total : 22465 time to create 1 rle with old method : 0.03853940963745117 time for calcul the mask position with numpy : 0.035300493240356445 nb_pixel_total : 230180 time to create 1 rle with new method : 0.52372145652771 time for calcul the mask position with numpy : 0.02709817886352539 nb_pixel_total : 30911 time to create 1 rle with old method : 0.03466796875 time for calcul the mask position with numpy : 0.026076793670654297 nb_pixel_total : 31372 time to create 1 rle with old method : 0.03920555114746094 time for calcul the mask position with numpy : 0.04570317268371582 nb_pixel_total : 278993 time to create 1 rle with new method : 1.0520365238189697 time for calcul the mask position with numpy : 0.04499506950378418 nb_pixel_total : 82931 time to create 1 rle with old method : 0.09259819984436035 create new chi : 4.21972918510437 time to delete rle : 0.001562356948852539 batch 1 Loaded 23 chid ids of type : 3594 +++++++++++++Number RLEs to save : 9515 TO DO : save crop sub photo not yet done ! save time : 0.6445512771606445 nb_obj : 20 nb_hashtags : 3 time to prepare the origin masks : 8.33352017402649 time for calcul the mask position with numpy : 0.6168265342712402 nb_pixel_total : 7218871 time to create 1 rle with new method : 1.5706815719604492 time for calcul the mask position with numpy : 0.024541139602661133 nb_pixel_total : 59599 time to create 1 rle with old method : 0.06403040885925293 time for calcul the mask position with numpy : 0.0245821475982666 nb_pixel_total : 10907 time to create 1 rle with old method : 0.011951446533203125 time for calcul the mask position with numpy : 0.02541327476501465 nb_pixel_total : 72280 time to create 1 rle with old method : 0.0773313045501709 time for calcul the mask position with numpy : 0.025516748428344727 nb_pixel_total : 42389 time to create 1 rle with old method : 0.04672074317932129 time for calcul the mask position with numpy : 0.02388763427734375 nb_pixel_total : 72933 time to create 1 rle with old method : 0.07959270477294922 time for calcul the mask position with numpy : 0.026102781295776367 nb_pixel_total : 70547 time to create 1 rle with old method : 0.07740330696105957 time for calcul the mask position with numpy : 0.025471925735473633 nb_pixel_total : 73948 time to create 1 rle with old method : 0.08199644088745117 time for calcul the mask position with numpy : 0.025569915771484375 nb_pixel_total : 30449 time to create 1 rle with old method : 0.03433847427368164 time for calcul the mask position with numpy : 0.026063919067382812 nb_pixel_total : 52645 time to create 1 rle with old method : 0.06747269630432129 time for calcul the mask position with numpy : 0.042350053787231445 nb_pixel_total : 34750 time to create 1 rle with old method : 0.0395052433013916 time for calcul the mask position with numpy : 0.04381608963012695 nb_pixel_total : 28807 time to create 1 rle with old method : 0.03617072105407715 time for calcul the mask position with numpy : 0.0799403190612793 nb_pixel_total : 44019 time to create 1 rle with old method : 0.07053136825561523 time for calcul the mask position with numpy : 0.07108759880065918 nb_pixel_total : 25793 time to create 1 rle with old method : 0.046233177185058594 time for calcul the mask position with numpy : 0.06035017967224121 nb_pixel_total : 62010 time to create 1 rle with old method : 0.07478094100952148 time for calcul the mask position with numpy : 0.02900385856628418 nb_pixel_total : 165917 time to create 1 rle with new method : 1.0139625072479248 time for calcul the mask position with numpy : 0.025995969772338867 nb_pixel_total : 27570 time to create 1 rle with old method : 0.031671762466430664 time for calcul the mask position with numpy : 0.02803349494934082 nb_pixel_total : 32845 time to create 1 rle with old method : 0.037563323974609375 time for calcul the mask position with numpy : 0.027951955795288086 nb_pixel_total : 35565 time to create 1 rle with old method : 0.04117631912231445 time for calcul the mask position with numpy : 0.02988290786743164 nb_pixel_total : 91706 time to create 1 rle with old method : 0.10280370712280273 time for calcul the mask position with numpy : 0.027773618698120117 nb_pixel_total : 40850 time to create 1 rle with old method : 0.04725503921508789 create new chi : 5.03411340713501 time to delete rle : 0.002185344696044922 batch 1 Loaded 41 chid ids of type : 3594 ++++++++++++++++++++++Number RLEs to save : 12804 TO DO : save crop sub photo not yet done ! save time : 0.9316699504852295 nb_obj : 17 nb_hashtags : 5 time to prepare the origin masks : 10.96747636795044 time for calcul the mask position with numpy : 0.7461094856262207 nb_pixel_total : 7185398 time to create 1 rle with new method : 1.2231926918029785 time for calcul the mask position with numpy : 0.04098248481750488 nb_pixel_total : 1208 time to create 1 rle with old method : 0.0015130043029785156 time for calcul the mask position with numpy : 0.02807021141052246 nb_pixel_total : 75688 time to create 1 rle with old method : 0.08266329765319824 time for calcul the mask position with numpy : 0.025967121124267578 nb_pixel_total : 18290 time to create 1 rle with old method : 0.020416259765625 time for calcul the mask position with numpy : 0.0261077880859375 nb_pixel_total : 46721 time to create 1 rle with old method : 0.05159425735473633 time for calcul the mask position with numpy : 0.025478601455688477 nb_pixel_total : 29796 time to create 1 rle with old method : 0.032587289810180664 time for calcul the mask position with numpy : 0.025382518768310547 nb_pixel_total : 20282 time to create 1 rle with old method : 0.021921634674072266 time for calcul the mask position with numpy : 0.025847673416137695 nb_pixel_total : 105411 time to create 1 rle with old method : 0.1146090030670166 time for calcul the mask position with numpy : 0.024005651473999023 nb_pixel_total : 10269 time to create 1 rle with old method : 0.01114964485168457 time for calcul the mask position with numpy : 0.02422046661376953 nb_pixel_total : 73256 time to create 1 rle with old method : 0.07958817481994629 time for calcul the mask position with numpy : 0.02481389045715332 nb_pixel_total : 129676 time to create 1 rle with old method : 0.15504956245422363 time for calcul the mask position with numpy : 0.027104854583740234 nb_pixel_total : 171629 time to create 1 rle with new method : 0.6978981494903564 time for calcul the mask position with numpy : 0.02438950538635254 nb_pixel_total : 12226 time to create 1 rle with old method : 0.013198375701904297 time for calcul the mask position with numpy : 0.024671077728271484 nb_pixel_total : 123693 time to create 1 rle with old method : 0.13726472854614258 time for calcul the mask position with numpy : 0.024541378021240234 nb_pixel_total : 129140 time to create 1 rle with old method : 0.14049220085144043 time for calcul the mask position with numpy : 0.02484130859375 nb_pixel_total : 11767 time to create 1 rle with old method : 0.013094186782836914 time for calcul the mask position with numpy : 0.02476954460144043 nb_pixel_total : 33982 time to create 1 rle with old method : 0.03883028030395508 time for calcul the mask position with numpy : 0.02708888053894043 nb_pixel_total : 115968 time to create 1 rle with old method : 0.13550901412963867 create new chi : 4.2394421100616455 time to delete rle : 0.0019054412841796875 batch 1 Loaded 35 chid ids of type : 3594 ++++++++++++++++++++++Number RLEs to save : 12119 TO DO : save crop sub photo not yet done ! save time : 0.8258965015411377 nb_obj : 12 nb_hashtags : 3 time to prepare the origin masks : 7.924539804458618 time for calcul the mask position with numpy : 0.8284027576446533 nb_pixel_total : 7625156 time to create 1 rle with new method : 0.8147408962249756 time for calcul the mask position with numpy : 0.04218745231628418 nb_pixel_total : 26242 time to create 1 rle with old method : 0.029576539993286133 time for calcul the mask position with numpy : 0.041474342346191406 nb_pixel_total : 22662 time to create 1 rle with old method : 0.025324583053588867 time for calcul the mask position with numpy : 0.04115176200866699 nb_pixel_total : 31887 time to create 1 rle with old method : 0.04519224166870117 time for calcul the mask position with numpy : 0.04588127136230469 nb_pixel_total : 9568 time to create 1 rle with old method : 0.011280298233032227 time for calcul the mask position with numpy : 0.04170846939086914 nb_pixel_total : 78468 time to create 1 rle with old method : 0.08927154541015625 time for calcul the mask position with numpy : 0.04437136650085449 nb_pixel_total : 32606 time to create 1 rle with old method : 0.036527395248413086 time for calcul the mask position with numpy : 0.041043996810913086 nb_pixel_total : 84557 time to create 1 rle with old method : 0.09780716896057129 time for calcul the mask position with numpy : 0.0463709831237793 nb_pixel_total : 75064 time to create 1 rle with old method : 0.0893709659576416 time for calcul the mask position with numpy : 0.04396557807922363 nb_pixel_total : 8038 time to create 1 rle with old method : 0.009077072143554688 time for calcul the mask position with numpy : 0.04142308235168457 nb_pixel_total : 159544 time to create 1 rle with new method : 0.7368407249450684 time for calcul the mask position with numpy : 0.03788924217224121 nb_pixel_total : 87057 time to create 1 rle with old method : 0.10026979446411133 time for calcul the mask position with numpy : 0.03701615333557129 nb_pixel_total : 53551 time to create 1 rle with old method : 0.05935835838317871 create new chi : 3.558943510055542 time to delete rle : 0.0018703937530517578 batch 1 Loaded 25 chid ids of type : 3594 ++++++++++++++Number RLEs to save : 10620 TO DO : save crop sub photo not yet done ! save time : 0.6855251789093018 nb_obj : 29 nb_hashtags : 4 time to prepare the origin masks : 4.7894487380981445 time for calcul the mask position with numpy : 0.6759605407714844 nb_pixel_total : 6845081 time to create 1 rle with new method : 0.9966421127319336 time for calcul the mask position with numpy : 0.03472304344177246 nb_pixel_total : 67379 time to create 1 rle with old method : 0.07503080368041992 time for calcul the mask position with numpy : 0.03454136848449707 nb_pixel_total : 38317 time to create 1 rle with old method : 0.042604923248291016 time for calcul the mask position with numpy : 0.038240909576416016 nb_pixel_total : 98696 time to create 1 rle with old method : 0.12509632110595703 time for calcul the mask position with numpy : 0.03434896469116211 nb_pixel_total : 31188 time to create 1 rle with old method : 0.0347142219543457 time for calcul the mask position with numpy : 0.03459668159484863 nb_pixel_total : 45673 time to create 1 rle with old method : 0.05109906196594238 time for calcul the mask position with numpy : 0.03461170196533203 nb_pixel_total : 48598 time to create 1 rle with old method : 0.05446648597717285 time for calcul the mask position with numpy : 0.03429245948791504 nb_pixel_total : 28591 time to create 1 rle with old method : 0.03191828727722168 time for calcul the mask position with numpy : 0.03436851501464844 nb_pixel_total : 67886 time to create 1 rle with old method : 0.07554817199707031 time for calcul the mask position with numpy : 0.034297943115234375 nb_pixel_total : 22579 time to create 1 rle with old method : 0.025160789489746094 time for calcul the mask position with numpy : 0.03413105010986328 nb_pixel_total : 52014 time to create 1 rle with old method : 0.05787062644958496 time for calcul the mask position with numpy : 0.03395962715148926 nb_pixel_total : 31221 time to create 1 rle with old method : 0.034818410873413086 time for calcul the mask position with numpy : 0.03396177291870117 nb_pixel_total : 30389 time to create 1 rle with old method : 0.03391098976135254 time for calcul the mask position with numpy : 0.03413748741149902 nb_pixel_total : 88170 time to create 1 rle with old method : 0.0983731746673584 time for calcul the mask position with numpy : 0.03472590446472168 nb_pixel_total : 37336 time to create 1 rle with old method : 0.042125701904296875 time for calcul the mask position with numpy : 0.034584760665893555 nb_pixel_total : 115341 time to create 1 rle with old method : 0.1284618377685547 time for calcul the mask position with numpy : 0.03455471992492676 nb_pixel_total : 73840 time to create 1 rle with old method : 0.08222079277038574 time for calcul the mask position with numpy : 0.03428816795349121 nb_pixel_total : 93364 time to create 1 rle with old method : 0.10430645942687988 time for calcul the mask position with numpy : 0.03426241874694824 nb_pixel_total : 20088 time to create 1 rle with old method : 0.02264547348022461 time for calcul the mask position with numpy : 0.03416752815246582 nb_pixel_total : 38526 time to create 1 rle with old method : 0.04333662986755371 time for calcul the mask position with numpy : 0.03761768341064453 nb_pixel_total : 66801 time to create 1 rle with old method : 0.07851195335388184 time for calcul the mask position with numpy : 0.03892397880554199 nb_pixel_total : 70713 time to create 1 rle with old method : 0.07837915420532227 time for calcul the mask position with numpy : 0.03443002700805664 nb_pixel_total : 99678 time to create 1 rle with old method : 0.11295366287231445 time for calcul the mask position with numpy : 0.0341801643371582 nb_pixel_total : 49032 time to create 1 rle with old method : 0.05469059944152832 time for calcul the mask position with numpy : 0.034183502197265625 nb_pixel_total : 35086 time to create 1 rle with old method : 0.03947043418884277 time for calcul the mask position with numpy : 0.03409838676452637 nb_pixel_total : 9387 time to create 1 rle with old method : 0.010775566101074219 time for calcul the mask position with numpy : 0.03410196304321289 nb_pixel_total : 38182 time to create 1 rle with old method : 0.043061017990112305 time for calcul the mask position with numpy : 0.03411459922790527 nb_pixel_total : 15983 time to create 1 rle with old method : 0.01809549331665039 time for calcul the mask position with numpy : 0.03408193588256836 nb_pixel_total : 19716 time to create 1 rle with old method : 0.02193164825439453 time for calcul the mask position with numpy : 0.03401327133178711 nb_pixel_total : 15545 time to create 1 rle with old method : 0.017565488815307617 create new chi : 4.358470678329468 time to delete rle : 0.005044221878051758 batch 1 Loaded 59 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++Number RLEs to save : 20934 TO DO : save crop sub photo not yet done ! save time : 1.2736458778381348 nb_obj : 16 nb_hashtags : 5 time to prepare the origin masks : 10.135818481445312 time for calcul the mask position with numpy : 0.44972848892211914 nb_pixel_total : 7415930 time to create 1 rle with new method : 0.737098217010498 time for calcul the mask position with numpy : 0.03497147560119629 nb_pixel_total : 191442 time to create 1 rle with new method : 0.8982737064361572 time for calcul the mask position with numpy : 0.02866077423095703 nb_pixel_total : 7613 time to create 1 rle with old method : 0.008611679077148438 time for calcul the mask position with numpy : 0.027771949768066406 nb_pixel_total : 29732 time to create 1 rle with old method : 0.03335094451904297 time for calcul the mask position with numpy : 0.027797698974609375 nb_pixel_total : 30993 time to create 1 rle with old method : 0.035004615783691406 time for calcul the mask position with numpy : 0.027751922607421875 nb_pixel_total : 75876 time to create 1 rle with old method : 0.09470295906066895 time for calcul the mask position with numpy : 0.029732704162597656 nb_pixel_total : 36608 time to create 1 rle with old method : 0.04239320755004883 time for calcul the mask position with numpy : 0.029321670532226562 nb_pixel_total : 12864 time to create 1 rle with old method : 0.014664649963378906 time for calcul the mask position with numpy : 0.028972625732421875 nb_pixel_total : 94169 time to create 1 rle with old method : 0.1184380054473877 time for calcul the mask position with numpy : 0.02915644645690918 nb_pixel_total : 18933 time to create 1 rle with old method : 0.034182071685791016 time for calcul the mask position with numpy : 0.026318073272705078 nb_pixel_total : 24020 time to create 1 rle with old method : 0.027492523193359375 time for calcul the mask position with numpy : 0.027626752853393555 nb_pixel_total : 10301 time to create 1 rle with old method : 0.01183009147644043 time for calcul the mask position with numpy : 0.027129411697387695 nb_pixel_total : 16485 time to create 1 rle with old method : 0.01880645751953125 time for calcul the mask position with numpy : 0.028255224227905273 nb_pixel_total : 136242 time to create 1 rle with old method : 0.15259456634521484 time for calcul the mask position with numpy : 0.027469635009765625 nb_pixel_total : 155814 time to create 1 rle with new method : 0.48967695236206055 time for calcul the mask position with numpy : 0.027484655380249023 nb_pixel_total : 16491 time to create 1 rle with old method : 0.020902395248413086 time for calcul the mask position with numpy : 0.03749966621398926 nb_pixel_total : 20887 time to create 1 rle with old method : 0.023544788360595703 create new chi : 3.7769131660461426 time to delete rle : 0.00189208984375 batch 1 Loaded 33 chid ids of type : 3594 ++++++++++++++++Number RLEs to save : 10956 TO DO : save crop sub photo not yet done ! save time : 0.7214694023132324 nb_obj : 24 nb_hashtags : 4 time to prepare the origin masks : 9.161362648010254 time for calcul the mask position with numpy : 0.37711238861083984 nb_pixel_total : 7238320 time to create 1 rle with new method : 1.2906873226165771 time for calcul the mask position with numpy : 0.033173322677612305 nb_pixel_total : 136339 time to create 1 rle with old method : 0.1601719856262207 time for calcul the mask position with numpy : 0.027312517166137695 nb_pixel_total : 45335 time to create 1 rle with old method : 0.05104684829711914 time for calcul the mask position with numpy : 0.02797722816467285 nb_pixel_total : 73193 time to create 1 rle with old method : 0.0838160514831543 time for calcul the mask position with numpy : 0.030298948287963867 nb_pixel_total : 110249 time to create 1 rle with old method : 0.12383508682250977 time for calcul the mask position with numpy : 0.04954028129577637 nb_pixel_total : 13034 time to create 1 rle with old method : 0.015162229537963867 time for calcul the mask position with numpy : 0.04459691047668457 nb_pixel_total : 72094 time to create 1 rle with old method : 0.08236074447631836 time for calcul the mask position with numpy : 0.04296541213989258 nb_pixel_total : 7395 time to create 1 rle with old method : 0.012918472290039062 time for calcul the mask position with numpy : 0.04102015495300293 nb_pixel_total : 16697 time to create 1 rle with old method : 0.02679276466369629 time for calcul the mask position with numpy : 0.04511833190917969 nb_pixel_total : 51391 time to create 1 rle with old method : 0.06013774871826172 time for calcul the mask position with numpy : 0.040123701095581055 nb_pixel_total : 17837 time to create 1 rle with old method : 0.02052593231201172 time for calcul the mask position with numpy : 0.04628443717956543 nb_pixel_total : 13549 time to create 1 rle with old method : 0.014794588088989258 time for calcul the mask position with numpy : 0.039104461669921875 nb_pixel_total : 63910 time to create 1 rle with old method : 0.0718388557434082 time for calcul the mask position with numpy : 0.039667606353759766 nb_pixel_total : 9494 time to create 1 rle with old method : 0.010886192321777344 time for calcul the mask position with numpy : 0.039824485778808594 nb_pixel_total : 12694 time to create 1 rle with old method : 0.014422178268432617 time for calcul the mask position with numpy : 0.0381319522857666 nb_pixel_total : 25628 time to create 1 rle with old method : 0.029165267944335938 time for calcul the mask position with numpy : 0.04853463172912598 nb_pixel_total : 87792 time to create 1 rle with old method : 0.0986166000366211 time for calcul the mask position with numpy : 0.04311227798461914 nb_pixel_total : 82439 time to create 1 rle with old method : 0.09481239318847656 time for calcul the mask position with numpy : 0.04095101356506348 nb_pixel_total : 27683 time to create 1 rle with old method : 0.031656503677368164 time for calcul the mask position with numpy : 0.041365623474121094 nb_pixel_total : 15828 time to create 1 rle with old method : 0.017967939376831055 time for calcul the mask position with numpy : 0.03997540473937988 nb_pixel_total : 6012 time to create 1 rle with old method : 0.006847381591796875 time for calcul the mask position with numpy : 0.04058027267456055 nb_pixel_total : 22266 time to create 1 rle with old method : 0.025084495544433594 time for calcul the mask position with numpy : 0.04019427299499512 nb_pixel_total : 20885 time to create 1 rle with old method : 0.02371835708618164 time for calcul the mask position with numpy : 0.04015469551086426 nb_pixel_total : 107697 time to create 1 rle with old method : 0.12492656707763672 time for calcul the mask position with numpy : 0.04067230224609375 nb_pixel_total : 16639 time to create 1 rle with old method : 0.01917123794555664 create new chi : 3.8955583572387695 time to delete rle : 0.0038022994995117188 batch 1 Loaded 49 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++Number RLEs to save : 15112 TO DO : save crop sub photo not yet done ! save time : 0.9200451374053955 nb_obj : 15 nb_hashtags : 3 time to prepare the origin masks : 6.901089191436768 time for calcul the mask position with numpy : 0.4527101516723633 nb_pixel_total : 7304832 time to create 1 rle with new method : 1.214493989944458 time for calcul the mask position with numpy : 0.04051494598388672 nb_pixel_total : 32350 time to create 1 rle with old method : 0.03628969192504883 time for calcul the mask position with numpy : 0.03736591339111328 nb_pixel_total : 44418 time to create 1 rle with old method : 0.05598855018615723 time for calcul the mask position with numpy : 0.04142355918884277 nb_pixel_total : 163604 time to create 1 rle with new method : 0.5611720085144043 time for calcul the mask position with numpy : 0.03722977638244629 nb_pixel_total : 61845 time to create 1 rle with old method : 0.07736778259277344 time for calcul the mask position with numpy : 0.028714656829833984 nb_pixel_total : 18313 time to create 1 rle with old method : 0.020720243453979492 time for calcul the mask position with numpy : 0.029590845108032227 nb_pixel_total : 90847 time to create 1 rle with old method : 0.10307598114013672 time for calcul the mask position with numpy : 0.04175257682800293 nb_pixel_total : 38255 time to create 1 rle with old method : 0.04310178756713867 time for calcul the mask position with numpy : 0.033081769943237305 nb_pixel_total : 27375 time to create 1 rle with old method : 0.030835628509521484 time for calcul the mask position with numpy : 0.03494715690612793 nb_pixel_total : 224393 time to create 1 rle with new method : 0.5407986640930176 time for calcul the mask position with numpy : 0.02912735939025879 nb_pixel_total : 91167 time to create 1 rle with old method : 0.11542606353759766 time for calcul the mask position with numpy : 0.035417795181274414 nb_pixel_total : 100464 time to create 1 rle with old method : 0.11739397048950195 time for calcul the mask position with numpy : 0.032562255859375 nb_pixel_total : 16212 time to create 1 rle with old method : 0.018388748168945312 time for calcul the mask position with numpy : 0.031006574630737305 nb_pixel_total : 15944 time to create 1 rle with old method : 0.01809978485107422 time for calcul the mask position with numpy : 0.03180980682373047 nb_pixel_total : 50543 time to create 1 rle with old method : 0.0565791130065918 time for calcul the mask position with numpy : 0.03891158103942871 nb_pixel_total : 13838 time to create 1 rle with old method : 0.0156552791595459 create new chi : 4.111682415008545 time to delete rle : 0.002149343490600586 batch 1 Loaded 31 chid ids of type : 3594 +++++++++++++++++++++Number RLEs to save : 11426 TO DO : save crop sub photo not yet done ! save time : 0.7312893867492676 nb_obj : 10 nb_hashtags : 3 time to prepare the origin masks : 4.530494451522827 time for calcul the mask position with numpy : 0.546766996383667 nb_pixel_total : 7267973 time to create 1 rle with new method : 1.0326762199401855 time for calcul the mask position with numpy : 0.0324099063873291 nb_pixel_total : 42157 time to create 1 rle with old method : 0.0502016544342041 time for calcul the mask position with numpy : 0.028284072875976562 nb_pixel_total : 87508 time to create 1 rle with old method : 0.12019658088684082 time for calcul the mask position with numpy : 0.027283430099487305 nb_pixel_total : 30942 time to create 1 rle with old method : 0.03769540786743164 time for calcul the mask position with numpy : 0.02604079246520996 nb_pixel_total : 23323 time to create 1 rle with old method : 0.026277780532836914 time for calcul the mask position with numpy : 0.026322126388549805 nb_pixel_total : 107972 time to create 1 rle with old method : 0.12316203117370605 time for calcul the mask position with numpy : 0.029593944549560547 nb_pixel_total : 9838 time to create 1 rle with old method : 0.011373758316040039 time for calcul the mask position with numpy : 0.026392459869384766 nb_pixel_total : 40625 time to create 1 rle with old method : 0.04743766784667969 time for calcul the mask position with numpy : 0.026610374450683594 nb_pixel_total : 138104 time to create 1 rle with old method : 0.168013334274292 time for calcul the mask position with numpy : 0.02829766273498535 nb_pixel_total : 287198 time to create 1 rle with new method : 0.6789026260375977 time for calcul the mask position with numpy : 0.027477025985717773 nb_pixel_total : 258760 time to create 1 rle with new method : 0.7953758239746094 create new chi : 4.023855686187744 time to delete rle : 0.002041339874267578 batch 1 Loaded 21 chid ids of type : 3594 ++++++++++Number RLEs to save : 9044 TO DO : save crop sub photo not yet done ! save time : 0.5760688781738281 nb_obj : 22 nb_hashtags : 4 time to prepare the origin masks : 9.662430763244629 time for calcul the mask position with numpy : 0.36843442916870117 nb_pixel_total : 6913309 time to create 1 rle with new method : 0.44802308082580566 time for calcul the mask position with numpy : 0.045194149017333984 nb_pixel_total : 90773 time to create 1 rle with old method : 0.10121941566467285 time for calcul the mask position with numpy : 0.04159140586853027 nb_pixel_total : 6047 time to create 1 rle with old method : 0.006848573684692383 time for calcul the mask position with numpy : 0.044175148010253906 nb_pixel_total : 486 time to create 1 rle with old method : 0.0007886886596679688 time for calcul the mask position with numpy : 0.034508705139160156 nb_pixel_total : 24401 time to create 1 rle with old method : 0.02740168571472168 time for calcul the mask position with numpy : 0.03068256378173828 nb_pixel_total : 15361 time to create 1 rle with old method : 0.017264366149902344 time for calcul the mask position with numpy : 0.03216886520385742 nb_pixel_total : 76012 time to create 1 rle with old method : 0.08562445640563965 time for calcul the mask position with numpy : 0.02761554718017578 nb_pixel_total : 15706 time to create 1 rle with old method : 0.019560813903808594 time for calcul the mask position with numpy : 0.028081893920898438 nb_pixel_total : 40217 time to create 1 rle with old method : 0.045096635818481445 time for calcul the mask position with numpy : 0.027078866958618164 nb_pixel_total : 18158 time to create 1 rle with old method : 0.0209500789642334 time for calcul the mask position with numpy : 0.043674468994140625 nb_pixel_total : 15862 time to create 1 rle with old method : 0.018627405166625977 time for calcul the mask position with numpy : 0.04422616958618164 nb_pixel_total : 74014 time to create 1 rle with old method : 0.08358526229858398 time for calcul the mask position with numpy : 0.04589104652404785 nb_pixel_total : 68791 time to create 1 rle with old method : 0.09376382827758789 time for calcul the mask position with numpy : 0.04347109794616699 nb_pixel_total : 80504 time to create 1 rle with old method : 0.09488153457641602 time for calcul the mask position with numpy : 0.042192697525024414 nb_pixel_total : 65109 time to create 1 rle with old method : 0.07237958908081055 time for calcul the mask position with numpy : 0.04207897186279297 nb_pixel_total : 26917 time to create 1 rle with old method : 0.03036665916442871 time for calcul the mask position with numpy : 0.04431033134460449 nb_pixel_total : 25217 time to create 1 rle with old method : 0.028377056121826172 time for calcul the mask position with numpy : 0.04187464714050293 nb_pixel_total : 39058 time to create 1 rle with old method : 0.04848456382751465 time for calcul the mask position with numpy : 0.05617403984069824 nb_pixel_total : 121126 time to create 1 rle with old method : 0.1495649814605713 time for calcul the mask position with numpy : 0.04178738594055176 nb_pixel_total : 30041 time to create 1 rle with old method : 0.03369307518005371 time for calcul the mask position with numpy : 0.0438227653503418 nb_pixel_total : 61654 time to create 1 rle with old method : 0.07020878791809082 time for calcul the mask position with numpy : 0.05398368835449219 nb_pixel_total : 458949 time to create 1 rle with new method : 0.46453213691711426 time for calcul the mask position with numpy : 0.04166221618652344 nb_pixel_total : 26688 time to create 1 rle with old method : 0.029866456985473633 create new chi : 3.3264501094818115 time to delete rle : 0.004232645034790039 batch 1 Loaded 45 chid ids of type : 3594 +++++++++++++++++++++++++Number RLEs to save : 14586 TO DO : save crop sub photo not yet done ! save time : 0.9105124473571777 nb_obj : 16 nb_hashtags : 5 time to prepare the origin masks : 11.194282293319702 time for calcul the mask position with numpy : 0.7305393218994141 nb_pixel_total : 7353712 time to create 1 rle with new method : 1.05617356300354 time for calcul the mask position with numpy : 0.03536081314086914 nb_pixel_total : 22227 time to create 1 rle with old method : 0.025344133377075195 time for calcul the mask position with numpy : 0.05834507942199707 nb_pixel_total : 36023 time to create 1 rle with old method : 0.059189796447753906 time for calcul the mask position with numpy : 0.04192018508911133 nb_pixel_total : 25874 time to create 1 rle with old method : 0.028914928436279297 time for calcul the mask position with numpy : 0.040921926498413086 nb_pixel_total : 38154 time to create 1 rle with old method : 0.04285407066345215 time for calcul the mask position with numpy : 0.04402971267700195 nb_pixel_total : 274237 time to create 1 rle with new method : 0.8142709732055664 time for calcul the mask position with numpy : 0.03509807586669922 nb_pixel_total : 82829 time to create 1 rle with old method : 0.0939631462097168 time for calcul the mask position with numpy : 0.04303622245788574 nb_pixel_total : 107946 time to create 1 rle with old method : 0.12705302238464355 time for calcul the mask position with numpy : 0.0268247127532959 nb_pixel_total : 48808 time to create 1 rle with old method : 0.05624103546142578 time for calcul the mask position with numpy : 0.02563190460205078 nb_pixel_total : 12634 time to create 1 rle with old method : 0.014237642288208008 time for calcul the mask position with numpy : 0.026767492294311523 nb_pixel_total : 27880 time to create 1 rle with old method : 0.03131413459777832 time for calcul the mask position with numpy : 0.025367259979248047 nb_pixel_total : 11514 time to create 1 rle with old method : 0.012937545776367188 time for calcul the mask position with numpy : 0.025857210159301758 nb_pixel_total : 63458 time to create 1 rle with old method : 0.0710446834564209 time for calcul the mask position with numpy : 0.025775909423828125 nb_pixel_total : 10492 time to create 1 rle with old method : 0.011910200119018555 time for calcul the mask position with numpy : 0.026595592498779297 nb_pixel_total : 5288 time to create 1 rle with old method : 0.00597834587097168 time for calcul the mask position with numpy : 0.026256561279296875 nb_pixel_total : 89211 time to create 1 rle with old method : 0.10359430313110352 time for calcul the mask position with numpy : 0.0303800106048584 nb_pixel_total : 84113 time to create 1 rle with old method : 0.10530853271484375 create new chi : 4.003806829452515 time to delete rle : 0.002814054489135742 batch 1 Loaded 33 chid ids of type : 3594 +++++++++++++++++Number RLEs to save : 11430 TO DO : save crop sub photo not yet done ! save time : 0.7059528827667236 nb_obj : 19 nb_hashtags : 3 time to prepare the origin masks : 7.953498840332031 time for calcul the mask position with numpy : 0.9828441143035889 nb_pixel_total : 7647828 time to create 1 rle with new method : 1.0134997367858887 time for calcul the mask position with numpy : 0.038988351821899414 nb_pixel_total : 20125 time to create 1 rle with old method : 0.02252197265625 time for calcul the mask position with numpy : 0.04082012176513672 nb_pixel_total : 16770 time to create 1 rle with old method : 0.018917560577392578 time for calcul the mask position with numpy : 0.0423431396484375 nb_pixel_total : 25069 time to create 1 rle with old method : 0.0278928279876709 time for calcul the mask position with numpy : 0.03267359733581543 nb_pixel_total : 89064 time to create 1 rle with old method : 0.10410761833190918 time for calcul the mask position with numpy : 0.0340425968170166 nb_pixel_total : 43920 time to create 1 rle with old method : 0.04932904243469238 time for calcul the mask position with numpy : 0.0257108211517334 nb_pixel_total : 23443 time to create 1 rle with old method : 0.028673887252807617 time for calcul the mask position with numpy : 0.02530527114868164 nb_pixel_total : 20193 time to create 1 rle with old method : 0.022708892822265625 time for calcul the mask position with numpy : 0.026994943618774414 nb_pixel_total : 201805 time to create 1 rle with new method : 0.9535007476806641 time for calcul the mask position with numpy : 0.026929855346679688 nb_pixel_total : 19293 time to create 1 rle with old method : 0.021528244018554688 time for calcul the mask position with numpy : 0.028037309646606445 nb_pixel_total : 8508 time to create 1 rle with old method : 0.010565996170043945 time for calcul the mask position with numpy : 0.025480270385742188 nb_pixel_total : 11357 time to create 1 rle with old method : 0.013651847839355469 time for calcul the mask position with numpy : 0.02675175666809082 nb_pixel_total : 5179 time to create 1 rle with old method : 0.005974531173706055 time for calcul the mask position with numpy : 0.026998281478881836 nb_pixel_total : 15043 time to create 1 rle with old method : 0.017794370651245117 time for calcul the mask position with numpy : 0.03133440017700195 nb_pixel_total : 19274 time to create 1 rle with old method : 0.046262502670288086 time for calcul the mask position with numpy : 0.044834136962890625 nb_pixel_total : 28245 time to create 1 rle with old method : 0.03224611282348633 time for calcul the mask position with numpy : 0.02523016929626465 nb_pixel_total : 39489 time to create 1 rle with old method : 0.0474703311920166 time for calcul the mask position with numpy : 0.026340484619140625 nb_pixel_total : 17979 time to create 1 rle with old method : 0.02035045623779297 time for calcul the mask position with numpy : 0.026270151138305664 nb_pixel_total : 19745 time to create 1 rle with old method : 0.02241659164428711 time for calcul the mask position with numpy : 0.026192903518676758 nb_pixel_total : 22071 time to create 1 rle with old method : 0.02509617805480957 create new chi : 4.144962787628174 time to delete rle : 0.0018825531005859375 batch 1 Loaded 39 chid ids of type : 3594 ++++++++++++++++++++++Number RLEs to save : 10549 TO DO : save crop sub photo not yet done ! save time : 0.6673731803894043 nb_obj : 12 nb_hashtags : 2 time to prepare the origin masks : 4.458384037017822 time for calcul the mask position with numpy : 0.7063980102539062 nb_pixel_total : 7722937 time to create 1 rle with new method : 1.0150947570800781 time for calcul the mask position with numpy : 0.025120019912719727 nb_pixel_total : 96118 time to create 1 rle with old method : 0.10659384727478027 time for calcul the mask position with numpy : 0.02453780174255371 nb_pixel_total : 12983 time to create 1 rle with old method : 0.01472783088684082 time for calcul the mask position with numpy : 0.025891542434692383 nb_pixel_total : 95329 time to create 1 rle with old method : 0.10669946670532227 time for calcul the mask position with numpy : 0.02617335319519043 nb_pixel_total : 57332 time to create 1 rle with old method : 0.06399083137512207 time for calcul the mask position with numpy : 0.027442455291748047 nb_pixel_total : 177721 time to create 1 rle with new method : 0.7743093967437744 time for calcul the mask position with numpy : 0.025780677795410156 nb_pixel_total : 4463 time to create 1 rle with old method : 0.005094766616821289 time for calcul the mask position with numpy : 0.024450063705444336 nb_pixel_total : 9086 time to create 1 rle with old method : 0.010327577590942383 time for calcul the mask position with numpy : 0.02595829963684082 nb_pixel_total : 12081 time to create 1 rle with old method : 0.013737678527832031 time for calcul the mask position with numpy : 0.025443315505981445 nb_pixel_total : 8993 time to create 1 rle with old method : 0.010209798812866211 time for calcul the mask position with numpy : 0.025322914123535156 nb_pixel_total : 33686 time to create 1 rle with old method : 0.03772330284118652 time for calcul the mask position with numpy : 0.025465726852416992 nb_pixel_total : 37353 time to create 1 rle with old method : 0.04167366027832031 time for calcul the mask position with numpy : 0.02765202522277832 nb_pixel_total : 26318 time to create 1 rle with old method : 0.1860511302947998 create new chi : 3.47648024559021 time to delete rle : 0.0013918876647949219 batch 1 Loaded 25 chid ids of type : 3594 +++++++++++++++Number RLEs to save : 7950 TO DO : save crop sub photo not yet done ! save time : 0.48607492446899414 nb_obj : 13 nb_hashtags : 4 time to prepare the origin masks : 7.338286399841309 time for calcul the mask position with numpy : 0.4609346389770508 nb_pixel_total : 7714818 time to create 1 rle with new method : 0.6323192119598389 time for calcul the mask position with numpy : 0.027620792388916016 nb_pixel_total : 11591 time to create 1 rle with old method : 0.015524625778198242 time for calcul the mask position with numpy : 0.03509163856506348 nb_pixel_total : 19742 time to create 1 rle with old method : 0.031447649002075195 time for calcul the mask position with numpy : 0.04026961326599121 nb_pixel_total : 59571 time to create 1 rle with old method : 0.07309436798095703 time for calcul the mask position with numpy : 0.0421910285949707 nb_pixel_total : 40845 time to create 1 rle with old method : 0.045670270919799805 time for calcul the mask position with numpy : 0.0417630672454834 nb_pixel_total : 40844 time to create 1 rle with old method : 0.04575371742248535 time for calcul the mask position with numpy : 0.03237271308898926 nb_pixel_total : 149970 time to create 1 rle with old method : 0.16791749000549316 time for calcul the mask position with numpy : 0.025941848754882812 nb_pixel_total : 15843 time to create 1 rle with old method : 0.017816543579101562 time for calcul the mask position with numpy : 0.025841236114501953 nb_pixel_total : 17267 time to create 1 rle with old method : 0.019398212432861328 time for calcul the mask position with numpy : 0.02559947967529297 nb_pixel_total : 93034 time to create 1 rle with old method : 0.10384607315063477 time for calcul the mask position with numpy : 0.02680659294128418 nb_pixel_total : 48757 time to create 1 rle with old method : 0.07534027099609375 time for calcul the mask position with numpy : 0.027317047119140625 nb_pixel_total : 17435 time to create 1 rle with old method : 0.01970958709716797 time for calcul the mask position with numpy : 0.025288105010986328 nb_pixel_total : 42804 time to create 1 rle with old method : 0.04799151420593262 time for calcul the mask position with numpy : 0.02527594566345215 nb_pixel_total : 21879 time to create 1 rle with old method : 0.02466297149658203 create new chi : 2.230760097503662 time to delete rle : 0.001584768295288086 batch 1 Loaded 27 chid ids of type : 3594 +++++++++++++++Number RLEs to save : 10350 TO DO : save crop sub photo not yet done ! save time : 0.6625242233276367 nb_obj : 21 nb_hashtags : 5 time to prepare the origin masks : 9.569113492965698 time for calcul the mask position with numpy : 0.6779627799987793 nb_pixel_total : 6834625 time to create 1 rle with new method : 1.4148142337799072 time for calcul the mask position with numpy : 0.03672599792480469 nb_pixel_total : 91756 time to create 1 rle with old method : 0.10247182846069336 time for calcul the mask position with numpy : 0.04256796836853027 nb_pixel_total : 98685 time to create 1 rle with old method : 0.1339585781097412 time for calcul the mask position with numpy : 0.042368412017822266 nb_pixel_total : 7180 time to create 1 rle with old method : 0.008607149124145508 time for calcul the mask position with numpy : 0.04149031639099121 nb_pixel_total : 116481 time to create 1 rle with old method : 0.1332709789276123 time for calcul the mask position with numpy : 0.04361987113952637 nb_pixel_total : 12021 time to create 1 rle with old method : 0.013570308685302734 time for calcul the mask position with numpy : 0.04331851005554199 nb_pixel_total : 64101 time to create 1 rle with old method : 0.0721426010131836 time for calcul the mask position with numpy : 0.04544377326965332 nb_pixel_total : 65489 time to create 1 rle with old method : 0.07379317283630371 time for calcul the mask position with numpy : 0.04130816459655762 nb_pixel_total : 152098 time to create 1 rle with new method : 1.0014746189117432 time for calcul the mask position with numpy : 0.044000864028930664 nb_pixel_total : 173232 time to create 1 rle with new method : 0.9487912654876709 time for calcul the mask position with numpy : 0.04005074501037598 nb_pixel_total : 13689 time to create 1 rle with old method : 0.01543426513671875 time for calcul the mask position with numpy : 0.0402371883392334 nb_pixel_total : 20782 time to create 1 rle with old method : 0.023394346237182617 time for calcul the mask position with numpy : 0.04181194305419922 nb_pixel_total : 109151 time to create 1 rle with old method : 0.12178468704223633 time for calcul the mask position with numpy : 0.04006195068359375 nb_pixel_total : 22772 time to create 1 rle with old method : 0.02555680274963379 time for calcul the mask position with numpy : 0.040859222412109375 nb_pixel_total : 142999 time to create 1 rle with old method : 0.1631927490234375 time for calcul the mask position with numpy : 0.0445094108581543 nb_pixel_total : 37127 time to create 1 rle with old method : 0.04187726974487305 time for calcul the mask position with numpy : 0.041124820709228516 nb_pixel_total : 49197 time to create 1 rle with old method : 0.05491304397583008 time for calcul the mask position with numpy : 0.04015636444091797 nb_pixel_total : 40827 time to create 1 rle with old method : 0.04857301712036133 time for calcul the mask position with numpy : 0.04087710380554199 nb_pixel_total : 66672 time to create 1 rle with old method : 0.07898259162902832 time for calcul the mask position with numpy : 0.04456663131713867 nb_pixel_total : 99131 time to create 1 rle with old method : 0.11698126792907715 time for calcul the mask position with numpy : 0.045476436614990234 nb_pixel_total : 58740 time to create 1 rle with old method : 0.06779932975769043 time for calcul the mask position with numpy : 0.045595645904541016 nb_pixel_total : 17645 time to create 1 rle with old method : 0.020862817764282227 create new chi : 6.346140623092651 time to delete rle : 0.005383491516113281 batch 1 Loaded 43 chid ids of type : 3594 +++++++++++++++++++++++++++++++++Number RLEs to save : 16224 TO DO : save crop sub photo not yet done ! save time : 0.9425289630889893 nb_obj : 14 nb_hashtags : 3 time to prepare the origin masks : 6.671878337860107 time for calcul the mask position with numpy : 0.5851485729217529 nb_pixel_total : 7369413 time to create 1 rle with new method : 0.7780072689056396 time for calcul the mask position with numpy : 0.027703046798706055 nb_pixel_total : 327325 time to create 1 rle with new method : 0.8751146793365479 time for calcul the mask position with numpy : 0.040721893310546875 nb_pixel_total : 16426 time to create 1 rle with old method : 0.018528461456298828 time for calcul the mask position with numpy : 0.04077959060668945 nb_pixel_total : 84319 time to create 1 rle with old method : 0.09406328201293945 time for calcul the mask position with numpy : 0.040699005126953125 nb_pixel_total : 57900 time to create 1 rle with old method : 0.06492733955383301 time for calcul the mask position with numpy : 0.043096065521240234 nb_pixel_total : 40842 time to create 1 rle with old method : 0.0455164909362793 time for calcul the mask position with numpy : 0.04036998748779297 nb_pixel_total : 68056 time to create 1 rle with old method : 0.07623457908630371 time for calcul the mask position with numpy : 0.04168128967285156 nb_pixel_total : 69291 time to create 1 rle with old method : 0.07733678817749023 time for calcul the mask position with numpy : 0.03912234306335449 nb_pixel_total : 11637 time to create 1 rle with old method : 0.013085126876831055 time for calcul the mask position with numpy : 0.03920483589172363 nb_pixel_total : 75817 time to create 1 rle with old method : 0.08383727073669434 time for calcul the mask position with numpy : 0.026169300079345703 nb_pixel_total : 64512 time to create 1 rle with old method : 0.07201862335205078 time for calcul the mask position with numpy : 0.02564549446105957 nb_pixel_total : 20225 time to create 1 rle with old method : 0.023108243942260742 time for calcul the mask position with numpy : 0.026123046875 nb_pixel_total : 49865 time to create 1 rle with old method : 0.05579495429992676 time for calcul the mask position with numpy : 0.02614116668701172 nb_pixel_total : 11514 time to create 1 rle with old method : 0.012980937957763672 time for calcul the mask position with numpy : 0.025866985321044922 nb_pixel_total : 27258 time to create 1 rle with old method : 0.030525684356689453 create new chi : 3.460689067840576 time to delete rle : 0.0029380321502685547 batch 1 Loaded 29 chid ids of type : 3594 +++++++++++++++++++++Number RLEs to save : 12388 TO DO : save crop sub photo not yet done ! save time : 0.7868421077728271 nb_obj : 17 nb_hashtags : 2 time to prepare the origin masks : 7.304316520690918 time for calcul the mask position with numpy : 0.7366149425506592 nb_pixel_total : 7576189 time to create 1 rle with new method : 0.6317176818847656 time for calcul the mask position with numpy : 0.03570437431335449 nb_pixel_total : 20437 time to create 1 rle with old method : 0.02330493927001953 time for calcul the mask position with numpy : 0.03845500946044922 nb_pixel_total : 100962 time to create 1 rle with old method : 0.11242246627807617 time for calcul the mask position with numpy : 0.032202959060668945 nb_pixel_total : 15280 time to create 1 rle with old method : 0.017453432083129883 time for calcul the mask position with numpy : 0.03438282012939453 nb_pixel_total : 31276 time to create 1 rle with old method : 0.05092167854309082 time for calcul the mask position with numpy : 0.027114391326904297 nb_pixel_total : 8828 time to create 1 rle with old method : 0.010117530822753906 time for calcul the mask position with numpy : 0.026938676834106445 nb_pixel_total : 65393 time to create 1 rle with old method : 0.07329702377319336 time for calcul the mask position with numpy : 0.025699138641357422 nb_pixel_total : 17525 time to create 1 rle with old method : 0.0197446346282959 time for calcul the mask position with numpy : 0.02625417709350586 nb_pixel_total : 77291 time to create 1 rle with old method : 0.08671188354492188 time for calcul the mask position with numpy : 0.032854557037353516 nb_pixel_total : 19949 time to create 1 rle with old method : 0.022431135177612305 time for calcul the mask position with numpy : 0.027163028717041016 nb_pixel_total : 129406 time to create 1 rle with old method : 0.14432859420776367 time for calcul the mask position with numpy : 0.025210857391357422 nb_pixel_total : 14210 time to create 1 rle with old method : 0.016019105911254883 time for calcul the mask position with numpy : 0.024777889251708984 nb_pixel_total : 8336 time to create 1 rle with old method : 0.009382963180541992 time for calcul the mask position with numpy : 0.02558135986328125 nb_pixel_total : 17562 time to create 1 rle with old method : 0.01971888542175293 time for calcul the mask position with numpy : 0.025382280349731445 nb_pixel_total : 37987 time to create 1 rle with old method : 0.04223012924194336 time for calcul the mask position with numpy : 0.02576470375061035 nb_pixel_total : 37045 time to create 1 rle with old method : 0.041272640228271484 time for calcul the mask position with numpy : 0.026152372360229492 nb_pixel_total : 62476 time to create 1 rle with old method : 0.07011628150939941 time for calcul the mask position with numpy : 0.024945497512817383 nb_pixel_total : 54248 time to create 1 rle with old method : 0.06043672561645508 create new chi : 2.7189979553222656 time to delete rle : 0.0018110275268554688 batch 1 Loaded 35 chid ids of type : 3594 ++++++++++++++++++++Number RLEs to save : 11566 TO DO : save crop sub photo not yet done ! save time : 0.7618205547332764 nb_obj : 15 nb_hashtags : 4 time to prepare the origin masks : 7.014562606811523 time for calcul the mask position with numpy : 1.362236738204956 nb_pixel_total : 7011220 time to create 1 rle with new method : 0.9638991355895996 time for calcul the mask position with numpy : 0.025174379348754883 nb_pixel_total : 17037 time to create 1 rle with old method : 0.019302845001220703 time for calcul the mask position with numpy : 0.026402711868286133 nb_pixel_total : 104643 time to create 1 rle with old method : 0.11548590660095215 time for calcul the mask position with numpy : 0.03010082244873047 nb_pixel_total : 268875 time to create 1 rle with new method : 0.7575507164001465 time for calcul the mask position with numpy : 0.026487112045288086 nb_pixel_total : 24051 time to create 1 rle with old method : 0.030931472778320312 time for calcul the mask position with numpy : 0.028171777725219727 nb_pixel_total : 31380 time to create 1 rle with old method : 0.04833579063415527 time for calcul the mask position with numpy : 0.02904510498046875 nb_pixel_total : 11742 time to create 1 rle with old method : 0.013184547424316406 time for calcul the mask position with numpy : 0.024702072143554688 nb_pixel_total : 29687 time to create 1 rle with old method : 0.03267359733581543 time for calcul the mask position with numpy : 0.027219295501708984 nb_pixel_total : 13753 time to create 1 rle with old method : 0.015194416046142578 time for calcul the mask position with numpy : 0.02705526351928711 nb_pixel_total : 324797 time to create 1 rle with new method : 0.8158740997314453 time for calcul the mask position with numpy : 0.028377294540405273 nb_pixel_total : 67778 time to create 1 rle with old method : 0.07562756538391113 time for calcul the mask position with numpy : 0.025938034057617188 nb_pixel_total : 39836 time to create 1 rle with old method : 0.04449272155761719 time for calcul the mask position with numpy : 0.025835037231445312 nb_pixel_total : 33724 time to create 1 rle with old method : 0.03786778450012207 time for calcul the mask position with numpy : 0.02791309356689453 nb_pixel_total : 189941 time to create 1 rle with new method : 0.6157567501068115 time for calcul the mask position with numpy : 0.026275157928466797 nb_pixel_total : 90215 time to create 1 rle with old method : 0.09920001029968262 time for calcul the mask position with numpy : 0.025519609451293945 nb_pixel_total : 35721 time to create 1 rle with old method : 0.03972911834716797 create new chi : 5.6278111934661865 time to delete rle : 0.0022194385528564453 batch 1 Loaded 31 chid ids of type : 3594 ++++++++++++++++++Number RLEs to save : 12890 TO DO : save crop sub photo not yet done ! save time : 0.7991905212402344 map_output_result : {1359829146: (0.0, 'Should be the crop_list due to order', 0), 1359829133: (0.0, 'Should be the crop_list due to order', 0), 1359829127: (0.0, 'Should be the crop_list due to order', 0), 1359829122: (0.0, 'Should be the crop_list due to order', 0), 1359829092: (0.0, 'Should be the crop_list due to order', 0), 1359829087: (0.0, 'Should be the crop_list due to order', 0), 1359829085: (0.0, 'Should be the crop_list due to order', 0), 1359829083: (0.0, 'Should be the crop_list due to order', 0), 1359829069: (0.0, 'Should be the crop_list due to order', 0), 1359829062: (0.0, 'Should be the crop_list due to order', 0), 1359829058: (0.0, 'Should be the crop_list due to order', 0), 1359829054: (0.0, 'Should be the crop_list due to order', 0), 1359829049: (0.0, 'Should be the crop_list due to order', 0), 1359829046: (0.0, 'Should be the crop_list due to order', 0), 1359829018: (0.0, 'Should be the crop_list due to order', 0), 1359829014: (0.0, 'Should be the crop_list due to order', 0), 1359829011: (0.0, 'Should be the crop_list due to order', 0), 1359829006: (0.0, 'Should be the crop_list due to order', 0), 1359828975: (0.0, 'Should be the crop_list due to order', 0), 1359828933: (0.0, 'Should be the crop_list due to order', 0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1359829146, 1359829133, 1359829127, 1359829122, 1359829092, 1359829087, 1359829085, 1359829083, 1359829069, 1359829062, 1359829058, 1359829054, 1359829049, 1359829046, 1359829018, 1359829014, 1359829011, 1359829006, 1359828975, 1359828933] Looping around the photos to save general results len do output : 20 /1359829146.Didn't retrieve data . /1359829133.Didn't retrieve data . /1359829127.Didn't retrieve data . /1359829122.Didn't retrieve data . /1359829092.Didn't retrieve data . /1359829087.Didn't retrieve data . /1359829085.Didn't retrieve data . /1359829083.Didn't retrieve data . /1359829069.Didn't retrieve data . /1359829062.Didn't retrieve data . /1359829058.Didn't retrieve data . /1359829054.Didn't retrieve data . /1359829049.Didn't retrieve data . /1359829046.Didn't retrieve data . /1359829018.Didn't retrieve data . /1359829014.Didn't retrieve data . /1359829011.Didn't retrieve data . /1359829006.Didn't retrieve data . /1359828975.Didn't retrieve data . /1359828933.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829146', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829133', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829127', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829122', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829092', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829087', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829085', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829083', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829069', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829062', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829058', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829054', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829049', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829046', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829018', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829014', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829011', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829006', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359828975', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359828933', None, None, None, None, None, '2913534') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 60 time used for this insertion : 0.018851280212402344 save_final save missing photos in datou_result : time spend for datou_step_exec : 262.5501425266266 time spend to save output : 0.02176809310913086 total time spend for step 3 : 262.5719106197357 step4:ventilate_hashtags_in_portfolio Tue May 20 21:13:26 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 23180311 get user id for portfolio 23180311 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23180311 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','pet_fonce','environnement','mal_croppe','pet_clair','metal','pehd','carton','background','flou','autre')) AND mptpi.`min_score`=0.5 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23180311 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','pet_fonce','environnement','mal_croppe','pet_clair','metal','pehd','carton','background','flou','autre')) AND mptpi.`min_score`=0.5 To do Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23180311 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','pet_fonce','environnement','mal_croppe','pet_clair','metal','pehd','carton','background','flou','autre')) AND mptpi.`min_score`=0.5 To do lien utilise dans velours : https://www.fotonower.com/velours/23181403,23181404,23181405,23181406,23181407,23181408,23181409,23181410,23181411,23181412,23181413?tags=papier,pet_fonce,environnement,mal_croppe,pet_clair,metal,pehd,carton,background,flou,autre Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1359829146, 1359829133, 1359829127, 1359829122, 1359829092, 1359829087, 1359829085, 1359829083, 1359829069, 1359829062, 1359829058, 1359829054, 1359829049, 1359829046, 1359829018, 1359829014, 1359829011, 1359829006, 1359828975, 1359828933] Looping around the photos to save general results len do output : 1 /23180311. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829146', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829133', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829127', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829122', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829092', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829087', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829085', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829083', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829069', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829062', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829058', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829054', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829049', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829046', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829018', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829014', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829011', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829006', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359828975', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359828933', None, None, None, None, None, '2913534') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 21 time used for this insertion : 0.01530313491821289 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.76790452003479 time spend to save output : 0.015932798385620117 total time spend for step 4 : 1.7838373184204102 step5:final Tue May 20 21:13:28 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Catched exception ! Connect or reconnect ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1359829146: ('0.12038170331790125',), 1359829133: ('0.12038170331790125',), 1359829127: ('0.12038170331790125',), 1359829122: ('0.12038170331790125',), 1359829092: ('0.12038170331790125',), 1359829087: ('0.12038170331790125',), 1359829085: ('0.12038170331790125',), 1359829083: ('0.12038170331790125',), 1359829069: ('0.12038170331790125',), 1359829062: ('0.12038170331790125',), 1359829058: ('0.12038170331790125',), 1359829054: ('0.12038170331790125',), 1359829049: ('0.12038170331790125',), 1359829046: ('0.12038170331790125',), 1359829018: ('0.12038170331790125',), 1359829014: ('0.12038170331790125',), 1359829011: ('0.12038170331790125',), 1359829006: ('0.12038170331790125',), 1359828975: ('0.12038170331790125',), 1359828933: ('0.12038170331790125',)} new output for save of step final : {1359829146: ('0.12038170331790125',), 1359829133: ('0.12038170331790125',), 1359829127: ('0.12038170331790125',), 1359829122: ('0.12038170331790125',), 1359829092: ('0.12038170331790125',), 1359829087: ('0.12038170331790125',), 1359829085: ('0.12038170331790125',), 1359829083: ('0.12038170331790125',), 1359829069: ('0.12038170331790125',), 1359829062: ('0.12038170331790125',), 1359829058: ('0.12038170331790125',), 1359829054: ('0.12038170331790125',), 1359829049: ('0.12038170331790125',), 1359829046: ('0.12038170331790125',), 1359829018: ('0.12038170331790125',), 1359829014: ('0.12038170331790125',), 1359829011: ('0.12038170331790125',), 1359829006: ('0.12038170331790125',), 1359828975: ('0.12038170331790125',), 1359828933: ('0.12038170331790125',)} [1359829146, 1359829133, 1359829127, 1359829122, 1359829092, 1359829087, 1359829085, 1359829083, 1359829069, 1359829062, 1359829058, 1359829054, 1359829049, 1359829046, 1359829018, 1359829014, 1359829011, 1359829006, 1359828975, 1359828933] Looping around the photos to save general results len do output : 20 /1359829146.Didn't retrieve data . /1359829133.Didn't retrieve data . /1359829127.Didn't retrieve data . /1359829122.Didn't retrieve data . /1359829092.Didn't retrieve data . /1359829087.Didn't retrieve data . /1359829085.Didn't retrieve data . /1359829083.Didn't retrieve data . /1359829069.Didn't retrieve data . /1359829062.Didn't retrieve data . /1359829058.Didn't retrieve data . /1359829054.Didn't retrieve data . /1359829049.Didn't retrieve data . /1359829046.Didn't retrieve data . /1359829018.Didn't retrieve data . /1359829014.Didn't retrieve data . /1359829011.Didn't retrieve data . /1359829006.Didn't retrieve data . /1359828975.Didn't retrieve data . /1359828933.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829146', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829133', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829127', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829122', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829092', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829087', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829085', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829083', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829069', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829062', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829058', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829054', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829049', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829046', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829018', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829014', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829011', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829006', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359828975', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359828933', None, None, None, None, None, '2913534') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 60 time used for this insertion : 0.01517033576965332 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.14042258262634277 time spend to save output : 0.016430139541625977 total time spend for step 5 : 0.15685272216796875 step6:blur_detection Tue May 20 21:13:28 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection methode: ratio et variance treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed.jpg resize: (2160, 3840) 1359829146 -5.681171819834477 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df.jpg resize: (2160, 3840) 1359829133 -5.775637919782563 treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf.jpg resize: (2160, 3840) 1359829127 -5.902173284317751 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb.jpg resize: (2160, 3840) 1359829122 -5.755227704330263 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae.jpg resize: (2160, 3840) 1359829092 -5.7961720836510215 treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f.jpg resize: (2160, 3840) 1359829087 -5.857180255534818 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9.jpg resize: (2160, 3840) 1359829085 -5.720045014715731 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b.jpg resize: (2160, 3840) 1359829083 -5.891315567462301 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb.jpg resize: (2160, 3840) 1359829069 -5.841597054309442 treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765.jpg resize: (2160, 3840) 1359829062 -5.762358309287807 treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d.jpg resize: (2160, 3840) 1359829058 -5.632424055985447 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468.jpg resize: (2160, 3840) 1359829054 -5.7192397105420225 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae.jpg resize: (2160, 3840) 1359829049 -5.738726198304633 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be.jpg resize: (2160, 3840) 1359829046 -5.871192776227551 treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db.jpg resize: (2160, 3840) 1359829018 -5.706058865122633 treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb.jpg resize: (2160, 3840) 1359829014 -5.8083448881130515 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531.jpg resize: (2160, 3840) 1359829011 -5.728650128918898 treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee.jpg resize: (2160, 3840) 1359829006 -5.696535119048737 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6.jpg resize: (2160, 3840) 1359828975 -5.74503223560426 treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821.jpg resize: (2160, 3840) 1359828933 -5.695248171830614 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564689_0.png resize: (306, 222) 1359858480 -1.5269434348735047 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564682_0.png resize: (299, 153) 1359858481 -3.524531169046143 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564698_0.png resize: (292, 310) 1359858482 0.6864228920162371 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564696_0.png resize: (210, 268) 1359858483 -1.5020025792912417 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564683_0.png resize: (371, 347) 1359858484 1.5729166116092086 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564687_0.png resize: (258, 229) 1359858485 -3.7218145951526087 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564685_0.png resize: (531, 558) 1359858486 -3.7711327985045315 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564688_0.png resize: (132, 262) 1359858487 -3.5695627782385064 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564692_0.png resize: (77, 63) 1359858488 0.5977718169840505 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564681_0.png resize: (131, 403) 1359858489 -1.0755691728201267 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564686_0.png resize: (166, 84) 1359858490 -1.7870228667696864 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564715_0.png resize: (127, 82) 1359858491 -2.2010471785047554 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564709_0.png resize: (589, 591) 1359858492 -4.072354076286311 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564704_0.png resize: (209, 130) 1359858493 -0.5501198635808952 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564706_0.png resize: (194, 199) 1359858494 -2.393261472086231 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564710_0.png resize: (182, 342) 1359858495 -4.1404377816962254 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564707_0.png resize: (687, 541) 1359858496 -3.63189864885557 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564713_0.png resize: (594, 439) 1359858497 -3.057140808006705 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564711_0.png resize: (150, 268) 1359858498 0.07525987975513232 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564714_0.png resize: (170, 219) 1359858499 -3.4424707635545717 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564702_0.png resize: (260, 180) 1359858500 -1.7509512603879196 treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564720_0.png resize: (259, 260) 1359858501 -3.2721044556607337 treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564719_0.png resize: (197, 221) 1359858502 -1.5205513736945868 treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564727_0.png resize: (179, 233) 1359858504 -3.528161444386938 treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564722_0.png resize: (207, 160) 1359858505 -3.810421354011595 treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564724_0.png resize: (62, 141) 1359858506 -3.718601825957815 treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564721_0.png resize: (605, 588) 1359858507 -3.7760984642638316 treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564725_0.png resize: (86, 111) 1359858508 -1.2288913407037527 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564743_0.png resize: (385, 259) 1359858510 1.1927156700442951 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564731_0.png resize: (284, 179) 1359858511 -1.1545086577904506 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564740_0.png resize: (179, 237) 1359858512 -1.0980359178438022 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564734_0.png resize: (303, 291) 1359858513 -1.4822361383176919 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564736_0.png resize: (235, 252) 1359858514 -0.6802332374334327 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564739_0.png resize: (267, 292) 1359858515 -2.4872547028600303 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564738_0.png resize: (249, 208) 1359858516 -1.62085216486336 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564732_0.png resize: (236, 133) 1359858517 -1.0372329536746059 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564735_0.png resize: (160, 283) 1359858518 1.133238582609245 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564742_0.png resize: (264, 367) 1359858519 -4.0239323056679845 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564730_0.png resize: (255, 244) 1359858520 -0.3294137252661857 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564746_0.png resize: (109, 119) 1359858521 -2.2112034598611987 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564761_0.png resize: (207, 346) 1359858522 -1.2469158787938188 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564759_0.png resize: (113, 231) 1359858523 0.7260873055307566 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564753_0.png resize: (200, 131) 1359858524 -3.3418448678145047 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564755_0.png resize: (372, 629) 1359858525 -1.3326813081105058 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564749_0.png resize: (251, 242) 1359858526 -1.8707990360852274 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564764_0.png resize: (73, 175) 1359858527 -0.06352906615350518 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564752_0.png resize: (427, 509) 1359858528 -2.7315506087902603 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564750_0.png resize: (119, 168) 1359858529 -2.2985268088657644 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564757_0.png resize: (98, 145) 1359858530 -2.3328336133361316 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564762_0.png resize: (123, 200) 1359858531 -2.9037348293058245 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564760_0.png resize: (209, 215) 1359858532 -3.8167981010704657 treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564765_0.png resize: (326, 248) 1359858533 -3.7278009155542584 treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564775_0.png resize: (226, 142) 1359858534 -2.816476229826836 treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564767_0.png resize: (641, 482) 1359858535 -0.21606255460725385 treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564771_0.png resize: (263, 231) 1359858536 -1.7935824945137424 treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564770_0.png resize: (527, 273) 1359858537 -2.4479697445757505 treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564774_0.png resize: (185, 345) 1359858538 -2.9830717984320883 treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564776_0.png resize: (173, 278) 1359858539 -2.9031875137992396 treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564768_0.png resize: (98, 114) 1359858540 -2.2704655352914327 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564788_0.png resize: (238, 292) 1359858541 -2.091291033789677 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564793_0.png resize: (284, 232) 1359858542 -2.20959437638051 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564799_0.png resize: (425, 231) 1359858544 -2.9967409649230676 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564783_0.png resize: (424, 325) 1359858545 -2.1238110070477147 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564786_0.png resize: (158, 154) 1359858546 -4.072037546914669 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564795_0.png resize: (127, 179) 1359858547 -3.113801813777152 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564779_0.png resize: (439, 340) 1359858548 -1.500199532382826 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564794_0.png resize: (114, 166) 1359858549 -0.07313568293495738 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564804_0.png resize: (169, 220) 1359858550 -2.9741195156852993 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564777_0.png resize: (185, 245) 1359858551 1.4355704351893723 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564784_0.png resize: (396, 265) 1359858552 0.09076405697216783 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564789_0.png resize: (434, 310) 1359858553 -1.9203511089295313 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564805_0.png resize: (246, 177) 1359858554 -3.2994369856002153 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564802_0.png resize: (427, 293) 1359858555 -3.7016140364520176 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564780_0.png resize: (218, 144) 1359858556 -0.7143641110619889 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564790_0.png resize: (247, 171) 1359858557 -3.5564456550835972 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564781_0.png resize: (209, 226) 1359858558 -3.257674039607696 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564796_0.png resize: (215, 193) 1359858559 -3.5138503179207503 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564787_0.png resize: (424, 307) 1359858560 -2.3173350986043224 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564778_0.png resize: (339, 167) 1359858561 -2.7298687324892446 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564808_0.png resize: (467, 502) 1359858562 -4.081967517187248 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564816_0.png resize: (301, 252) 1359858563 -3.5343660263877443 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564811_0.png resize: (105, 140) 1359858564 -2.4606138953264396 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564806_0.png resize: (198, 156) 1359858565 -2.335683130516141 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564814_0.png resize: (442, 345) 1359858566 3.464569253749276 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564820_0.png resize: (107, 93) 1359858568 -3.1025737826818625 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564818_0.png resize: (234, 220) 1359858569 -3.5675527833305734 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564807_0.png resize: (211, 95) 1359858570 -1.8156933598628617 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564812_0.png resize: (218, 128) 1359858571 -2.3109107547565158 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564837_0.png resize: (495, 148) 1359858572 -4.15125086182016 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564841_0.png resize: (155, 124) 1359858573 -0.46048882024136983 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564831_0.png resize: (228, 148) 1359858574 -4.3140938797022566 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564829_0.png resize: (351, 401) 1359858575 -3.7142916371333023 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564838_0.png resize: (145, 188) 1359858576 -1.5319288514057545 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564828_0.png resize: (214, 150) 1359858577 -2.185884641076791 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564822_0.png resize: (110, 198) 1359858579 -3.6436463470102867 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564843_0.png resize: (268, 451) 1359858580 -2.1104005170718914 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564830_0.png resize: (413, 322) 1359858581 3.10473856457186 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564825_0.png resize: (161, 196) 1359858582 -2.9376963706345562 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564826_0.png resize: (71, 146) 1359858583 -3.2214940213162486 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564835_0.png resize: (184, 92) 1359858585 -1.45791001510168 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564823_0.png resize: (384, 482) 1359858586 -1.728628917754707 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564832_0.png resize: (95, 242) 1359858587 -4.069594245101514 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564833_0.png resize: (111, 145) 1359858588 -4.2591509501597935 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564824_0.png resize: (193, 133) 1359858589 -3.426744340052951 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564827_0.png resize: (173, 144) 1359858590 -3.560490848072406 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564839_0.png resize: (159, 88) 1359858591 -2.58069507246129 treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564853_0.png resize: (208, 203) 1359858592 -3.373505412892915 treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564847_0.png resize: (269, 274) 1359858593 -0.8316534210603475 treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564846_0.png resize: (132, 166) 1359858594 -1.0421495191167334 treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564856_0.png resize: (161, 185) 1359858595 -1.5616930554738062 treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564855_0.png resize: (296, 443) 1359858596 -2.603699694906645 treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564852_0.png resize: (665, 508) 1359858597 -2.9807300903400757 treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564858_0.png resize: (538, 423) 1359858598 -4.333343683441917 treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564848_0.png resize: (137, 160) 1359858599 -3.008475923884083 treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564849_0.png resize: (199, 136) 1359858600 -1.8206366445961353 treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564860_0.png resize: (266, 161) 1359858601 -3.467317384574306 treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564868_0.png resize: (210, 220) 1359858602 -3.552279636180329 treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564863_0.png resize: (316, 560) 1359858603 -2.722755988396921 treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564869_0.png resize: (339, 402) 1359858604 -3.584485948183626 treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564862_0.png resize: (642, 820) 1359858605 -1.9258191233658613 treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564861_0.png resize: (630, 533) 1359858606 -0.5171727842119782 treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564864_0.png resize: (194, 369) 1359858607 -3.3397100406629305 treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564866_0.png resize: (463, 416) 1359858608 -2.3550759466973044 treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564865_0.png resize: (93, 130) 1359858609 -2.315575635763086 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564890_0.png resize: (230, 205) 1359858610 -2.622285128202436 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564886_0.png resize: (164, 162) 1359858611 -3.7468585072554403 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564881_0.png resize: (314, 309) 1359858612 -2.384002755128433 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564875_0.png resize: (434, 439) 1359858613 -0.3064139206657339 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564873_0.png resize: (206, 423) 1359858614 -2.7225927716679483 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564871_0.png resize: (104, 334) 1359858615 -2.880689574106397 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564877_0.png resize: (232, 189) 1359858616 -3.5929913640109548 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564874_0.png resize: (195, 222) 1359858617 -3.0480836046509086 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564888_0.png resize: (139, 150) 1359858618 -0.5494201596690899 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564884_0.png resize: (215, 143) 1359858619 -3.290347108354569 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564883_0.png resize: (106, 243) 1359858620 -2.32965200832172 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564876_0.png resize: (280, 212) 1359858621 -3.4558342780370603 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564904_0.png resize: (657, 741) 1359858622 -4.5281646310866766 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564902_0.png resize: (371, 435) 1359858623 -2.654532067178163 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564906_0.png resize: (252, 158) 1359858624 -3.551521776679739 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564895_0.png resize: (116, 80) 1359858625 -2.7664402042652343 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564908_0.png resize: (155, 215) 1359858626 -0.9597332287531846 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564894_0.png resize: (289, 493) 1359858627 -4.398651933832395 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564893_0.png resize: (378, 407) 1359858628 3.3470253973721484 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564901_0.png resize: (276, 251) 1359858629 -3.5611515544412953 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564897_0.png resize: (259, 399) 1359858630 -0.9585168661880489 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564899_0.png resize: (233, 145) 1359858631 -2.888858572215606 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564896_0.png resize: (118, 114) 1359858632 -0.2996306104437851 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564900_0.png resize: (146, 110) 1359858633 -1.99245748554744 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564926_0.png resize: (159, 189) 1359858634 -1.6523752063852708 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564924_0.png resize: (506, 372) 1359858635 -3.366215288589168 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564927_0.png resize: (284, 105) 1359858636 -3.625790992854733 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564914_0.png resize: (113, 236) 1359858637 -3.098833291952361 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564923_0.png resize: (260, 289) 1359858638 -2.6881786392784908 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564922_0.png resize: (192, 182) 1359858640 -0.30046813934313454 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564911_0.png resize: (109, 232) 1359858641 -3.5558475764386652 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564918_0.png resize: (137, 84) 1359858642 -0.5969510392369315 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564921_0.png resize: (249, 121) 1359858643 -3.0036251099886315 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564912_0.png resize: (281, 255) 1359858644 -2.020589675401354 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564909_0.png resize: (188, 155) 1359858645 -2.642547061742265 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564915_0.png resize: (125, 154) 1359858646 -1.8777076944119593 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564913_0.png resize: (219, 160) 1359858647 -3.524024351645315 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564916_0.png resize: (100, 107) 1359858648 -2.304481037284538 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564910_0.png resize: (179, 159) 1359858649 -2.0329157877829127 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564917_0.png resize: (132, 105) 1359858650 -0.6411741840775663 treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564936_0.png resize: (337, 215) 1359858651 -3.549010653703427 treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564930_0.png resize: (257, 199) 1359858652 -2.9775614176414034 treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564931_0.png resize: (94, 136) 1359858653 -0.6330707956007197 treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564938_0.png resize: (113, 180) 1359858654 -2.639984797202894 treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564935_0.png resize: (512, 488) 1359858655 -3.650929037083915 treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564929_0.png resize: (213, 267) 1359858656 -2.082325123862603 treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564933_0.png resize: (123, 117) 1359858658 -4.50769692673677 treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564934_0.png resize: (70, 110) 1359858659 -3.33774268568734 treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564932_0.png resize: (132, 128) 1359858660 -3.5454510132051604 treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564928_0.png resize: (102, 346) 1359858661 0.5728079324188102 treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564944_0.png resize: (432, 356) 1359858662 -1.6077244784160463 treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564949_0.png resize: (329, 186) 1359858663 -4.0885164058975 treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564941_0.png resize: (325, 230) 1359858664 -2.4243831942762157 treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564943_0.png resize: (298, 277) 1359858665 -3.7197956903181972 treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564940_0.png resize: (203, 256) 1359858666 -2.9405551539436305 treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564947_0.png resize: (668, 387) 1359858667 -4.383141332147011 treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564946_0.png resize: (158, 135) 1359858668 -0.49715667210081194 treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564952_0.png resize: (130, 121) 1359858669 -2.2363934578305433 treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564951_0.png resize: (227, 150) 1359858670 -3.3038564063621494 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564955_0.png resize: (445, 276) 1359858671 -3.148520002898424 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564958_0.png resize: (343, 221) 1359858672 -3.267680146381413 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564968_0.png resize: (419, 353) 1359858673 -2.425868877106145 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564965_0.png resize: (474, 518) 1359858674 -2.496618338697877 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564953_0.png resize: (161, 216) 1359858675 -1.2334886390846371 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564961_0.png resize: (157, 219) 1359858676 -3.7620987071446907 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564954_0.png resize: (313, 285) 1359858678 -1.592291799892902 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564959_0.png resize: (199, 280) 1359858679 -2.8641211639925506 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564963_0.png resize: (190, 176) 1359858680 -3.5897152200151936 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564957_0.png resize: (604, 189) 1359858681 -3.5161656224653117 treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564982_0.png resize: (416, 359) 1359858682 -3.4087282839928053 treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564984_0.png resize: (291, 402) 1359858683 -2.953529018962315 treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564985_0.png resize: (352, 311) 1359858684 2.5614369793351166 treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564977_0.png resize: (333, 113) 1359858685 -2.7780021077645767 treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564976_0.png resize: (345, 438) 1359858686 -3.1999584070582263 treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564986_0.png resize: (163, 131) 1359858687 -1.7471991103188358 treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564975_0.png resize: (167, 86) 1359858688 -2.8913998495815867 treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564980_0.png resize: (155, 101) 1359858690 2.4113061016027655 treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564974_0.png resize: (326, 106) 1359858691 -3.287397348413972 treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564978_0.png resize: (222, 416) 1359858692 -3.2933578597273323 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564994_0.png resize: (158, 128) 1359858693 -0.9613741683373153 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564988_0.png resize: (272, 345) 1359858694 -3.323487532327271 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564998_0.png resize: (90, 220) 1359858695 -1.8929041854831228 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564993_0.png resize: (110, 103) 1359858696 -0.8483839909706477 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564989_0.png resize: (293, 406) 1359858697 -1.092116055711002 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806565002_0.png resize: (94, 192) 1359858698 1.0194557811631149 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806565004_0.png resize: (156, 189) 1359858699 -4.058217866712048 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806565000_0.png resize: (121, 143) 1359858700 -1.6910591874731469 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564999_0.png resize: (296, 362) 1359858701 -3.319332919035705 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564992_0.png resize: (153, 153) 1359858702 -2.89333692805411 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564996_0.png resize: (204, 178) 1359858703 -4.546235645705292 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564990_0.png resize: (230, 204) 1359858704 -0.8020969637802089 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564995_0.png resize: (406, 507) 1359858705 -3.6610522544832844 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564997_0.png resize: (488, 532) 1359858706 -1.5065497470564624 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806565003_0.png resize: (635, 241) 1359858707 -3.9143055148509474 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806565001_0.png resize: (280, 145) 1359858708 -4.13204026167154 treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565006_0.png resize: (462, 372) 1359858709 -1.1190902984616884 treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565008_0.png resize: (198, 289) 1359858710 1.3717817149550107 treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565005_0.png resize: (199, 319) 1359858711 -2.5011308029661325 treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565014_0.png resize: (138, 119) 1359858712 0.5344152057157724 treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565019_0.png resize: (219, 108) 1359858713 -0.9857468544578365 treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565012_0.png resize: (154, 147) 1359858715 -2.574220156485902 treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565015_0.png resize: (282, 178) 1359858716 -3.30936085163865 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564703_0.png resize: (196, 184) 1359858727 -3.721414728645864 treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564717_0.png resize: (322, 382) 1359858728 -3.7310749773053113 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564737_0.png resize: (164, 229) 1359858729 -3.296552445417675 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564728_0.png resize: (137, 334) 1359858730 -1.3624943221749153 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564810_0.png resize: (168, 121) 1359858731 -2.92893863569565 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564813_0.png resize: (147, 169) 1359858732 -3.175006070407766 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564836_0.png resize: (158, 148) 1359858733 -2.638831966978695 treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564867_0.png resize: (191, 170) 1359858734 -0.2829146202792227 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564879_0.png resize: (339, 304) 1359858736 -3.2774374997788636 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564885_0.png resize: (283, 212) 1359858737 -3.197209750452959 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564891_0.png resize: (126, 89) 1359858738 -3.6080099103828998 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564889_0.png resize: (169, 225) 1359858739 -3.3744687857294173 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564878_0.png resize: (257, 149) 1359858740 -3.7246339314492047 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564898_0.png resize: (138, 109) 1359858741 -1.3678305407019704 treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564942_0.png resize: (174, 150) 1359858742 -2.9183451985470947 treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564950_0.png resize: (351, 239) 1359858743 -3.170855616303463 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564972_0.png resize: (332, 416) 1359858744 -4.685376533167724 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564970_0.png resize: (333, 516) 1359858745 -3.7253100835957738 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564973_0.png resize: (365, 392) 1359858746 -4.062103945113225 treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564991_0.png resize: (221, 224) 1359858747 -2.621262246689844 treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565017_0.png resize: (475, 812) 1359858748 -1.9708723605356842 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564697_0.png resize: (376, 218) 1359858753 -3.337581315336121 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564763_0.png resize: (496, 256) 1359858754 -4.480731556101404 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564800_0.png resize: (168, 167) 1359858755 -4.209236057524442 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564815_0.png resize: (109, 154) 1359858756 -1.779450166298014 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564907_0.png resize: (207, 231) 1359858757 -4.012270399198178 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564969_0.png resize: (169, 208) 1359858758 -3.687539996953522 treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565013_0.png resize: (217, 191) 1359858759 -4.525458758159926 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564694_0.png resize: (438, 341) 1359858808 -3.986343330161806 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564695_0.png resize: (244, 153) 1359858810 -4.367921126406634 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564690_0.png resize: (398, 496) 1359858812 -4.396964231811088 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564684_0.png resize: (362, 194) 1359858814 -3.989902639455715 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564716_0.png resize: (112, 258) 1359858815 -2.259693818247655 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564699_0.png resize: (369, 503) 1359858816 -4.69712081295792 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564701_0.png resize: (262, 254) 1359858818 -1.9400439816136812 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564705_0.png resize: (143, 300) 1359858820 -2.2102146423292597 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564708_0.png resize: (185, 244) 1359858822 -3.8426572046053513 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564712_0.png resize: (362, 485) 1359858824 -4.148543707383525 treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564700_0.png resize: (342, 212) 1359858826 -3.249799270233814 treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564718_0.png resize: (540, 615) 1359858828 -5.026218510906114 treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564723_0.png resize: (728, 402) 1359858829 -4.379241593637794 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564733_0.png resize: (392, 607) 1359858830 -3.018471981152807 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564741_0.png resize: (396, 294) 1359858831 -3.7193984691300073 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564729_0.png resize: (301, 354) 1359858832 -3.907914487590947 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564744_0.png resize: (136, 396) 1359858833 -2.895189357314906 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564747_0.png resize: (301, 265) 1359858834 -3.5743920324900222 treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564745_0.png resize: (325, 266) 1359858835 -3.332578027737439 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564754_0.png resize: (545, 416) 1359858836 -3.9398793001410315 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564751_0.png resize: (340, 564) 1359858837 -3.701637097903312 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564758_0.png resize: (445, 354) 1359858838 -3.2743098092572485 treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564766_0.png resize: (313, 513) 1359858839 -2.780760537908043 treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564772_0.png resize: (417, 270) 1359858840 -3.7485808671000105 treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564769_0.png resize: (462, 257) 1359858841 -3.4324044466534045 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564801_0.png resize: (263, 598) 1359858842 -3.909569769512447 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564792_0.png resize: (229, 219) 1359858843 -3.994095078113198 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564797_0.png resize: (443, 302) 1359858845 -4.416522170677504 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564791_0.png resize: (401, 207) 1359858846 -2.1809608310164093 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564782_0.png resize: (217, 412) 1359858847 -2.5361905300116616 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564785_0.png resize: (413, 321) 1359858848 -4.147365744934607 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564803_0.png resize: (455, 543) 1359858849 -3.8815704511462243 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564821_0.png resize: (610, 388) 1359858850 -3.5895310057426086 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564819_0.png resize: (244, 170) 1359858851 -2.592643029537919 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564809_0.png resize: (448, 394) 1359858852 -3.764665310674531 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564844_0.png resize: (244, 244) 1359858853 -3.0832957247864616 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564845_0.png resize: (426, 420) 1359858854 -3.8118507049276027 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564842_0.png resize: (374, 432) 1359858855 -3.2898513790774797 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564834_0.png resize: (423, 231) 1359858856 -4.066196173542434 treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564851_0.png resize: (242, 462) 1359858857 -2.797379868063803 treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564859_0.png resize: (300, 232) 1359858858 -3.4968433581645675 treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564857_0.png resize: (363, 265) 1359858859 -4.360378966732793 treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564854_0.png resize: (285, 179) 1359858860 -3.1111453376024785 treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564870_0.png resize: (298, 172) 1359858861 -3.9535738494175527 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564892_0.png resize: (259, 499) 1359858862 -4.394091291752199 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564872_0.png resize: (1053, 560) 1359858863 -4.2049654338499565 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564882_0.png resize: (334, 296) 1359858864 -3.7334307258200576 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564887_0.png resize: (463, 196) 1359858865 -4.180834160731928 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564903_0.png resize: (264, 390) 1359858866 -3.1383787452590814 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564920_0.png resize: (498, 533) 1359858867 -4.293111379227381 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564919_0.png resize: (152, 173) 1359858868 -1.7913434275646853 treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564939_0.png resize: (342, 447) 1359858869 -3.462977834123818 treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564937_0.png resize: (376, 345) 1359858870 -3.380523736849253 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564971_0.png resize: (454, 403) 1359858871 -2.7169142006204394 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564956_0.png resize: (346, 413) 1359858872 -2.192494881609322 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564967_0.png resize: (272, 345) 1359858873 -3.831593256019117 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564962_0.png resize: (456, 320) 1359858874 -4.63468282065792 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564966_0.png resize: (427, 564) 1359858875 -4.395958930049214 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564960_0.png resize: (302, 661) 1359858876 -4.291311838602947 treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564987_0.png resize: (728, 564) 1359858877 -3.745640172582933 treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564981_0.png resize: (273, 333) 1359858878 -2.671098319991051 treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564979_0.png resize: (335, 380) 1359858879 -3.2590308674928643 treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565011_0.png resize: (942, 585) 1359858880 -3.6154666626252614 treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565009_0.png resize: (308, 214) 1359858881 -4.743612370752239 treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565018_0.png resize: (412, 386) 1359858882 -4.764874199947343 treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565010_0.png resize: (184, 568) 1359858883 -4.367477748132166 treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565007_0.png resize: (531, 446) 1359858884 -3.947563307608069 treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565016_0.png resize: (176, 168) 1359858885 -2.937771171311273 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564691_0.png resize: (167, 477) 1359858906 -0.5584763255078059 treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564693_0.png resize: (149, 123) 1359858908 -1.987491746525443 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564748_0.png resize: (378, 375) 1359858909 -0.9575472602707339 treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564773_0.png resize: (105, 118) 1359858910 -1.432119941912175 treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564850_0.png resize: (427, 332) 1359858911 -2.8405554851640353 treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564880_0.png resize: (294, 346) 1359858912 -4.15863139651452 treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564945_0.png resize: (156, 141) 1359858913 -2.6225145847818387 treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564964_0.png resize: (120, 142) 1359858914 -2.156328490259261 treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564726_0.png resize: (331, 343) 1359858917 0.5167007615812708 treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564756_0.png resize: (224, 427) 1359858953 -4.502488351872914 treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564798_0.png resize: (250, 263) 1359858954 -4.362358886969503 treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564817_0.png resize: (287, 312) 1359858955 -4.928095377352387 treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564840_0.png resize: (281, 310) 1359858956 -4.761182795711762 treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564905_0.png resize: (298, 167) 1359858957 -3.2044880707990226 treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564925_0.png resize: (179, 163) 1359858958 -2.798993303965778 treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564948_0.png resize: (323, 164) 1359858959 -4.333444377943715 treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564983_0.png resize: (211, 271) 1359858960 -3.3065493868063425 Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 359 time used for this insertion : 0.02756786346435547 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 359 time used for this insertion : 0.06324124336242676 save missing photos in datou_result : time spend for datou_step_exec : 89.8001720905304 time spend to save output : 0.09763908386230469 total time spend for step 6 : 89.8978111743927 step7:brightness Tue May 20 21:14:58 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed.jpg treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df.jpg treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf.jpg treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb.jpg treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae.jpg treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f.jpg treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9.jpg treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b.jpg treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb.jpg treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765.jpg treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d.jpg treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468.jpg treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae.jpg treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be.jpg treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db.jpg treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb.jpg treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531.jpg treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee.jpg treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6.jpg treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821.jpg treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564689_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564682_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564698_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564696_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564683_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564687_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564685_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564688_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564692_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564681_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564686_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564715_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564709_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564704_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564706_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564710_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564707_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564713_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564711_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564714_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564702_0.png treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564720_0.png treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564719_0.png treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564727_0.png treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564722_0.png treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564724_0.png treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564721_0.png treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564725_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564743_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564731_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564740_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564734_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564736_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564739_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564738_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564732_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564735_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564742_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564730_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564746_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564761_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564759_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564753_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564755_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564749_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564764_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564752_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564750_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564757_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564762_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564760_0.png treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564765_0.png treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564775_0.png treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564767_0.png treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564771_0.png treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564770_0.png treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564774_0.png treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564776_0.png treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564768_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564788_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564793_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564799_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564783_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564786_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564795_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564779_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564794_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564804_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564777_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564784_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564789_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564805_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564802_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564780_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564790_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564781_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564796_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564787_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564778_0.png treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564808_0.png treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564816_0.png treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564811_0.png treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564806_0.png treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564814_0.png treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564820_0.png treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564818_0.png treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564807_0.png treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564812_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564837_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564841_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564831_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564829_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564838_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564828_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564822_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564843_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564830_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564825_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564826_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564835_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564823_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564832_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564833_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564824_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564827_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564839_0.png treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564853_0.png treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564847_0.png treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564846_0.png treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564856_0.png treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564855_0.png treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564852_0.png treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564858_0.png treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564848_0.png treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564849_0.png treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564860_0.png treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564868_0.png treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564863_0.png treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564869_0.png treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564862_0.png treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564861_0.png treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564864_0.png treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564866_0.png treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564865_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564890_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564886_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564881_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564875_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564873_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564871_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564877_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564874_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564888_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564884_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564883_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564876_0.png treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564904_0.png treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564902_0.png treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564906_0.png treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564895_0.png treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564908_0.png treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564894_0.png treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564893_0.png treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564901_0.png treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564897_0.png treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564899_0.png treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564896_0.png treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564900_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564926_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564924_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564927_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564914_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564923_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564922_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564911_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564918_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564921_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564912_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564909_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564915_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564913_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564916_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564910_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564917_0.png treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564936_0.png treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564930_0.png treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564931_0.png treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564938_0.png treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564935_0.png treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564929_0.png treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564933_0.png treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564934_0.png treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564932_0.png treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564928_0.png treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564944_0.png treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564949_0.png treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564941_0.png treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564943_0.png treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564940_0.png treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564947_0.png treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564946_0.png treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564952_0.png treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564951_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564955_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564958_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564968_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564965_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564953_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564961_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564954_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564959_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564963_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564957_0.png treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564982_0.png treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564984_0.png treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564985_0.png treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564977_0.png treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564976_0.png treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564986_0.png treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564975_0.png treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564980_0.png treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564974_0.png treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564978_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564994_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564988_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564998_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564993_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564989_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806565002_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806565004_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806565000_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564999_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564992_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564996_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564990_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564995_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564997_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806565003_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806565001_0.png treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565006_0.png treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565008_0.png treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565005_0.png treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565014_0.png treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565019_0.png treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565012_0.png treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565015_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564703_0.png treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564717_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564737_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564728_0.png treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564810_0.png treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564813_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564836_0.png treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564867_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564879_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564885_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564891_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564889_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564878_0.png treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564898_0.png treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564942_0.png treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564950_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564972_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564970_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564973_0.png treat image : temp/1747767629_3935224_1359828975_57518dbdc79f9c4e92dee90ea82712c6_rle_crop_3806564991_0.png treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565017_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564697_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564763_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564800_0.png treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564815_0.png treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564907_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564969_0.png treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565013_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564694_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564695_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564690_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564684_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564716_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564699_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564701_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564705_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564708_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564712_0.png treat image : temp/1747767629_3935224_1359829133_402efead3e670fcb7fe61fc07505b5df_rle_crop_3806564700_0.png treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564718_0.png treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564723_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564733_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564741_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564729_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564744_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564747_0.png treat image : temp/1747767629_3935224_1359829122_ef16f9d8d028f622ffc7a5104e8c80eb_rle_crop_3806564745_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564754_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564751_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564758_0.png treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564766_0.png treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564772_0.png treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564769_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564801_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564792_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564797_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564791_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564782_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564785_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564803_0.png treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564821_0.png treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564819_0.png treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564809_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564844_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564845_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564842_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564834_0.png treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564851_0.png treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564859_0.png treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564857_0.png treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564854_0.png treat image : temp/1747767629_3935224_1359829058_09a0200d1ca2bdfaaa3119418891de9d_rle_crop_3806564870_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564892_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564872_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564882_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564887_0.png treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564903_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564920_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564919_0.png treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564939_0.png treat image : temp/1747767629_3935224_1359829018_0945680868b2046b2de19628b29131db_rle_crop_3806564937_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564971_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564956_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564967_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564962_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564966_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564960_0.png treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564987_0.png treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564981_0.png treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564979_0.png treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565011_0.png treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565009_0.png treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565018_0.png treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565010_0.png treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565007_0.png treat image : temp/1747767629_3935224_1359828933_e5199faba5fa1878f887593e2fa49821_rle_crop_3806565016_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564691_0.png treat image : temp/1747767629_3935224_1359829146_f89668ee1847e71eff2bfbb1dfcab8ed_rle_crop_3806564693_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564748_0.png treat image : temp/1747767629_3935224_1359829087_cc70aeba33865c0fb4be9160f8ef013f_rle_crop_3806564773_0.png treat image : temp/1747767629_3935224_1359829062_e5fc0577063d00c9402611b5930c0765_rle_crop_3806564850_0.png treat image : temp/1747767629_3935224_1359829054_dc51c82967e987eda28976a285e19468_rle_crop_3806564880_0.png treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564945_0.png treat image : temp/1747767629_3935224_1359829011_538bb128beea0a7de84f091a167ee531_rle_crop_3806564964_0.png treat image : temp/1747767629_3935224_1359829127_7acb72af522cd2e859e59021c7b3fddf_rle_crop_3806564726_0.png treat image : temp/1747767629_3935224_1359829092_783817010ffe5e3d970a54e934443bae_rle_crop_3806564756_0.png treat image : temp/1747767629_3935224_1359829085_13e7df90b99ff94ac92ee1c78d71a1e9_rle_crop_3806564798_0.png treat image : temp/1747767629_3935224_1359829083_aa05fd4859c0ca5dd1b41e53ded53e3b_rle_crop_3806564817_0.png treat image : temp/1747767629_3935224_1359829069_2d47fc41997dcc9a5cfe06e68dc1cfeb_rle_crop_3806564840_0.png treat image : temp/1747767629_3935224_1359829049_cc85b598ceee93f397101fae087fadae_rle_crop_3806564905_0.png treat image : temp/1747767629_3935224_1359829046_4aab4ed365696695e299621a3d1d23be_rle_crop_3806564925_0.png treat image : temp/1747767629_3935224_1359829014_05acb44d9f70c9b3269b4d2dcec2f2cb_rle_crop_3806564948_0.png treat image : temp/1747767629_3935224_1359829006_04b8b7306209e8443e358c61e5dffaee_rle_crop_3806564983_0.png Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 359 time used for this insertion : 0.029084205627441406 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 359 time used for this insertion : 0.06473088264465332 save missing photos in datou_result : time spend for datou_step_exec : 21.546195030212402 time spend to save output : 0.10093069076538086 total time spend for step 7 : 21.647125720977783 step8:velours_tree Tue May 20 21:15:20 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure can't find the photo_desc_type Inside saveOutput : final : False verbose : 0 ouput is None No outpout to save, returning out of save general time spend for datou_step_exec : 1.5841524600982666 time spend to save output : 4.982948303222656e-05 total time spend for step 8 : 1.5842022895812988 step9:send_mail_cod Tue May 20 21:15:21 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin/workarea/git/Velours/python in order to get the selector url, please entre the license of selector results_Auto_P23180311_20-05-2025_21_15_21.pdf 23181403 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette231814031747768521 23181404 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette231814041747768523 23181406 imagette231814061747768524 23181407 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette231814071747768524 23181408 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette231814081747768526 23181409 change filename to text .imagette231814091747768526 23181410 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette231814101747768526 23181411 imagette231814111747768528 23181412 imagette231814121747768528 23181413 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette231814131747768528 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=23180311 and hashtag_type = 3594 and pcr.hashtag_id = h.hashtag_id; velour_link : https://www.fotonower.com/velours/23181403,23181404,23181405,23181406,23181407,23181408,23181409,23181410,23181411,23181412,23181413?tags=papier,pet_fonce,environnement,mal_croppe,pet_clair,metal,pehd,carton,background,flou,autre args[1359829146] : ((1359829146, -5.681171819834477, 492609224), (1359829146, 0.8244116572432387, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829133] : ((1359829133, -5.775637919782563, 492609224), (1359829133, 0.7830930864046324, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829127] : ((1359829127, -5.902173284317751, 492609224), (1359829127, 0.7715347249419641, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829122] : ((1359829122, -5.755227704330263, 492609224), (1359829122, 0.8470549046045038, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829092] : ((1359829092, -5.7961720836510215, 492609224), (1359829092, 0.609033517490448, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829087] : ((1359829087, -5.857180255534818, 492609224), (1359829087, 0.7431795389520265, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829085] : ((1359829085, -5.720045014715731, 492609224), (1359829085, 0.791865976502204, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829083] : ((1359829083, -5.891315567462301, 492609224), (1359829083, 0.7528174570561502, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829069] : ((1359829069, -5.841597054309442, 492609224), (1359829069, 0.7807221976228149, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829062] : ((1359829062, -5.762358309287807, 492609224), (1359829062, 0.7172407260120679, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829058] : ((1359829058, -5.632424055985447, 492609224), (1359829058, 0.8665736293345655, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829054] : ((1359829054, -5.7192397105420225, 492609224), (1359829054, 0.822086005500699, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829049] : ((1359829049, -5.738726198304633, 492609224), (1359829049, 0.6964993650930784, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829046] : ((1359829046, -5.871192776227551, 492609224), (1359829046, 0.7688183094766181, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829018] : ((1359829018, -5.706058865122633, 492609224), (1359829018, 0.7545339786241432, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829014] : ((1359829014, -5.8083448881130515, 492609224), (1359829014, 0.8555810821943493, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829011] : ((1359829011, -5.728650128918898, 492609224), (1359829011, 0.839638431243875, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359829006] : ((1359829006, -5.696535119048737, 492609224), (1359829006, 0.8075063152742784, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359828975] : ((1359828975, -5.74503223560426, 492609224), (1359828975, 0.9037234854254488, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com args[1359828933] : ((1359828933, -5.695248171830614, 492609224), (1359828933, 0.7761998247812497, 2107752395), '0.12038170331790125') We are sending mail with results at report@fotonower.com refus_total : 0.12038170331790125 2022-04-13 10:29:59 0 SELECT ph.photo_id,ph.url,ph.username,ph.uploaded_at,ph.text FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=23180311 AND mpp.hide_status=0 ORDER BY mpp.order LIMIT 0, 1000 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23180311_20-05-2025_21_15_21.pdf results_Auto_P23180311_20-05-2025_21_15_21.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23180311_20-05-2025_21_15_21.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('3318','23180311','results_Auto_P23180311_20-05-2025_21_15_21.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23180311_20-05-2025_21_15_21.pdf','pdf','','1.32','0.12038170331790125') message_in_mail: Bonjour,
Veuillez trouver ci dessous les résultats du service carac on demand pour le portfolio: https://www.fotonower.com/view/23180311

https://www.fotonower.com/image?json=false&list_photos_id=1359829146
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829133
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829127
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829122
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829092
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829087
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829085
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829083
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829069
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829062
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829058
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829054
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829049
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829046
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829018
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829014
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829011
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359829006
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359828975
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1359828933
Bravo, la photo est bien prise.

Dans ces conditions,le taux de refus est: 12.04%
Veuillez trouver les photos des contaminants.

exemples de contaminants: papier: https://www.fotonower.com/view/23181403?limit=200
exemples de contaminants: pet_fonce: https://www.fotonower.com/view/23181404?limit=200
exemples de contaminants: pet_clair: https://www.fotonower.com/view/23181407?limit=200
exemples de contaminants: metal: https://www.fotonower.com/view/23181408?limit=200
exemples de contaminants: pehd: https://www.fotonower.com/view/23181409?limit=200
exemples de contaminants: carton: https://www.fotonower.com/view/23181410?limit=200
exemples de contaminants: autre: https://www.fotonower.com/view/23181413?limit=200
Veuillez trouver le rapport en pdf:https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23180311_20-05-2025_21_15_21.pdf.

Lien vers velours :https://www.fotonower.com/velours/23181403,23181404,23181405,23181406,23181407,23181408,23181409,23181410,23181411,23181412,23181413?tags=papier,pet_fonce,environnement,mal_croppe,pet_clair,metal,pehd,carton,background,flou,autre.


L'équipe Fotonower 202 b'' Server: nginx Date: Tue, 20 May 2025 19:15:34 GMT Content-Length: 0 Connection: close X-Message-Id: MtNKipGqTlqHeKB7BtHG8g Access-Control-Allow-Origin: https://sendgrid.api-docs.io Access-Control-Allow-Methods: POST Access-Control-Allow-Headers: Authorization, Content-Type, On-behalf-of, x-sg-elas-acl Access-Control-Max-Age: 600 X-No-CORS-Reason: https://sendgrid.com/docs/Classroom/Basics/API/cors.html Strict-Transport-Security: max-age=31536000; includeSubDomains Content-Security-Policy: frame-ancestors 'none' Cache-Control: no-cache X-Content-Type-Options: no-sniff Referrer-Policy: strict-origin-when-cross-origin Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1359829146, 1359829133, 1359829127, 1359829122, 1359829092, 1359829087, 1359829085, 1359829083, 1359829069, 1359829062, 1359829058, 1359829054, 1359829049, 1359829046, 1359829018, 1359829014, 1359829011, 1359829006, 1359828975, 1359828933] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829146', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829133', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829127', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829122', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829092', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829087', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829085', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829083', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829069', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829062', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829058', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829054', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829049', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829046', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829018', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829014', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829011', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829006', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359828975', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359828933', None, None, None, None, None, '2913534') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 20 time used for this insertion : 0.01629924774169922 save_final save missing photos in datou_result : time spend for datou_step_exec : 12.727157831192017 time spend to save output : 0.016578197479248047 total time spend for step 9 : 12.743736028671265 step10:split_time_score Tue May 20 21:15:34 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score Catched exception ! Connect or reconnect ! TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('16', 20),) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 20052025 23180311 Nombre de photos uploadées : 20 / 23040 (0%) 20052025 23180311 Nombre de photos taguées (types de déchets): 0 / 20 (0%) 20052025 23180311 Nombre de photos taguées (volume) : 0 / 20 (0%) elapsed_time : load_data_split_time_score 2.1457672119140625e-06 elapsed_time : order_list_meta_photo_and_scores 5.0067901611328125e-06 ???????????????????? elapsed_time : fill_and_build_computed_from_old_data 0.0007033348083496094 elapsed_time : insert_dashboard_record_day_entry 0.029105186462402344 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.09201400421027917 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23157181_20-05-2025_08_46_06.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23157181 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23157181 AND mptpi.`type`=3594 To do Qualite : 0.10180426772060878 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23157188_20-05-2025_08_35_35.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23157188 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23157188 AND mptpi.`type`=3594 To do Qualite : 0.09871742380401231 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23157192_20-05-2025_08_28_29.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23157192 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23157192 AND mptpi.`type`=3594 To do Qualite : 0.10408518822159352 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23161557_20-05-2025_10_30_41.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23161557 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23161557 AND mptpi.`type`=3594 To do Qualite : 0.1308608699845679 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23164737_20-05-2025_11_54_17.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23164737 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23164737 AND mptpi.`type`=3594 To do Qualite : 0.1064864810896508 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23164744_20-05-2025_12_03_09.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23164744 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23164744 AND mptpi.`type`=3594 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23170102 order by id desc limit 1 Qualite : 0.11795572916666669 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23170104_20-05-2025_15_34_40.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23170104 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23170104 AND mptpi.`type`=3594 To do Qualite : 0.02726871178050528 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23166958_20-05-2025_13_30_06.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23166958 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23166958 AND mptpi.`type`=3726 To do Qualite : 0.09543954354745371 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23166961_20-05-2025_13_20_30.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23166961 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23166961 AND mptpi.`type`=3594 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23170110 order by id desc limit 1 Qualite : 0.10659472484016758 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23170112_20-05-2025_15_29_13.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23170112 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23170112 AND mptpi.`type`=3594 To do Qualite : 0.11812746691001888 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23171910_20-05-2025_17_03_19.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23171910 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23171910 AND mptpi.`type`=3594 To do Qualite : 0.19817297078617974 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23172899_20-05-2025_16_57_08.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23172899 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23172899 AND mptpi.`type`=3594 To do Qualite : 0.20049805783529734 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23172907_20-05-2025_16_49_21.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23172907 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23172907 AND mptpi.`type`=3594 To do Qualite : 0.12038170331790125 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23180311_20-05-2025_21_15_21.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23180311 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23180311 AND mptpi.`type`=3594 To do Qualite : 0.08694197959533603 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23180313_20-05-2025_20_55_56.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23180313 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23180313 AND mptpi.`type`=3594 To do Qualite : 0.13942144974046014 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23176165_20-05-2025_20_19_35.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23176165 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23176165 AND mptpi.`type`=3594 To do Qualite : 0.14197543925218625 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23176167_20-05-2025_18_56_37.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23176167 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23176167 AND mptpi.`type`=3594 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23179278 order by id desc limit 1 Qualite : 0.09473824295343136 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23179281_20-05-2025_20_30_26.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23179281 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23179281 AND mptpi.`type`=3594 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23179282 order by id desc limit 1 Qualite : 0.07927160493827161 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23179284_20-05-2025_20_16_37.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23179284 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23179284 AND mptpi.`type`=3594 To do Qualite : 0.12132243269124779 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P23180321_20-05-2025_20_49_49.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 23180321 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=23180321 AND mptpi.`type`=3594 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'20052025': {'nb_upload': 20, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1359829146, 1359829133, 1359829127, 1359829122, 1359829092, 1359829087, 1359829085, 1359829083, 1359829069, 1359829062, 1359829058, 1359829054, 1359829049, 1359829046, 1359829018, 1359829014, 1359829011, 1359829006, 1359828975, 1359828933] Looping around the photos to save general results len do output : 1 /23180311Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829146', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829133', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829127', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829122', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829092', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829087', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829085', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829083', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829069', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829062', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829058', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829054', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829049', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829046', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829018', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829014', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829011', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359829006', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359828975', None, None, None, None, None, '2913534') ('3318', None, None, None, None, None, None, None, '2913534') ('3318', '23180311', '1359828933', None, None, None, None, None, '2913534') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 21 time used for this insertion : 0.018561124801635742 save_final save missing photos in datou_result : time spend for datou_step_exec : 4.980631113052368 time spend to save output : 0.018909454345703125 total time spend for step 10 : 4.999540567398071 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 20 set_done_treatment 358.44user 370.02system 15:13.35elapsed 79%CPU (0avgtext+0avgdata 5681192maxresident)k 16875112inputs+294728outputs (575556major+36255545minor)pagefaults 0swaps