python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 3318 ' -s datou_3318 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 3479611 load datou : 3318 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? [(photo_id, hashtag_id, hashtag_type, x0, x1, y0, y1, score, seg_temp, polygons), ...] was removed should we ? chemin de la photo was removed should we ? [ (photo_id_loc, hashtag_id, hashtag_type, x0, x1, y0, y1, score, None), ...] was removed should we ? chemin de la photo was removed should we ? id de la photo (peut être local ou global) was removed should we ? chemin de la photo was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? None was removed should we ? donnée sous forme de nombre was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) load thcls load THCL from format json or kwargs add thcl : 2847 in CacheModelConfig load pdts add pdt : 5275 in CacheModelConfig Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 3318, datou_cur_ids : ['3370219'] with mtr_portfolio_ids : ['25399044'] and first list_photo_ids : [] new path : /proc/3479611/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : mask_detect, crop_condition, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, velours_tree, send_mail_cod, split_time_score over limit max, limiting to limit_max 40 list_input_json : [] origin We have 1 , BFBFBFBFBFBFBFBFBFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 13 ; length of list_pids : 13 ; length of list_args : 13 time to download the photos : 2.1628339290618896 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 10 step1:mask_detect Mon Jul 28 12:50:29 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 3411 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-07-28 12:50:37.570873: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-07-28 12:50:37.607636: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493035000 Hz 2025-07-28 12:50:37.610604: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f6630000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-07-28 12:50:37.610666: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-07-28 12:50:37.615145: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-07-28 12:50:37.775606: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x3a1ee560 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-07-28 12:50:37.775670: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-07-28 12:50:37.777042: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-07-28 12:50:37.778773: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-28 12:50:37.816356: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-28 12:50:37.836858: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-07-28 12:50:37.841620: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-07-28 12:50:37.877383: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-07-28 12:50:37.881344: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-07-28 12:50:37.897100: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-28 12:50:37.898245: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-07-28 12:50:37.899522: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-28 12:50:37.900321: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-07-28 12:50:37.900337: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-07-28 12:50:37.900347: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-07-28 12:50:37.901680: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 3048 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-07-28 12:50:38.246245: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-07-28 12:50:38.246358: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-28 12:50:38.246379: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-28 12:50:38.246396: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-07-28 12:50:38.246412: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-07-28 12:50:38.246428: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-07-28 12:50:38.246444: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-07-28 12:50:38.246460: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-28 12:50:38.247384: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-07-28 12:50:38.248640: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-07-28 12:50:38.248720: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-28 12:50:38.248737: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-28 12:50:38.248752: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-07-28 12:50:38.248815: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-07-28 12:50:38.248840: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-07-28 12:50:38.248863: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-07-28 12:50:38.248880: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-28 12:50:38.250065: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-07-28 12:50:38.250131: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-07-28 12:50:38.250142: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-07-28 12:50:38.250151: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-07-28 12:50:38.251151: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 3048 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl2847 thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) {'thcl': {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'], 'list_hashtags_csv': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'svm_hashtag_type_desc': 5275, 'photo_desc_type': 5275, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME learn_RUBBIA_REFUS_AMIENS_23 NUM_CLASSES 9 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : learn_RUBBIA_REFUS_AMIENS_23 model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-07-28 12:50:53.671312: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-28 12:50:54.161218: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-28 12:50:57.151292: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.152231: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.23G (2394331392 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.152271: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-28 12:50:57.153401: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.153424: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-28 12:50:57.161340: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.161376: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-28 12:50:57.161979: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.161998: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-28 12:50:57.171502: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.171542: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 466.56MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-28 12:50:57.172357: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.172379: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 466.56MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-28 12:50:57.208808: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.208903: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-28 12:50:57.209550: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.209571: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-28 12:50:57.216473: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.216510: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 243.25MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-28 12:50:57.217333: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.217353: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 243.25MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-28 12:50:57.261052: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.261782: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.263796: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.264414: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.316800: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.317464: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.319884: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.320699: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.367207: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.368078: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.370797: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.371441: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.388981: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.389774: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.392404: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.393062: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.405033: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.405799: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.407705: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.408449: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.444809: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.445615: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.446455: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.447128: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.452399: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.453125: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.480619: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.481598: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.482427: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.483302: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.499748: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.500451: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.501100: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.501821: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.507249: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.507873: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.512716: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.513501: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.526092: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.526822: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.532114: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.532895: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.533597: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.534334: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.535563: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.536295: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.547480: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.548759: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.549782: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.550552: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.551150: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.551759: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.568291: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.583975: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.678832: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.678896: W tensorflow/core/kernels/gpu_utils.cc:49] Failed to allocate memory for convolution redzone checking; skipping this check. This is benign and only means that we won't check cudnn for out-of-bounds reads and writes. This message will only be printed once. 2025-07-28 12:50:57.679956: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.680976: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.703739: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.704811: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.705820: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.706608: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.714715: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.715436: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.775733: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.792108: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.793074: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.807045: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.820828: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.821610: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.822209: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.822802: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.839288: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.847372: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.848556: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.858605: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.859561: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.860353: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.861051: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.862074: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-28 12:50:57.862751: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.48G (2660368384 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory local folder : /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23 /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23/mask_model.h5 size_local : 256009536 size in s3 : 256009536 create time local : 2021-08-09 09:43:22 create time in s3 : 2021-08-06 18:54:04 mask_model.h5 already exist and didn't need to update list_images length : 13 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 34 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 37 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 40 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 39 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 28 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 36 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 46 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 33 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 31 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 37 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 38 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 36 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 34 Detection mask done ! Trying to reset tf kernel 3480156 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 4791 tf kernel not reseted sub process len(results) : 13 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 13 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 5984 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl2847 Catched exception ! Connect or reconnect ! thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] time for calcul the mask position with numpy : 0.0009248256683349609 nb_pixel_total : 27329 time to create 1 rle with old method : 0.03440356254577637 length of segment : 197 time for calcul the mask position with numpy : 0.0026869773864746094 nb_pixel_total : 116300 time to create 1 rle with old method : 0.13677215576171875 length of segment : 699 time for calcul the mask position with numpy : 0.0005564689636230469 nb_pixel_total : 24824 time to create 1 rle with old method : 0.0301513671875 length of segment : 167 time for calcul the mask position with numpy : 0.0005478858947753906 nb_pixel_total : 23653 time to create 1 rle with old method : 0.028577566146850586 length of segment : 277 time for calcul the mask position with numpy : 0.0006594657897949219 nb_pixel_total : 29035 time to create 1 rle with old method : 0.0350034236907959 length of segment : 180 time for calcul the mask position with numpy : 0.00047659873962402344 nb_pixel_total : 22731 time to create 1 rle with old method : 0.027405261993408203 length of segment : 187 time for calcul the mask position with numpy : 0.0019478797912597656 nb_pixel_total : 93079 time to create 1 rle with old method : 0.11105585098266602 length of segment : 352 time for calcul the mask position with numpy : 0.0002522468566894531 nb_pixel_total : 8753 time to create 1 rle with old method : 0.010752201080322266 length of segment : 85 time for calcul the mask position with numpy : 0.0014009475708007812 nb_pixel_total : 73133 time to create 1 rle with old method : 0.08776307106018066 length of segment : 443 time for calcul the mask position with numpy : 0.0006115436553955078 nb_pixel_total : 28245 time to create 1 rle with old method : 0.033689260482788086 length of segment : 278 time for calcul the mask position with numpy : 0.00042748451232910156 nb_pixel_total : 24948 time to create 1 rle with old method : 0.030105113983154297 length of segment : 164 time for calcul the mask position with numpy : 0.001295328140258789 nb_pixel_total : 63152 time to create 1 rle with old method : 0.07629585266113281 length of segment : 321 time for calcul the mask position with numpy : 0.0005578994750976562 nb_pixel_total : 21517 time to create 1 rle with old method : 0.025675296783447266 length of segment : 226 time for calcul the mask position with numpy : 0.001222848892211914 nb_pixel_total : 67060 time to create 1 rle with old method : 0.07879495620727539 length of segment : 357 time for calcul the mask position with numpy : 0.0012843608856201172 nb_pixel_total : 36345 time to create 1 rle with old method : 0.04997873306274414 length of segment : 327 time for calcul the mask position with numpy : 0.0009717941284179688 nb_pixel_total : 58946 time to create 1 rle with old method : 0.07101583480834961 length of segment : 516 time for calcul the mask position with numpy : 0.00028824806213378906 nb_pixel_total : 14770 time to create 1 rle with old method : 0.02188849449157715 length of segment : 152 time for calcul the mask position with numpy : 0.0005164146423339844 nb_pixel_total : 14622 time to create 1 rle with old method : 0.017838478088378906 length of segment : 117 time for calcul the mask position with numpy : 0.012404918670654297 nb_pixel_total : 259837 time to create 1 rle with new method : 0.03179287910461426 length of segment : 837 time for calcul the mask position with numpy : 0.0022535324096679688 nb_pixel_total : 40956 time to create 1 rle with old method : 0.04851126670837402 length of segment : 404 time for calcul the mask position with numpy : 0.0004229545593261719 nb_pixel_total : 8475 time to create 1 rle with old method : 0.010799407958984375 length of segment : 111 time for calcul the mask position with numpy : 0.0022411346435546875 nb_pixel_total : 60277 time to create 1 rle with old method : 0.07169771194458008 length of segment : 303 time for calcul the mask position with numpy : 0.0037224292755126953 nb_pixel_total : 107624 time to create 1 rle with old method : 0.1302199363708496 length of segment : 399 time for calcul the mask position with numpy : 0.0003554821014404297 nb_pixel_total : 6661 time to create 1 rle with old method : 0.008055925369262695 length of segment : 135 time for calcul the mask position with numpy : 0.0007691383361816406 nb_pixel_total : 14808 time to create 1 rle with old method : 0.017527103424072266 length of segment : 169 time for calcul the mask position with numpy : 0.001119375228881836 nb_pixel_total : 24041 time to create 1 rle with old method : 0.02864384651184082 length of segment : 323 time for calcul the mask position with numpy : 0.00138092041015625 nb_pixel_total : 59258 time to create 1 rle with old method : 0.07103371620178223 length of segment : 395 time for calcul the mask position with numpy : 0.0008995532989501953 nb_pixel_total : 16322 time to create 1 rle with old method : 0.01934671401977539 length of segment : 293 time for calcul the mask position with numpy : 0.0011606216430664062 nb_pixel_total : 30987 time to create 1 rle with old method : 0.036806583404541016 length of segment : 169 time for calcul the mask position with numpy : 0.0003910064697265625 nb_pixel_total : 6007 time to create 1 rle with old method : 0.0072710514068603516 length of segment : 123 time for calcul the mask position with numpy : 0.0018596649169921875 nb_pixel_total : 30968 time to create 1 rle with old method : 0.03703594207763672 length of segment : 219 time for calcul the mask position with numpy : 0.0015342235565185547 nb_pixel_total : 33330 time to create 1 rle with old method : 0.03996467590332031 length of segment : 204 time for calcul the mask position with numpy : 0.002199888229370117 nb_pixel_total : 44801 time to create 1 rle with old method : 0.05341958999633789 length of segment : 233 time for calcul the mask position with numpy : 0.0017323493957519531 nb_pixel_total : 53767 time to create 1 rle with old method : 0.06351161003112793 length of segment : 288 time for calcul the mask position with numpy : 0.004323720932006836 nb_pixel_total : 112947 time to create 1 rle with old method : 0.1317136287689209 length of segment : 416 time for calcul the mask position with numpy : 0.0010135173797607422 nb_pixel_total : 20496 time to create 1 rle with old method : 0.029112815856933594 length of segment : 179 time for calcul the mask position with numpy : 0.0007009506225585938 nb_pixel_total : 12818 time to create 1 rle with old method : 0.020615339279174805 length of segment : 119 time for calcul the mask position with numpy : 0.0009768009185791016 nb_pixel_total : 20001 time to create 1 rle with old method : 0.024239301681518555 length of segment : 198 time for calcul the mask position with numpy : 0.0013225078582763672 nb_pixel_total : 31730 time to create 1 rle with old method : 0.03751492500305176 length of segment : 306 time for calcul the mask position with numpy : 0.0015361309051513672 nb_pixel_total : 28694 time to create 1 rle with old method : 0.03457903861999512 length of segment : 276 time for calcul the mask position with numpy : 0.0007810592651367188 nb_pixel_total : 11658 time to create 1 rle with old method : 0.014242410659790039 length of segment : 225 time for calcul the mask position with numpy : 0.0020647048950195312 nb_pixel_total : 55974 time to create 1 rle with old method : 0.07182478904724121 length of segment : 226 time for calcul the mask position with numpy : 0.0016024112701416016 nb_pixel_total : 25192 time to create 1 rle with old method : 0.030126333236694336 length of segment : 315 time for calcul the mask position with numpy : 0.002785205841064453 nb_pixel_total : 75577 time to create 1 rle with old method : 0.09034490585327148 length of segment : 318 time for calcul the mask position with numpy : 0.0018939971923828125 nb_pixel_total : 51308 time to create 1 rle with old method : 0.07647991180419922 length of segment : 429 time for calcul the mask position with numpy : 0.001165628433227539 nb_pixel_total : 26312 time to create 1 rle with old method : 0.03252077102661133 length of segment : 388 time for calcul the mask position with numpy : 0.0016434192657470703 nb_pixel_total : 28761 time to create 1 rle with old method : 0.0357973575592041 length of segment : 386 time for calcul the mask position with numpy : 0.0007989406585693359 nb_pixel_total : 17680 time to create 1 rle with old method : 0.02131366729736328 length of segment : 194 time for calcul the mask position with numpy : 0.0022094249725341797 nb_pixel_total : 51035 time to create 1 rle with old method : 0.07011723518371582 length of segment : 320 time for calcul the mask position with numpy : 0.002107858657836914 nb_pixel_total : 37713 time to create 1 rle with old method : 0.0500340461730957 length of segment : 351 time for calcul the mask position with numpy : 0.002778291702270508 nb_pixel_total : 74851 time to create 1 rle with old method : 0.09512472152709961 length of segment : 311 time for calcul the mask position with numpy : 0.0014722347259521484 nb_pixel_total : 21604 time to create 1 rle with old method : 0.029910802841186523 length of segment : 147 time for calcul the mask position with numpy : 0.0003867149353027344 nb_pixel_total : 7423 time to create 1 rle with old method : 0.009152412414550781 length of segment : 131 time for calcul the mask position with numpy : 0.0033538341522216797 nb_pixel_total : 81151 time to create 1 rle with old method : 0.10140442848205566 length of segment : 423 time for calcul the mask position with numpy : 0.003621339797973633 nb_pixel_total : 70911 time to create 1 rle with old method : 0.10192441940307617 length of segment : 345 time for calcul the mask position with numpy : 0.0010449886322021484 nb_pixel_total : 12684 time to create 1 rle with old method : 0.0242917537689209 length of segment : 142 time for calcul the mask position with numpy : 0.0020902156829833984 nb_pixel_total : 41488 time to create 1 rle with old method : 0.04926180839538574 length of segment : 347 time for calcul the mask position with numpy : 0.0015075206756591797 nb_pixel_total : 16415 time to create 1 rle with old method : 0.021938562393188477 length of segment : 304 time for calcul the mask position with numpy : 0.0008969306945800781 nb_pixel_total : 11533 time to create 1 rle with old method : 0.014221668243408203 length of segment : 131 time for calcul the mask position with numpy : 0.0006537437438964844 nb_pixel_total : 18847 time to create 1 rle with old method : 0.027298688888549805 length of segment : 124 time for calcul the mask position with numpy : 0.0014793872833251953 nb_pixel_total : 33861 time to create 1 rle with old method : 0.04012656211853027 length of segment : 206 time for calcul the mask position with numpy : 0.0015070438385009766 nb_pixel_total : 23619 time to create 1 rle with old method : 0.04002785682678223 length of segment : 196 time for calcul the mask position with numpy : 0.0015573501586914062 nb_pixel_total : 30779 time to create 1 rle with old method : 0.03657197952270508 length of segment : 267 time for calcul the mask position with numpy : 0.0038018226623535156 nb_pixel_total : 124084 time to create 1 rle with old method : 0.14881181716918945 length of segment : 605 time for calcul the mask position with numpy : 0.000997781753540039 nb_pixel_total : 22210 time to create 1 rle with old method : 0.026917457580566406 length of segment : 126 time for calcul the mask position with numpy : 0.0020723342895507812 nb_pixel_total : 42470 time to create 1 rle with old method : 0.051737070083618164 length of segment : 268 time for calcul the mask position with numpy : 0.0044515132904052734 nb_pixel_total : 130805 time to create 1 rle with old method : 0.15335297584533691 length of segment : 597 time for calcul the mask position with numpy : 0.0005395412445068359 nb_pixel_total : 6053 time to create 1 rle with old method : 0.0076427459716796875 length of segment : 156 time for calcul the mask position with numpy : 0.00057220458984375 nb_pixel_total : 14608 time to create 1 rle with old method : 0.017888545989990234 length of segment : 119 time for calcul the mask position with numpy : 0.001134634017944336 nb_pixel_total : 21390 time to create 1 rle with old method : 0.025859355926513672 length of segment : 292 time for calcul the mask position with numpy : 0.002675771713256836 nb_pixel_total : 133139 time to create 1 rle with old method : 0.15652942657470703 length of segment : 582 time for calcul the mask position with numpy : 0.0013971328735351562 nb_pixel_total : 23498 time to create 1 rle with old method : 0.027976036071777344 length of segment : 229 time for calcul the mask position with numpy : 0.0006477832794189453 nb_pixel_total : 23556 time to create 1 rle with old method : 0.028023958206176758 length of segment : 180 time for calcul the mask position with numpy : 0.0021224021911621094 nb_pixel_total : 43309 time to create 1 rle with old method : 0.0519864559173584 length of segment : 252 time for calcul the mask position with numpy : 0.002961397171020508 nb_pixel_total : 85608 time to create 1 rle with old method : 0.10355257987976074 length of segment : 306 time for calcul the mask position with numpy : 0.0022394657135009766 nb_pixel_total : 55780 time to create 1 rle with old method : 0.06615495681762695 length of segment : 494 time for calcul the mask position with numpy : 0.0014901161193847656 nb_pixel_total : 51159 time to create 1 rle with old method : 0.06178593635559082 length of segment : 304 time for calcul the mask position with numpy : 0.0006384849548339844 nb_pixel_total : 9016 time to create 1 rle with old method : 0.011392354965209961 length of segment : 303 time for calcul the mask position with numpy : 0.005071163177490234 nb_pixel_total : 109806 time to create 1 rle with old method : 0.13046550750732422 length of segment : 530 time for calcul the mask position with numpy : 0.0015499591827392578 nb_pixel_total : 40856 time to create 1 rle with old method : 0.048455238342285156 length of segment : 385 time for calcul the mask position with numpy : 0.0005931854248046875 nb_pixel_total : 13977 time to create 1 rle with old method : 0.016949176788330078 length of segment : 109 time for calcul the mask position with numpy : 0.0015475749969482422 nb_pixel_total : 42185 time to create 1 rle with old method : 0.051670074462890625 length of segment : 253 time for calcul the mask position with numpy : 0.001968860626220703 nb_pixel_total : 51334 time to create 1 rle with old method : 0.06099677085876465 length of segment : 343 time for calcul the mask position with numpy : 0.0009410381317138672 nb_pixel_total : 21936 time to create 1 rle with old method : 0.02630162239074707 length of segment : 219 time for calcul the mask position with numpy : 0.002392292022705078 nb_pixel_total : 82800 time to create 1 rle with old method : 0.0999598503112793 length of segment : 450 time for calcul the mask position with numpy : 0.0016734600067138672 nb_pixel_total : 29698 time to create 1 rle with old method : 0.03626418113708496 length of segment : 405 time for calcul the mask position with numpy : 0.0006759166717529297 nb_pixel_total : 18765 time to create 1 rle with old method : 0.023153305053710938 length of segment : 383 time for calcul the mask position with numpy : 0.0012912750244140625 nb_pixel_total : 26508 time to create 1 rle with old method : 0.03261446952819824 length of segment : 291 time for calcul the mask position with numpy : 0.003391265869140625 nb_pixel_total : 76963 time to create 1 rle with old method : 0.09244489669799805 length of segment : 620 time for calcul the mask position with numpy : 0.006371259689331055 nb_pixel_total : 84383 time to create 1 rle with old method : 0.10339832305908203 length of segment : 594 time for calcul the mask position with numpy : 0.0011143684387207031 nb_pixel_total : 20852 time to create 1 rle with old method : 0.026663541793823242 length of segment : 146 time for calcul the mask position with numpy : 0.007819414138793945 nb_pixel_total : 231340 time to create 1 rle with new method : 0.014589309692382812 length of segment : 730 time for calcul the mask position with numpy : 0.0013740062713623047 nb_pixel_total : 33411 time to create 1 rle with old method : 0.03954195976257324 length of segment : 265 time for calcul the mask position with numpy : 0.001882791519165039 nb_pixel_total : 43126 time to create 1 rle with old method : 0.051726341247558594 length of segment : 206 time for calcul the mask position with numpy : 0.0004260540008544922 nb_pixel_total : 20007 time to create 1 rle with old method : 0.023700714111328125 length of segment : 201 time for calcul the mask position with numpy : 0.0015604496002197266 nb_pixel_total : 65243 time to create 1 rle with old method : 0.07627344131469727 length of segment : 505 time for calcul the mask position with numpy : 0.0009441375732421875 nb_pixel_total : 28023 time to create 1 rle with old method : 0.033400774002075195 length of segment : 240 time for calcul the mask position with numpy : 0.0018773078918457031 nb_pixel_total : 35056 time to create 1 rle with old method : 0.04263901710510254 length of segment : 277 time for calcul the mask position with numpy : 0.001779794692993164 nb_pixel_total : 51612 time to create 1 rle with old method : 0.06135725975036621 length of segment : 204 time for calcul the mask position with numpy : 0.0017993450164794922 nb_pixel_total : 27965 time to create 1 rle with old method : 0.03301858901977539 length of segment : 351 time for calcul the mask position with numpy : 0.0006775856018066406 nb_pixel_total : 21076 time to create 1 rle with old method : 0.024199485778808594 length of segment : 408 time for calcul the mask position with numpy : 0.0007383823394775391 nb_pixel_total : 19003 time to create 1 rle with old method : 0.02179574966430664 length of segment : 129 time for calcul the mask position with numpy : 0.0016829967498779297 nb_pixel_total : 60483 time to create 1 rle with old method : 0.06939482688903809 length of segment : 339 time for calcul the mask position with numpy : 0.0016527175903320312 nb_pixel_total : 36714 time to create 1 rle with old method : 0.04257559776306152 length of segment : 306 time for calcul the mask position with numpy : 0.003787517547607422 nb_pixel_total : 129468 time to create 1 rle with old method : 0.14766240119934082 length of segment : 670 time for calcul the mask position with numpy : 0.005754947662353516 nb_pixel_total : 163470 time to create 1 rle with new method : 0.013964653015136719 length of segment : 639 time for calcul the mask position with numpy : 0.0006635189056396484 nb_pixel_total : 14677 time to create 1 rle with old method : 0.01773357391357422 length of segment : 170 time for calcul the mask position with numpy : 0.0004372596740722656 nb_pixel_total : 23292 time to create 1 rle with old method : 0.027823209762573242 length of segment : 225 time for calcul the mask position with numpy : 0.0009417533874511719 nb_pixel_total : 37106 time to create 1 rle with old method : 0.0441136360168457 length of segment : 227 time for calcul the mask position with numpy : 0.0007190704345703125 nb_pixel_total : 19388 time to create 1 rle with old method : 0.023106813430786133 length of segment : 176 time for calcul the mask position with numpy : 0.0019154548645019531 nb_pixel_total : 52292 time to create 1 rle with old method : 0.06209063529968262 length of segment : 404 time for calcul the mask position with numpy : 0.00042128562927246094 nb_pixel_total : 12853 time to create 1 rle with old method : 0.015618085861206055 length of segment : 263 time for calcul the mask position with numpy : 0.002050638198852539 nb_pixel_total : 64857 time to create 1 rle with old method : 0.07763457298278809 length of segment : 411 time for calcul the mask position with numpy : 0.0004336833953857422 nb_pixel_total : 8782 time to create 1 rle with old method : 0.010509729385375977 length of segment : 158 time for calcul the mask position with numpy : 0.002619504928588867 nb_pixel_total : 67917 time to create 1 rle with old method : 0.07990503311157227 length of segment : 304 time for calcul the mask position with numpy : 0.0026044845581054688 nb_pixel_total : 38904 time to create 1 rle with old method : 0.0463869571685791 length of segment : 240 time for calcul the mask position with numpy : 0.005540370941162109 nb_pixel_total : 150530 time to create 1 rle with new method : 0.00908207893371582 length of segment : 469 time for calcul the mask position with numpy : 0.004046916961669922 nb_pixel_total : 94426 time to create 1 rle with old method : 0.11089301109313965 length of segment : 529 time for calcul the mask position with numpy : 0.0011501312255859375 nb_pixel_total : 19165 time to create 1 rle with old method : 0.03248238563537598 length of segment : 343 time for calcul the mask position with numpy : 0.004057884216308594 nb_pixel_total : 86566 time to create 1 rle with old method : 0.12204599380493164 length of segment : 438 time for calcul the mask position with numpy : 0.0017452239990234375 nb_pixel_total : 35614 time to create 1 rle with old method : 0.04239678382873535 length of segment : 283 time for calcul the mask position with numpy : 0.006262063980102539 nb_pixel_total : 172934 time to create 1 rle with new method : 0.0102996826171875 length of segment : 476 time for calcul the mask position with numpy : 0.0012218952178955078 nb_pixel_total : 30838 time to create 1 rle with old method : 0.03714251518249512 length of segment : 187 time for calcul the mask position with numpy : 0.0015683174133300781 nb_pixel_total : 27633 time to create 1 rle with old method : 0.0340571403503418 length of segment : 165 time for calcul the mask position with numpy : 0.00421452522277832 nb_pixel_total : 57817 time to create 1 rle with old method : 0.06784796714782715 length of segment : 645 time for calcul the mask position with numpy : 0.0025680065155029297 nb_pixel_total : 72694 time to create 1 rle with old method : 0.08601021766662598 length of segment : 410 time for calcul the mask position with numpy : 0.01213526725769043 nb_pixel_total : 226303 time to create 1 rle with new method : 0.03023219108581543 length of segment : 702 time for calcul the mask position with numpy : 0.0011925697326660156 nb_pixel_total : 21539 time to create 1 rle with old method : 0.0258328914642334 length of segment : 316 time for calcul the mask position with numpy : 0.0004990100860595703 nb_pixel_total : 15023 time to create 1 rle with old method : 0.01767730712890625 length of segment : 200 time for calcul the mask position with numpy : 0.0006029605865478516 nb_pixel_total : 12192 time to create 1 rle with old method : 0.014809846878051758 length of segment : 131 time for calcul the mask position with numpy : 0.00039577484130859375 nb_pixel_total : 17712 time to create 1 rle with old method : 0.027488231658935547 length of segment : 201 time for calcul the mask position with numpy : 0.0007276535034179688 nb_pixel_total : 11921 time to create 1 rle with old method : 0.014772891998291016 length of segment : 275 time for calcul the mask position with numpy : 0.0011186599731445312 nb_pixel_total : 25030 time to create 1 rle with old method : 0.029885053634643555 length of segment : 229 time for calcul the mask position with numpy : 0.0003390312194824219 nb_pixel_total : 16822 time to create 1 rle with old method : 0.02029561996459961 length of segment : 134 time for calcul the mask position with numpy : 0.0012161731719970703 nb_pixel_total : 33046 time to create 1 rle with old method : 0.053998708724975586 length of segment : 226 time for calcul the mask position with numpy : 0.0008533000946044922 nb_pixel_total : 15242 time to create 1 rle with old method : 0.01904749870300293 length of segment : 146 time for calcul the mask position with numpy : 0.001150369644165039 nb_pixel_total : 13699 time to create 1 rle with old method : 0.023837566375732422 length of segment : 309 time for calcul the mask position with numpy : 0.0030040740966796875 nb_pixel_total : 106220 time to create 1 rle with old method : 0.12211728096008301 length of segment : 500 time for calcul the mask position with numpy : 0.00074005126953125 nb_pixel_total : 19945 time to create 1 rle with old method : 0.023848533630371094 length of segment : 204 time for calcul the mask position with numpy : 0.001316070556640625 nb_pixel_total : 34415 time to create 1 rle with old method : 0.04046773910522461 length of segment : 273 time for calcul the mask position with numpy : 0.0013272762298583984 nb_pixel_total : 35179 time to create 1 rle with old method : 0.041611433029174805 length of segment : 273 time for calcul the mask position with numpy : 0.0016391277313232422 nb_pixel_total : 45143 time to create 1 rle with old method : 0.05364489555358887 length of segment : 253 time for calcul the mask position with numpy : 0.0019969940185546875 nb_pixel_total : 45086 time to create 1 rle with old method : 0.052211761474609375 length of segment : 348 time for calcul the mask position with numpy : 0.0018024444580078125 nb_pixel_total : 46567 time to create 1 rle with old method : 0.05422472953796387 length of segment : 289 time for calcul the mask position with numpy : 0.002054929733276367 nb_pixel_total : 77347 time to create 1 rle with old method : 0.09061741828918457 length of segment : 289 time for calcul the mask position with numpy : 0.0009047985076904297 nb_pixel_total : 23930 time to create 1 rle with old method : 0.028511524200439453 length of segment : 266 time for calcul the mask position with numpy : 0.003222942352294922 nb_pixel_total : 77272 time to create 1 rle with old method : 0.09107708930969238 length of segment : 569 time for calcul the mask position with numpy : 0.004209280014038086 nb_pixel_total : 166585 time to create 1 rle with new method : 0.00694584846496582 length of segment : 386 time for calcul the mask position with numpy : 0.002853870391845703 nb_pixel_total : 54660 time to create 1 rle with old method : 0.06446433067321777 length of segment : 409 time for calcul the mask position with numpy : 0.0007505416870117188 nb_pixel_total : 15807 time to create 1 rle with old method : 0.021652936935424805 length of segment : 224 time for calcul the mask position with numpy : 0.002897500991821289 nb_pixel_total : 47955 time to create 1 rle with old method : 0.06824970245361328 length of segment : 230 time for calcul the mask position with numpy : 0.006406307220458984 nb_pixel_total : 104563 time to create 1 rle with old method : 0.12767243385314941 length of segment : 509 time for calcul the mask position with numpy : 0.0006556510925292969 nb_pixel_total : 15739 time to create 1 rle with old method : 0.018507719039916992 length of segment : 161 time for calcul the mask position with numpy : 0.0020101070404052734 nb_pixel_total : 54131 time to create 1 rle with old method : 0.06563735008239746 length of segment : 282 time for calcul the mask position with numpy : 0.003206014633178711 nb_pixel_total : 101077 time to create 1 rle with old method : 0.11783671379089355 length of segment : 429 time for calcul the mask position with numpy : 0.0006136894226074219 nb_pixel_total : 16191 time to create 1 rle with old method : 0.019725561141967773 length of segment : 208 time for calcul the mask position with numpy : 0.0007252693176269531 nb_pixel_total : 21221 time to create 1 rle with old method : 0.026679039001464844 length of segment : 144 time for calcul the mask position with numpy : 0.0013039112091064453 nb_pixel_total : 36092 time to create 1 rle with old method : 0.04260396957397461 length of segment : 287 time for calcul the mask position with numpy : 0.0008857250213623047 nb_pixel_total : 18800 time to create 1 rle with old method : 0.022151708602905273 length of segment : 255 time for calcul the mask position with numpy : 0.00093841552734375 nb_pixel_total : 26754 time to create 1 rle with old method : 0.03707265853881836 length of segment : 190 time for calcul the mask position with numpy : 0.0012125968933105469 nb_pixel_total : 42355 time to create 1 rle with old method : 0.0717775821685791 length of segment : 396 time for calcul the mask position with numpy : 0.0013012886047363281 nb_pixel_total : 37021 time to create 1 rle with old method : 0.043470144271850586 length of segment : 356 time for calcul the mask position with numpy : 0.0025436878204345703 nb_pixel_total : 78060 time to create 1 rle with old method : 0.09203481674194336 length of segment : 440 time for calcul the mask position with numpy : 0.0023717880249023438 nb_pixel_total : 80922 time to create 1 rle with old method : 0.09554290771484375 length of segment : 350 time for calcul the mask position with numpy : 0.0004425048828125 nb_pixel_total : 12351 time to create 1 rle with old method : 0.014592170715332031 length of segment : 114 time for calcul the mask position with numpy : 0.0013544559478759766 nb_pixel_total : 46612 time to create 1 rle with old method : 0.05520343780517578 length of segment : 219 time for calcul the mask position with numpy : 0.002329111099243164 nb_pixel_total : 65534 time to create 1 rle with old method : 0.0790560245513916 length of segment : 371 time for calcul the mask position with numpy : 0.001560211181640625 nb_pixel_total : 47881 time to create 1 rle with old method : 0.05515241622924805 length of segment : 234 time for calcul the mask position with numpy : 0.0005447864532470703 nb_pixel_total : 12236 time to create 1 rle with old method : 0.015072107315063477 length of segment : 186 time for calcul the mask position with numpy : 0.002882719039916992 nb_pixel_total : 71136 time to create 1 rle with old method : 0.0840444564819336 length of segment : 378 time for calcul the mask position with numpy : 0.005952119827270508 nb_pixel_total : 190008 time to create 1 rle with new method : 0.011737346649169922 length of segment : 472 time for calcul the mask position with numpy : 0.0006754398345947266 nb_pixel_total : 14381 time to create 1 rle with old method : 0.017742633819580078 length of segment : 163 time for calcul the mask position with numpy : 0.0018703937530517578 nb_pixel_total : 47124 time to create 1 rle with old method : 0.057257890701293945 length of segment : 213 time for calcul the mask position with numpy : 0.0011701583862304688 nb_pixel_total : 21556 time to create 1 rle with old method : 0.026081562042236328 length of segment : 283 time for calcul the mask position with numpy : 0.0010528564453125 nb_pixel_total : 27824 time to create 1 rle with old method : 0.033327341079711914 length of segment : 155 time for calcul the mask position with numpy : 0.0015201568603515625 nb_pixel_total : 34804 time to create 1 rle with old method : 0.04503321647644043 length of segment : 321 time for calcul the mask position with numpy : 0.005000114440917969 nb_pixel_total : 109348 time to create 1 rle with old method : 0.12526178359985352 length of segment : 447 time for calcul the mask position with numpy : 0.0005834102630615234 nb_pixel_total : 14452 time to create 1 rle with old method : 0.01747584342956543 length of segment : 101 time for calcul the mask position with numpy : 0.001976490020751953 nb_pixel_total : 54842 time to create 1 rle with old method : 0.06521749496459961 length of segment : 328 time for calcul the mask position with numpy : 0.0007083415985107422 nb_pixel_total : 14675 time to create 1 rle with old method : 0.01772928237915039 length of segment : 176 time for calcul the mask position with numpy : 0.0032434463500976562 nb_pixel_total : 73278 time to create 1 rle with old method : 0.08816337585449219 length of segment : 461 time for calcul the mask position with numpy : 0.0026721954345703125 nb_pixel_total : 60622 time to create 1 rle with old method : 0.07115364074707031 length of segment : 355 time for calcul the mask position with numpy : 0.0007865428924560547 nb_pixel_total : 17643 time to create 1 rle with old method : 0.021384716033935547 length of segment : 197 time for calcul the mask position with numpy : 0.00035691261291503906 nb_pixel_total : 13068 time to create 1 rle with old method : 0.015838146209716797 length of segment : 129 time for calcul the mask position with numpy : 0.0015969276428222656 nb_pixel_total : 44345 time to create 1 rle with old method : 0.052507877349853516 length of segment : 333 time for calcul the mask position with numpy : 0.001779317855834961 nb_pixel_total : 50068 time to create 1 rle with old method : 0.06119394302368164 length of segment : 365 time for calcul the mask position with numpy : 0.001619577407836914 nb_pixel_total : 38087 time to create 1 rle with old method : 0.04484081268310547 length of segment : 459 time for calcul the mask position with numpy : 0.0010066032409667969 nb_pixel_total : 23581 time to create 1 rle with old method : 0.030068397521972656 length of segment : 202 time for calcul the mask position with numpy : 0.0007214546203613281 nb_pixel_total : 18563 time to create 1 rle with old method : 0.02519822120666504 length of segment : 195 time for calcul the mask position with numpy : 0.0005528926849365234 nb_pixel_total : 10673 time to create 1 rle with old method : 0.01306915283203125 length of segment : 150 time for calcul the mask position with numpy : 0.0022504329681396484 nb_pixel_total : 77533 time to create 1 rle with old method : 0.09281539916992188 length of segment : 217 time for calcul the mask position with numpy : 0.0008611679077148438 nb_pixel_total : 12178 time to create 1 rle with old method : 0.014455080032348633 length of segment : 261 time for calcul the mask position with numpy : 0.0008604526519775391 nb_pixel_total : 24922 time to create 1 rle with old method : 0.029721498489379883 length of segment : 192 time for calcul the mask position with numpy : 0.0009500980377197266 nb_pixel_total : 23624 time to create 1 rle with old method : 0.028289794921875 length of segment : 282 time for calcul the mask position with numpy : 0.00148773193359375 nb_pixel_total : 39644 time to create 1 rle with old method : 0.04779362678527832 length of segment : 239 time for calcul the mask position with numpy : 0.0008227825164794922 nb_pixel_total : 18780 time to create 1 rle with old method : 0.022984981536865234 length of segment : 199 time for calcul the mask position with numpy : 0.0003628730773925781 nb_pixel_total : 7383 time to create 1 rle with old method : 0.009099721908569336 length of segment : 87 time for calcul the mask position with numpy : 0.0049037933349609375 nb_pixel_total : 38316 time to create 1 rle with old method : 0.04769587516784668 length of segment : 336 time for calcul the mask position with numpy : 0.0014882087707519531 nb_pixel_total : 32789 time to create 1 rle with old method : 0.0388641357421875 length of segment : 307 time for calcul the mask position with numpy : 0.0026040077209472656 nb_pixel_total : 65988 time to create 1 rle with old method : 0.07759881019592285 length of segment : 480 time for calcul the mask position with numpy : 0.0037050247192382812 nb_pixel_total : 101009 time to create 1 rle with old method : 0.11918139457702637 length of segment : 503 time for calcul the mask position with numpy : 0.0007140636444091797 nb_pixel_total : 18936 time to create 1 rle with old method : 0.02198171615600586 length of segment : 147 time for calcul the mask position with numpy : 0.0014870166778564453 nb_pixel_total : 33232 time to create 1 rle with old method : 0.0399632453918457 length of segment : 296 time for calcul the mask position with numpy : 0.0007297992706298828 nb_pixel_total : 18298 time to create 1 rle with old method : 0.022403955459594727 length of segment : 125 time for calcul the mask position with numpy : 0.0009055137634277344 nb_pixel_total : 22724 time to create 1 rle with old method : 0.027189254760742188 length of segment : 200 time for calcul the mask position with numpy : 0.001361846923828125 nb_pixel_total : 31054 time to create 1 rle with old method : 0.03701019287109375 length of segment : 189 time for calcul the mask position with numpy : 0.0006859302520751953 nb_pixel_total : 14361 time to create 1 rle with old method : 0.017410755157470703 length of segment : 150 time for calcul the mask position with numpy : 0.0009336471557617188 nb_pixel_total : 17119 time to create 1 rle with old method : 0.020612001419067383 length of segment : 199 time for calcul the mask position with numpy : 0.0014057159423828125 nb_pixel_total : 36637 time to create 1 rle with old method : 0.04302811622619629 length of segment : 367 time for calcul the mask position with numpy : 0.0022134780883789062 nb_pixel_total : 46795 time to create 1 rle with old method : 0.07031822204589844 length of segment : 251 time for calcul the mask position with numpy : 0.0013349056243896484 nb_pixel_total : 28832 time to create 1 rle with old method : 0.036978721618652344 length of segment : 164 time for calcul the mask position with numpy : 0.0006985664367675781 nb_pixel_total : 19790 time to create 1 rle with old method : 0.024106502532958984 length of segment : 159 time for calcul the mask position with numpy : 0.0007219314575195312 nb_pixel_total : 17829 time to create 1 rle with old method : 0.022089004516601562 length of segment : 158 time for calcul the mask position with numpy : 0.005584716796875 nb_pixel_total : 150678 time to create 1 rle with new method : 0.011944055557250977 length of segment : 430 time for calcul the mask position with numpy : 0.005532741546630859 nb_pixel_total : 147996 time to create 1 rle with old method : 0.1710827350616455 length of segment : 379 time for calcul the mask position with numpy : 0.0017781257629394531 nb_pixel_total : 48389 time to create 1 rle with old method : 0.05718255043029785 length of segment : 256 time for calcul the mask position with numpy : 0.0014238357543945312 nb_pixel_total : 38906 time to create 1 rle with old method : 0.04754495620727539 length of segment : 288 time for calcul the mask position with numpy : 0.0004127025604248047 nb_pixel_total : 16244 time to create 1 rle with old method : 0.019472122192382812 length of segment : 115 time for calcul the mask position with numpy : 0.003499746322631836 nb_pixel_total : 97235 time to create 1 rle with old method : 0.11342430114746094 length of segment : 309 time for calcul the mask position with numpy : 0.0012204647064208984 nb_pixel_total : 31388 time to create 1 rle with old method : 0.03652310371398926 length of segment : 567 time for calcul the mask position with numpy : 0.0018775463104248047 nb_pixel_total : 63955 time to create 1 rle with old method : 0.0776815414428711 length of segment : 801 time for calcul the mask position with numpy : 0.0012249946594238281 nb_pixel_total : 32176 time to create 1 rle with old method : 0.037819623947143555 length of segment : 250 time for calcul the mask position with numpy : 0.0024771690368652344 nb_pixel_total : 67402 time to create 1 rle with old method : 0.07924890518188477 length of segment : 267 time for calcul the mask position with numpy : 0.0008482933044433594 nb_pixel_total : 16456 time to create 1 rle with old method : 0.019726037979125977 length of segment : 147 time for calcul the mask position with numpy : 0.00031757354736328125 nb_pixel_total : 10570 time to create 1 rle with old method : 0.013232946395874023 length of segment : 99 time for calcul the mask position with numpy : 0.002393007278442383 nb_pixel_total : 74330 time to create 1 rle with old method : 0.08765935897827148 length of segment : 338 time for calcul the mask position with numpy : 0.00035381317138671875 nb_pixel_total : 14342 time to create 1 rle with old method : 0.01751708984375 length of segment : 150 time for calcul the mask position with numpy : 0.0003008842468261719 nb_pixel_total : 9666 time to create 1 rle with old method : 0.011831521987915039 length of segment : 135 time for calcul the mask position with numpy : 0.00023555755615234375 nb_pixel_total : 10558 time to create 1 rle with old method : 0.014999151229858398 length of segment : 131 time for calcul the mask position with numpy : 0.005495309829711914 nb_pixel_total : 183369 time to create 1 rle with new method : 0.010143756866455078 length of segment : 416 time for calcul the mask position with numpy : 0.0022764205932617188 nb_pixel_total : 67489 time to create 1 rle with old method : 0.0768272876739502 length of segment : 413 time for calcul the mask position with numpy : 0.002255678176879883 nb_pixel_total : 73298 time to create 1 rle with old method : 0.08343029022216797 length of segment : 434 time for calcul the mask position with numpy : 0.0060405731201171875 nb_pixel_total : 149783 time to create 1 rle with old method : 0.1738266944885254 length of segment : 768 time for calcul the mask position with numpy : 0.0007419586181640625 nb_pixel_total : 17981 time to create 1 rle with old method : 0.02125263214111328 length of segment : 190 time for calcul the mask position with numpy : 0.0003509521484375 nb_pixel_total : 7730 time to create 1 rle with old method : 0.00960540771484375 length of segment : 92 time for calcul the mask position with numpy : 0.0004456043243408203 nb_pixel_total : 11047 time to create 1 rle with old method : 0.013323783874511719 length of segment : 149 time for calcul the mask position with numpy : 0.0013127326965332031 nb_pixel_total : 32682 time to create 1 rle with old method : 0.04137134552001953 length of segment : 280 time for calcul the mask position with numpy : 0.0033593177795410156 nb_pixel_total : 86090 time to create 1 rle with old method : 0.1011049747467041 length of segment : 702 time for calcul the mask position with numpy : 0.0007650852203369141 nb_pixel_total : 18059 time to create 1 rle with old method : 0.021675825119018555 length of segment : 195 time for calcul the mask position with numpy : 0.0050296783447265625 nb_pixel_total : 161936 time to create 1 rle with new method : 0.012080907821655273 length of segment : 731 time for calcul the mask position with numpy : 0.0004012584686279297 nb_pixel_total : 12952 time to create 1 rle with old method : 0.01670241355895996 length of segment : 127 time for calcul the mask position with numpy : 0.003826141357421875 nb_pixel_total : 71069 time to create 1 rle with old method : 0.09610962867736816 length of segment : 618 time for calcul the mask position with numpy : 0.003192424774169922 nb_pixel_total : 59584 time to create 1 rle with old method : 0.07083868980407715 length of segment : 778 time for calcul the mask position with numpy : 0.0029935836791992188 nb_pixel_total : 76330 time to create 1 rle with old method : 0.08920621871948242 length of segment : 554 time for calcul the mask position with numpy : 0.0014657974243164062 nb_pixel_total : 21432 time to create 1 rle with old method : 0.025630950927734375 length of segment : 391 time for calcul the mask position with numpy : 0.0008440017700195312 nb_pixel_total : 23001 time to create 1 rle with old method : 0.0279996395111084 length of segment : 163 time for calcul the mask position with numpy : 0.0007083415985107422 nb_pixel_total : 31914 time to create 1 rle with old method : 0.04281497001647949 length of segment : 203 time for calcul the mask position with numpy : 0.0005469322204589844 nb_pixel_total : 28690 time to create 1 rle with old method : 0.0344693660736084 length of segment : 298 time for calcul the mask position with numpy : 0.003829479217529297 nb_pixel_total : 53102 time to create 1 rle with old method : 0.08641171455383301 length of segment : 483 time for calcul the mask position with numpy : 0.0016074180603027344 nb_pixel_total : 55946 time to create 1 rle with old method : 0.06952333450317383 length of segment : 337 time for calcul the mask position with numpy : 0.001705169677734375 nb_pixel_total : 66031 time to create 1 rle with old method : 0.07823753356933594 length of segment : 379 time for calcul the mask position with numpy : 0.0021047592163085938 nb_pixel_total : 60314 time to create 1 rle with old method : 0.0721287727355957 length of segment : 391 time spent for convertir_results : 24.17662787437439 Inside saveOutput : final : False verbose : 0 eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 252 chid ids of type : 3594 Number RLEs to save : 76562 save missing photos in datou_result : time spend for datou_step_exec : 170.376446723938 time spend to save output : 4.566611289978027 total time spend for step 1 : 174.94305801391602 step2:crop_condition Mon Jul 28 12:53:24 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure Loading chi in step crop with photo_hashtag_type : 3594 Loading chi in step crop for list_pids : 13 ! batch 1 Loaded 252 chid ids of type : 3594 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ begin to crop the class : papier param for this class : {'min_score': 0.7} filtre for class : papier hashtag_id of this class : 492668766 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 221 About to insert : list_path_to_insert length 221 new photo from crops ! About to upload 221 photos upload in portfolio : 3736932 init cache_photo without model_param we have 221 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753700060_3479611 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 221 photos in the portfolio 3736932 time of upload the photos Elapsed time : 54.0616557598114 we have finished the crop for the class : papier begin to crop the class : carton param for this class : {'min_score': 0.7} filtre for class : carton hashtag_id of this class : 492774966 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 21 About to insert : list_path_to_insert length 21 new photo from crops ! About to upload 21 photos upload in portfolio : 3736932 init cache_photo without model_param we have 21 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753700120_3479611 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 21 photos in the portfolio 3736932 time of upload the photos Elapsed time : 4.664151906967163 we have finished the crop for the class : carton begin to crop the class : metal param for this class : {'min_score': 0.7} filtre for class : metal hashtag_id of this class : 492628673 we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 1 About to insert : list_path_to_insert length 1 new photo from crops ! About to upload 1 photos upload in portfolio : 3736932 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753700127_3479611 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 1 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.6038639545440674 we have finished the crop for the class : metal begin to crop the class : pet_clair param for this class : {'min_score': 0.7} filtre for class : pet_clair hashtag_id of this class : 2107755846 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 9 About to insert : list_path_to_insert length 9 new photo from crops ! About to upload 9 photos upload in portfolio : 3736932 init cache_photo without model_param we have 9 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753700133_3479611 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 9 photos in the portfolio 3736932 time of upload the photos Elapsed time : 2.313178539276123 we have finished the crop for the class : pet_clair begin to crop the class : autre param for this class : {'min_score': 0.7} filtre for class : autre hashtag_id of this class : 494826614 begin to crop the class : pehd param for this class : {'min_score': 0.7} filtre for class : pehd hashtag_id of this class : 628944319 begin to crop the class : pet_fonce param for this class : {'min_score': 0.7} filtre for class : pet_fonce hashtag_id of this class : 2107755900 delete rles from all chi we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : crop_condition we use saveGeneral [1374030691, 1374030612, 1374030610, 1374030607, 1374030601, 1374030599, 1374028732, 1374028730, 1374028726, 1374028722, 1374028721, 1374028644, 1374028640] Looping around the photos to save general results len do output : 252 /1374146058Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146061Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146062Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146063Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146065Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146066Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146067Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146069Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146070Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146071Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146072Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146075Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146076Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146077Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146079Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146081Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146082Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146084Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146086Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146087Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146089Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146091Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146092Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146094Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146097Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146098Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146099Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146102Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146103Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146104Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146107Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146108Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146109Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146111Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146113Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146114Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146115Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146118Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146119Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146120Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146123Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146124Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146125Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146128Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146129Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146130Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146133Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146134Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146135Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146138Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146139Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146140Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146141Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146144Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146145Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146146Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146149Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146150Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146151Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146153Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146155Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146156Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146157Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146160Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146161Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146162Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146165Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146166Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146167Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146169Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146171Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146172Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146174Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146176Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146177Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146179Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146181Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146182Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146183Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146185Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146187Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146188Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146190Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146192Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146193Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146194Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146196Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146198Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146199Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146201Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146203Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146204Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146206Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146207Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146210Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146211Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146213Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146215Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146216Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146218Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146220Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146221Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146223Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146224Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146226Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146228Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146229Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146230Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146232Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146233Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146234Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146235Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146237Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146238Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146239Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146241Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146242Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146243Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146245Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146246Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146247Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146249Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146250Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146251Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146253Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146254Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146255Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146257Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146258Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146259Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146260Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146262Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146263Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146264Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146266Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146267Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146268Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146271Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146272Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146273Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146275Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146276Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146277Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146279Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146280Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146281Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146283Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146284Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146285Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146286Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146288Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146289Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146290Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146292Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146293Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146294Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146295Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146296Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146297Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146298Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146299Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146300Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146301Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146302Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146303Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146304Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146305Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146306Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146307Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146308Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146309Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146310Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146311Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146312Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146313Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146314Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146315Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146316Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146317Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146318Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146319Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146320Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146321Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146322Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146323Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146324Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146325Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146326Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146327Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146328Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146329Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146330Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146332Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146333Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146334Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146335Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146336Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146337Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146339Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146340Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146341Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146342Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146343Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146344Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146345Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146346Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146347Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146348Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146349Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146350Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146351Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146352Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146353Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146354Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146355Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146356Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146357Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146358Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146359Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146360Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146361Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146393Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146394Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146395Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146396Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146397Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146398Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146399Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146400Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146401Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146402Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146403Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146404Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146405Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146406Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146407Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146409Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146410Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146411Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146412Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146414Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146415Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146418Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146427Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146428Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146429Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146430Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146431Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146432Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146433Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146434Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374146435Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030691', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030612', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030610', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030607', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030601', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030599', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028732', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028730', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028726', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028722', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028721', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028644', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028640', None, None, None, None, None, '3370219') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 769 time used for this insertion : 0.04723072052001953 save_final save missing photos in datou_result : time spend for datou_step_exec : 130.8819065093994 time spend to save output : 0.052787065505981445 total time spend for step 2 : 130.9346935749054 step3:rle_unique_nms_with_priority Mon Jul 28 12:55:35 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 252 chid ids of type : 3594 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++nb_obj : 17 nb_hashtags : 4 time to prepare the origin masks : 9.681830167770386 time for calcul the mask position with numpy : 0.6209888458251953 nb_pixel_total : 7560580 time to create 1 rle with new method : 0.8264102935791016 time for calcul the mask position with numpy : 0.04090118408203125 nb_pixel_total : 14770 time to create 1 rle with old method : 0.017134904861450195 time for calcul the mask position with numpy : 0.04103660583496094 nb_pixel_total : 58946 time to create 1 rle with old method : 0.06996655464172363 time for calcul the mask position with numpy : 0.04344320297241211 nb_pixel_total : 36345 time to create 1 rle with old method : 0.042658329010009766 time for calcul the mask position with numpy : 0.04166078567504883 nb_pixel_total : 67060 time to create 1 rle with old method : 0.08148980140686035 time for calcul the mask position with numpy : 0.040779829025268555 nb_pixel_total : 21517 time to create 1 rle with old method : 0.025126218795776367 time for calcul the mask position with numpy : 0.04040026664733887 nb_pixel_total : 63152 time to create 1 rle with old method : 0.07411384582519531 time for calcul the mask position with numpy : 0.0400540828704834 nb_pixel_total : 24948 time to create 1 rle with old method : 0.02941274642944336 time for calcul the mask position with numpy : 0.04086494445800781 nb_pixel_total : 28245 time to create 1 rle with old method : 0.047525644302368164 time for calcul the mask position with numpy : 0.044553518295288086 nb_pixel_total : 73133 time to create 1 rle with old method : 0.08708930015563965 time for calcul the mask position with numpy : 0.040856122970581055 nb_pixel_total : 8753 time to create 1 rle with old method : 0.010406017303466797 time for calcul the mask position with numpy : 0.04235529899597168 nb_pixel_total : 93079 time to create 1 rle with old method : 0.1268754005432129 time for calcul the mask position with numpy : 0.04281783103942871 nb_pixel_total : 22731 time to create 1 rle with old method : 0.026879549026489258 time for calcul the mask position with numpy : 0.041848182678222656 nb_pixel_total : 29035 time to create 1 rle with old method : 0.034561872482299805 time for calcul the mask position with numpy : 0.04304361343383789 nb_pixel_total : 23653 time to create 1 rle with old method : 0.028255462646484375 time for calcul the mask position with numpy : 0.04536700248718262 nb_pixel_total : 24824 time to create 1 rle with old method : 0.03227663040161133 time for calcul the mask position with numpy : 0.03907895088195801 nb_pixel_total : 116300 time to create 1 rle with old method : 0.13878822326660156 time for calcul the mask position with numpy : 0.025135040283203125 nb_pixel_total : 27329 time to create 1 rle with old method : 0.03204464912414551 create new chi : 3.08601975440979 time to delete rle : 0.03148460388183594 batch 1 Loaded 35 chid ids of type : 3594 ++++++++++++++++++++++Number RLEs to save : 12016 TO DO : save crop sub photo not yet done ! save time : 0.7415432929992676 nb_obj : 14 nb_hashtags : 3 time to prepare the origin masks : 10.278900384902954 time for calcul the mask position with numpy : 0.976722240447998 nb_pixel_total : 7613557 time to create 1 rle with new method : 0.7592194080352783 time for calcul the mask position with numpy : 0.02657341957092285 nb_pixel_total : 30968 time to create 1 rle with old method : 0.036481618881225586 time for calcul the mask position with numpy : 0.029282331466674805 nb_pixel_total : 6007 time to create 1 rle with old method : 0.007519960403442383 time for calcul the mask position with numpy : 0.02681255340576172 nb_pixel_total : 30987 time to create 1 rle with old method : 0.03781914710998535 time for calcul the mask position with numpy : 0.030233383178710938 nb_pixel_total : 16322 time to create 1 rle with old method : 0.020331859588623047 time for calcul the mask position with numpy : 0.033843278884887695 nb_pixel_total : 59258 time to create 1 rle with old method : 0.06891965866088867 time for calcul the mask position with numpy : 0.043781280517578125 nb_pixel_total : 24041 time to create 1 rle with old method : 0.03277444839477539 time for calcul the mask position with numpy : 0.04261970520019531 nb_pixel_total : 14808 time to create 1 rle with old method : 0.017511606216430664 time for calcul the mask position with numpy : 0.04332423210144043 nb_pixel_total : 6661 time to create 1 rle with old method : 0.008019685745239258 time for calcul the mask position with numpy : 0.05136418342590332 nb_pixel_total : 107624 time to create 1 rle with old method : 0.14739274978637695 time for calcul the mask position with numpy : 0.047803401947021484 nb_pixel_total : 60277 time to create 1 rle with old method : 0.08906388282775879 time for calcul the mask position with numpy : 0.0443730354309082 nb_pixel_total : 8475 time to create 1 rle with old method : 0.010114669799804688 time for calcul the mask position with numpy : 0.04266166687011719 nb_pixel_total : 40956 time to create 1 rle with old method : 0.048150062561035156 time for calcul the mask position with numpy : 0.03931403160095215 nb_pixel_total : 259837 time to create 1 rle with new method : 0.4345252513885498 time for calcul the mask position with numpy : 0.02602553367614746 nb_pixel_total : 14622 time to create 1 rle with old method : 0.018162012100219727 create new chi : 3.3147408962249756 time to delete rle : 0.0445406436920166 batch 1 Loaded 29 chid ids of type : 3594 ++++++++++++++++++Number RLEs to save : 10154 TO DO : save crop sub photo not yet done ! save time : 0.6600570678710938 nb_obj : 23 nb_hashtags : 3 time to prepare the origin masks : 14.892391204833984 time for calcul the mask position with numpy : 0.46352481842041016 nb_pixel_total : 7369577 time to create 1 rle with new method : 0.5727612972259521 time for calcul the mask position with numpy : 0.04490804672241211 nb_pixel_total : 81151 time to create 1 rle with old method : 0.09513163566589355 time for calcul the mask position with numpy : 0.041129350662231445 nb_pixel_total : 7423 time to create 1 rle with old method : 0.012466192245483398 time for calcul the mask position with numpy : 0.050493717193603516 nb_pixel_total : 21604 time to create 1 rle with old method : 0.03632950782775879 time for calcul the mask position with numpy : 0.046346426010131836 nb_pixel_total : 74851 time to create 1 rle with old method : 0.08815431594848633 time for calcul the mask position with numpy : 0.039275407791137695 nb_pixel_total : 37713 time to create 1 rle with old method : 0.04511857032775879 time for calcul the mask position with numpy : 0.04245567321777344 nb_pixel_total : 51035 time to create 1 rle with old method : 0.060509681701660156 time for calcul the mask position with numpy : 0.037371158599853516 nb_pixel_total : 17680 time to create 1 rle with old method : 0.0209958553314209 time for calcul the mask position with numpy : 0.045265913009643555 nb_pixel_total : 28761 time to create 1 rle with old method : 0.03728032112121582 time for calcul the mask position with numpy : 0.04460740089416504 nb_pixel_total : 26312 time to create 1 rle with old method : 0.03292560577392578 time for calcul the mask position with numpy : 0.04417896270751953 nb_pixel_total : 51308 time to create 1 rle with old method : 0.06013083457946777 time for calcul the mask position with numpy : 0.05413103103637695 nb_pixel_total : 75577 time to create 1 rle with old method : 0.08935308456420898 time for calcul the mask position with numpy : 0.05080366134643555 nb_pixel_total : 25192 time to create 1 rle with old method : 0.030730485916137695 time for calcul the mask position with numpy : 0.04437398910522461 nb_pixel_total : 55974 time to create 1 rle with old method : 0.07050585746765137 time for calcul the mask position with numpy : 0.05508756637573242 nb_pixel_total : 11658 time to create 1 rle with old method : 0.017228126525878906 time for calcul the mask position with numpy : 0.04823946952819824 nb_pixel_total : 28694 time to create 1 rle with old method : 0.040337324142456055 time for calcul the mask position with numpy : 0.04792976379394531 nb_pixel_total : 31730 time to create 1 rle with old method : 0.055338144302368164 time for calcul the mask position with numpy : 0.04977893829345703 nb_pixel_total : 20001 time to create 1 rle with old method : 0.026011228561401367 time for calcul the mask position with numpy : 0.05026888847351074 nb_pixel_total : 12818 time to create 1 rle with old method : 0.016257762908935547 time for calcul the mask position with numpy : 0.04521989822387695 nb_pixel_total : 20496 time to create 1 rle with old method : 0.02580881118774414 time for calcul the mask position with numpy : 0.04454326629638672 nb_pixel_total : 112947 time to create 1 rle with old method : 0.15303969383239746 time for calcul the mask position with numpy : 0.052910804748535156 nb_pixel_total : 53767 time to create 1 rle with old method : 0.08112502098083496 time for calcul the mask position with numpy : 0.04551529884338379 nb_pixel_total : 44801 time to create 1 rle with old method : 0.06107163429260254 time for calcul the mask position with numpy : 0.03698110580444336 nb_pixel_total : 33330 time to create 1 rle with old method : 0.0493009090423584 create new chi : 3.3556277751922607 time to delete rle : 0.00281524658203125 batch 1 Loaded 47 chid ids of type : 3594 +++++++++++++++++++++++++++Number RLEs to save : 14926 TO DO : save crop sub photo not yet done ! save time : 0.9808874130249023 nb_obj : 20 nb_hashtags : 3 time to prepare the origin masks : 12.29633641242981 time for calcul the mask position with numpy : 0.9206681251525879 nb_pixel_total : 7571229 time to create 1 rle with new method : 0.738703727722168 time for calcul the mask position with numpy : 0.04761934280395508 nb_pixel_total : 43309 time to create 1 rle with old method : 0.05801558494567871 time for calcul the mask position with numpy : 0.10052895545959473 nb_pixel_total : 22999 time to create 1 rle with old method : 0.06074666976928711 time for calcul the mask position with numpy : 0.06002068519592285 nb_pixel_total : 23498 time to create 1 rle with old method : 0.032688140869140625 time for calcul the mask position with numpy : 0.0497431755065918 nb_pixel_total : 12496 time to create 1 rle with old method : 0.015584707260131836 time for calcul the mask position with numpy : 0.049610137939453125 nb_pixel_total : 20502 time to create 1 rle with old method : 0.033965110778808594 time for calcul the mask position with numpy : 0.06695318222045898 nb_pixel_total : 14608 time to create 1 rle with old method : 0.0235598087310791 time for calcul the mask position with numpy : 0.05972576141357422 nb_pixel_total : 6053 time to create 1 rle with old method : 0.007317066192626953 time for calcul the mask position with numpy : 0.04854583740234375 nb_pixel_total : 130805 time to create 1 rle with old method : 0.16238999366760254 time for calcul the mask position with numpy : 0.0574193000793457 nb_pixel_total : 42470 time to create 1 rle with old method : 0.07196235656738281 time for calcul the mask position with numpy : 0.06109309196472168 nb_pixel_total : 22210 time to create 1 rle with old method : 0.03597617149353027 time for calcul the mask position with numpy : 0.05481433868408203 nb_pixel_total : 124084 time to create 1 rle with old method : 0.2144787311553955 time for calcul the mask position with numpy : 0.05665326118469238 nb_pixel_total : 30779 time to create 1 rle with old method : 0.03660321235656738 time for calcul the mask position with numpy : 0.060416460037231445 nb_pixel_total : 23619 time to create 1 rle with old method : 0.0283963680267334 time for calcul the mask position with numpy : 0.05558156967163086 nb_pixel_total : 33861 time to create 1 rle with old method : 0.04004192352294922 time for calcul the mask position with numpy : 0.05038881301879883 nb_pixel_total : 18847 time to create 1 rle with old method : 0.02234792709350586 time for calcul the mask position with numpy : 0.04735970497131348 nb_pixel_total : 11533 time to create 1 rle with old method : 0.013847589492797852 time for calcul the mask position with numpy : 0.046442508697509766 nb_pixel_total : 16415 time to create 1 rle with old method : 0.01972508430480957 time for calcul the mask position with numpy : 0.05048990249633789 nb_pixel_total : 41488 time to create 1 rle with old method : 0.04889726638793945 time for calcul the mask position with numpy : 0.04537677764892578 nb_pixel_total : 12684 time to create 1 rle with old method : 0.015093088150024414 time for calcul the mask position with numpy : 0.04537177085876465 nb_pixel_total : 70911 time to create 1 rle with old method : 0.08361315727233887 create new chi : 3.8506462574005127 time to delete rle : 0.0030913352966308594 batch 1 Loaded 41 chid ids of type : 3594 +++++++++++++++++++++++++++Number RLEs to save : 12491 TO DO : save crop sub photo not yet done ! save time : 0.7904331684112549 nb_obj : 16 nb_hashtags : 2 time to prepare the origin masks : 9.19605278968811 time for calcul the mask position with numpy : 0.8457236289978027 nb_pixel_total : 7497787 time to create 1 rle with new method : 0.6342997550964355 time for calcul the mask position with numpy : 0.04505348205566406 nb_pixel_total : 84383 time to create 1 rle with old method : 0.10622167587280273 time for calcul the mask position with numpy : 0.03926253318786621 nb_pixel_total : 76963 time to create 1 rle with old method : 0.09880518913269043 time for calcul the mask position with numpy : 0.0427851676940918 nb_pixel_total : 26508 time to create 1 rle with old method : 0.2503354549407959 time for calcul the mask position with numpy : 0.042406558990478516 nb_pixel_total : 18765 time to create 1 rle with old method : 0.02232646942138672 time for calcul the mask position with numpy : 0.044095754623413086 nb_pixel_total : 25537 time to create 1 rle with old method : 0.030083656311035156 time for calcul the mask position with numpy : 0.04098057746887207 nb_pixel_total : 82800 time to create 1 rle with old method : 0.09931182861328125 time for calcul the mask position with numpy : 0.03330659866333008 nb_pixel_total : 21936 time to create 1 rle with old method : 0.02588486671447754 time for calcul the mask position with numpy : 0.043762922286987305 nb_pixel_total : 51334 time to create 1 rle with old method : 0.060413360595703125 time for calcul the mask position with numpy : 0.05236077308654785 nb_pixel_total : 42185 time to create 1 rle with old method : 0.04963850975036621 time for calcul the mask position with numpy : 0.043325185775756836 nb_pixel_total : 13977 time to create 1 rle with old method : 0.01659679412841797 time for calcul the mask position with numpy : 0.04222369194030762 nb_pixel_total : 40856 time to create 1 rle with old method : 0.04832577705383301 time for calcul the mask position with numpy : 0.04334688186645508 nb_pixel_total : 109806 time to create 1 rle with old method : 0.1428523063659668 time for calcul the mask position with numpy : 0.04159116744995117 nb_pixel_total : 9016 time to create 1 rle with old method : 0.010760307312011719 time for calcul the mask position with numpy : 0.0395047664642334 nb_pixel_total : 51159 time to create 1 rle with old method : 0.0617060661315918 time for calcul the mask position with numpy : 0.02866816520690918 nb_pixel_total : 55780 time to create 1 rle with old method : 0.06767487525939941 time for calcul the mask position with numpy : 0.02618265151977539 nb_pixel_total : 85608 time to create 1 rle with old method : 0.0996849536895752 create new chi : 3.3690104484558105 time to delete rle : 0.0037937164306640625 batch 1 Loaded 33 chid ids of type : 3594 ++++++++++++++++++++++++++++++++Number RLEs to save : 13981 TO DO : save crop sub photo not yet done ! save time : 0.8705780506134033 nb_obj : 24 nb_hashtags : 1 time to prepare the origin masks : 10.814795017242432 time for calcul the mask position with numpy : 0.6207015514373779 nb_pixel_total : 7120596 time to create 1 rle with new method : 0.9138941764831543 time for calcul the mask position with numpy : 0.04660153388977051 nb_pixel_total : 8782 time to create 1 rle with old method : 0.010646581649780273 time for calcul the mask position with numpy : 0.0492100715637207 nb_pixel_total : 64857 time to create 1 rle with old method : 0.09561872482299805 time for calcul the mask position with numpy : 0.049523115158081055 nb_pixel_total : 12853 time to create 1 rle with old method : 0.017867565155029297 time for calcul the mask position with numpy : 0.0498809814453125 nb_pixel_total : 52292 time to create 1 rle with old method : 0.07886314392089844 time for calcul the mask position with numpy : 0.049744606018066406 nb_pixel_total : 19388 time to create 1 rle with old method : 0.023786544799804688 time for calcul the mask position with numpy : 0.05016589164733887 nb_pixel_total : 36990 time to create 1 rle with old method : 0.04460906982421875 time for calcul the mask position with numpy : 0.049913883209228516 nb_pixel_total : 23292 time to create 1 rle with old method : 0.0272824764251709 time for calcul the mask position with numpy : 0.03994607925415039 nb_pixel_total : 14677 time to create 1 rle with old method : 0.01716756820678711 time for calcul the mask position with numpy : 0.04265284538269043 nb_pixel_total : 163470 time to create 1 rle with new method : 0.5894298553466797 time for calcul the mask position with numpy : 0.04591250419616699 nb_pixel_total : 83329 time to create 1 rle with old method : 0.09970211982727051 time for calcul the mask position with numpy : 0.04005002975463867 nb_pixel_total : 36714 time to create 1 rle with old method : 0.04445528984069824 time for calcul the mask position with numpy : 0.03780698776245117 nb_pixel_total : 60483 time to create 1 rle with old method : 0.07445883750915527 time for calcul the mask position with numpy : 0.03952932357788086 nb_pixel_total : 19003 time to create 1 rle with old method : 0.02190542221069336 time for calcul the mask position with numpy : 0.0390629768371582 nb_pixel_total : 21076 time to create 1 rle with old method : 0.028649568557739258 time for calcul the mask position with numpy : 0.04238128662109375 nb_pixel_total : 27965 time to create 1 rle with old method : 0.04346513748168945 time for calcul the mask position with numpy : 0.03254508972167969 nb_pixel_total : 51612 time to create 1 rle with old method : 0.06313490867614746 time for calcul the mask position with numpy : 0.030351877212524414 nb_pixel_total : 35056 time to create 1 rle with old method : 0.05427074432373047 time for calcul the mask position with numpy : 0.031700849533081055 nb_pixel_total : 28023 time to create 1 rle with old method : 0.0335390567779541 time for calcul the mask position with numpy : 0.02740645408630371 nb_pixel_total : 65243 time to create 1 rle with old method : 0.07747077941894531 time for calcul the mask position with numpy : 0.026725292205810547 nb_pixel_total : 19970 time to create 1 rle with old method : 0.023859739303588867 time for calcul the mask position with numpy : 0.026421070098876953 nb_pixel_total : 43126 time to create 1 rle with old method : 0.04996776580810547 time for calcul the mask position with numpy : 0.02775716781616211 nb_pixel_total : 33411 time to create 1 rle with old method : 0.039674997329711914 time for calcul the mask position with numpy : 0.027788162231445312 nb_pixel_total : 231340 time to create 1 rle with new method : 1.3760592937469482 time for calcul the mask position with numpy : 0.035182952880859375 nb_pixel_total : 20852 time to create 1 rle with old method : 0.02654290199279785 create new chi : 5.5535852909088135 time to delete rle : 0.0058841705322265625 batch 1 Loaded 49 chid ids of type : 3594 +++++++++++++++++++++++++++++++Number RLEs to save : 16794 TO DO : save crop sub photo not yet done ! save time : 1.0443124771118164 nb_obj : 25 nb_hashtags : 1 time to prepare the origin masks : 12.701576471328735 time for calcul the mask position with numpy : 0.9311845302581787 nb_pixel_total : 6911382 time to create 1 rle with new method : 1.1195893287658691 time for calcul the mask position with numpy : 0.045938968658447266 nb_pixel_total : 19945 time to create 1 rle with old method : 0.02339458465576172 time for calcul the mask position with numpy : 0.05059552192687988 nb_pixel_total : 105197 time to create 1 rle with old method : 0.19782209396362305 time for calcul the mask position with numpy : 0.06065821647644043 nb_pixel_total : 13699 time to create 1 rle with old method : 0.016359567642211914 time for calcul the mask position with numpy : 0.0488123893737793 nb_pixel_total : 15242 time to create 1 rle with old method : 0.021987199783325195 time for calcul the mask position with numpy : 0.05075240135192871 nb_pixel_total : 33046 time to create 1 rle with old method : 0.04068326950073242 time for calcul the mask position with numpy : 0.048140525817871094 nb_pixel_total : 12957 time to create 1 rle with old method : 0.015755176544189453 time for calcul the mask position with numpy : 0.040598392486572266 nb_pixel_total : 25030 time to create 1 rle with old method : 0.03157615661621094 time for calcul the mask position with numpy : 0.045766592025756836 nb_pixel_total : 11912 time to create 1 rle with old method : 0.01678776741027832 time for calcul the mask position with numpy : 0.03781890869140625 nb_pixel_total : 17451 time to create 1 rle with old method : 0.02060699462890625 time for calcul the mask position with numpy : 0.04364347457885742 nb_pixel_total : 12192 time to create 1 rle with old method : 0.02574014663696289 time for calcul the mask position with numpy : 0.05029177665710449 nb_pixel_total : 15023 time to create 1 rle with old method : 0.04397249221801758 time for calcul the mask position with numpy : 0.0476071834564209 nb_pixel_total : 19983 time to create 1 rle with old method : 0.02480936050415039 time for calcul the mask position with numpy : 0.05382061004638672 nb_pixel_total : 226303 time to create 1 rle with new method : 0.7609202861785889 time for calcul the mask position with numpy : 0.04853010177612305 nb_pixel_total : 72694 time to create 1 rle with old method : 0.09059929847717285 time for calcul the mask position with numpy : 0.04529261589050293 nb_pixel_total : 57817 time to create 1 rle with old method : 0.09811711311340332 time for calcul the mask position with numpy : 0.048519134521484375 nb_pixel_total : 27633 time to create 1 rle with old method : 0.04044914245605469 time for calcul the mask position with numpy : 0.04978775978088379 nb_pixel_total : 30838 time to create 1 rle with old method : 0.036482810974121094 time for calcul the mask position with numpy : 0.05087399482727051 nb_pixel_total : 172934 time to create 1 rle with new method : 2.14911150932312 time for calcul the mask position with numpy : 0.04514598846435547 nb_pixel_total : 35614 time to create 1 rle with old method : 0.04764199256896973 time for calcul the mask position with numpy : 0.045102596282958984 nb_pixel_total : 86566 time to create 1 rle with old method : 0.10145235061645508 time for calcul the mask position with numpy : 0.04993176460266113 nb_pixel_total : 19165 time to create 1 rle with old method : 0.02268242835998535 time for calcul the mask position with numpy : 0.04170846939086914 nb_pixel_total : 94426 time to create 1 rle with old method : 0.11230921745300293 time for calcul the mask position with numpy : 0.042212486267089844 nb_pixel_total : 150530 time to create 1 rle with new method : 0.9963493347167969 time for calcul the mask position with numpy : 0.04842019081115723 nb_pixel_total : 38904 time to create 1 rle with old method : 0.06033754348754883 time for calcul the mask position with numpy : 0.04237627983093262 nb_pixel_total : 67917 time to create 1 rle with old method : 0.0799856185913086 create new chi : 8.457548141479492 time to delete rle : 0.005153656005859375 batch 1 Loaded 51 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++Number RLEs to save : 17804 TO DO : save crop sub photo not yet done ! save time : 1.0644230842590332 nb_obj : 22 nb_hashtags : 3 time to prepare the origin masks : 14.455929279327393 time for calcul the mask position with numpy : 0.39706993103027344 nb_pixel_total : 7187531 time to create 1 rle with new method : 0.6401033401489258 time for calcul the mask position with numpy : 0.05406665802001953 nb_pixel_total : 42355 time to create 1 rle with old method : 0.04970216751098633 time for calcul the mask position with numpy : 0.04959750175476074 nb_pixel_total : 26754 time to create 1 rle with old method : 0.03159737586975098 time for calcul the mask position with numpy : 0.04196643829345703 nb_pixel_total : 18800 time to create 1 rle with old method : 0.026717185974121094 time for calcul the mask position with numpy : 0.04398202896118164 nb_pixel_total : 36092 time to create 1 rle with old method : 0.04243946075439453 time for calcul the mask position with numpy : 0.04560279846191406 nb_pixel_total : 21221 time to create 1 rle with old method : 0.026992321014404297 time for calcul the mask position with numpy : 0.045075416564941406 nb_pixel_total : 16191 time to create 1 rle with old method : 0.019225597381591797 time for calcul the mask position with numpy : 0.05040931701660156 nb_pixel_total : 101077 time to create 1 rle with old method : 0.12410783767700195 time for calcul the mask position with numpy : 0.04366755485534668 nb_pixel_total : 54131 time to create 1 rle with old method : 0.07006072998046875 time for calcul the mask position with numpy : 0.04253816604614258 nb_pixel_total : 15739 time to create 1 rle with old method : 0.018579483032226562 time for calcul the mask position with numpy : 0.04461407661437988 nb_pixel_total : 104563 time to create 1 rle with old method : 0.14099788665771484 time for calcul the mask position with numpy : 0.044756174087524414 nb_pixel_total : 47955 time to create 1 rle with old method : 0.05669999122619629 time for calcul the mask position with numpy : 0.04610013961791992 nb_pixel_total : 15807 time to create 1 rle with old method : 0.018628835678100586 time for calcul the mask position with numpy : 0.04752779006958008 nb_pixel_total : 54660 time to create 1 rle with old method : 0.06456184387207031 time for calcul the mask position with numpy : 0.04501914978027344 nb_pixel_total : 166585 time to create 1 rle with new method : 1.35313081741333 time for calcul the mask position with numpy : 0.04619169235229492 nb_pixel_total : 77272 time to create 1 rle with old method : 0.12273192405700684 time for calcul the mask position with numpy : 0.05032920837402344 nb_pixel_total : 23930 time to create 1 rle with old method : 0.028829097747802734 time for calcul the mask position with numpy : 0.042534589767456055 nb_pixel_total : 77347 time to create 1 rle with old method : 0.09070467948913574 time for calcul the mask position with numpy : 0.04613041877746582 nb_pixel_total : 46567 time to create 1 rle with old method : 0.05594468116760254 time for calcul the mask position with numpy : 0.04067850112915039 nb_pixel_total : 45086 time to create 1 rle with old method : 0.05276918411254883 time for calcul the mask position with numpy : 0.0429685115814209 nb_pixel_total : 45143 time to create 1 rle with old method : 0.05325460433959961 time for calcul the mask position with numpy : 0.04328560829162598 nb_pixel_total : 35179 time to create 1 rle with old method : 0.042053937911987305 time for calcul the mask position with numpy : 0.04167604446411133 nb_pixel_total : 34415 time to create 1 rle with old method : 0.040598154067993164 create new chi : 4.648545026779175 time to delete rle : 0.003939151763916016 batch 1 Loaded 45 chid ids of type : 3594 ++++++++++++++++++++++++++++Number RLEs to save : 15500 TO DO : save crop sub photo not yet done ! save time : 0.9335365295410156 nb_obj : 13 nb_hashtags : 2 time to prepare the origin masks : 8.000866651535034 time for calcul the mask position with numpy : 1.4081966876983643 nb_pixel_total : 7569578 time to create 1 rle with new method : 1.2292182445526123 time for calcul the mask position with numpy : 0.0437469482421875 nb_pixel_total : 21556 time to create 1 rle with old method : 0.025601625442504883 time for calcul the mask position with numpy : 0.042877197265625 nb_pixel_total : 47124 time to create 1 rle with old method : 0.05698823928833008 time for calcul the mask position with numpy : 0.04596281051635742 nb_pixel_total : 14381 time to create 1 rle with old method : 0.01947498321533203 time for calcul the mask position with numpy : 0.043099164962768555 nb_pixel_total : 190008 time to create 1 rle with new method : 0.9899115562438965 time for calcul the mask position with numpy : 0.03883719444274902 nb_pixel_total : 71136 time to create 1 rle with old method : 0.08361220359802246 time for calcul the mask position with numpy : 0.03934836387634277 nb_pixel_total : 12236 time to create 1 rle with old method : 0.014523744583129883 time for calcul the mask position with numpy : 0.0383143424987793 nb_pixel_total : 47881 time to create 1 rle with old method : 0.055907487869262695 time for calcul the mask position with numpy : 0.04026031494140625 nb_pixel_total : 65534 time to create 1 rle with old method : 0.07782101631164551 time for calcul the mask position with numpy : 0.026473522186279297 nb_pixel_total : 46612 time to create 1 rle with old method : 0.05472445487976074 time for calcul the mask position with numpy : 0.03273415565490723 nb_pixel_total : 12351 time to create 1 rle with old method : 0.01840662956237793 time for calcul the mask position with numpy : 0.02769160270690918 nb_pixel_total : 80922 time to create 1 rle with old method : 0.09477949142456055 time for calcul the mask position with numpy : 0.025248050689697266 nb_pixel_total : 78060 time to create 1 rle with old method : 0.48921632766723633 time for calcul the mask position with numpy : 0.029284238815307617 nb_pixel_total : 37021 time to create 1 rle with old method : 0.04367828369140625 create new chi : 5.218881845474243 time to delete rle : 0.002338409423828125 batch 1 Loaded 27 chid ids of type : 3594 +++++++++++++++++++Number RLEs to save : 9718 TO DO : save crop sub photo not yet done ! save time : 0.624774694442749 nb_obj : 21 nb_hashtags : 3 time to prepare the origin masks : 12.539936304092407 time for calcul the mask position with numpy : 0.6596448421478271 nb_pixel_total : 7510675 time to create 1 rle with new method : 0.9297196865081787 time for calcul the mask position with numpy : 0.04231882095336914 nb_pixel_total : 39644 time to create 1 rle with old method : 0.04622006416320801 time for calcul the mask position with numpy : 0.04352235794067383 nb_pixel_total : 23624 time to create 1 rle with old method : 0.027802467346191406 time for calcul the mask position with numpy : 0.04392075538635254 nb_pixel_total : 24922 time to create 1 rle with old method : 0.0294342041015625 time for calcul the mask position with numpy : 0.04302644729614258 nb_pixel_total : 12178 time to create 1 rle with old method : 0.014534711837768555 time for calcul the mask position with numpy : 0.04227423667907715 nb_pixel_total : 77533 time to create 1 rle with old method : 0.11617422103881836 time for calcul the mask position with numpy : 0.04200029373168945 nb_pixel_total : 10673 time to create 1 rle with old method : 0.012643575668334961 time for calcul the mask position with numpy : 0.042588233947753906 nb_pixel_total : 18563 time to create 1 rle with old method : 0.02176380157470703 time for calcul the mask position with numpy : 0.043771982192993164 nb_pixel_total : 23581 time to create 1 rle with old method : 0.02787303924560547 time for calcul the mask position with numpy : 0.04233241081237793 nb_pixel_total : 38087 time to create 1 rle with old method : 0.04478335380554199 time for calcul the mask position with numpy : 0.0418548583984375 nb_pixel_total : 50068 time to create 1 rle with old method : 0.05867338180541992 time for calcul the mask position with numpy : 0.03393983840942383 nb_pixel_total : 44345 time to create 1 rle with old method : 0.0520632266998291 time for calcul the mask position with numpy : 0.02919745445251465 nb_pixel_total : 13068 time to create 1 rle with old method : 0.020154476165771484 time for calcul the mask position with numpy : 0.028258323669433594 nb_pixel_total : 17643 time to create 1 rle with old method : 0.02977442741394043 time for calcul the mask position with numpy : 0.02851700782775879 nb_pixel_total : 60622 time to create 1 rle with old method : 0.0716555118560791 time for calcul the mask position with numpy : 0.02889108657836914 nb_pixel_total : 73229 time to create 1 rle with old method : 0.08546924591064453 time for calcul the mask position with numpy : 0.02709364891052246 nb_pixel_total : 14675 time to create 1 rle with old method : 0.01750469207763672 time for calcul the mask position with numpy : 0.029674530029296875 nb_pixel_total : 54842 time to create 1 rle with old method : 0.06471490859985352 time for calcul the mask position with numpy : 0.028705120086669922 nb_pixel_total : 14452 time to create 1 rle with old method : 0.0200040340423584 time for calcul the mask position with numpy : 0.02906036376953125 nb_pixel_total : 109348 time to create 1 rle with old method : 0.1273798942565918 time for calcul the mask position with numpy : 0.027367830276489258 nb_pixel_total : 34804 time to create 1 rle with old method : 0.04093432426452637 time for calcul the mask position with numpy : 0.031073570251464844 nb_pixel_total : 27824 time to create 1 rle with old method : 0.032319068908691406 create new chi : 3.3513543605804443 time to delete rle : 0.002273082733154297 batch 1 Loaded 43 chid ids of type : 3594 ++++++++++++++++++++++++++++Number RLEs to save : 13272 TO DO : save crop sub photo not yet done ! save time : 0.8287539482116699 nb_obj : 14 nb_hashtags : 2 time to prepare the origin masks : 7.602465391159058 time for calcul the mask position with numpy : 0.6886634826660156 nb_pixel_total : 7837782 time to create 1 rle with new method : 0.7781875133514404 time for calcul the mask position with numpy : 0.043057918548583984 nb_pixel_total : 36629 time to create 1 rle with old method : 0.04276227951049805 time for calcul the mask position with numpy : 0.045589447021484375 nb_pixel_total : 17119 time to create 1 rle with old method : 0.019989490509033203 time for calcul the mask position with numpy : 0.041681528091430664 nb_pixel_total : 14361 time to create 1 rle with old method : 0.016971349716186523 time for calcul the mask position with numpy : 0.04273700714111328 nb_pixel_total : 31054 time to create 1 rle with old method : 0.03637194633483887 time for calcul the mask position with numpy : 0.04099845886230469 nb_pixel_total : 22724 time to create 1 rle with old method : 0.02667236328125 time for calcul the mask position with numpy : 0.0421605110168457 nb_pixel_total : 18298 time to create 1 rle with old method : 0.02216625213623047 time for calcul the mask position with numpy : 0.03658342361450195 nb_pixel_total : 33232 time to create 1 rle with old method : 0.038886070251464844 time for calcul the mask position with numpy : 0.031211137771606445 nb_pixel_total : 18936 time to create 1 rle with old method : 0.022252559661865234 time for calcul the mask position with numpy : 0.026146411895751953 nb_pixel_total : 101009 time to create 1 rle with old method : 0.11831784248352051 time for calcul the mask position with numpy : 0.02698206901550293 nb_pixel_total : 65988 time to create 1 rle with old method : 0.07973837852478027 time for calcul the mask position with numpy : 0.025980234146118164 nb_pixel_total : 32789 time to create 1 rle with old method : 0.03864264488220215 time for calcul the mask position with numpy : 0.03217363357543945 nb_pixel_total : 38316 time to create 1 rle with old method : 0.045228004455566406 time for calcul the mask position with numpy : 0.04252457618713379 nb_pixel_total : 7383 time to create 1 rle with old method : 0.008799314498901367 time for calcul the mask position with numpy : 0.04268908500671387 nb_pixel_total : 18780 time to create 1 rle with old method : 0.02225947380065918 create new chi : 2.574618101119995 time to delete rle : 0.0016074180603027344 batch 1 Loaded 29 chid ids of type : 3594 +++++++++++++++++++Number RLEs to save : 9324 TO DO : save crop sub photo not yet done ! save time : 0.5950901508331299 nb_obj : 20 nb_hashtags : 2 time to prepare the origin masks : 9.785780429840088 time for calcul the mask position with numpy : 1.4667162895202637 nb_pixel_total : 7402475 time to create 1 rle with new method : 0.8250837326049805 time for calcul the mask position with numpy : 0.04618501663208008 nb_pixel_total : 2401 time to create 1 rle with old method : 0.002966642379760742 time for calcul the mask position with numpy : 0.04176068305969238 nb_pixel_total : 9666 time to create 1 rle with old method : 0.01172018051147461 time for calcul the mask position with numpy : 0.042257070541381836 nb_pixel_total : 2 time to create 1 rle with old method : 3.0517578125e-05 time for calcul the mask position with numpy : 0.046433210372924805 nb_pixel_total : 63388 time to create 1 rle with old method : 0.07379913330078125 time for calcul the mask position with numpy : 0.04172348976135254 nb_pixel_total : 10570 time to create 1 rle with old method : 0.012568235397338867 time for calcul the mask position with numpy : 0.0300443172454834 nb_pixel_total : 16456 time to create 1 rle with old method : 0.01972508430480957 time for calcul the mask position with numpy : 0.03747963905334473 nb_pixel_total : 67402 time to create 1 rle with old method : 0.07903814315795898 time for calcul the mask position with numpy : 0.027981281280517578 nb_pixel_total : 32176 time to create 1 rle with old method : 0.03831076622009277 time for calcul the mask position with numpy : 0.029500722885131836 nb_pixel_total : 50398 time to create 1 rle with old method : 0.14426183700561523 time for calcul the mask position with numpy : 0.033254146575927734 nb_pixel_total : 27414 time to create 1 rle with old method : 0.03256797790527344 time for calcul the mask position with numpy : 0.030762672424316406 nb_pixel_total : 97235 time to create 1 rle with old method : 0.11656928062438965 time for calcul the mask position with numpy : 0.03427910804748535 nb_pixel_total : 16237 time to create 1 rle with old method : 0.019238710403442383 time for calcul the mask position with numpy : 0.02723217010498047 nb_pixel_total : 38906 time to create 1 rle with old method : 0.04638981819152832 time for calcul the mask position with numpy : 0.03026413917541504 nb_pixel_total : 48389 time to create 1 rle with old method : 0.0568232536315918 time for calcul the mask position with numpy : 0.0338592529296875 nb_pixel_total : 147996 time to create 1 rle with old method : 0.17233848571777344 time for calcul the mask position with numpy : 0.03294873237609863 nb_pixel_total : 150043 time to create 1 rle with new method : 0.6499178409576416 time for calcul the mask position with numpy : 0.027192115783691406 nb_pixel_total : 17829 time to create 1 rle with old method : 0.021584510803222656 time for calcul the mask position with numpy : 0.03177022933959961 nb_pixel_total : 19790 time to create 1 rle with old method : 0.023658275604248047 time for calcul the mask position with numpy : 0.044034481048583984 nb_pixel_total : 28832 time to create 1 rle with old method : 0.03451228141784668 time for calcul the mask position with numpy : 0.04301095008850098 nb_pixel_total : 46795 time to create 1 rle with old method : 0.05521559715270996 create new chi : 4.695166826248169 time to delete rle : 0.002146482467651367 batch 1 Loaded 41 chid ids of type : 3594 ++++++++++++++++++++++++++Number RLEs to save : 11625 TO DO : save crop sub photo not yet done ! save time : 0.7301850318908691 nb_obj : 23 nb_hashtags : 2 time to prepare the origin masks : 14.395859003067017 time for calcul the mask position with numpy : 0.4532186985015869 nb_pixel_total : 6951967 time to create 1 rle with new method : 0.9771561622619629 time for calcul the mask position with numpy : 0.033817291259765625 nb_pixel_total : 60314 time to create 1 rle with old method : 0.07091903686523438 time for calcul the mask position with numpy : 0.03049445152282715 nb_pixel_total : 66031 time to create 1 rle with old method : 0.0775296688079834 time for calcul the mask position with numpy : 0.028547048568725586 nb_pixel_total : 55946 time to create 1 rle with old method : 0.0658724308013916 time for calcul the mask position with numpy : 0.028326988220214844 nb_pixel_total : 53102 time to create 1 rle with old method : 0.062448740005493164 time for calcul the mask position with numpy : 0.02734827995300293 nb_pixel_total : 3139 time to create 1 rle with old method : 0.0038695335388183594 time for calcul the mask position with numpy : 0.029539823532104492 nb_pixel_total : 31733 time to create 1 rle with old method : 0.037067413330078125 time for calcul the mask position with numpy : 0.02661752700805664 nb_pixel_total : 23001 time to create 1 rle with old method : 0.026921987533569336 time for calcul the mask position with numpy : 0.027587175369262695 nb_pixel_total : 21432 time to create 1 rle with old method : 0.025139808654785156 time for calcul the mask position with numpy : 0.02876448631286621 nb_pixel_total : 76330 time to create 1 rle with old method : 0.09895205497741699 time for calcul the mask position with numpy : 0.026331663131713867 nb_pixel_total : 59584 time to create 1 rle with old method : 0.07419514656066895 time for calcul the mask position with numpy : 0.027341842651367188 nb_pixel_total : 71069 time to create 1 rle with old method : 0.08369016647338867 time for calcul the mask position with numpy : 0.03145861625671387 nb_pixel_total : 12952 time to create 1 rle with old method : 0.015135049819946289 time for calcul the mask position with numpy : 0.027513980865478516 nb_pixel_total : 160272 time to create 1 rle with new method : 0.648188591003418 time for calcul the mask position with numpy : 0.028966426849365234 nb_pixel_total : 18059 time to create 1 rle with old method : 0.021172285079956055 time for calcul the mask position with numpy : 0.028029441833496094 nb_pixel_total : 86090 time to create 1 rle with old method : 0.1065225601196289 time for calcul the mask position with numpy : 0.027202606201171875 nb_pixel_total : 32682 time to create 1 rle with old method : 0.03800368309020996 time for calcul the mask position with numpy : 0.024974346160888672 nb_pixel_total : 11047 time to create 1 rle with old method : 0.012993097305297852 time for calcul the mask position with numpy : 0.029410362243652344 nb_pixel_total : 7730 time to create 1 rle with old method : 0.009102821350097656 time for calcul the mask position with numpy : 0.042218685150146484 nb_pixel_total : 17981 time to create 1 rle with old method : 0.021137714385986328 time for calcul the mask position with numpy : 0.04509925842285156 nb_pixel_total : 149783 time to create 1 rle with old method : 0.18359589576721191 time for calcul the mask position with numpy : 0.04693341255187988 nb_pixel_total : 73298 time to create 1 rle with old method : 0.08863449096679688 time for calcul the mask position with numpy : 0.047814369201660156 nb_pixel_total : 67489 time to create 1 rle with old method : 0.08119654655456543 time for calcul the mask position with numpy : 0.029683589935302734 nb_pixel_total : 183369 time to create 1 rle with new method : 0.611778736114502 create new chi : 4.725204944610596 time to delete rle : 0.003372669219970703 batch 1 Loaded 47 chid ids of type : 3594 ++++++++++++++++++++++++++++++++++++++Number RLEs to save : 19877 TO DO : save crop sub photo not yet done ! save time : 1.2193176746368408 map_output_result : {1374030691: (0.0, 'Should be the crop_list due to order', 0), 1374030612: (0.0, 'Should be the crop_list due to order', 0), 1374030610: (0.0, 'Should be the crop_list due to order', 0), 1374030607: (0.0, 'Should be the crop_list due to order', 0), 1374030601: (0.0, 'Should be the crop_list due to order', 0), 1374030599: (0.0, 'Should be the crop_list due to order', 0), 1374028732: (0.0, 'Should be the crop_list due to order', 0), 1374028730: (0.0, 'Should be the crop_list due to order', 0), 1374028726: (0.0, 'Should be the crop_list due to order', 0), 1374028722: (0.0, 'Should be the crop_list due to order', 0), 1374028721: (0.0, 'Should be the crop_list due to order', 0), 1374028644: (0.0, 'Should be the crop_list due to order', 0), 1374028640: (0.0, 'Should be the crop_list due to order', 0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1374030691, 1374030612, 1374030610, 1374030607, 1374030601, 1374030599, 1374028732, 1374028730, 1374028726, 1374028722, 1374028721, 1374028644, 1374028640] Looping around the photos to save general results len do output : 13 /1374030691.Didn't retrieve data . /1374030612.Didn't retrieve data . /1374030610.Didn't retrieve data . /1374030607.Didn't retrieve data . /1374030601.Didn't retrieve data . /1374030599.Didn't retrieve data . /1374028732.Didn't retrieve data . /1374028730.Didn't retrieve data . /1374028726.Didn't retrieve data . /1374028722.Didn't retrieve data . /1374028721.Didn't retrieve data . /1374028644.Didn't retrieve data . /1374028640.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030691', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030612', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030610', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030607', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030601', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030599', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028732', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028730', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028726', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028722', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028721', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028644', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028640', None, None, None, None, None, '3370219') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 39 time used for this insertion : 0.016821622848510742 save_final save missing photos in datou_result : time spend for datou_step_exec : 216.82370924949646 time spend to save output : 0.023179292678833008 total time spend for step 3 : 216.8468885421753 step4:ventilate_hashtags_in_portfolio Mon Jul 28 12:59:12 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 25399044 get user id for portfolio 25399044 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25399044 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('environnement','pet_clair','autre','mal_croppe','pehd','pet_fonce','carton','background','flou','papier','metal')) AND mptpi.`min_score`=0.5 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25399044 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('environnement','pet_clair','autre','mal_croppe','pehd','pet_fonce','carton','background','flou','papier','metal')) AND mptpi.`min_score`=0.5 To do Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25399044 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('environnement','pet_clair','autre','mal_croppe','pehd','pet_fonce','carton','background','flou','papier','metal')) AND mptpi.`min_score`=0.5 To do lien utilise dans velours : https://www.fotonower.com/velours/25402986,25402987,25402988,25402989,25402990,25402991,25402992,25402993,25402994,25402995,25402996?tags=environnement,pet_clair,autre,mal_croppe,pehd,pet_fonce,carton,background,flou,papier,metal Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1374030691, 1374030612, 1374030610, 1374030607, 1374030601, 1374030599, 1374028732, 1374028730, 1374028726, 1374028722, 1374028721, 1374028644, 1374028640] Looping around the photos to save general results len do output : 1 /25399044. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030691', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030612', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030610', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030607', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030601', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030599', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028732', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028730', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028726', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028722', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028721', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028644', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028640', None, None, None, None, None, '3370219') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 14 time used for this insertion : 0.015637874603271484 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.7819709777832031 time spend to save output : 0.015995502471923828 total time spend for step 4 : 1.797966480255127 step5:final Mon Jul 28 12:59:14 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Catched exception ! Connect or reconnect ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1374030691: ('0.10871546325973407',), 1374030612: ('0.10871546325973407',), 1374030610: ('0.10871546325973407',), 1374030607: ('0.10871546325973407',), 1374030601: ('0.10871546325973407',), 1374030599: ('0.10871546325973407',), 1374028732: ('0.10871546325973407',), 1374028730: ('0.10871546325973407',), 1374028726: ('0.10871546325973407',), 1374028722: ('0.10871546325973407',), 1374028721: ('0.10871546325973407',), 1374028644: ('0.10871546325973407',), 1374028640: ('0.10871546325973407',)} new output for save of step final : {1374030691: ('0.10871546325973407',), 1374030612: ('0.10871546325973407',), 1374030610: ('0.10871546325973407',), 1374030607: ('0.10871546325973407',), 1374030601: ('0.10871546325973407',), 1374030599: ('0.10871546325973407',), 1374028732: ('0.10871546325973407',), 1374028730: ('0.10871546325973407',), 1374028726: ('0.10871546325973407',), 1374028722: ('0.10871546325973407',), 1374028721: ('0.10871546325973407',), 1374028644: ('0.10871546325973407',), 1374028640: ('0.10871546325973407',)} [1374030691, 1374030612, 1374030610, 1374030607, 1374030601, 1374030599, 1374028732, 1374028730, 1374028726, 1374028722, 1374028721, 1374028644, 1374028640] Looping around the photos to save general results len do output : 13 /1374030691.Didn't retrieve data . /1374030612.Didn't retrieve data . /1374030610.Didn't retrieve data . /1374030607.Didn't retrieve data . /1374030601.Didn't retrieve data . /1374030599.Didn't retrieve data . /1374028732.Didn't retrieve data . /1374028730.Didn't retrieve data . /1374028726.Didn't retrieve data . /1374028722.Didn't retrieve data . /1374028721.Didn't retrieve data . /1374028644.Didn't retrieve data . /1374028640.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030691', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030612', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030610', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030607', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030601', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030599', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028732', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028730', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028726', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028722', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028721', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028644', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028640', None, None, None, None, None, '3370219') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 39 time used for this insertion : 0.015359163284301758 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.13630151748657227 time spend to save output : 0.01625967025756836 total time spend for step 5 : 0.15256118774414062 step6:blur_detection Mon Jul 28 12:59:14 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection methode: ratio et variance treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193.jpg resize: (2160, 3840) 1374030691 -7.092473384363186 treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a.jpg resize: (2160, 3840) 1374030612 -7.46893604258404 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422.jpg resize: (2160, 3840) 1374030610 -7.202947112265012 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691.jpg resize: (2160, 3840) 1374030607 -7.226123258333117 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626.jpg resize: (2160, 3840) 1374030601 -7.641509282358543 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9.jpg resize: (2160, 3840) 1374030599 -7.335561120757522 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1.jpg resize: (2160, 3840) 1374028732 -7.30940663895572 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8.jpg resize: (2160, 3840) 1374028730 -7.499247030120027 treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f.jpg resize: (2160, 3840) 1374028726 -7.193444110899314 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa.jpg resize: (2160, 3840) 1374028722 -7.7310301333476295 treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54.jpg resize: (2160, 3840) 1374028721 -7.777705237572167 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59.jpg resize: (2160, 3840) 1374028644 -6.998933295021255 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf.jpg resize: (2160, 3840) 1374028640 -7.598691391431869 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375015_0.png resize: (152, 117) 1374146058 -5.07754298696399 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375002_0.png resize: (277, 122) 1374146061 -4.489874966026468 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375004_0.png resize: (187, 200) 1374146062 -4.55284745378422 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375008_0.png resize: (278, 142) 1374146063 -4.899119740370953 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375001_0.png resize: (167, 200) 1374146065 -5.17276557377733 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375007_0.png resize: (416, 282) 1374146066 -4.005506887698589 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896374999_0.png resize: (197, 215) 1374146067 -4.592648975941335 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375005_0.png resize: (347, 521) 1374146069 -4.738627314396276 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375009_0.png resize: (164, 189) 1374146070 -1.6245244937854761 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375003_0.png resize: (179, 267) 1374146071 -4.317522947664598 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375013_0.png resize: (237, 274) 1374146072 -4.379010981225315 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375010_0.png resize: (281, 329) 1374146075 -3.5360246341964854 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375012_0.png resize: (344, 286) 1374146076 -3.922120880085377 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375014_0.png resize: (332, 274) 1374146077 -4.9815566943032215 treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375018_0.png resize: (310, 194) 1374146079 -4.469858974882495 treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375026_0.png resize: (292, 75) 1374146081 -3.984527078212056 treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375017_0.png resize: (716, 633) 1374146082 -2.2948362810694256 treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375021_0.png resize: (394, 399) 1374146084 -4.317067167085267 treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375023_0.png resize: (165, 148) 1374146086 -4.170564613801125 treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375029_0.png resize: (219, 287) 1374146087 0.5491499727105855 treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375024_0.png resize: (278, 145) 1374146089 -5.140085800369866 treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375028_0.png resize: (123, 65) 1374146091 -3.2021927504964713 treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375016_0.png resize: (117, 137) 1374146092 -0.6413860251284831 treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375019_0.png resize: (103, 112) 1374146094 -3.812985509115736 treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375022_0.png resize: (135, 87) 1374146097 -4.423957580780838 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375041_0.png resize: (315, 132) 1374146098 -2.7845233416437885 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375049_0.png resize: (240, 482) 1374146099 -4.583287873361654 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375042_0.png resize: (297, 364) 1374146102 -4.051854997908048 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375050_0.png resize: (143, 231) 1374146103 -2.1720847424692318 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375040_0.png resize: (208, 338) 1374146104 -0.8084404241493635 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375035_0.png resize: (119, 153) 1374146107 0.9817800030247172 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375032_0.png resize: (288, 276) 1374146108 -1.2694072580424194 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375052_0.png resize: (406, 355) 1374146109 -2.9628944484882527 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375045_0.png resize: (358, 172) 1374146111 -2.4468572685067564 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375039_0.png resize: (225, 73) 1374146113 -3.039449248253167 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375051_0.png resize: (130, 82) 1374146114 -3.4116223302530546 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375043_0.png resize: (429, 190) 1374146115 -3.5793752219841806 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375046_0.png resize: (173, 149) 1374146118 -4.70401669618696 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375038_0.png resize: (260, 199) 1374146119 -4.270500332284158 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375048_0.png resize: (318, 203) 1374146120 -4.116153980740545 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375058_0.png resize: (122, 172) 1374146123 -2.371738861880257 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375053_0.png resize: (330, 350) 1374146124 -3.5127676158431824 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375063_0.png resize: (126, 288) 1374146125 -4.395588086452982 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375059_0.png resize: (206, 267) 1374146128 -4.156078638642528 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375072_0.png resize: (244, 285) 1374146129 -0.7808028532645298 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375066_0.png resize: (99, 106) 1374146130 -3.544508344987092 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375054_0.png resize: (142, 120) 1374146133 -3.5958794452679776 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375062_0.png resize: (605, 263) 1374146134 -4.059798652063817 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375068_0.png resize: (315, 177) 1374146135 -3.4670376093737403 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375056_0.png resize: (301, 109) 1374146138 -3.755878718688927 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375064_0.png resize: (226, 304) 1374146139 -1.9285050255999485 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375055_0.png resize: (320, 215) 1374146140 -3.8055237307866614 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375071_0.png resize: (178, 164) 1374146141 -4.91606284012559 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375069_0.png resize: (452, 438) 1374146144 -4.844139936654852 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375061_0.png resize: (260, 183) 1374146145 -3.4798866502870447 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375070_0.png resize: (222, 154) 1374146146 -4.252701620650931 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375057_0.png resize: (131, 187) 1374146149 -2.4796458178254936 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375074_0.png resize: (276, 323) 1374146150 -0.580264876467164 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375080_0.png resize: (239, 280) 1374146151 -0.9585269705251301 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375083_0.png resize: (354, 327) 1374146153 -4.472271377905395 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375077_0.png resize: (407, 500) 1374146155 -1.2859973826524576 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375087_0.png resize: (448, 339) 1374146156 -1.9087472665827219 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375076_0.png resize: (178, 86) 1374146157 -1.283746919646504 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375085_0.png resize: (268, 138) 1374146160 -3.760205055750359 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375084_0.png resize: (274, 202) 1374146161 -2.130671162282155 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375078_0.png resize: (349, 226) 1374146162 -1.771026615462654 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375075_0.png resize: (301, 272) 1374146165 -5.35747146616911 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375073_0.png resize: (299, 434) 1374146166 -5.365196919776595 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375082_0.png resize: (218, 139) 1374146167 -3.438578360769171 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375088_0.png resize: (396, 632) 1374146169 -3.9875202727482986 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375079_0.png resize: (106, 172) 1374146171 3.765320288392298 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375086_0.png resize: (247, 249) 1374146172 -4.007966960866273 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375112_0.png resize: (132, 124) 1374146174 -2.814972205686225 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375102_0.png resize: (274, 194) 1374146176 -3.2983959861760805 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375105_0.png resize: (167, 115) 1374146177 -1.1147492444357108 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375109_0.png resize: (383, 220) 1374146179 -4.278136506043385 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375111_0.png resize: (405, 215) 1374146181 -4.647389626463703 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375096_0.png resize: (256, 265) 1374146182 -4.550393007387708 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375101_0.png resize: (315, 247) 1374146183 -3.2222100316463798 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375110_0.png resize: (168, 156) 1374146185 -3.634091000974814 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375089_0.png resize: (139, 217) 1374146187 -2.787181217842423 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375092_0.png resize: (201, 342) 1374146188 -3.87519719171588 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375108_0.png resize: (173, 157) 1374146190 -1.832493443075365 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375095_0.png resize: (238, 182) 1374146192 -4.264229722510371 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375093_0.png resize: (183, 194) 1374146193 -3.72639096936359 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375090_0.png resize: (645, 526) 1374146194 -4.737237202783312 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375106_0.png resize: (218, 145) 1374146196 -2.745077839629852 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375099_0.png resize: (365, 101) 1374146198 -2.1049952409258803 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375098_0.png resize: (344, 132) 1374146199 -4.16791500094619 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375107_0.png resize: (227, 220) 1374146201 -3.5737711940620542 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375100_0.png resize: (128, 217) 1374146203 -2.955299541995083 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375104_0.png resize: (549, 667) 1374146204 -3.3542855145552135 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375094_0.png resize: (505, 157) 1374146206 0.0034247835173102665 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375103_0.png resize: (489, 459) 1374146207 0.5526545618022001 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375097_0.png resize: (203, 368) 1374146210 -1.5004721130880883 treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375091_0.png resize: (265, 186) 1374146211 -3.507734813850834 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375116_0.png resize: (392, 359) 1374146213 -4.04790492569397 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375134_0.png resize: (143, 163) 1374146215 0.033759243100807315 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375128_0.png resize: (131, 129) 1374146216 -3.5332939722872605 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375130_0.png resize: (178, 127) 1374146218 -3.3723192919684077 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375117_0.png resize: (178, 187) 1374146220 -2.6087572578466744 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375129_0.png resize: (198, 104) 1374146221 -4.44370188087425 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375118_0.png resize: (388, 395) 1374146223 -2.6786078154500395 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375127_0.png resize: (195, 105) 1374146224 -4.772798304197228 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375126_0.png resize: (258, 181) 1374146226 -2.5716731401171367 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375135_0.png resize: (199, 155) 1374146228 1.6617335134562075 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375122_0.png resize: (156, 372) 1374146229 -2.971088369204697 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375115_0.png resize: (439, 521) 1374146230 -4.09069050742653 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375136_0.png resize: (440, 340) 1374146232 -4.026024614798275 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375124_0.png resize: (393, 371) 1374146233 -0.2933935450770851 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375132_0.png resize: (130, 152) 1374146234 -3.0941597935091725 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375120_0.png resize: (476, 481) 1374146235 -0.8804582684413464 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375113_0.png resize: (229, 419) 1374146237 -3.5125474939007253 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375121_0.png resize: (187, 235) 1374146238 -4.468531047387491 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375133_0.png resize: (223, 170) 1374146239 1.2541716079731573 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375131_0.png resize: (203, 183) 1374146241 -1.0402081144702584 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375125_0.png resize: (546, 851) 1374146242 -3.425531442337087 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375114_0.png resize: (231, 431) 1374146243 -3.4972131498038674 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375119_0.png resize: (280, 193) 1374146245 -1.9420251464901468 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375137_0.png resize: (200, 171) 1374146246 -3.0539677602340345 treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375123_0.png resize: (638, 254) 1374146247 -4.27482705177825 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375140_0.png resize: (233, 321) 1374146249 -1.5948631907978095 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375149_0.png resize: (203, 429) 1374146250 -0.5102774585719249 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375150_0.png resize: (509, 411) 1374146251 -4.647256448899551 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375156_0.png resize: (287, 173) 1374146253 -3.632984149369367 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375157_0.png resize: (203, 158) 1374146254 -4.545173032499084 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375155_0.png resize: (107, 299) 1374146255 -4.429002281783694 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375144_0.png resize: (258, 153) 1374146257 -2.8905469452303385 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375138_0.png resize: (270, 173) 1374146258 1.0016317326437456 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375145_0.png resize: (413, 382) 1374146259 -3.579501022428312 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375154_0.png resize: (208, 158) 1374146260 -4.243506959073342 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375141_0.png resize: (346, 222) 1374146262 -4.352262420482886 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375142_0.png resize: (257, 301) 1374146263 -4.338047242962173 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375146_0.png resize: (382, 589) 1374146264 -3.571475373687753 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375139_0.png resize: (273, 189) 1374146266 -1.3832088313776523 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375147_0.png resize: (380, 338) 1374146267 -3.6233544405928146 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375159_0.png resize: (326, 184) 1374146268 -4.671587093591819 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375143_0.png resize: (286, 357) 1374146271 -5.1841628360671566 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375152_0.png resize: (259, 371) 1374146272 -4.513959313731525 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375158_0.png resize: (167, 192) 1374146273 -3.7806843092058284 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375148_0.png resize: (224, 118) 1374146275 -2.409446479883743 treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375167_0.png resize: (186, 82) 1374146276 -3.7596176347405215 treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375163_0.png resize: (114, 122) 1374146277 -0.37766784540037074 treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375165_0.png resize: (353, 290) 1374146279 -2.3387796709485116 treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375169_0.png resize: (387, 766) 1374146280 -4.929166677712792 treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375164_0.png resize: (218, 259) 1374146281 -2.256102366797052 treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375166_0.png resize: (234, 281) 1374146283 1.8891066906242584 treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375171_0.png resize: (213, 405) 1374146284 -1.2189487278588014 treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375160_0.png resize: (337, 146) 1374146285 -4.754111268790048 treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375168_0.png resize: (378, 327) 1374146286 -4.576090124162602 treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375162_0.png resize: (335, 322) 1374146288 -0.37824801525905094 treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375172_0.png resize: (187, 276) 1374146289 -4.4150857611590695 treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375161_0.png resize: (438, 264) 1374146290 -4.0454403701896 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375185_0.png resize: (340, 228) 1374146292 -4.055348429379995 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375175_0.png resize: (446, 345) 1374146293 -3.7474571092892366 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375184_0.png resize: (346, 259) 1374146294 -4.653640660113521 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375177_0.png resize: (323, 277) 1374146295 -4.2039403626215295 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375191_0.png resize: (179, 179) 1374146296 -2.759017402095561 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375190_0.png resize: (214, 133) 1374146297 -3.445867741029981 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375180_0.png resize: (340, 316) 1374146298 0.5688181444156304 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375182_0.png resize: (129, 123) 1374146299 2.3143626368017296 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375173_0.png resize: (155, 222) 1374146300 -4.228549449759307 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375192_0.png resize: (277, 131) 1374146301 -4.198902678096676 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375183_0.png resize: (328, 194) 1374146302 -3.9980639787325107 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375188_0.png resize: (145, 106) 1374146303 -3.498006264030003 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375181_0.png resize: (152, 178) 1374146304 -3.3647636008967003 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375176_0.png resize: (100, 181) 1374146305 -1.3005383083186246 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375187_0.png resize: (195, 179) 1374146306 -4.760272372348236 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375179_0.png resize: (432, 282) 1374146307 -5.4801425873626775 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375178_0.png resize: (176, 144) 1374146308 -4.577106794498263 treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375199_0.png resize: (402, 351) 1374146309 -4.209270173724921 treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375195_0.png resize: (84, 111) 1374146310 -5.295478167084617 treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375202_0.png resize: (125, 211) 1374146311 -1.5375782159457523 treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375198_0.png resize: (411, 228) 1374146312 -4.8152908494510855 treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375196_0.png resize: (335, 439) 1374146313 -3.694456485750097 treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375201_0.png resize: (272, 182) 1374146314 -4.417685716241815 treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375207_0.png resize: (323, 217) 1374146315 -0.3942146989905911 treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375205_0.png resize: (134, 181) 1374146316 -0.00987455754909027 treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375204_0.png resize: (159, 333) 1374146317 -4.796741117832693 treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375194_0.png resize: (199, 129) 1374146318 -3.997864437888749 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375214_0.png resize: (215, 398) 1374146319 -3.4896035862498196 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375221_0.png resize: (219, 564) 1374146320 -4.585356820471037 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375223_0.png resize: (99, 119) 1374146321 -0.0872315414650666 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375212_0.png resize: (421, 510) 1374146322 -3.878888623737938 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375222_0.png resize: (147, 186) 1374146323 -2.43051384092895 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375224_0.png resize: (280, 386) 1374146324 -5.048910751171575 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375209_0.png resize: (163, 228) 1374146325 7.431039165176287 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375213_0.png resize: (377, 494) 1374146326 -4.712139509458687 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375208_0.png resize: (251, 321) 1374146327 -3.186646070217428 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375220_0.png resize: (181, 277) 1374146328 -3.8786940271809685 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375215_0.png resize: (245, 242) 1374146329 -1.7542646806144946 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375217_0.png resize: (300, 616) 1374146330 -4.721630981206442 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375210_0.png resize: (151, 180) 1374146332 -2.7736224615681544 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375218_0.png resize: (335, 152) 1374146333 -3.8816924755145914 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375227_0.png resize: (130, 111) 1374146334 1.3569909482756322 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375226_0.png resize: (135, 104) 1374146335 -4.172410079748244 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375219_0.png resize: (476, 250) 1374146336 -4.977522225012161 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375211_0.png resize: (143, 171) 1374146337 -3.8123716828087413 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375225_0.png resize: (144, 133) 1374146339 -4.949557300271476 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375243_0.png resize: (391, 110) 1374146340 -4.186549708783571 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375241_0.png resize: (387, 333) 1374146341 -2.871045147347374 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375242_0.png resize: (391, 375) 1374146342 -3.8482434610843677 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375234_0.png resize: (148, 93) 1374146343 -2.540517947346079 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375238_0.png resize: (523, 683) 1374146344 -4.220374771709074 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375239_0.png resize: (127, 137) 1374146345 -0.6660518287403375 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375228_0.png resize: (351, 812) 1374146346 -1.7474205485613818 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375229_0.png resize: (311, 283) 1374146347 -3.9486142947471397 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375231_0.png resize: (496, 548) 1374146348 -5.34087903586812 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375245_0.png resize: (199, 269) 1374146349 0.6282948115929777 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375246_0.png resize: (265, 192) 1374146350 -4.356518109599786 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375249_0.png resize: (351, 231) 1374146351 -5.245241716635884 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375236_0.png resize: (702, 181) 1374146352 -4.721355530248699 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375244_0.png resize: (163, 196) 1374146353 -5.81776703664212 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375235_0.png resize: (275, 175) 1374146354 -4.116775174639809 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375248_0.png resize: (333, 220) 1374146355 -1.8015980988049185 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375232_0.png resize: (190, 134) 1374146356 -5.819133942048022 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375250_0.png resize: (389, 244) 1374146357 -3.901931334585706 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375233_0.png resize: (92, 115) 1374146358 -3.028246571136917 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375237_0.png resize: (192, 137) 1374146359 -1.1553136142259812 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375247_0.png resize: (379, 404) 1374146360 -2.2451677744290297 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375240_0.png resize: (428, 363) 1374146361 -4.414949828760735 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375011_0.png resize: (205, 164) 1374146393 -4.439553505159569 treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375025_0.png resize: (395, 215) 1374146394 -3.8978288434389525 treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375020_0.png resize: (302, 289) 1374146395 -4.279519336212569 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375030_0.png resize: (204, 241) 1374146396 -5.26400920898274 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375047_0.png resize: (241, 313) 1374146397 -4.094792292426898 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375031_0.png resize: (233, 273) 1374146398 -3.1282482949831163 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375034_0.png resize: (174, 170) 1374146399 -2.5794671873366166 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375037_0.png resize: (302, 144) 1374146400 -2.60138245120824 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375044_0.png resize: (353, 107) 1374146401 -4.799611442107826 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375060_0.png resize: (196, 188) 1374146402 -4.599038674522186 treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375081_0.png resize: (304, 243) 1374146403 -5.0730014253433655 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375151_0.png resize: (161, 129) 1374146404 -4.691497841021616 treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375170_0.png resize: (163, 161) 1374146405 -3.4212291020630134 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375174_0.png resize: (320, 168) 1374146406 -3.6289212928615755 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375186_0.png resize: (202, 171) 1374146407 -4.278159976950756 treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375206_0.png resize: (187, 138) 1374146409 -4.543657338958944 treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375200_0.png resize: (147, 165) 1374146410 -2.8871778047848347 treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375197_0.png resize: (307, 163) 1374146411 -3.5390462511614293 treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375203_0.png resize: (200, 169) 1374146412 -3.9985292257034573 treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375216_0.png resize: (115, 176) 1374146414 -3.564396955437865 treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375230_0.png resize: (375, 285) 1374146415 -4.8154696357115325 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375006_0.png resize: (85, 125) 1374146418 -4.97401477781468 treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375000_0.png resize: (622, 261) 1374146427 -5.739086697011787 treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375027_0.png resize: (168, 205) 1374146428 -2.103976013542193 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375033_0.png resize: (386, 423) 1374146429 -5.1113638513216415 treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375036_0.png resize: (198, 140) 1374146430 -4.845361011541221 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375067_0.png resize: (118, 167) 1374146431 -3.768879399189456 treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375065_0.png resize: (455, 402) 1374146432 -4.898222872380504 treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375153_0.png resize: (393, 330) 1374146433 -4.6799269968983355 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375193_0.png resize: (210, 253) 1374146434 -5.514482260274135 treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375189_0.png resize: (216, 458) 1374146435 -4.802030753770483 Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 265 time used for this insertion : 0.0318446159362793 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 265 time used for this insertion : 0.05230379104614258 save missing photos in datou_result : time spend for datou_step_exec : 60.3142454624176 time spend to save output : 0.09099030494689941 total time spend for step 6 : 60.4052357673645 step7:brightness Mon Jul 28 13:00:14 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193.jpg treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a.jpg treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422.jpg treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691.jpg treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626.jpg treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9.jpg treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1.jpg treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8.jpg treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f.jpg treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa.jpg treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54.jpg treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59.jpg treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf.jpg treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375015_0.png treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375002_0.png treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375004_0.png treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375008_0.png treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375001_0.png treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375007_0.png treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896374999_0.png treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375005_0.png treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375009_0.png treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375003_0.png treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375013_0.png treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375010_0.png treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375012_0.png treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375014_0.png treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375018_0.png treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375026_0.png treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375017_0.png treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375021_0.png treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375023_0.png treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375029_0.png treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375024_0.png treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375028_0.png treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375016_0.png treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375019_0.png treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375022_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375041_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375049_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375042_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375050_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375040_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375035_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375032_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375052_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375045_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375039_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375051_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375043_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375046_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375038_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375048_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375058_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375053_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375063_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375059_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375072_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375066_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375054_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375062_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375068_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375056_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375064_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375055_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375071_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375069_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375061_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375070_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375057_0.png treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375074_0.png treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375080_0.png treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375083_0.png treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375077_0.png treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375087_0.png treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375076_0.png treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375085_0.png treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375084_0.png treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375078_0.png treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375075_0.png treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375073_0.png treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375082_0.png treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375088_0.png treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375079_0.png treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375086_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375112_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375102_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375105_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375109_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375111_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375096_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375101_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375110_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375089_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375092_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375108_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375095_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375093_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375090_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375106_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375099_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375098_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375107_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375100_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375104_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375094_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375103_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375097_0.png treat image : temp/1753699827_3479611_1374030599_35a3bfaa0c548f05efda6f64c81501f9_rle_crop_3896375091_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375116_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375134_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375128_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375130_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375117_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375129_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375118_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375127_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375126_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375135_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375122_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375115_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375136_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375124_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375132_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375120_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375113_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375121_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375133_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375131_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375125_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375114_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375119_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375137_0.png treat image : temp/1753699827_3479611_1374028732_e81ff88301f52deaa26856477d5fe1d1_rle_crop_3896375123_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375140_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375149_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375150_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375156_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375157_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375155_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375144_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375138_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375145_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375154_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375141_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375142_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375146_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375139_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375147_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375159_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375143_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375152_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375158_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375148_0.png treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375167_0.png treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375163_0.png treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375165_0.png treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375169_0.png treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375164_0.png treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375166_0.png treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375171_0.png treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375160_0.png treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375168_0.png treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375162_0.png treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375172_0.png treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375161_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375185_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375175_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375184_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375177_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375191_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375190_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375180_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375182_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375173_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375192_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375183_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375188_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375181_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375176_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375187_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375179_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375178_0.png treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375199_0.png treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375195_0.png treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375202_0.png treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375198_0.png treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375196_0.png treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375201_0.png treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375207_0.png treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375205_0.png treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375204_0.png treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375194_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375214_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375221_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375223_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375212_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375222_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375224_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375209_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375213_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375208_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375220_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375215_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375217_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375210_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375218_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375227_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375226_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375219_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375211_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375225_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375243_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375241_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375242_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375234_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375238_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375239_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375228_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375229_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375231_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375245_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375246_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375249_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375236_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375244_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375235_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375248_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375232_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375250_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375233_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375237_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375247_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375240_0.png treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375011_0.png treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375025_0.png treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375020_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375030_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375047_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375031_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375034_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375037_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375044_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375060_0.png treat image : temp/1753699827_3479611_1374030601_15f7e47008617fc216ed065320859626_rle_crop_3896375081_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375151_0.png treat image : temp/1753699827_3479611_1374028726_4b1d270eb47c9ee12ff8d171c11c8f1f_rle_crop_3896375170_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375174_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375186_0.png treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375206_0.png treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375200_0.png treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375197_0.png treat image : temp/1753699827_3479611_1374028721_41f1d20f59b7fe7ec8b81ed0d08d0b54_rle_crop_3896375203_0.png treat image : temp/1753699827_3479611_1374028644_da9a70347a236fe1eec3c8e4ba5fcc59_rle_crop_3896375216_0.png treat image : temp/1753699827_3479611_1374028640_a74225b140ba77bb19b010de9302ddcf_rle_crop_3896375230_0.png treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375006_0.png treat image : temp/1753699827_3479611_1374030691_9faf440a2407f0dce81019229d659193_rle_crop_3896375000_0.png treat image : temp/1753699827_3479611_1374030612_d46545d061b103e7e784061ef4ef212a_rle_crop_3896375027_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375033_0.png treat image : temp/1753699827_3479611_1374030610_5c1f251ef95e7a2c3563269118a5d422_rle_crop_3896375036_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375067_0.png treat image : temp/1753699827_3479611_1374030607_d5aecf7ab871de1cad52620aead95691_rle_crop_3896375065_0.png treat image : temp/1753699827_3479611_1374028730_d6987119c536cc5831fc7e33442e25f8_rle_crop_3896375153_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375193_0.png treat image : temp/1753699827_3479611_1374028722_4b58586b61f135f6ba6b8cb6199661aa_rle_crop_3896375189_0.png Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 265 time used for this insertion : 0.022623062133789062 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 265 time used for this insertion : 0.04794716835021973 save missing photos in datou_result : time spend for datou_step_exec : 17.153838872909546 time spend to save output : 0.07624006271362305 total time spend for step 7 : 17.23007893562317 step8:velours_tree Mon Jul 28 13:00:32 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure can't find the photo_desc_type Inside saveOutput : final : False verbose : 0 ouput is None No outpout to save, returning out of save general time spend for datou_step_exec : 1.8498656749725342 time spend to save output : 5.2928924560546875e-05 total time spend for step 8 : 1.8499186038970947 step9:send_mail_cod Mon Jul 28 13:00:33 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin/workarea/git/Velours/python in order to get the selector url, please entre the license of selector results_Auto_P25399044_28-07-2025_13_00_33.pdf 25402987 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette254029871753700433 25402988 imagette254029881753700434 25402989 imagette254029891753700434 25402990 imagette254029901753700434 25402991 imagette254029911753700434 25402992 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette254029921753700434 25402993 imagette254029931753700436 25402994 imagette254029941753700436 25402995 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette254029951753700436 25402996 change filename to text .imagette254029961753700437 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=25399044 and hashtag_type = 3594 and pcr.hashtag_id = h.hashtag_id; velour_link : https://www.fotonower.com/velours/25402986,25402987,25402988,25402989,25402990,25402991,25402992,25402993,25402994,25402995,25402996?tags=environnement,pet_clair,autre,mal_croppe,pehd,pet_fonce,carton,background,flou,papier,metal args[1374030691] : ((1374030691, -7.092473384363186, 492609224), (1374030691, 0.18859664843024512, 2107752395), '0.10871546325973407') We are sending mail with results at report@fotonower.com args[1374030612] : ((1374030612, -7.46893604258404, 492609224), (1374030612, 0.7030367896871949, 2107752395), '0.10871546325973407') We are sending mail with results at report@fotonower.com args[1374030610] : ((1374030610, -7.202947112265012, 492609224), (1374030610, 0.7952977027672974, 2107752395), '0.10871546325973407') We are sending mail with results at report@fotonower.com args[1374030607] : ((1374030607, -7.226123258333117, 492609224), (1374030607, 0.30963945160659523, 2107752395), '0.10871546325973407') We are sending mail with results at report@fotonower.com args[1374030601] : ((1374030601, -7.641509282358543, 492609224), (1374030601, 1.3981293300259123, 2107752395), '0.10871546325973407') We are sending mail with results at report@fotonower.com args[1374030599] : ((1374030599, -7.335561120757522, 492609224), (1374030599, 1.177356079219933, 2107752395), '0.10871546325973407') We are sending mail with results at report@fotonower.com args[1374028732] : ((1374028732, -7.30940663895572, 492609224), (1374028732, 1.1835591603626723, 2107752395), '0.10871546325973407') We are sending mail with results at report@fotonower.com args[1374028730] : ((1374028730, -7.499247030120027, 492609224), (1374028730, 1.346233861539527, 2107752395), '0.10871546325973407') We are sending mail with results at report@fotonower.com args[1374028726] : ((1374028726, -7.193444110899314, 492609224), (1374028726, 1.230466969243491, 2107752395), '0.10871546325973407') We are sending mail with results at report@fotonower.com args[1374028722] : ((1374028722, -7.7310301333476295, 492609224), (1374028722, 1.0062360615997177, 2107752395), '0.10871546325973407') We are sending mail with results at report@fotonower.com args[1374028721] : ((1374028721, -7.777705237572167, 492609224), (1374028721, 0.9543516803473757, 2107752395), '0.10871546325973407') We are sending mail with results at report@fotonower.com args[1374028644] : ((1374028644, -6.998933295021255, 492609224), (1374028644, 0.4098698402553625, 2107752395), '0.10871546325973407') We are sending mail with results at report@fotonower.com args[1374028640] : ((1374028640, -7.598691391431869, 492609224), (1374028640, 1.6410388600982486, 2107752395), '0.10871546325973407') We are sending mail with results at report@fotonower.com refus_total : 0.10871546325973407 2022-04-13 10:29:59 0 SELECT ph.photo_id,ph.url,ph.username,ph.uploaded_at,ph.text FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=25399044 AND mpp.hide_status=0 ORDER BY mpp.order LIMIT 0, 1000 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25399044_28-07-2025_13_00_33.pdf results_Auto_P25399044_28-07-2025_13_00_33.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25399044_28-07-2025_13_00_33.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('3318','25399044','results_Auto_P25399044_28-07-2025_13_00_33.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25399044_28-07-2025_13_00_33.pdf','pdf','','0.75','0.10871546325973407') message_in_mail: Bonjour,
Veuillez trouver ci dessous les résultats du service carac on demand pour le portfolio: https://www.fotonower.com/view/25399044

https://www.fotonower.com/image?json=false&list_photos_id=1374030691
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374030612
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374030610
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374030607
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374030601
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374030599
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374028732
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374028730
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374028726
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374028722
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374028721
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374028644
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374028640
Bravo, la photo est bien prise.

Dans ces conditions,le taux de refus est: 10.87%
Veuillez trouver les photos des contaminants.

exemples de contaminants: pet_clair: https://www.fotonower.com/view/25402987?limit=200
exemples de contaminants: carton: https://www.fotonower.com/view/25402992?limit=200
exemples de contaminants: papier: https://www.fotonower.com/view/25402995?limit=200
exemples de contaminants: metal: https://www.fotonower.com/view/25402996?limit=200
Veuillez trouver le rapport en pdf:https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25399044_28-07-2025_13_00_33.pdf.

Lien vers velours :https://www.fotonower.com/velours/25402986,25402987,25402988,25402989,25402990,25402991,25402992,25402993,25402994,25402995,25402996?tags=environnement,pet_clair,autre,mal_croppe,pehd,pet_fonce,carton,background,flou,papier,metal.


L'équipe Fotonower 202 b'' Server: nginx Date: Mon, 28 Jul 2025 11:00:41 GMT Content-Length: 0 Connection: close X-Message-Id: KJuDvh_LRyaWPXvONSqOlA Access-Control-Allow-Origin: https://sendgrid.api-docs.io Access-Control-Allow-Methods: POST Access-Control-Allow-Headers: Authorization, Content-Type, On-behalf-of, x-sg-elas-acl Access-Control-Max-Age: 600 X-No-CORS-Reason: https://sendgrid.com/docs/Classroom/Basics/API/cors.html Strict-Transport-Security: max-age=31536000; includeSubDomains Content-Security-Policy: frame-ancestors 'none' Cache-Control: no-cache X-Content-Type-Options: no-sniff Referrer-Policy: strict-origin-when-cross-origin Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1374030691, 1374030612, 1374030610, 1374030607, 1374030601, 1374030599, 1374028732, 1374028730, 1374028726, 1374028722, 1374028721, 1374028644, 1374028640] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030691', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030612', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030610', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030607', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030601', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030599', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028732', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028730', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028726', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028722', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028721', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028644', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028640', None, None, None, None, None, '3370219') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 13 time used for this insertion : 0.013605833053588867 save_final save missing photos in datou_result : time spend for datou_step_exec : 7.897520065307617 time spend to save output : 0.01415705680847168 total time spend for step 9 : 7.911677122116089 step10:split_time_score Mon Jul 28 13:00:41 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score Catched exception ! Connect or reconnect ! TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('14', 3), ('15', 10)) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 19072025 25399044 Nombre de photos uploadées : 13 / 23040 (0%) 19072025 25399044 Nombre de photos taguées (types de déchets): 0 / 13 (0%) 19072025 25399044 Nombre de photos taguées (volume) : 0 / 13 (0%) elapsed_time : load_data_split_time_score 1.2636184692382812e-05 elapsed_time : order_list_meta_photo_and_scores 3.457069396972656e-05 ????????????? elapsed_time : fill_and_build_computed_from_old_data 0.0009443759918212891 Catched exception ! Connect or reconnect ! Catched exception ! Connect or reconnect ! elapsed_time : insert_dashboard_record_day_entry 0.22196388244628906 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.03100127996184855 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25398904_28-07-2025_12_32_34.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25398904 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25398904 AND mptpi.`type`=3726 To do Qualite : 0.01825568852062115 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25398937_28-07-2025_12_13_45.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25398937 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25398937 AND mptpi.`type`=3726 To do Qualite : 0.02289449354025213 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25398943_28-07-2025_12_02_50.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25398943 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25398943 AND mptpi.`type`=3726 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25398994 order by id desc limit 1 Qualite : 0.03506406923321646 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25398999_28-07-2025_11_56_56.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25398999 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25398999 AND mptpi.`type`=3726 To do Qualite : 0.024266514647614878 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25399030_28-07-2025_11_36_59.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25399030 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25399030 AND mptpi.`type`=3726 To do Qualite : 0.10871546325973407 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25399044_28-07-2025_13_00_33.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25399044 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25399044 AND mptpi.`type`=3594 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'19072025': {'nb_upload': 13, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1374030691, 1374030612, 1374030610, 1374030607, 1374030601, 1374030599, 1374028732, 1374028730, 1374028726, 1374028722, 1374028721, 1374028644, 1374028640] Looping around the photos to save general results len do output : 1 /25399044Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030691', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030612', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030610', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030607', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030601', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374030599', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028732', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028730', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028726', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028722', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028721', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028644', None, None, None, None, None, '3370219') ('3318', None, None, None, None, None, None, None, '3370219') ('3318', '25399044', '1374028640', None, None, None, None, None, '3370219') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 14 time used for this insertion : 0.013920783996582031 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.7130305767059326 time spend to save output : 0.014277219772338867 total time spend for step 10 : 1.7273077964782715 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 13 set_done_treatment 229.49user 291.38system 10:19.12elapsed 84%CPU (0avgtext+0avgdata 5181432maxresident)k 3627120inputs+182560outputs (71168major+26161475minor)pagefaults 0swaps