python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 3318 ' -s datou_3318 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 1021063 load datou : 3318 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? [(photo_id, hashtag_id, hashtag_type, x0, x1, y0, y1, score, seg_temp, polygons), ...] was removed should we ? chemin de la photo was removed should we ? [ (photo_id_loc, hashtag_id, hashtag_type, x0, x1, y0, y1, score, None), ...] was removed should we ? chemin de la photo was removed should we ? id de la photo (peut être local ou global) was removed should we ? chemin de la photo was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? None was removed should we ? donnée sous forme de nombre was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) load thcls load THCL from format json or kwargs add thcl : 2847 in CacheModelConfig load pdts add pdt : 5275 in CacheModelConfig Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 3318, datou_cur_ids : ['3512843'] with mtr_portfolio_ids : ['25915180'] and first list_photo_ids : [] new path : /proc/1021063/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : mask_detect, crop_condition, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, velours_tree, send_mail_cod, split_time_score over limit max, limiting to limit_max 40 list_input_json : [] origin We have 1 , BFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 32 ; length of list_pids : 32 ; length of list_args : 32 time to download the photos : 4.074708938598633 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 10 step1:mask_detect Tue Aug 12 14:40:32 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 10733 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-08-12 14:40:35.221237: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-08-12 14:40:35.229289: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493035000 Hz 2025-08-12 14:40:35.230627: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f60f0000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-08-12 14:40:35.230672: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-08-12 14:40:35.233074: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-08-12 14:40:35.349207: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x260fa3b0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-08-12 14:40:35.349247: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-08-12 14:40:35.350516: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-08-12 14:40:35.350902: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-08-12 14:40:35.353779: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-08-12 14:40:35.356367: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-08-12 14:40:35.356845: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-08-12 14:40:35.360045: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-08-12 14:40:35.361061: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-08-12 14:40:35.364850: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-08-12 14:40:35.366295: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-08-12 14:40:35.366356: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-08-12 14:40:35.367119: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-08-12 14:40:35.367133: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-08-12 14:40:35.367142: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-08-12 14:40:35.368666: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9803 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-08-12 14:40:35.628450: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-08-12 14:40:35.628530: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-08-12 14:40:35.628555: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-08-12 14:40:35.628576: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-08-12 14:40:35.628595: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-08-12 14:40:35.628614: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-08-12 14:40:35.628633: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-08-12 14:40:35.628653: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-08-12 14:40:35.630247: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-08-12 14:40:35.631589: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-08-12 14:40:35.631626: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-08-12 14:40:35.631645: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-08-12 14:40:35.631661: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-08-12 14:40:35.631678: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-08-12 14:40:35.631694: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-08-12 14:40:35.631710: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-08-12 14:40:35.631727: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-08-12 14:40:35.633240: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-08-12 14:40:35.633268: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-08-12 14:40:35.633276: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-08-12 14:40:35.633283: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-08-12 14:40:35.634546: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9803 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl2847 thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) {'thcl': {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'], 'list_hashtags_csv': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'svm_hashtag_type_desc': 5275, 'photo_desc_type': 5275, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME learn_RUBBIA_REFUS_AMIENS_23 NUM_CLASSES 9 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : learn_RUBBIA_REFUS_AMIENS_23 model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-08-12 14:40:45.559443: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-08-12 14:40:45.834515: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-08-12 14:40:47.346280: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.346904: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 3.60G (3865470464 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.347484: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 3.24G (3478923264 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.348038: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.92G (3131030784 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.348597: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.62G (2817927680 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.349135: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.36G (2536134912 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.349661: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.12G (2282521344 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.349695: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-08-12 14:40:47.350349: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.350366: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-08-12 14:40:47.359043: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.359067: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-08-12 14:40:47.359639: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.359658: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-08-12 14:40:47.367479: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.367517: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 466.56MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-08-12 14:40:47.368177: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.368194: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 466.56MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-08-12 14:40:47.406462: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.406499: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-08-12 14:40:47.407031: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.407046: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-08-12 14:40:47.413665: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.413696: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 243.25MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-08-12 14:40:47.414244: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.414261: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 243.25MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-08-12 14:40:47.453503: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.454071: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.456716: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.457286: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.511136: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.511742: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.514106: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.514647: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.563368: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.563994: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.569244: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.569783: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.592621: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.593168: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.597458: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.598127: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.615758: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.616333: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.622080: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.622643: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.747384: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.747968: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.748528: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.749085: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.764904: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.765469: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.817314: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.817884: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.818442: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.818999: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.849842: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.850412: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.850970: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.851535: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.868134: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.868699: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.889600: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.890165: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.913614: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.914183: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.923577: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.924141: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.924700: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.925257: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.925979: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.926544: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.943312: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.943951: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.944526: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.945084: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.945642: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.946199: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.946755: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.947334: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.971225: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.971794: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.994205: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:47.994771: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.037553: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.037620: W tensorflow/core/kernels/gpu_utils.cc:49] Failed to allocate memory for convolution redzone checking; skipping this check. This is benign and only means that we won't check cudnn for out-of-bounds reads and writes. This message will only be printed once. 2025-08-12 14:40:48.038607: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.039600: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.049666: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.050233: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.050805: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.051372: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.074923: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.075511: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.126203: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.126823: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.127402: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.127964: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.135937: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.136500: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.137059: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.137616: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.138674: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.154286: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.155004: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.181216: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.181782: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.182351: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.182908: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.183481: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-08-12 14:40:48.184041: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory local folder : /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23 /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23/mask_model.h5 size_local : 256009536 size in s3 : 256009536 create time local : 2021-08-09 09:43:22 create time in s3 : 2021-08-06 18:54:04 mask_model.h5 already exist and didn't need to update list_images length : 32 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 33.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 30.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 18 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 38.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 22 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 33.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 15 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 51.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 12 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 44.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 13 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 35.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 30.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 9 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 37.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 33.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 27.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 14 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 17.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 15 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 12.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 15 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 32.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 9 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 24.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 10 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 31.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 29.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 20 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 30.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 29.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 10 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 29.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 2.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 11 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 35.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 22.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 13 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 23.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 15 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 32.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 9 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 34.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 36.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 10 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 25.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 17 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 43.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 11 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 31.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 29.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 10 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 37.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 11 Detection mask done ! Trying to reset tf kernel 1021727 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 1986 tf kernel not reseted sub process len(results) : 32 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 32 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 3179 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl2847 Catched exception ! Connect or reconnect ! thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] time for calcul the mask position with numpy : 0.0020394325256347656 nb_pixel_total : 100906 time to create 1 rle with old method : 0.11324262619018555 length of segment : 538 time for calcul the mask position with numpy : 0.0005147457122802734 nb_pixel_total : 19917 time to create 1 rle with old method : 0.03034687042236328 length of segment : 180 time for calcul the mask position with numpy : 0.00017786026000976562 nb_pixel_total : 6229 time to create 1 rle with old method : 0.006855010986328125 length of segment : 130 time for calcul the mask position with numpy : 0.00018358230590820312 nb_pixel_total : 7771 time to create 1 rle with old method : 0.009011030197143555 length of segment : 132 time for calcul the mask position with numpy : 0.0003209114074707031 nb_pixel_total : 13795 time to create 1 rle with old method : 0.01685333251953125 length of segment : 145 time for calcul the mask position with numpy : 0.0005505084991455078 nb_pixel_total : 31558 time to create 1 rle with old method : 0.0368194580078125 length of segment : 198 time for calcul the mask position with numpy : 0.0002777576446533203 nb_pixel_total : 16333 time to create 1 rle with old method : 0.018244504928588867 length of segment : 230 time for calcul the mask position with numpy : 0.001775979995727539 nb_pixel_total : 96921 time to create 1 rle with old method : 0.10815072059631348 length of segment : 509 time for calcul the mask position with numpy : 0.000156402587890625 nb_pixel_total : 6817 time to create 1 rle with old method : 0.007975578308105469 length of segment : 86 time for calcul the mask position with numpy : 0.002474069595336914 nb_pixel_total : 110470 time to create 1 rle with old method : 0.1283092498779297 length of segment : 382 time for calcul the mask position with numpy : 0.0004978179931640625 nb_pixel_total : 25802 time to create 1 rle with old method : 0.02865910530090332 length of segment : 157 time for calcul the mask position with numpy : 0.0003814697265625 nb_pixel_total : 23887 time to create 1 rle with old method : 0.02697134017944336 length of segment : 137 time for calcul the mask position with numpy : 0.00018978118896484375 nb_pixel_total : 9897 time to create 1 rle with old method : 0.011377334594726562 length of segment : 100 time for calcul the mask position with numpy : 0.00022840499877929688 nb_pixel_total : 14103 time to create 1 rle with old method : 0.01618361473083496 length of segment : 125 time for calcul the mask position with numpy : 0.0003383159637451172 nb_pixel_total : 22650 time to create 1 rle with old method : 0.027013540267944336 length of segment : 156 time for calcul the mask position with numpy : 0.010792016983032227 nb_pixel_total : 566133 time to create 1 rle with new method : 0.024956703186035156 length of segment : 1158 time for calcul the mask position with numpy : 0.0022783279418945312 nb_pixel_total : 91898 time to create 1 rle with old method : 0.10558676719665527 length of segment : 468 time for calcul the mask position with numpy : 0.00012922286987304688 nb_pixel_total : 4288 time to create 1 rle with old method : 0.005907535552978516 length of segment : 61 time for calcul the mask position with numpy : 0.00024366378784179688 nb_pixel_total : 14379 time to create 1 rle with old method : 0.018390655517578125 length of segment : 105 time for calcul the mask position with numpy : 0.00045013427734375 nb_pixel_total : 27406 time to create 1 rle with old method : 0.031859397888183594 length of segment : 161 time for calcul the mask position with numpy : 0.0001957416534423828 nb_pixel_total : 7103 time to create 1 rle with old method : 0.008502006530761719 length of segment : 78 time for calcul the mask position with numpy : 0.0003247261047363281 nb_pixel_total : 17941 time to create 1 rle with old method : 0.0209197998046875 length of segment : 139 time for calcul the mask position with numpy : 0.0008723735809326172 nb_pixel_total : 25511 time to create 1 rle with old method : 0.02938699722290039 length of segment : 189 time for calcul the mask position with numpy : 0.00048351287841796875 nb_pixel_total : 16608 time to create 1 rle with old method : 0.019749879837036133 length of segment : 170 time for calcul the mask position with numpy : 0.00021147727966308594 nb_pixel_total : 10027 time to create 1 rle with old method : 0.011607170104980469 length of segment : 131 time for calcul the mask position with numpy : 0.0003333091735839844 nb_pixel_total : 23652 time to create 1 rle with old method : 0.02707529067993164 length of segment : 149 time for calcul the mask position with numpy : 0.00036525726318359375 nb_pixel_total : 19899 time to create 1 rle with old method : 0.025354862213134766 length of segment : 229 time for calcul the mask position with numpy : 0.0002105236053466797 nb_pixel_total : 4403 time to create 1 rle with old method : 0.006125211715698242 length of segment : 43 time for calcul the mask position with numpy : 0.0005819797515869141 nb_pixel_total : 18667 time to create 1 rle with old method : 0.021413564682006836 length of segment : 172 time for calcul the mask position with numpy : 0.0005624294281005859 nb_pixel_total : 14116 time to create 1 rle with old method : 0.015959501266479492 length of segment : 189 time for calcul the mask position with numpy : 0.0003306865692138672 nb_pixel_total : 9351 time to create 1 rle with old method : 0.01102137565612793 length of segment : 106 time for calcul the mask position with numpy : 0.0007576942443847656 nb_pixel_total : 19979 time to create 1 rle with old method : 0.025040388107299805 length of segment : 270 time for calcul the mask position with numpy : 0.0006325244903564453 nb_pixel_total : 31413 time to create 1 rle with old method : 0.0380864143371582 length of segment : 188 time for calcul the mask position with numpy : 0.00030231475830078125 nb_pixel_total : 17261 time to create 1 rle with old method : 0.025313615798950195 length of segment : 133 time for calcul the mask position with numpy : 0.0004227161407470703 nb_pixel_total : 15847 time to create 1 rle with old method : 0.018737316131591797 length of segment : 136 time for calcul the mask position with numpy : 0.00028014183044433594 nb_pixel_total : 6227 time to create 1 rle with old method : 0.007537364959716797 length of segment : 101 time for calcul the mask position with numpy : 0.0002391338348388672 nb_pixel_total : 3661 time to create 1 rle with old method : 0.006042957305908203 length of segment : 115 time for calcul the mask position with numpy : 0.0012996196746826172 nb_pixel_total : 43516 time to create 1 rle with old method : 0.0656578540802002 length of segment : 360 time for calcul the mask position with numpy : 0.0003464221954345703 nb_pixel_total : 14315 time to create 1 rle with old method : 0.016948223114013672 length of segment : 138 time for calcul the mask position with numpy : 0.00027108192443847656 nb_pixel_total : 13802 time to create 1 rle with old method : 0.01628899574279785 length of segment : 120 time for calcul the mask position with numpy : 0.00026488304138183594 nb_pixel_total : 7217 time to create 1 rle with old method : 0.008771181106567383 length of segment : 102 time for calcul the mask position with numpy : 0.00022268295288085938 nb_pixel_total : 10089 time to create 1 rle with old method : 0.011738300323486328 length of segment : 154 time for calcul the mask position with numpy : 0.00026869773864746094 nb_pixel_total : 6844 time to create 1 rle with old method : 0.00812387466430664 length of segment : 113 time for calcul the mask position with numpy : 0.0002465248107910156 nb_pixel_total : 7228 time to create 1 rle with old method : 0.008776187896728516 length of segment : 123 time for calcul the mask position with numpy : 0.0003902912139892578 nb_pixel_total : 8113 time to create 1 rle with old method : 0.009659767150878906 length of segment : 179 time for calcul the mask position with numpy : 0.0009768009185791016 nb_pixel_total : 28644 time to create 1 rle with old method : 0.03297710418701172 length of segment : 279 time for calcul the mask position with numpy : 0.00029397010803222656 nb_pixel_total : 6766 time to create 1 rle with old method : 0.008125543594360352 length of segment : 110 time for calcul the mask position with numpy : 0.0004227161407470703 nb_pixel_total : 10217 time to create 1 rle with old method : 0.012346982955932617 length of segment : 125 time for calcul the mask position with numpy : 0.0009133815765380859 nb_pixel_total : 30522 time to create 1 rle with old method : 0.035974979400634766 length of segment : 230 time for calcul the mask position with numpy : 0.0003867149353027344 nb_pixel_total : 5621 time to create 1 rle with old method : 0.006999015808105469 length of segment : 140 time for calcul the mask position with numpy : 0.00016117095947265625 nb_pixel_total : 2302 time to create 1 rle with old method : 0.0028963088989257812 length of segment : 57 time for calcul the mask position with numpy : 0.0005519390106201172 nb_pixel_total : 16547 time to create 1 rle with old method : 0.019740819931030273 length of segment : 162 time for calcul the mask position with numpy : 0.0004603862762451172 nb_pixel_total : 10353 time to create 1 rle with old method : 0.012314319610595703 length of segment : 134 time for calcul the mask position with numpy : 0.0036339759826660156 nb_pixel_total : 107837 time to create 1 rle with old method : 0.12316441535949707 length of segment : 553 time for calcul the mask position with numpy : 0.0006320476531982422 nb_pixel_total : 17165 time to create 1 rle with old method : 0.020440101623535156 length of segment : 152 time for calcul the mask position with numpy : 0.0006420612335205078 nb_pixel_total : 19823 time to create 1 rle with old method : 0.02324056625366211 length of segment : 197 time for calcul the mask position with numpy : 0.00030422210693359375 nb_pixel_total : 13270 time to create 1 rle with old method : 0.015877962112426758 length of segment : 116 time for calcul the mask position with numpy : 0.0002894401550292969 nb_pixel_total : 9756 time to create 1 rle with old method : 0.011609554290771484 length of segment : 130 time for calcul the mask position with numpy : 0.00041794776916503906 nb_pixel_total : 11901 time to create 1 rle with old method : 0.014242172241210938 length of segment : 151 time for calcul the mask position with numpy : 0.0002837181091308594 nb_pixel_total : 7609 time to create 1 rle with old method : 0.009165525436401367 length of segment : 131 time for calcul the mask position with numpy : 0.0002646446228027344 nb_pixel_total : 5465 time to create 1 rle with old method : 0.0065081119537353516 length of segment : 95 time for calcul the mask position with numpy : 0.00020885467529296875 nb_pixel_total : 4148 time to create 1 rle with old method : 0.005116701126098633 length of segment : 75 time for calcul the mask position with numpy : 0.0006194114685058594 nb_pixel_total : 23179 time to create 1 rle with old method : 0.027322769165039062 length of segment : 141 time for calcul the mask position with numpy : 0.0003349781036376953 nb_pixel_total : 8712 time to create 1 rle with old method : 0.010429143905639648 length of segment : 130 time for calcul the mask position with numpy : 0.0002818107604980469 nb_pixel_total : 7955 time to create 1 rle with old method : 0.010163307189941406 length of segment : 75 time for calcul the mask position with numpy : 0.0001342296600341797 nb_pixel_total : 5043 time to create 1 rle with old method : 0.006190776824951172 length of segment : 62 time for calcul the mask position with numpy : 0.0004134178161621094 nb_pixel_total : 8823 time to create 1 rle with old method : 0.010772705078125 length of segment : 149 time for calcul the mask position with numpy : 0.002248048782348633 nb_pixel_total : 100043 time to create 1 rle with old method : 0.11774420738220215 length of segment : 504 time for calcul the mask position with numpy : 0.0004284381866455078 nb_pixel_total : 9553 time to create 1 rle with old method : 0.011487960815429688 length of segment : 115 time for calcul the mask position with numpy : 0.000331878662109375 nb_pixel_total : 7848 time to create 1 rle with old method : 0.00976872444152832 length of segment : 85 time for calcul the mask position with numpy : 0.00019407272338867188 nb_pixel_total : 3406 time to create 1 rle with old method : 0.0043904781341552734 length of segment : 54 time for calcul the mask position with numpy : 0.0002682209014892578 nb_pixel_total : 5010 time to create 1 rle with old method : 0.00616145133972168 length of segment : 103 time for calcul the mask position with numpy : 0.00035858154296875 nb_pixel_total : 12250 time to create 1 rle with old method : 0.015117406845092773 length of segment : 127 time for calcul the mask position with numpy : 0.0002601146697998047 nb_pixel_total : 6843 time to create 1 rle with old method : 0.008628129959106445 length of segment : 96 time for calcul the mask position with numpy : 0.00038623809814453125 nb_pixel_total : 15989 time to create 1 rle with old method : 0.019640207290649414 length of segment : 113 time for calcul the mask position with numpy : 0.0003764629364013672 nb_pixel_total : 8625 time to create 1 rle with old method : 0.010717153549194336 length of segment : 118 time for calcul the mask position with numpy : 0.0003666877746582031 nb_pixel_total : 9342 time to create 1 rle with old method : 0.011580228805541992 length of segment : 97 time for calcul the mask position with numpy : 0.002324342727661133 nb_pixel_total : 97494 time to create 1 rle with old method : 0.11333513259887695 length of segment : 545 time for calcul the mask position with numpy : 0.0003521442413330078 nb_pixel_total : 7271 time to create 1 rle with old method : 0.008688926696777344 length of segment : 116 time for calcul the mask position with numpy : 0.0007760524749755859 nb_pixel_total : 25133 time to create 1 rle with old method : 0.02933812141418457 length of segment : 227 time for calcul the mask position with numpy : 0.00012183189392089844 nb_pixel_total : 3454 time to create 1 rle with old method : 0.004276752471923828 length of segment : 72 time for calcul the mask position with numpy : 0.00016641616821289062 nb_pixel_total : 5151 time to create 1 rle with old method : 0.006374359130859375 length of segment : 49 time for calcul the mask position with numpy : 0.0003998279571533203 nb_pixel_total : 10618 time to create 1 rle with old method : 0.012522459030151367 length of segment : 134 time for calcul the mask position with numpy : 0.0002090930938720703 nb_pixel_total : 3863 time to create 1 rle with old method : 0.004797220230102539 length of segment : 66 time for calcul the mask position with numpy : 0.00028777122497558594 nb_pixel_total : 7187 time to create 1 rle with old method : 0.008713483810424805 length of segment : 112 time for calcul the mask position with numpy : 0.0004067420959472656 nb_pixel_total : 10044 time to create 1 rle with old method : 0.011946678161621094 length of segment : 178 time for calcul the mask position with numpy : 0.0003933906555175781 nb_pixel_total : 10247 time to create 1 rle with old method : 0.01219487190246582 length of segment : 154 time for calcul the mask position with numpy : 0.00033855438232421875 nb_pixel_total : 5394 time to create 1 rle with old method : 0.0066356658935546875 length of segment : 90 time for calcul the mask position with numpy : 0.0004184246063232422 nb_pixel_total : 7682 time to create 1 rle with old method : 0.009272098541259766 length of segment : 163 time for calcul the mask position with numpy : 0.0005133152008056641 nb_pixel_total : 12722 time to create 1 rle with old method : 0.015183448791503906 length of segment : 137 time for calcul the mask position with numpy : 0.00017023086547851562 nb_pixel_total : 2657 time to create 1 rle with old method : 0.0032858848571777344 length of segment : 84 time for calcul the mask position with numpy : 0.00024247169494628906 nb_pixel_total : 3760 time to create 1 rle with old method : 0.0045795440673828125 length of segment : 82 time for calcul the mask position with numpy : 0.0004169940948486328 nb_pixel_total : 11206 time to create 1 rle with old method : 0.013419389724731445 length of segment : 117 time for calcul the mask position with numpy : 0.0006656646728515625 nb_pixel_total : 21077 time to create 1 rle with old method : 0.02451300621032715 length of segment : 260 time for calcul the mask position with numpy : 0.0027098655700683594 nb_pixel_total : 112000 time to create 1 rle with old method : 0.12640380859375 length of segment : 546 time for calcul the mask position with numpy : 0.0003447532653808594 nb_pixel_total : 7028 time to create 1 rle with old method : 0.008280515670776367 length of segment : 159 time for calcul the mask position with numpy : 0.0005724430084228516 nb_pixel_total : 18839 time to create 1 rle with old method : 0.02216815948486328 length of segment : 188 time for calcul the mask position with numpy : 0.0002639293670654297 nb_pixel_total : 5017 time to create 1 rle with old method : 0.006157875061035156 length of segment : 78 time for calcul the mask position with numpy : 0.00045561790466308594 nb_pixel_total : 11295 time to create 1 rle with old method : 0.013361930847167969 length of segment : 183 time for calcul the mask position with numpy : 0.0001876354217529297 nb_pixel_total : 2975 time to create 1 rle with old method : 0.0036139488220214844 length of segment : 88 time for calcul the mask position with numpy : 0.0002338886260986328 nb_pixel_total : 5312 time to create 1 rle with old method : 0.007214069366455078 length of segment : 61 time for calcul the mask position with numpy : 0.002168416976928711 nb_pixel_total : 66910 time to create 1 rle with old method : 0.07921481132507324 length of segment : 373 time for calcul the mask position with numpy : 0.0003421306610107422 nb_pixel_total : 5260 time to create 1 rle with old method : 0.0064239501953125 length of segment : 133 time for calcul the mask position with numpy : 0.0004398822784423828 nb_pixel_total : 13911 time to create 1 rle with old method : 0.016603946685791016 length of segment : 79 time for calcul the mask position with numpy : 0.0018699169158935547 nb_pixel_total : 79810 time to create 1 rle with old method : 0.09229874610900879 length of segment : 417 time for calcul the mask position with numpy : 0.0006337165832519531 nb_pixel_total : 20762 time to create 1 rle with old method : 0.023738622665405273 length of segment : 176 time for calcul the mask position with numpy : 0.00021147727966308594 nb_pixel_total : 6103 time to create 1 rle with old method : 0.007469654083251953 length of segment : 163 time for calcul the mask position with numpy : 0.0002598762512207031 nb_pixel_total : 6886 time to create 1 rle with old method : 0.008417606353759766 length of segment : 93 time for calcul the mask position with numpy : 0.0006270408630371094 nb_pixel_total : 22845 time to create 1 rle with old method : 0.02664947509765625 length of segment : 237 time for calcul the mask position with numpy : 0.00046062469482421875 nb_pixel_total : 18923 time to create 1 rle with old method : 0.021802902221679688 length of segment : 189 time for calcul the mask position with numpy : 0.0002384185791015625 nb_pixel_total : 4059 time to create 1 rle with old method : 0.0048487186431884766 length of segment : 110 time for calcul the mask position with numpy : 0.0004229545593261719 nb_pixel_total : 15658 time to create 1 rle with old method : 0.01879286766052246 length of segment : 133 time for calcul the mask position with numpy : 0.0018777847290039062 nb_pixel_total : 82078 time to create 1 rle with old method : 0.09214997291564941 length of segment : 448 time for calcul the mask position with numpy : 0.00014066696166992188 nb_pixel_total : 4722 time to create 1 rle with old method : 0.0056493282318115234 length of segment : 53 time for calcul the mask position with numpy : 0.00019121170043945312 nb_pixel_total : 3970 time to create 1 rle with old method : 0.0049207210540771484 length of segment : 83 time for calcul the mask position with numpy : 0.0011925697326660156 nb_pixel_total : 50148 time to create 1 rle with old method : 0.057680606842041016 length of segment : 321 time for calcul the mask position with numpy : 0.0005917549133300781 nb_pixel_total : 17809 time to create 1 rle with old method : 0.020528316497802734 length of segment : 145 time for calcul the mask position with numpy : 0.00039458274841308594 nb_pixel_total : 10630 time to create 1 rle with old method : 0.012689352035522461 length of segment : 174 time for calcul the mask position with numpy : 0.000270843505859375 nb_pixel_total : 5193 time to create 1 rle with old method : 0.006447792053222656 length of segment : 87 time for calcul the mask position with numpy : 0.0003292560577392578 nb_pixel_total : 8754 time to create 1 rle with old method : 0.010570764541625977 length of segment : 85 time for calcul the mask position with numpy : 0.0004286766052246094 nb_pixel_total : 13577 time to create 1 rle with old method : 0.015815258026123047 length of segment : 149 time for calcul the mask position with numpy : 0.017962217330932617 nb_pixel_total : 723424 time to create 1 rle with new method : 0.05888795852661133 length of segment : 1008 time spent for convertir_results : 6.026007652282715 Inside saveOutput : final : False verbose : 0 eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 122 chid ids of type : 3594 Number RLEs to save : 22336 save missing photos in datou_result : time spend for datou_step_exec : 43.44673204421997 time spend to save output : 1.2663049697875977 total time spend for step 1 : 44.71303701400757 step2:crop_condition Tue Aug 12 14:41:17 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure Loading chi in step crop with photo_hashtag_type : 3594 Loading chi in step crop for list_pids : 32 ! batch 1 Loaded 122 chid ids of type : 3594 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ begin to crop the class : papier param for this class : {'min_score': 0.7} filtre for class : papier hashtag_id of this class : 492668766 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 39 About to insert : list_path_to_insert length 39 new photo from crops ! About to upload 39 photos upload in portfolio : 3736932 init cache_photo without model_param we have 39 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1755002480_1021063 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 39 photos in the portfolio 3736932 time of upload the photos Elapsed time : 9.05862832069397 we have finished the crop for the class : papier begin to crop the class : carton param for this class : {'min_score': 0.7} filtre for class : carton hashtag_id of this class : 492774966 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 13 About to insert : list_path_to_insert length 13 new photo from crops ! About to upload 13 photos upload in portfolio : 3736932 init cache_photo without model_param we have 13 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1755002491_1021063 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 13 photos in the portfolio 3736932 time of upload the photos Elapsed time : 2.8962998390197754 we have finished the crop for the class : carton begin to crop the class : metal param for this class : {'min_score': 0.7} filtre for class : metal hashtag_id of this class : 492628673 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 8 About to insert : list_path_to_insert length 8 new photo from crops ! About to upload 8 photos upload in portfolio : 3736932 init cache_photo without model_param we have 8 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1755002495_1021063 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 8 photos in the portfolio 3736932 time of upload the photos Elapsed time : 2.6831893920898438 we have finished the crop for the class : metal begin to crop the class : pet_clair param for this class : {'min_score': 0.7} filtre for class : pet_clair hashtag_id of this class : 2107755846 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 58 About to insert : list_path_to_insert length 58 new photo from crops ! About to upload 58 photos upload in portfolio : 3736932 init cache_photo without model_param we have 58 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1755002510_1021063 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 58 photos in the portfolio 3736932 time of upload the photos Elapsed time : 14.302580833435059 we have finished the crop for the class : pet_clair begin to crop the class : autre param for this class : {'min_score': 0.7} filtre for class : autre hashtag_id of this class : 494826614 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 3 About to insert : list_path_to_insert length 3 new photo from crops ! About to upload 3 photos upload in portfolio : 3736932 init cache_photo without model_param we have 3 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1755002525_1021063 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 3 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.8289330005645752 we have finished the crop for the class : autre begin to crop the class : pehd param for this class : {'min_score': 0.7} filtre for class : pehd hashtag_id of this class : 628944319 begin to crop the class : pet_fonce param for this class : {'min_score': 0.7} filtre for class : pet_fonce hashtag_id of this class : 2107755900 we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 1 About to insert : list_path_to_insert length 1 new photo from crops ! About to upload 1 photos upload in portfolio : 3736932 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1755002527_1021063 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 1 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.7931094169616699 we have finished the crop for the class : pet_fonce delete rles from all chi we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : crop_condition we use saveGeneral [1376616345, 1376616343, 1376616340, 1376616339, 1376616338, 1376616336, 1376616255, 1376616247, 1376616225, 1376616190, 1376616156, 1376616121, 1376615988, 1376615939, 1376615935, 1376615933, 1376615912, 1376615910, 1376615907, 1376615906, 1376615905, 1376615901, 1376615680, 1376615677, 1376615673, 1376615668, 1376615663, 1376615636, 1376615433, 1376615400, 1376615357, 1376615325] Looping around the photos to save general results len do output : 122 /1376618751Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618752Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618753Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618754Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618755Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618756Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618757Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618758Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618759Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618760Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618761Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618762Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618763Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618764Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618765Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618766Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618767Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618768Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618769Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618770Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618771Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618773Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618774Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618775Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618776Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618777Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618778Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618779Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618780Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618781Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618782Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618784Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618785Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618786Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618787Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618788Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618789Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618790Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618791Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618792Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618793Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618794Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618795Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618796Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618797Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618798Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618799Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618800Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618801Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618802Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618804Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618805Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618808Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618809Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618810Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618811Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618812Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618813Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618814Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618815Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618839Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618840Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618841Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618842Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618843Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618844Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618845Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618846Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618847Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618848Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618849Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618850Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618851Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618852Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618853Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618854Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618855Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618856Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618857Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618858Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618859Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618860Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618861Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618862Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618863Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618864Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618865Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618866Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618867Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618868Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618869Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618870Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618871Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618872Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618874Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618875Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618876Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618878Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618879Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618880Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618881Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618883Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618884Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618885Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618887Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618888Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618889Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618890Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618892Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618893Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618894Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618896Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618897Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618898Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618899Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618901Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618902Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618903Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618915Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618916Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618917Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1376618920Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616345', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616343', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616340', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616339', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616338', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616336', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616255', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616247', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616225', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616190', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616156', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616121', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615988', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615939', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615935', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615933', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615912', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615910', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615907', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615906', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615905', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615901', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615680', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615677', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615673', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615668', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615663', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615636', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615433', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615400', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615357', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615325', None, None, None, None, None, '3512843') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 398 time used for this insertion : 0.03151082992553711 save_final save missing photos in datou_result : time spend for datou_step_exec : 50.395575284957886 time spend to save output : 0.036508798599243164 total time spend for step 2 : 50.43208408355713 step3:rle_unique_nms_with_priority Tue Aug 12 14:42:07 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 122 chid ids of type : 3594 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.10414552688598633 time for calcul the mask position with numpy : 0.0333712100982666 nb_pixel_total : 1972694 time to create 1 rle with new method : 0.05652952194213867 time for calcul the mask position with numpy : 0.006473064422607422 nb_pixel_total : 100906 time to create 1 rle with old method : 0.11293387413024902 create new chi : 0.2096543312072754 time to delete rle : 0.015781402587890625 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 2156 TO DO : save crop sub photo not yet done ! save time : 0.15737676620483398 nb_obj : 8 nb_hashtags : 3 time to prepare the origin masks : 0.730506420135498 time for calcul the mask position with numpy : 0.3895869255065918 nb_pixel_total : 1874259 time to create 1 rle with new method : 0.09504008293151855 time for calcul the mask position with numpy : 0.006196260452270508 nb_pixel_total : 6817 time to create 1 rle with old method : 0.00766444206237793 time for calcul the mask position with numpy : 0.006375551223754883 nb_pixel_total : 96921 time to create 1 rle with old method : 0.10497307777404785 time for calcul the mask position with numpy : 0.006373405456542969 nb_pixel_total : 16333 time to create 1 rle with old method : 0.017830610275268555 time for calcul the mask position with numpy : 0.005936384201049805 nb_pixel_total : 31558 time to create 1 rle with old method : 0.0349881649017334 time for calcul the mask position with numpy : 0.0064013004302978516 nb_pixel_total : 13795 time to create 1 rle with old method : 0.018625259399414062 time for calcul the mask position with numpy : 0.0060710906982421875 nb_pixel_total : 7771 time to create 1 rle with old method : 0.008420228958129883 time for calcul the mask position with numpy : 0.006094455718994141 nb_pixel_total : 6229 time to create 1 rle with old method : 0.007031440734863281 time for calcul the mask position with numpy : 0.0062046051025390625 nb_pixel_total : 19917 time to create 1 rle with old method : 0.022121429443359375 create new chi : 0.767158031463623 time to delete rle : 0.0006608963012695312 batch 1 Loaded 17 chid ids of type : 3594 ++++++++Number RLEs to save : 4300 TO DO : save crop sub photo not yet done ! save time : 0.2785320281982422 nb_obj : 12 nb_hashtags : 2 time to prepare the origin masks : 0.44442296028137207 time for calcul the mask position with numpy : 0.2794067859649658 nb_pixel_total : 1185143 time to create 1 rle with new method : 0.09896135330200195 time for calcul the mask position with numpy : 0.0061686038970947266 nb_pixel_total : 7103 time to create 1 rle with old method : 0.007760286331176758 time for calcul the mask position with numpy : 0.0063037872314453125 nb_pixel_total : 27406 time to create 1 rle with old method : 0.030288219451904297 time for calcul the mask position with numpy : 0.006043910980224609 nb_pixel_total : 14379 time to create 1 rle with old method : 0.015861034393310547 time for calcul the mask position with numpy : 0.0060672760009765625 nb_pixel_total : 127 time to create 1 rle with old method : 0.0001957416534423828 time for calcul the mask position with numpy : 0.0073125362396240234 nb_pixel_total : 91898 time to create 1 rle with old method : 0.10088586807250977 time for calcul the mask position with numpy : 0.009423255920410156 nb_pixel_total : 566133 time to create 1 rle with new method : 0.3454768657684326 time for calcul the mask position with numpy : 0.005922079086303711 nb_pixel_total : 4187 time to create 1 rle with old method : 0.004549980163574219 time for calcul the mask position with numpy : 0.005987405776977539 nb_pixel_total : 14103 time to create 1 rle with old method : 0.015242338180541992 time for calcul the mask position with numpy : 0.006250143051147461 nb_pixel_total : 2962 time to create 1 rle with old method : 0.003350496292114258 time for calcul the mask position with numpy : 0.006030082702636719 nb_pixel_total : 23887 time to create 1 rle with old method : 0.02564382553100586 time for calcul the mask position with numpy : 0.0059735774993896484 nb_pixel_total : 25802 time to create 1 rle with old method : 0.028330087661743164 time for calcul the mask position with numpy : 0.006619930267333984 nb_pixel_total : 110470 time to create 1 rle with old method : 0.11945748329162598 create new chi : 1.1639678478240967 time to delete rle : 0.0009806156158447266 batch 1 Loaded 25 chid ids of type : 3594 +++++++++++++Number RLEs to save : 6751 TO DO : save crop sub photo not yet done ! save time : 0.40788841247558594 nb_obj : 6 nb_hashtags : 2 time to prepare the origin masks : 0.07808899879455566 time for calcul the mask position with numpy : 0.018945693969726562 nb_pixel_total : 1959962 time to create 1 rle with new method : 0.028919696807861328 time for calcul the mask position with numpy : 0.006253957748413086 nb_pixel_total : 19899 time to create 1 rle with old method : 0.02204418182373047 time for calcul the mask position with numpy : 0.006268739700317383 nb_pixel_total : 23652 time to create 1 rle with old method : 0.025882959365844727 time for calcul the mask position with numpy : 0.005944728851318359 nb_pixel_total : 10027 time to create 1 rle with old method : 0.010932207107543945 time for calcul the mask position with numpy : 0.006071329116821289 nb_pixel_total : 16608 time to create 1 rle with old method : 0.01863574981689453 time for calcul the mask position with numpy : 0.0061817169189453125 nb_pixel_total : 25511 time to create 1 rle with old method : 0.028158187866210938 time for calcul the mask position with numpy : 0.007023334503173828 nb_pixel_total : 17941 time to create 1 rle with old method : 0.019510746002197266 create new chi : 0.21820569038391113 time to delete rle : 0.0004699230194091797 batch 1 Loaded 13 chid ids of type : 3594 ++++++Number RLEs to save : 3094 TO DO : save crop sub photo not yet done ! save time : 0.2128279209136963 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.05027461051940918 time for calcul the mask position with numpy : 0.022811412811279297 nb_pixel_total : 2050530 time to create 1 rle with new method : 0.029949665069580078 time for calcul the mask position with numpy : 0.0061376094818115234 nb_pixel_total : 18667 time to create 1 rle with old method : 0.021103382110595703 time for calcul the mask position with numpy : 0.006010770797729492 nb_pixel_total : 4403 time to create 1 rle with old method : 0.004978179931640625 create new chi : 0.09128093719482422 time to delete rle : 0.0002570152282714844 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1510 TO DO : save crop sub photo not yet done ! save time : 0.12663912773132324 nb_obj : 6 nb_hashtags : 3 time to prepare the origin masks : 0.07777857780456543 time for calcul the mask position with numpy : 0.019273042678833008 nb_pixel_total : 1968766 time to create 1 rle with new method : 0.367389440536499 time for calcul the mask position with numpy : 0.006183624267578125 nb_pixel_total : 15847 time to create 1 rle with old method : 0.0211184024810791 time for calcul the mask position with numpy : 0.006448507308959961 nb_pixel_total : 14128 time to create 1 rle with old method : 0.015434026718139648 time for calcul the mask position with numpy : 0.006216764450073242 nb_pixel_total : 31413 time to create 1 rle with old method : 0.03386878967285156 time for calcul the mask position with numpy : 0.010205745697021484 nb_pixel_total : 19979 time to create 1 rle with old method : 0.022381067276000977 time for calcul the mask position with numpy : 0.010174751281738281 nb_pixel_total : 9351 time to create 1 rle with old method : 0.010454416275024414 time for calcul the mask position with numpy : 0.007876873016357422 nb_pixel_total : 14116 time to create 1 rle with old method : 0.01510930061340332 create new chi : 0.5613729953765869 time to delete rle : 0.000492095947265625 batch 1 Loaded 13 chid ids of type : 3594 ++++++++++Number RLEs to save : 3088 TO DO : save crop sub photo not yet done ! save time : 0.21550941467285156 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.07129907608032227 time for calcul the mask position with numpy : 0.02839493751525879 nb_pixel_total : 2063712 time to create 1 rle with new method : 0.03327751159667969 time for calcul the mask position with numpy : 0.00634002685546875 nb_pixel_total : 3661 time to create 1 rle with old method : 0.004125356674194336 time for calcul the mask position with numpy : 0.006169557571411133 nb_pixel_total : 6227 time to create 1 rle with old method : 0.007123470306396484 create new chi : 0.08572173118591309 time to delete rle : 0.0002799034118652344 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1512 TO DO : save crop sub photo not yet done ! save time : 0.12332463264465332 nb_obj : 5 nb_hashtags : 1 time to prepare the origin masks : 0.06973385810852051 time for calcul the mask position with numpy : 0.01848006248474121 nb_pixel_total : 1992246 time to create 1 rle with new method : 0.07972478866577148 time for calcul the mask position with numpy : 0.006028890609741211 nb_pixel_total : 10089 time to create 1 rle with old method : 0.011299610137939453 time for calcul the mask position with numpy : 0.00615692138671875 nb_pixel_total : 7217 time to create 1 rle with old method : 0.008016347885131836 time for calcul the mask position with numpy : 0.0060443878173828125 nb_pixel_total : 6217 time to create 1 rle with old method : 0.0069844722747802734 time for calcul the mask position with numpy : 0.006052732467651367 nb_pixel_total : 14315 time to create 1 rle with old method : 0.019233226776123047 time for calcul the mask position with numpy : 0.006419658660888672 nb_pixel_total : 43516 time to create 1 rle with old method : 0.04666924476623535 create new chi : 0.23021411895751953 time to delete rle : 0.00026416778564453125 batch 1 Loaded 11 chid ids of type : 3594 +++++Number RLEs to save : 2735 TO DO : save crop sub photo not yet done ! save time : 0.1891014575958252 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.037682533264160156 time for calcul the mask position with numpy : 0.0189211368560791 nb_pixel_total : 2059528 time to create 1 rle with new method : 0.3392782211303711 time for calcul the mask position with numpy : 0.006029605865478516 nb_pixel_total : 7228 time to create 1 rle with old method : 0.008065462112426758 time for calcul the mask position with numpy : 0.006066560745239258 nb_pixel_total : 6844 time to create 1 rle with old method : 0.007671356201171875 create new chi : 0.3863179683685303 time to delete rle : 0.000286102294921875 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1552 TO DO : save crop sub photo not yet done ! save time : 0.12533211708068848 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.032494544982910156 time for calcul the mask position with numpy : 0.019403934478759766 nb_pixel_total : 2065487 time to create 1 rle with new method : 0.028051137924194336 time for calcul the mask position with numpy : 0.00628972053527832 nb_pixel_total : 8113 time to create 1 rle with old method : 0.009027481079101562 create new chi : 0.06302022933959961 time to delete rle : 0.0002396106719970703 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1438 TO DO : save crop sub photo not yet done ! save time : 0.11232233047485352 nb_obj : 6 nb_hashtags : 3 time to prepare the origin masks : 0.07300209999084473 time for calcul the mask position with numpy : 0.3388967514038086 nb_pixel_total : 1989528 time to create 1 rle with new method : 0.09006500244140625 time for calcul the mask position with numpy : 0.006026268005371094 nb_pixel_total : 2302 time to create 1 rle with old method : 0.0026090145111083984 time for calcul the mask position with numpy : 0.005864381790161133 nb_pixel_total : 5621 time to create 1 rle with old method : 0.006125926971435547 time for calcul the mask position with numpy : 0.005984067916870117 nb_pixel_total : 30522 time to create 1 rle with old method : 0.033203125 time for calcul the mask position with numpy : 0.006028175354003906 nb_pixel_total : 10217 time to create 1 rle with old method : 0.011516332626342773 time for calcul the mask position with numpy : 0.0062329769134521484 nb_pixel_total : 6766 time to create 1 rle with old method : 0.007528781890869141 time for calcul the mask position with numpy : 0.0065004825592041016 nb_pixel_total : 28644 time to create 1 rle with old method : 0.031890869140625 create new chi : 0.569333553314209 time to delete rle : 0.0004916191101074219 batch 1 Loaded 13 chid ids of type : 3594 +++++++Number RLEs to save : 2962 TO DO : save crop sub photo not yet done ! save time : 0.20315074920654297 nb_obj : 4 nb_hashtags : 3 time to prepare the origin masks : 0.06075906753540039 time for calcul the mask position with numpy : 0.020526647567749023 nb_pixel_total : 1938030 time to create 1 rle with new method : 0.41530275344848633 time for calcul the mask position with numpy : 0.005930185317993164 nb_pixel_total : 833 time to create 1 rle with old method : 0.0011005401611328125 time for calcul the mask position with numpy : 0.006454944610595703 nb_pixel_total : 107837 time to create 1 rle with old method : 0.12163543701171875 time for calcul the mask position with numpy : 0.006250619888305664 nb_pixel_total : 10353 time to create 1 rle with old method : 0.011828184127807617 time for calcul the mask position with numpy : 0.006350278854370117 nb_pixel_total : 16547 time to create 1 rle with old method : 0.0186154842376709 create new chi : 0.6216726303100586 time to delete rle : 0.0005018711090087891 batch 1 Loaded 9 chid ids of type : 3594 +++++Number RLEs to save : 2970 TO DO : save crop sub photo not yet done ! save time : 0.19800090789794922 nb_obj : 5 nb_hashtags : 4 time to prepare the origin masks : 0.06278753280639648 time for calcul the mask position with numpy : 0.3547487258911133 nb_pixel_total : 2011241 time to create 1 rle with new method : 0.10307788848876953 time for calcul the mask position with numpy : 0.006235837936401367 nb_pixel_total : 7609 time to create 1 rle with old method : 0.00844430923461914 time for calcul the mask position with numpy : 0.0062944889068603516 nb_pixel_total : 11901 time to create 1 rle with old method : 0.013230323791503906 time for calcul the mask position with numpy : 0.006040334701538086 nb_pixel_total : 9756 time to create 1 rle with old method : 0.01086568832397461 time for calcul the mask position with numpy : 0.0059871673583984375 nb_pixel_total : 13270 time to create 1 rle with old method : 0.014553070068359375 time for calcul the mask position with numpy : 0.0061473846435546875 nb_pixel_total : 19823 time to create 1 rle with old method : 0.02176690101623535 create new chi : 0.5685222148895264 time to delete rle : 0.00040459632873535156 batch 1 Loaded 11 chid ids of type : 3594 +++++Number RLEs to save : 2530 TO DO : save crop sub photo not yet done ! save time : 0.1910097599029541 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.047803401947021484 time for calcul the mask position with numpy : 0.01879429817199707 nb_pixel_total : 2040808 time to create 1 rle with new method : 0.20253968238830566 time for calcul the mask position with numpy : 0.0061032772064208984 nb_pixel_total : 23179 time to create 1 rle with old method : 0.025150060653686523 time for calcul the mask position with numpy : 0.006109476089477539 nb_pixel_total : 4148 time to create 1 rle with old method : 0.005894899368286133 time for calcul the mask position with numpy : 0.0077474117279052734 nb_pixel_total : 5465 time to create 1 rle with old method : 0.010040998458862305 create new chi : 0.2898080348968506 time to delete rle : 0.0004589557647705078 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 1702 TO DO : save crop sub photo not yet done ! save time : 0.1312580108642578 nb_obj : 5 nb_hashtags : 4 time to prepare the origin masks : 0.1387636661529541 time for calcul the mask position with numpy : 0.14597010612487793 nb_pixel_total : 1943024 time to create 1 rle with new method : 0.3545341491699219 time for calcul the mask position with numpy : 0.007504463195800781 nb_pixel_total : 100043 time to create 1 rle with old method : 0.11542749404907227 time for calcul the mask position with numpy : 0.0060176849365234375 nb_pixel_total : 8823 time to create 1 rle with old method : 0.009806632995605469 time for calcul the mask position with numpy : 0.005783796310424805 nb_pixel_total : 5043 time to create 1 rle with old method : 0.005837202072143555 time for calcul the mask position with numpy : 0.00579071044921875 nb_pixel_total : 7955 time to create 1 rle with old method : 0.00896596908569336 time for calcul the mask position with numpy : 0.006011009216308594 nb_pixel_total : 8712 time to create 1 rle with old method : 0.010056257247924805 create new chi : 0.6914467811584473 time to delete rle : 0.0004875659942626953 batch 1 Loaded 11 chid ids of type : 3594 +++++Number RLEs to save : 2920 TO DO : save crop sub photo not yet done ! save time : 0.18836617469787598 No data in photo_id : 1376615933 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.05709671974182129 time for calcul the mask position with numpy : 0.018842220306396484 nb_pixel_total : 2047783 time to create 1 rle with new method : 0.08729124069213867 time for calcul the mask position with numpy : 0.006044149398803711 nb_pixel_total : 5010 time to create 1 rle with old method : 0.005454540252685547 time for calcul the mask position with numpy : 0.0058040618896484375 nb_pixel_total : 3406 time to create 1 rle with old method : 0.0037384033203125 time for calcul the mask position with numpy : 0.00581812858581543 nb_pixel_total : 7848 time to create 1 rle with old method : 0.008658170700073242 time for calcul the mask position with numpy : 0.0057260990142822266 nb_pixel_total : 9553 time to create 1 rle with old method : 0.010421514511108398 create new chi : 0.16484665870666504 time to delete rle : 0.000308990478515625 batch 1 Loaded 9 chid ids of type : 3594 ++++Number RLEs to save : 1794 TO DO : save crop sub photo not yet done ! save time : 0.1349477767944336 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.051074981689453125 time for calcul the mask position with numpy : 0.021192073822021484 nb_pixel_total : 2038518 time to create 1 rle with new method : 0.29761528968811035 time for calcul the mask position with numpy : 0.006056785583496094 nb_pixel_total : 15989 time to create 1 rle with old method : 0.017756223678588867 time for calcul the mask position with numpy : 0.006051301956176758 nb_pixel_total : 6843 time to create 1 rle with old method : 0.007596015930175781 time for calcul the mask position with numpy : 0.006047725677490234 nb_pixel_total : 12250 time to create 1 rle with old method : 0.013576030731201172 create new chi : 0.3827195167541504 time to delete rle : 0.00029850006103515625 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 1752 TO DO : save crop sub photo not yet done ! save time : 0.14145493507385254 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.038632869720458984 time for calcul the mask position with numpy : 0.01962900161743164 nb_pixel_total : 2055633 time to create 1 rle with new method : 0.32187891006469727 time for calcul the mask position with numpy : 0.005926847457885742 nb_pixel_total : 9342 time to create 1 rle with old method : 0.010159730911254883 time for calcul the mask position with numpy : 0.005744457244873047 nb_pixel_total : 8625 time to create 1 rle with old method : 0.009335994720458984 create new chi : 0.37294530868530273 time to delete rle : 0.0002620220184326172 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1510 TO DO : save crop sub photo not yet done ! save time : 0.12930703163146973 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.056545257568359375 time for calcul the mask position with numpy : 0.02133965492248535 nb_pixel_total : 1940248 time to create 1 rle with new method : 0.05825662612915039 time for calcul the mask position with numpy : 0.005874156951904297 nb_pixel_total : 3454 time to create 1 rle with old method : 0.0038607120513916016 time for calcul the mask position with numpy : 0.005921363830566406 nb_pixel_total : 25133 time to create 1 rle with old method : 0.027147769927978516 time for calcul the mask position with numpy : 0.005779266357421875 nb_pixel_total : 7271 time to create 1 rle with old method : 0.008003711700439453 time for calcul the mask position with numpy : 0.006257057189941406 nb_pixel_total : 97494 time to create 1 rle with old method : 0.10521435737609863 create new chi : 0.25481653213500977 time to delete rle : 0.0003058910369873047 batch 1 Loaded 9 chid ids of type : 3594 ++++++Number RLEs to save : 3000 TO DO : save crop sub photo not yet done ! save time : 0.1897118091583252 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.056665658950805664 time for calcul the mask position with numpy : 0.24987006187438965 nb_pixel_total : 2046781 time to create 1 rle with new method : 0.08379936218261719 time for calcul the mask position with numpy : 0.005760908126831055 nb_pixel_total : 7187 time to create 1 rle with old method : 0.007842302322387695 time for calcul the mask position with numpy : 0.0059587955474853516 nb_pixel_total : 3863 time to create 1 rle with old method : 0.004299163818359375 time for calcul the mask position with numpy : 0.00594639778137207 nb_pixel_total : 10618 time to create 1 rle with old method : 0.011472702026367188 time for calcul the mask position with numpy : 0.0057942867279052734 nb_pixel_total : 5151 time to create 1 rle with old method : 0.005533933639526367 create new chi : 0.38965892791748047 time to delete rle : 0.0003476142883300781 batch 1 Loaded 9 chid ids of type : 3594 +++++Number RLEs to save : 1802 TO DO : save crop sub photo not yet done ! save time : 0.13750958442687988 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.03846430778503418 time for calcul the mask position with numpy : 0.018725156784057617 nb_pixel_total : 2053309 time to create 1 rle with new method : 0.027411937713623047 time for calcul the mask position with numpy : 0.006005764007568359 nb_pixel_total : 10247 time to create 1 rle with old method : 0.011263370513916016 time for calcul the mask position with numpy : 0.005926370620727539 nb_pixel_total : 10044 time to create 1 rle with old method : 0.010918378829956055 create new chi : 0.08055543899536133 time to delete rle : 0.0002760887145996094 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1744 TO DO : save crop sub photo not yet done ! save time : 0.13357949256896973 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.04133439064025879 time for calcul the mask position with numpy : 0.019038677215576172 nb_pixel_total : 2060524 time to create 1 rle with new method : 0.027149438858032227 time for calcul the mask position with numpy : 0.00598597526550293 nb_pixel_total : 7682 time to create 1 rle with old method : 0.008388757705688477 time for calcul the mask position with numpy : 0.005808591842651367 nb_pixel_total : 5394 time to create 1 rle with old method : 0.006047725677490234 create new chi : 0.07268810272216797 time to delete rle : 0.0002760887145996094 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1586 TO DO : save crop sub photo not yet done ! save time : 0.1318979263305664 nb_obj : 6 nb_hashtags : 3 time to prepare the origin masks : 0.07285404205322266 time for calcul the mask position with numpy : 0.04275345802307129 nb_pixel_total : 1910178 time to create 1 rle with new method : 0.3211803436279297 time for calcul the mask position with numpy : 0.00661158561706543 nb_pixel_total : 112000 time to create 1 rle with old method : 0.12157726287841797 time for calcul the mask position with numpy : 0.006049156188964844 nb_pixel_total : 21077 time to create 1 rle with old method : 0.023546457290649414 time for calcul the mask position with numpy : 0.005929470062255859 nb_pixel_total : 11206 time to create 1 rle with old method : 0.01232147216796875 time for calcul the mask position with numpy : 0.005877256393432617 nb_pixel_total : 3760 time to create 1 rle with old method : 0.004181861877441406 time for calcul the mask position with numpy : 0.006450176239013672 nb_pixel_total : 2657 time to create 1 rle with old method : 0.0029659271240234375 time for calcul the mask position with numpy : 0.007580995559692383 nb_pixel_total : 12722 time to create 1 rle with old method : 0.013808965682983398 create new chi : 0.5911257266998291 time to delete rle : 0.0005791187286376953 batch 1 Loaded 13 chid ids of type : 3594 ++++++Number RLEs to save : 3532 TO DO : save crop sub photo not yet done ! save time : 0.2337644100189209 nb_obj : 3 nb_hashtags : 3 time to prepare the origin masks : 0.04491782188415527 time for calcul the mask position with numpy : 0.019104480743408203 nb_pixel_total : 2042716 time to create 1 rle with new method : 0.3343064785003662 time for calcul the mask position with numpy : 0.006158351898193359 nb_pixel_total : 5017 time to create 1 rle with old method : 0.005953550338745117 time for calcul the mask position with numpy : 0.007295131683349609 nb_pixel_total : 18839 time to create 1 rle with old method : 0.020951509475708008 time for calcul the mask position with numpy : 0.005841970443725586 nb_pixel_total : 7028 time to create 1 rle with old method : 0.010326623916625977 create new chi : 0.41651439666748047 time to delete rle : 0.0004222393035888672 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 1930 TO DO : save crop sub photo not yet done ! save time : 0.14533066749572754 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.052358388900756836 time for calcul the mask position with numpy : 0.019198894500732422 nb_pixel_total : 2062305 time to create 1 rle with new method : 0.03635573387145996 time for calcul the mask position with numpy : 0.005789279937744141 nb_pixel_total : 11295 time to create 1 rle with old method : 0.012506246566772461 create new chi : 0.07409906387329102 time to delete rle : 0.0002498626708984375 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1446 TO DO : save crop sub photo not yet done ! save time : 0.12262892723083496 nb_obj : 3 nb_hashtags : 3 time to prepare the origin masks : 0.13981294631958008 time for calcul the mask position with numpy : 0.0726313591003418 nb_pixel_total : 1998403 time to create 1 rle with new method : 0.4163069725036621 time for calcul the mask position with numpy : 0.007211208343505859 nb_pixel_total : 66910 time to create 1 rle with old method : 0.07939410209655762 time for calcul the mask position with numpy : 0.006857872009277344 nb_pixel_total : 5312 time to create 1 rle with old method : 0.008615732192993164 time for calcul the mask position with numpy : 0.006857395172119141 nb_pixel_total : 2975 time to create 1 rle with old method : 0.004724740982055664 create new chi : 0.6114330291748047 time to delete rle : 0.0004010200500488281 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 2124 TO DO : save crop sub photo not yet done ! save time : 0.16631340980529785 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.0959622859954834 time for calcul the mask position with numpy : 0.09676575660705566 nb_pixel_total : 1956811 time to create 1 rle with new method : 0.25793886184692383 time for calcul the mask position with numpy : 0.006309032440185547 nb_pixel_total : 20762 time to create 1 rle with old method : 0.022881031036376953 time for calcul the mask position with numpy : 0.006255388259887695 nb_pixel_total : 76856 time to create 1 rle with old method : 0.08422636985778809 time for calcul the mask position with numpy : 0.006287097930908203 nb_pixel_total : 13911 time to create 1 rle with old method : 0.015253305435180664 time for calcul the mask position with numpy : 0.006849765777587891 nb_pixel_total : 5260 time to create 1 rle with old method : 0.005856990814208984 create new chi : 0.5168445110321045 time to delete rle : 0.0004980564117431641 batch 1 Loaded 10 chid ids of type : 3594 ++++++++++Number RLEs to save : 2624 TO DO : save crop sub photo not yet done ! save time : 0.18971753120422363 nb_obj : 4 nb_hashtags : 1 time to prepare the origin masks : 0.07045888900756836 time for calcul the mask position with numpy : 0.02128005027770996 nb_pixel_total : 2022981 time to create 1 rle with new method : 0.22526979446411133 time for calcul the mask position with numpy : 0.0069277286529541016 nb_pixel_total : 4059 time to create 1 rle with old method : 0.0048444271087646484 time for calcul the mask position with numpy : 0.00714874267578125 nb_pixel_total : 16829 time to create 1 rle with old method : 0.021193742752075195 time for calcul the mask position with numpy : 0.0075533390045166016 nb_pixel_total : 22845 time to create 1 rle with old method : 0.025776386260986328 time for calcul the mask position with numpy : 0.006615400314331055 nb_pixel_total : 6886 time to create 1 rle with old method : 0.008036613464355469 create new chi : 0.3432657718658447 time to delete rle : 0.0004096031188964844 batch 1 Loaded 9 chid ids of type : 3594 +++++++++Number RLEs to save : 2286 TO DO : save crop sub photo not yet done ! save time : 0.1766648292541504 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.06180453300476074 time for calcul the mask position with numpy : 0.3147544860839844 nb_pixel_total : 1967172 time to create 1 rle with new method : 0.22036194801330566 time for calcul the mask position with numpy : 0.006445407867431641 nb_pixel_total : 3970 time to create 1 rle with old method : 0.0045168399810791016 time for calcul the mask position with numpy : 0.006186008453369141 nb_pixel_total : 4722 time to create 1 rle with old method : 0.005321025848388672 time for calcul the mask position with numpy : 0.0072362422943115234 nb_pixel_total : 82078 time to create 1 rle with old method : 0.09164643287658691 time for calcul the mask position with numpy : 0.006251096725463867 nb_pixel_total : 15658 time to create 1 rle with old method : 0.017323732376098633 create new chi : 0.6891255378723145 time to delete rle : 0.0004630088806152344 batch 1 Loaded 9 chid ids of type : 3594 +++++++Number RLEs to save : 2514 TO DO : save crop sub photo not yet done ! save time : 0.19271254539489746 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.05214095115661621 time for calcul the mask position with numpy : 0.019498825073242188 nb_pixel_total : 1995013 time to create 1 rle with new method : 0.18411493301391602 time for calcul the mask position with numpy : 0.006487131118774414 nb_pixel_total : 10630 time to create 1 rle with old method : 0.012009620666503906 time for calcul the mask position with numpy : 0.006674051284790039 nb_pixel_total : 17809 time to create 1 rle with old method : 0.020064830780029297 time for calcul the mask position with numpy : 0.0067141056060791016 nb_pixel_total : 50148 time to create 1 rle with old method : 0.05577230453491211 create new chi : 0.3191392421722412 time to delete rle : 0.0003936290740966797 batch 1 Loaded 7 chid ids of type : 3594 ++++Number RLEs to save : 2360 TO DO : save crop sub photo not yet done ! save time : 0.16852140426635742 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.05963444709777832 time for calcul the mask position with numpy : 0.015512943267822266 nb_pixel_total : 1322652 time to create 1 rle with new method : 0.26669788360595703 time for calcul the mask position with numpy : 0.016117334365844727 nb_pixel_total : 723424 time to create 1 rle with new method : 0.3383297920227051 time for calcul the mask position with numpy : 0.006094694137573242 nb_pixel_total : 13577 time to create 1 rle with old method : 0.016544580459594727 time for calcul the mask position with numpy : 0.006332874298095703 nb_pixel_total : 8754 time to create 1 rle with old method : 0.009574651718139648 time for calcul the mask position with numpy : 0.0060083866119384766 nb_pixel_total : 5193 time to create 1 rle with old method : 0.0058307647705078125 create new chi : 0.6949372291564941 time to delete rle : 0.0006082057952880859 batch 1 Loaded 9 chid ids of type : 3594 ++++Number RLEs to save : 3738 TO DO : save crop sub photo not yet done ! save time : 0.24781394004821777 map_output_result : {1376616345: (0.0, 'Should be the crop_list due to order', 0), 1376616343: (0.0, 'Should be the crop_list due to order', 0), 1376616340: (0.0, 'Should be the crop_list due to order', 0), 1376616339: (0.0, 'Should be the crop_list due to order', 0), 1376616338: (0.0, 'Should be the crop_list due to order', 0), 1376616336: (0.0, 'Should be the crop_list due to order', 0), 1376616255: (0.0, 'Should be the crop_list due to order', 0), 1376616247: (0.0, 'Should be the crop_list due to order', 0), 1376616225: (0.0, 'Should be the crop_list due to order', 0), 1376616190: (0.0, 'Should be the crop_list due to order', 0), 1376616156: (0.0, 'Should be the crop_list due to order', 0), 1376616121: (0.0, 'Should be the crop_list due to order', 0), 1376615988: (0.0, 'Should be the crop_list due to order', 0), 1376615939: (0.0, 'Should be the crop_list due to order', 0), 1376615935: (0.0, 'Should be the crop_list due to order', 0), 1376615933: (0.0, 'Should be the crop_list due to order', 0.0), 1376615912: (0.0, 'Should be the crop_list due to order', 0), 1376615910: (0.0, 'Should be the crop_list due to order', 0), 1376615907: (0.0, 'Should be the crop_list due to order', 0), 1376615906: (0.0, 'Should be the crop_list due to order', 0), 1376615905: (0.0, 'Should be the crop_list due to order', 0), 1376615901: (0.0, 'Should be the crop_list due to order', 0), 1376615680: (0.0, 'Should be the crop_list due to order', 0), 1376615677: (0.0, 'Should be the crop_list due to order', 0), 1376615673: (0.0, 'Should be the crop_list due to order', 0), 1376615668: (0.0, 'Should be the crop_list due to order', 0), 1376615663: (0.0, 'Should be the crop_list due to order', 0), 1376615636: (0.0, 'Should be the crop_list due to order', 0), 1376615433: (0.0, 'Should be the crop_list due to order', 0), 1376615400: (0.0, 'Should be the crop_list due to order', 0), 1376615357: (0.0, 'Should be the crop_list due to order', 0), 1376615325: (0.0, 'Should be the crop_list due to order', 0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1376616345, 1376616343, 1376616340, 1376616339, 1376616338, 1376616336, 1376616255, 1376616247, 1376616225, 1376616190, 1376616156, 1376616121, 1376615988, 1376615939, 1376615935, 1376615933, 1376615912, 1376615910, 1376615907, 1376615906, 1376615905, 1376615901, 1376615680, 1376615677, 1376615673, 1376615668, 1376615663, 1376615636, 1376615433, 1376615400, 1376615357, 1376615325] Looping around the photos to save general results len do output : 32 /1376616345.Didn't retrieve data . /1376616343.Didn't retrieve data . /1376616340.Didn't retrieve data . /1376616339.Didn't retrieve data . /1376616338.Didn't retrieve data . /1376616336.Didn't retrieve data . /1376616255.Didn't retrieve data . /1376616247.Didn't retrieve data . /1376616225.Didn't retrieve data . /1376616190.Didn't retrieve data . /1376616156.Didn't retrieve data . /1376616121.Didn't retrieve data . /1376615988.Didn't retrieve data . /1376615939.Didn't retrieve data . /1376615935.Didn't retrieve data . /1376615933.Didn't retrieve data . /1376615912.Didn't retrieve data . /1376615910.Didn't retrieve data . /1376615907.Didn't retrieve data . /1376615906.Didn't retrieve data . /1376615905.Didn't retrieve data . /1376615901.Didn't retrieve data . /1376615680.Didn't retrieve data . /1376615677.Didn't retrieve data . /1376615673.Didn't retrieve data . /1376615668.Didn't retrieve data . /1376615663.Didn't retrieve data . /1376615636.Didn't retrieve data . /1376615433.Didn't retrieve data . /1376615400.Didn't retrieve data . /1376615357.Didn't retrieve data . /1376615325.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616345', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616343', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616340', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616339', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616338', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616336', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616255', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616247', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616225', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616190', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616156', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616121', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615988', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615939', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615935', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615933', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615912', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615910', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615907', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615906', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615905', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615901', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615680', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615677', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615673', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615668', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615663', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615636', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615433', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615400', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615357', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615325', None, None, None, None, None, '3512843') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 96 time used for this insertion : 0.01516103744506836 save_final save missing photos in datou_result : time spend for datou_step_exec : 22.0281343460083 time spend to save output : 0.016507863998413086 total time spend for step 3 : 22.044642210006714 step4:ventilate_hashtags_in_portfolio Tue Aug 12 14:42:29 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 25915180 get user id for portfolio 25915180 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25915180 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('environnement','papier','flou','pehd','autre','pet_fonce','pet_clair','background','carton','mal_croppe','metal')) AND mptpi.`min_score`=0.5 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25915180 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('environnement','papier','flou','pehd','autre','pet_fonce','pet_clair','background','carton','mal_croppe','metal')) AND mptpi.`min_score`=0.5 To do To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25915180 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('environnement','papier','flou','pehd','autre','pet_fonce','pet_clair','background','carton','mal_croppe','metal')) AND mptpi.`min_score`=0.5 To do lien utilise dans velours : https://www.fotonower.com/velours/25915280,25915281,25915282,25915283,25915284,25915285,25915286,25915287,25915288,25915289,25915290?tags=environnement,papier,flou,pehd,autre,pet_fonce,pet_clair,background,carton,mal_croppe,metal Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1376616345, 1376616343, 1376616340, 1376616339, 1376616338, 1376616336, 1376616255, 1376616247, 1376616225, 1376616190, 1376616156, 1376616121, 1376615988, 1376615939, 1376615935, 1376615933, 1376615912, 1376615910, 1376615907, 1376615906, 1376615905, 1376615901, 1376615680, 1376615677, 1376615673, 1376615668, 1376615663, 1376615636, 1376615433, 1376615400, 1376615357, 1376615325] Looping around the photos to save general results len do output : 1 /25915180. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616345', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616343', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616340', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616339', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616338', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616336', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616255', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616247', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616225', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616190', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616156', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616121', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615988', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615939', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615935', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615933', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615912', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615910', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615907', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615906', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615905', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615901', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615680', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615677', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615673', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615668', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615663', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615636', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615433', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615400', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615357', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615325', None, None, None, None, None, '3512843') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 33 time used for this insertion : 0.021904945373535156 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.6990871429443359 time spend to save output : 0.022347450256347656 total time spend for step 4 : 0.7214345932006836 step5:final Tue Aug 12 14:42:30 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Catched exception ! Connect or reconnect ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1376616345: ('0.054940908926504634',), 1376616343: ('0.054940908926504634',), 1376616340: ('0.054940908926504634',), 1376616339: ('0.054940908926504634',), 1376616338: ('0.054940908926504634',), 1376616336: ('0.054940908926504634',), 1376616255: ('0.054940908926504634',), 1376616247: ('0.054940908926504634',), 1376616225: ('0.054940908926504634',), 1376616190: ('0.054940908926504634',), 1376616156: ('0.054940908926504634',), 1376616121: ('0.054940908926504634',), 1376615988: ('0.054940908926504634',), 1376615939: ('0.054940908926504634',), 1376615935: ('0.054940908926504634',), 1376615933: ('0.054940908926504634',), 1376615912: ('0.054940908926504634',), 1376615910: ('0.054940908926504634',), 1376615907: ('0.054940908926504634',), 1376615906: ('0.054940908926504634',), 1376615905: ('0.054940908926504634',), 1376615901: ('0.054940908926504634',), 1376615680: ('0.054940908926504634',), 1376615677: ('0.054940908926504634',), 1376615673: ('0.054940908926504634',), 1376615668: ('0.054940908926504634',), 1376615663: ('0.054940908926504634',), 1376615636: ('0.054940908926504634',), 1376615433: ('0.054940908926504634',), 1376615400: ('0.054940908926504634',), 1376615357: ('0.054940908926504634',), 1376615325: ('0.054940908926504634',)} new output for save of step final : {1376616345: ('0.054940908926504634',), 1376616343: ('0.054940908926504634',), 1376616340: ('0.054940908926504634',), 1376616339: ('0.054940908926504634',), 1376616338: ('0.054940908926504634',), 1376616336: ('0.054940908926504634',), 1376616255: ('0.054940908926504634',), 1376616247: ('0.054940908926504634',), 1376616225: ('0.054940908926504634',), 1376616190: ('0.054940908926504634',), 1376616156: ('0.054940908926504634',), 1376616121: ('0.054940908926504634',), 1376615988: ('0.054940908926504634',), 1376615939: ('0.054940908926504634',), 1376615935: ('0.054940908926504634',), 1376615933: ('0.054940908926504634',), 1376615912: ('0.054940908926504634',), 1376615910: ('0.054940908926504634',), 1376615907: ('0.054940908926504634',), 1376615906: ('0.054940908926504634',), 1376615905: ('0.054940908926504634',), 1376615901: ('0.054940908926504634',), 1376615680: ('0.054940908926504634',), 1376615677: ('0.054940908926504634',), 1376615673: ('0.054940908926504634',), 1376615668: ('0.054940908926504634',), 1376615663: ('0.054940908926504634',), 1376615636: ('0.054940908926504634',), 1376615433: ('0.054940908926504634',), 1376615400: ('0.054940908926504634',), 1376615357: ('0.054940908926504634',), 1376615325: ('0.054940908926504634',)} [1376616345, 1376616343, 1376616340, 1376616339, 1376616338, 1376616336, 1376616255, 1376616247, 1376616225, 1376616190, 1376616156, 1376616121, 1376615988, 1376615939, 1376615935, 1376615933, 1376615912, 1376615910, 1376615907, 1376615906, 1376615905, 1376615901, 1376615680, 1376615677, 1376615673, 1376615668, 1376615663, 1376615636, 1376615433, 1376615400, 1376615357, 1376615325] Looping around the photos to save general results len do output : 32 /1376616345.Didn't retrieve data . /1376616343.Didn't retrieve data . /1376616340.Didn't retrieve data . /1376616339.Didn't retrieve data . /1376616338.Didn't retrieve data . /1376616336.Didn't retrieve data . /1376616255.Didn't retrieve data . /1376616247.Didn't retrieve data . /1376616225.Didn't retrieve data . /1376616190.Didn't retrieve data . /1376616156.Didn't retrieve data . /1376616121.Didn't retrieve data . /1376615988.Didn't retrieve data . /1376615939.Didn't retrieve data . /1376615935.Didn't retrieve data . /1376615933.Didn't retrieve data . /1376615912.Didn't retrieve data . /1376615910.Didn't retrieve data . /1376615907.Didn't retrieve data . /1376615906.Didn't retrieve data . /1376615905.Didn't retrieve data . /1376615901.Didn't retrieve data . /1376615680.Didn't retrieve data . /1376615677.Didn't retrieve data . /1376615673.Didn't retrieve data . /1376615668.Didn't retrieve data . /1376615663.Didn't retrieve data . /1376615636.Didn't retrieve data . /1376615433.Didn't retrieve data . /1376615400.Didn't retrieve data . /1376615357.Didn't retrieve data . /1376615325.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616345', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616343', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616340', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616339', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616338', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616336', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616255', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616247', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616225', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616190', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616156', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616121', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615988', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615939', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615935', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615933', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615912', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615910', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615907', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615906', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615905', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615901', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615680', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615677', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615673', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615668', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615663', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615636', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615433', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615400', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615357', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615325', None, None, None, None, None, '3512843') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 96 time used for this insertion : 0.01685190200805664 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.12457418441772461 time spend to save output : 0.018311023712158203 total time spend for step 5 : 0.1428852081298828 step6:blur_detection Tue Aug 12 14:42:30 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection methode: ratio et variance treat image : temp/1755002428_1021063_1376616345_eb179660806c2458f319b164b8fd2875.jpg resize: (1080, 1920) 1376616345 1.3678124973524015 treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96.jpg resize: (1080, 1920) 1376616343 2.7483875309323076 treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960.jpg resize: (1080, 1920) 1376616340 1.7781591264735965 treat image : temp/1755002428_1021063_1376616339_992d7fc68463c9b93cb6f1baeb2b19b1.jpg resize: (1080, 1920) 1376616339 2.5296744098632713 treat image : temp/1755002428_1021063_1376616338_7551026f552781689006e7774394694a.jpg resize: (1080, 1920) 1376616338 0.789852865277291 treat image : temp/1755002428_1021063_1376616336_7d9ca93dae5e4d339418eb9e28de7996.jpg resize: (1080, 1920) 1376616336 3.977393841840734 treat image : temp/1755002428_1021063_1376616255_8049566e22379de53578dbb77007546c.jpg resize: (1080, 1920) 1376616255 0.4574945920813751 treat image : temp/1755002428_1021063_1376616247_24a17bf472d313b99798fad16e2230fd.jpg resize: (1080, 1920) 1376616247 2.1094321425233478 treat image : temp/1755002428_1021063_1376616225_ead42c0fa85550f15ac6e22cf35f5093.jpg resize: (1080, 1920) 1376616225 1.1030952925543622 treat image : temp/1755002428_1021063_1376616190_e2d329b9c22b9a77c65cf740660ca8fe.jpg resize: (1080, 1920) 1376616190 1.8472318561573158 treat image : temp/1755002428_1021063_1376616156_cecc6481d0d4840169059e54ba6235ed.jpg resize: (1080, 1920) 1376616156 0.11795734540366531 treat image : temp/1755002428_1021063_1376616121_0ab7ac92400248bd41cfe95f796e8493.jpg resize: (1080, 1920) 1376616121 -2.86325021768498 treat image : temp/1755002428_1021063_1376615988_5a34606028a4082bd0c3e439a562586a.jpg resize: (1080, 1920) 1376615988 -3.6931196348050146 treat image : temp/1755002428_1021063_1376615939_abf5aa7cdddf642f4c0b187a10079c41.jpg resize: (1080, 1920) 1376615939 0.5398208005083787 treat image : temp/1755002428_1021063_1376615935_1f2050d0915a716343954df972c0a490.jpg resize: (1080, 1920) 1376615935 -0.051077777492504965 treat image : temp/1755002428_1021063_1376615933_32dd881aa87254d1fa8b7bcaa584226c.jpg resize: (1080, 1920) 1376615933 0.4067163189740742 treat image : temp/1755002428_1021063_1376615912_5d8b19c3974b0643c09ae5c720107cba.jpg resize: (1080, 1920) 1376615912 0.9258607613406578 treat image : temp/1755002428_1021063_1376615910_ff695412a811ec5cc71ade82078c14ce.jpg resize: (1080, 1920) 1376615910 0.3834055549562788 treat image : temp/1755002428_1021063_1376615907_363af079b186585feedc904addf0c4a9.jpg resize: (1080, 1920) 1376615907 0.7404548500047579 treat image : temp/1755002428_1021063_1376615906_f9a090953a2bf483a30d38f070fae10d.jpg resize: (1080, 1920) 1376615906 -0.04808053514134563 treat image : temp/1755002428_1021063_1376615905_1ba3ce0c01d0e6d7841d78f2a150d90e.jpg resize: (1080, 1920) 1376615905 -3.9595207802598558 treat image : temp/1755002428_1021063_1376615901_6e8399b10cf80e614a82e6929be5dc3f.jpg resize: (1080, 1920) 1376615901 1.8610503654734172 treat image : temp/1755002428_1021063_1376615680_373642d8658b39a3eb8595a55c315bd5.jpg resize: (1080, 1920) 1376615680 0.2091622960434755 treat image : temp/1755002428_1021063_1376615677_b47f6e88fedf38ae7790799daa764362.jpg resize: (1080, 1920) 1376615677 0.3388924488030405 treat image : temp/1755002428_1021063_1376615673_a7f4e834c017cdedc06508e65c2cd4f2.jpg resize: (1080, 1920) 1376615673 1.7086942679766672 treat image : temp/1755002428_1021063_1376615668_a45ea13eecfdf999f3cec4b8155578a7.jpg resize: (1080, 1920) 1376615668 0.2024731908591126 treat image : temp/1755002428_1021063_1376615663_6b4a4c06634987de40cf3e3719f1dd1d.jpg resize: (1080, 1920) 1376615663 1.3870671611371004 treat image : temp/1755002428_1021063_1376615636_a588a7e9790978247d6b408d0b6e51a5.jpg resize: (1080, 1920) 1376615636 0.955947786499488 treat image : temp/1755002428_1021063_1376615433_9ba98c93fecf9f94c6e640766695d089.jpg resize: (1080, 1920) 1376615433 0.9839343945695964 treat image : temp/1755002428_1021063_1376615400_451aaf3d18cd835258aec99d5f93797c.jpg resize: (1080, 1920) 1376615400 1.8722194538548635 treat image : temp/1755002428_1021063_1376615357_7f10fe0d8556c0f723e2ab4a5def730d.jpg resize: (1080, 1920) 1376615357 0.021510936326406774 treat image : temp/1755002428_1021063_1376615325_10abadff93a036b3d96387016faec225.jpg resize: (1080, 1920) 1376615325 -0.07726434110938954 treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96_rle_crop_3912137195_0.png resize: (180, 182) 1376618751 -0.9396846435976907 treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96_rle_crop_3912137198_0.png resize: (143, 184) 1376618752 -2.0039681068110466 treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96_rle_crop_3912137202_0.png resize: (86, 124) 1376618753 1.0426222578751436 treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137211_0.png resize: (61, 98) 1376618754 -0.8769520649241791 treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137212_0.png resize: (103, 163) 1376618755 1.567084807867083 treat image : temp/1755002428_1021063_1376616339_992d7fc68463c9b93cb6f1baeb2b19b1_rle_crop_3912137217_0.png resize: (164, 228) 1376618756 -1.2675687517088912 treat image : temp/1755002428_1021063_1376616339_992d7fc68463c9b93cb6f1baeb2b19b1_rle_crop_3912137218_0.png resize: (125, 102) 1376618757 0.2516500479611008 treat image : temp/1755002428_1021063_1376616336_7d9ca93dae5e4d339418eb9e28de7996_rle_crop_3912137225_0.png resize: (202, 233) 1376618758 -0.30093810666157417 treat image : temp/1755002428_1021063_1376616336_7d9ca93dae5e4d339418eb9e28de7996_rle_crop_3912137226_0.png resize: (181, 216) 1376618759 -0.40454105992832967 treat image : temp/1755002428_1021063_1376616225_ead42c0fa85550f15ac6e22cf35f5093_rle_crop_3912137237_0.png resize: (123, 119) 1376618760 -1.9677611362653051 treat image : temp/1755002428_1021063_1376616156_cecc6481d0d4840169059e54ba6235ed_rle_crop_3912137240_0.png resize: (110, 82) 1376618761 -0.7105696542601647 treat image : temp/1755002428_1021063_1376616121_0ab7ac92400248bd41cfe95f796e8493_rle_crop_3912137248_0.png resize: (134, 182) 1376618762 -1.5819074722702484 treat image : temp/1755002428_1021063_1376615988_5a34606028a4082bd0c3e439a562586a_rle_crop_3912137253_0.png resize: (108, 116) 1376618763 -3.6818775235384256 treat image : temp/1755002428_1021063_1376615935_1f2050d0915a716343954df972c0a490_rle_crop_3912137258_0.png resize: (73, 137) 1376618764 0.5481685235408017 treat image : temp/1755002428_1021063_1376615935_1f2050d0915a716343954df972c0a490_rle_crop_3912137259_0.png resize: (60, 109) 1376618765 0.0510240278726264 treat image : temp/1755002428_1021063_1376615910_ff695412a811ec5cc71ade82078c14ce_rle_crop_3912137266_0.png resize: (123, 152) 1376618766 -1.3139439681140062 treat image : temp/1755002428_1021063_1376615907_363af079b186585feedc904addf0c4a9_rle_crop_3912137269_0.png resize: (118, 115) 1376618767 -1.5005803000048992 treat image : temp/1755002428_1021063_1376615906_f9a090953a2bf483a30d38f070fae10d_rle_crop_3912137272_0.png resize: (114, 105) 1376618768 -0.019382716813735452 treat image : temp/1755002428_1021063_1376615906_f9a090953a2bf483a30d38f070fae10d_rle_crop_3912137273_0.png resize: (225, 225) 1376618769 -1.1860011770501717 treat image : temp/1755002428_1021063_1376615906_f9a090953a2bf483a30d38f070fae10d_rle_crop_3912137274_0.png resize: (68, 77) 1376618770 -0.5490038638940143 treat image : temp/1755002428_1021063_1376615905_1ba3ce0c01d0e6d7841d78f2a150d90e_rle_crop_3912137275_0.png resize: (48, 119) 1376618771 7.1147211470008465 treat image : temp/1755002428_1021063_1376615905_1ba3ce0c01d0e6d7841d78f2a150d90e_rle_crop_3912137277_0.png resize: (66, 93) 1376618773 -2.5030245431736056 treat image : temp/1755002428_1021063_1376615901_6e8399b10cf80e614a82e6929be5dc3f_rle_crop_3912137279_0.png resize: (169, 102) 1376618774 -1.2807605740374386 treat image : temp/1755002428_1021063_1376615680_373642d8658b39a3eb8595a55c315bd5_rle_crop_3912137281_0.png resize: (89, 123) 1376618775 -1.8482029349484608 treat image : temp/1755002428_1021063_1376615680_373642d8658b39a3eb8595a55c315bd5_rle_crop_3912137282_0.png resize: (163, 90) 1376618776 -1.8493937283316972 treat image : temp/1755002428_1021063_1376615677_b47f6e88fedf38ae7790799daa764362_rle_crop_3912137283_0.png resize: (137, 148) 1376618777 -1.6197812207890085 treat image : temp/1755002428_1021063_1376615677_b47f6e88fedf38ae7790799daa764362_rle_crop_3912137286_0.png resize: (117, 149) 1376618778 -0.34033410996235414 treat image : temp/1755002428_1021063_1376615673_a7f4e834c017cdedc06508e65c2cd4f2_rle_crop_3912137289_0.png resize: (158, 70) 1376618779 -1.5325642034100986 treat image : temp/1755002428_1021063_1376615668_a45ea13eecfdf999f3cec4b8155578a7_rle_crop_3912137292_0.png resize: (183, 115) 1376618780 -0.6528829977564775 treat image : temp/1755002428_1021063_1376615663_6b4a4c06634987de40cf3e3719f1dd1d_rle_crop_3912137293_0.png resize: (88, 47) 1376618781 -0.833623467058195 treat image : temp/1755002428_1021063_1376615636_a588a7e9790978247d6b408d0b6e51a5_rle_crop_3912137296_0.png resize: (105, 89) 1376618782 -0.9804826313290712 treat image : temp/1755002428_1021063_1376615636_a588a7e9790978247d6b408d0b6e51a5_rle_crop_3912137298_0.png resize: (220, 515) 1376618784 -2.451793402046918 treat image : temp/1755002428_1021063_1376615636_a588a7e9790978247d6b408d0b6e51a5_rle_crop_3912137299_0.png resize: (166, 185) 1376618785 -1.5941991792873382 treat image : temp/1755002428_1021063_1376615636_a588a7e9790978247d6b408d0b6e51a5_rle_crop_3912137300_0.png resize: (115, 121) 1376618786 -1.6636069164648246 treat image : temp/1755002428_1021063_1376615400_451aaf3d18cd835258aec99d5f93797c_rle_crop_3912137307_0.png resize: (53, 103) 1376618787 6.17358974585944 treat image : temp/1755002428_1021063_1376615400_451aaf3d18cd835258aec99d5f93797c_rle_crop_3912137308_0.png resize: (55, 117) 1376618788 -1.1290786232253016 treat image : temp/1755002428_1021063_1376615357_7f10fe0d8556c0f723e2ab4a5def730d_rle_crop_3912137311_0.png resize: (174, 109) 1376618789 -0.668720885487635 treat image : temp/1755002428_1021063_1376615325_10abadff93a036b3d96387016faec225_rle_crop_3912137312_0.png resize: (83, 128) 1376618790 -0.12340154568193569 treat image : temp/1755002428_1021063_1376615325_10abadff93a036b3d96387016faec225_rle_crop_3912137313_0.png resize: (82, 149) 1376618791 -1.2713290700937732 treat image : temp/1755002428_1021063_1376616255_8049566e22379de53578dbb77007546c_rle_crop_3912137230_0.png resize: (115, 39) 1376618792 2.546325426603026 treat image : temp/1755002428_1021063_1376616225_ead42c0fa85550f15ac6e22cf35f5093_rle_crop_3912137236_0.png resize: (113, 84) 1376618793 -0.5219737367905574 treat image : temp/1755002428_1021063_1376616121_0ab7ac92400248bd41cfe95f796e8493_rle_crop_3912137245_0.png resize: (137, 182) 1376618794 -1.702372101391534 treat image : temp/1755002428_1021063_1376615935_1f2050d0915a716343954df972c0a490_rle_crop_3912137260_0.png resize: (141, 100) 1376618795 -1.7055769589272498 treat image : temp/1755002428_1021063_1376615912_5d8b19c3974b0643c09ae5c720107cba_rle_crop_3912137262_0.png resize: (115, 128) 1376618796 -1.5518593388429498 treat image : temp/1755002428_1021063_1376615912_5d8b19c3974b0643c09ae5c720107cba_rle_crop_3912137263_0.png resize: (85, 121) 1376618797 -0.6629643849150834 treat image : temp/1755002428_1021063_1376615912_5d8b19c3974b0643c09ae5c720107cba_rle_crop_3912137264_0.png resize: (54, 87) 1376618798 -0.655853873539143 treat image : temp/1755002428_1021063_1376615901_6e8399b10cf80e614a82e6929be5dc3f_rle_crop_3912137280_0.png resize: (153, 94) 1376618799 -1.5361609042875448 treat image : temp/1755002428_1021063_1376615677_b47f6e88fedf38ae7790799daa764362_rle_crop_3912137284_0.png resize: (84, 47) 1376618800 -0.508180728555351 treat image : temp/1755002428_1021063_1376615677_b47f6e88fedf38ae7790799daa764362_rle_crop_3912137287_0.png resize: (228, 147) 1376618801 -1.3179907448337693 treat image : temp/1755002428_1021063_1376615677_b47f6e88fedf38ae7790799daa764362_rle_crop_3912137288_0.png resize: (545, 337) 1376618802 0.1261255037445873 treat image : temp/1755002428_1021063_1376615673_a7f4e834c017cdedc06508e65c2cd4f2_rle_crop_3912137291_0.png resize: (78, 119) 1376618804 -1.6241299035929222 treat image : temp/1755002428_1021063_1376615663_6b4a4c06634987de40cf3e3719f1dd1d_rle_crop_3912137294_0.png resize: (60, 98) 1376618805 4.060541561889823 treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96_rle_crop_3912137197_0.png resize: (131, 78) 1376618808 0.2475632978735161 treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96_rle_crop_3912137200_0.png resize: (226, 89) 1376618809 -0.039578876893092245 treat image : temp/1755002428_1021063_1376616336_7d9ca93dae5e4d339418eb9e28de7996_rle_crop_3912137224_0.png resize: (106, 114) 1376618810 0.7778409058700189 treat image : temp/1755002428_1021063_1376615939_abf5aa7cdddf642f4c0b187a10079c41_rle_crop_3912137254_0.png resize: (95, 71) 1376618811 1.161629726330361 treat image : temp/1755002428_1021063_1376615939_abf5aa7cdddf642f4c0b187a10079c41_rle_crop_3912137255_0.png resize: (75, 70) 1376618812 0.9440436717820511 treat image : temp/1755002428_1021063_1376615935_1f2050d0915a716343954df972c0a490_rle_crop_3912137257_0.png resize: (129, 106) 1376618813 -1.9901786269889736 treat image : temp/1755002428_1021063_1376615912_5d8b19c3974b0643c09ae5c720107cba_rle_crop_3912137265_0.png resize: (100, 61) 1376618814 3.2026942959186004 treat image : temp/1755002428_1021063_1376615907_363af079b186585feedc904addf0c4a9_rle_crop_3912137270_0.png resize: (97, 131) 1376618815 -1.6493471958330526 treat image : temp/1755002428_1021063_1376616345_eb179660806c2458f319b164b8fd2875_rle_crop_3912137194_0.png resize: (534, 327) 1376618839 -0.14840600638090423 treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96_rle_crop_3912137196_0.png resize: (130, 56) 1376618840 -1.5998430718674936 treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96_rle_crop_3912137199_0.png resize: (198, 244) 1376618841 -1.2195927951087715 treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96_rle_crop_3912137201_0.png resize: (506, 321) 1376618842 0.0270002588108535 treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137203_0.png resize: (266, 686) 1376618843 -1.284455777702729 treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137204_0.png resize: (154, 218) 1376618844 -0.9540422653987002 treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137205_0.png resize: (136, 211) 1376618845 -1.3263297564637833 treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137206_0.png resize: (100, 157) 1376618846 -1.6240964063628769 treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137207_0.png resize: (125, 155) 1376618847 -0.5844035472750193 treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137208_0.png resize: (152, 218) 1376618848 -0.016767715102086703 treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137209_0.png resize: (953, 927) 1376618849 0.8580409368899027 treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137210_0.png resize: (468, 338) 1376618850 0.04484886515003867 treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137213_0.png resize: (161, 204) 1376618851 0.3023994977966988 treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137214_0.png resize: (76, 113) 1376618852 1.0689016296290668 treat image : temp/1755002428_1021063_1376616339_992d7fc68463c9b93cb6f1baeb2b19b1_rle_crop_3912137215_0.png resize: (138, 186) 1376618853 -0.9860430573479808 treat image : temp/1755002428_1021063_1376616339_992d7fc68463c9b93cb6f1baeb2b19b1_rle_crop_3912137216_0.png resize: (176, 324) 1376618854 -0.2852800297734395 treat image : temp/1755002428_1021063_1376616339_992d7fc68463c9b93cb6f1baeb2b19b1_rle_crop_3912137219_0.png resize: (149, 185) 1376618855 0.2815397653155887 treat image : temp/1755002428_1021063_1376616339_992d7fc68463c9b93cb6f1baeb2b19b1_rle_crop_3912137220_0.png resize: (225, 114) 1376618856 -0.8793913563175709 treat image : temp/1755002428_1021063_1376616338_7551026f552781689006e7774394694a_rle_crop_3912137221_0.png resize: (42, 135) 1376618857 -0.6111195690522752 treat image : temp/1755002428_1021063_1376616338_7551026f552781689006e7774394694a_rle_crop_3912137222_0.png resize: (170, 151) 1376618858 -1.8483910133740087 treat image : temp/1755002428_1021063_1376616336_7d9ca93dae5e4d339418eb9e28de7996_rle_crop_3912137223_0.png resize: (189, 115) 1376618859 -1.8461506364982854 treat image : temp/1755002428_1021063_1376616336_7d9ca93dae5e4d339418eb9e28de7996_rle_crop_3912137227_0.png resize: (133, 163) 1376618860 3.293313083649574 treat image : temp/1755002428_1021063_1376616336_7d9ca93dae5e4d339418eb9e28de7996_rle_crop_3912137228_0.png resize: (136, 145) 1376618861 1.1171577441049063 treat image : temp/1755002428_1021063_1376616255_8049566e22379de53578dbb77007546c_rle_crop_3912137229_0.png resize: (101, 107) 1376618862 -3.484633919060061 treat image : temp/1755002428_1021063_1376616247_24a17bf472d313b99798fad16e2230fd_rle_crop_3912137231_0.png resize: (359, 151) 1376618863 -0.8411789434786835 treat image : temp/1755002428_1021063_1376616247_24a17bf472d313b99798fad16e2230fd_rle_crop_3912137232_0.png resize: (130, 134) 1376618864 -0.12188861775880165 treat image : temp/1755002428_1021063_1376616247_24a17bf472d313b99798fad16e2230fd_rle_crop_3912137233_0.png resize: (120, 160) 1376618865 -0.581304445271358 treat image : temp/1755002428_1021063_1376616247_24a17bf472d313b99798fad16e2230fd_rle_crop_3912137234_0.png resize: (101, 87) 1376618866 0.19069238065979743 treat image : temp/1755002428_1021063_1376616247_24a17bf472d313b99798fad16e2230fd_rle_crop_3912137235_0.png resize: (151, 84) 1376618867 0.3397161383467404 treat image : temp/1755002428_1021063_1376616190_e2d329b9c22b9a77c65cf740660ca8fe_rle_crop_3912137238_0.png resize: (179, 64) 1376618868 -2.2931986223221403 treat image : temp/1755002428_1021063_1376616156_cecc6481d0d4840169059e54ba6235ed_rle_crop_3912137239_0.png resize: (279, 152) 1376618869 -0.3253938011912863 treat image : temp/1755002428_1021063_1376616156_cecc6481d0d4840169059e54ba6235ed_rle_crop_3912137241_0.png resize: (124, 122) 1376618870 -1.0373635642913566 treat image : temp/1755002428_1021063_1376616156_cecc6481d0d4840169059e54ba6235ed_rle_crop_3912137242_0.png resize: (170, 291) 1376618871 -0.5229315375611207 treat image : temp/1755002428_1021063_1376616156_cecc6481d0d4840169059e54ba6235ed_rle_crop_3912137243_0.png resize: (138, 110) 1376618872 -2.754371024986361 treat image : temp/1755002428_1021063_1376616121_0ab7ac92400248bd41cfe95f796e8493_rle_crop_3912137246_0.png resize: (134, 104) 1376618874 -2.951825363741391 treat image : temp/1755002428_1021063_1376616121_0ab7ac92400248bd41cfe95f796e8493_rle_crop_3912137247_0.png resize: (552, 334) 1376618875 0.19216487777986413 treat image : temp/1755002428_1021063_1376615988_5a34606028a4082bd0c3e439a562586a_rle_crop_3912137251_0.png resize: (130, 111) 1376618876 -4.460517576609363 treat image : temp/1755002428_1021063_1376615988_5a34606028a4082bd0c3e439a562586a_rle_crop_3912137252_0.png resize: (135, 120) 1376618878 -2.050869472326977 treat image : temp/1755002428_1021063_1376615939_abf5aa7cdddf642f4c0b187a10079c41_rle_crop_3912137256_0.png resize: (141, 234) 1376618879 -0.26048030217197077 treat image : temp/1755002428_1021063_1376615935_1f2050d0915a716343954df972c0a490_rle_crop_3912137261_0.png resize: (501, 328) 1376618880 0.24989556337375213 treat image : temp/1755002428_1021063_1376615910_ff695412a811ec5cc71ade82078c14ce_rle_crop_3912137267_0.png resize: (96, 96) 1376618881 -0.8107642168924821 treat image : temp/1755002428_1021063_1376615910_ff695412a811ec5cc71ade82078c14ce_rle_crop_3912137268_0.png resize: (106, 170) 1376618883 0.23069599901654203 treat image : temp/1755002428_1021063_1376615906_f9a090953a2bf483a30d38f070fae10d_rle_crop_3912137271_0.png resize: (523, 317) 1376618884 -0.6505863587475812 treat image : temp/1755002428_1021063_1376615905_1ba3ce0c01d0e6d7841d78f2a150d90e_rle_crop_3912137276_0.png resize: (134, 112) 1376618885 -3.04862584057997 treat image : temp/1755002428_1021063_1376615905_1ba3ce0c01d0e6d7841d78f2a150d90e_rle_crop_3912137278_0.png resize: (107, 97) 1376618887 -2.0856854137731244 treat image : temp/1755002428_1021063_1376615673_a7f4e834c017cdedc06508e65c2cd4f2_rle_crop_3912137290_0.png resize: (168, 155) 1376618888 -0.9490528425228019 treat image : temp/1755002428_1021063_1376615663_6b4a4c06634987de40cf3e3719f1dd1d_rle_crop_3912137295_0.png resize: (373, 312) 1376618889 -1.5316155347711369 treat image : temp/1755002428_1021063_1376615636_a588a7e9790978247d6b408d0b6e51a5_rle_crop_3912137297_0.png resize: (79, 216) 1376618890 -1.7441324082792207 treat image : temp/1755002428_1021063_1376615433_9ba98c93fecf9f94c6e640766695d089_rle_crop_3912137301_0.png resize: (78, 136) 1376618892 -0.7076111167857939 treat image : temp/1755002428_1021063_1376615433_9ba98c93fecf9f94c6e640766695d089_rle_crop_3912137302_0.png resize: (157, 268) 1376618893 -0.7589196745756969 treat image : temp/1755002428_1021063_1376615433_9ba98c93fecf9f94c6e640766695d089_rle_crop_3912137303_0.png resize: (187, 135) 1376618894 -0.060431410594740144 treat image : temp/1755002428_1021063_1376615433_9ba98c93fecf9f94c6e640766695d089_rle_crop_3912137304_0.png resize: (110, 47) 1376618896 -0.703439886485372 treat image : temp/1755002428_1021063_1376615400_451aaf3d18cd835258aec99d5f93797c_rle_crop_3912137305_0.png resize: (118, 199) 1376618897 -2.7458966106140323 treat image : temp/1755002428_1021063_1376615400_451aaf3d18cd835258aec99d5f93797c_rle_crop_3912137306_0.png resize: (440, 314) 1376618898 -0.33764697323550313 treat image : temp/1755002428_1021063_1376615357_7f10fe0d8556c0f723e2ab4a5def730d_rle_crop_3912137309_0.png resize: (275, 234) 1376618899 -1.5336397786113967 treat image : temp/1755002428_1021063_1376615357_7f10fe0d8556c0f723e2ab4a5def730d_rle_crop_3912137310_0.png resize: (139, 180) 1376618901 -2.042923717933662 treat image : temp/1755002428_1021063_1376615325_10abadff93a036b3d96387016faec225_rle_crop_3912137314_0.png resize: (145, 119) 1376618902 -0.5621912519105468 treat image : temp/1755002428_1021063_1376615325_10abadff93a036b3d96387016faec225_rle_crop_3912137315_0.png resize: (1003, 995) 1376618903 0.24482503182737073 treat image : temp/1755002428_1021063_1376616156_cecc6481d0d4840169059e54ba6235ed_rle_crop_3912137244_0.png resize: (55, 57) 1376618915 0.9247537864109057 treat image : temp/1755002428_1021063_1376615988_5a34606028a4082bd0c3e439a562586a_rle_crop_3912137250_0.png resize: (116, 156) 1376618916 -1.0853735074832749 treat image : temp/1755002428_1021063_1376615677_b47f6e88fedf38ae7790799daa764362_rle_crop_3912137285_0.png resize: (82, 57) 1376618917 0.2253875630759326 treat image : temp/1755002428_1021063_1376615988_5a34606028a4082bd0c3e439a562586a_rle_crop_3912137249_0.png resize: (197, 139) 1376618920 -2.089258262284564 Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 154 time used for this insertion : 0.018068790435791016 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 154 time used for this insertion : 0.032224416732788086 save missing photos in datou_result : time spend for datou_step_exec : 26.014665842056274 time spend to save output : 0.05518627166748047 total time spend for step 6 : 26.069852113723755 step7:brightness Tue Aug 12 14:42:56 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness treat image : temp/1755002428_1021063_1376616345_eb179660806c2458f319b164b8fd2875.jpg treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96.jpg treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960.jpg treat image : temp/1755002428_1021063_1376616339_992d7fc68463c9b93cb6f1baeb2b19b1.jpg treat image : temp/1755002428_1021063_1376616338_7551026f552781689006e7774394694a.jpg treat image : temp/1755002428_1021063_1376616336_7d9ca93dae5e4d339418eb9e28de7996.jpg treat image : temp/1755002428_1021063_1376616255_8049566e22379de53578dbb77007546c.jpg treat image : temp/1755002428_1021063_1376616247_24a17bf472d313b99798fad16e2230fd.jpg treat image : temp/1755002428_1021063_1376616225_ead42c0fa85550f15ac6e22cf35f5093.jpg treat image : temp/1755002428_1021063_1376616190_e2d329b9c22b9a77c65cf740660ca8fe.jpg treat image : temp/1755002428_1021063_1376616156_cecc6481d0d4840169059e54ba6235ed.jpg treat image : temp/1755002428_1021063_1376616121_0ab7ac92400248bd41cfe95f796e8493.jpg treat image : temp/1755002428_1021063_1376615988_5a34606028a4082bd0c3e439a562586a.jpg treat image : temp/1755002428_1021063_1376615939_abf5aa7cdddf642f4c0b187a10079c41.jpg treat image : temp/1755002428_1021063_1376615935_1f2050d0915a716343954df972c0a490.jpg treat image : temp/1755002428_1021063_1376615933_32dd881aa87254d1fa8b7bcaa584226c.jpg treat image : temp/1755002428_1021063_1376615912_5d8b19c3974b0643c09ae5c720107cba.jpg treat image : temp/1755002428_1021063_1376615910_ff695412a811ec5cc71ade82078c14ce.jpg treat image : temp/1755002428_1021063_1376615907_363af079b186585feedc904addf0c4a9.jpg treat image : temp/1755002428_1021063_1376615906_f9a090953a2bf483a30d38f070fae10d.jpg treat image : temp/1755002428_1021063_1376615905_1ba3ce0c01d0e6d7841d78f2a150d90e.jpg treat image : temp/1755002428_1021063_1376615901_6e8399b10cf80e614a82e6929be5dc3f.jpg treat image : temp/1755002428_1021063_1376615680_373642d8658b39a3eb8595a55c315bd5.jpg treat image : temp/1755002428_1021063_1376615677_b47f6e88fedf38ae7790799daa764362.jpg treat image : temp/1755002428_1021063_1376615673_a7f4e834c017cdedc06508e65c2cd4f2.jpg treat image : temp/1755002428_1021063_1376615668_a45ea13eecfdf999f3cec4b8155578a7.jpg treat image : temp/1755002428_1021063_1376615663_6b4a4c06634987de40cf3e3719f1dd1d.jpg treat image : temp/1755002428_1021063_1376615636_a588a7e9790978247d6b408d0b6e51a5.jpg treat image : temp/1755002428_1021063_1376615433_9ba98c93fecf9f94c6e640766695d089.jpg treat image : temp/1755002428_1021063_1376615400_451aaf3d18cd835258aec99d5f93797c.jpg treat image : temp/1755002428_1021063_1376615357_7f10fe0d8556c0f723e2ab4a5def730d.jpg treat image : temp/1755002428_1021063_1376615325_10abadff93a036b3d96387016faec225.jpg treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96_rle_crop_3912137195_0.png treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96_rle_crop_3912137198_0.png treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96_rle_crop_3912137202_0.png treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137211_0.png treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137212_0.png treat image : temp/1755002428_1021063_1376616339_992d7fc68463c9b93cb6f1baeb2b19b1_rle_crop_3912137217_0.png treat image : temp/1755002428_1021063_1376616339_992d7fc68463c9b93cb6f1baeb2b19b1_rle_crop_3912137218_0.png treat image : temp/1755002428_1021063_1376616336_7d9ca93dae5e4d339418eb9e28de7996_rle_crop_3912137225_0.png treat image : temp/1755002428_1021063_1376616336_7d9ca93dae5e4d339418eb9e28de7996_rle_crop_3912137226_0.png treat image : temp/1755002428_1021063_1376616225_ead42c0fa85550f15ac6e22cf35f5093_rle_crop_3912137237_0.png treat image : temp/1755002428_1021063_1376616156_cecc6481d0d4840169059e54ba6235ed_rle_crop_3912137240_0.png treat image : temp/1755002428_1021063_1376616121_0ab7ac92400248bd41cfe95f796e8493_rle_crop_3912137248_0.png treat image : temp/1755002428_1021063_1376615988_5a34606028a4082bd0c3e439a562586a_rle_crop_3912137253_0.png treat image : temp/1755002428_1021063_1376615935_1f2050d0915a716343954df972c0a490_rle_crop_3912137258_0.png treat image : temp/1755002428_1021063_1376615935_1f2050d0915a716343954df972c0a490_rle_crop_3912137259_0.png treat image : temp/1755002428_1021063_1376615910_ff695412a811ec5cc71ade82078c14ce_rle_crop_3912137266_0.png treat image : temp/1755002428_1021063_1376615907_363af079b186585feedc904addf0c4a9_rle_crop_3912137269_0.png treat image : temp/1755002428_1021063_1376615906_f9a090953a2bf483a30d38f070fae10d_rle_crop_3912137272_0.png treat image : temp/1755002428_1021063_1376615906_f9a090953a2bf483a30d38f070fae10d_rle_crop_3912137273_0.png treat image : temp/1755002428_1021063_1376615906_f9a090953a2bf483a30d38f070fae10d_rle_crop_3912137274_0.png treat image : temp/1755002428_1021063_1376615905_1ba3ce0c01d0e6d7841d78f2a150d90e_rle_crop_3912137275_0.png treat image : temp/1755002428_1021063_1376615905_1ba3ce0c01d0e6d7841d78f2a150d90e_rle_crop_3912137277_0.png treat image : temp/1755002428_1021063_1376615901_6e8399b10cf80e614a82e6929be5dc3f_rle_crop_3912137279_0.png treat image : temp/1755002428_1021063_1376615680_373642d8658b39a3eb8595a55c315bd5_rle_crop_3912137281_0.png treat image : temp/1755002428_1021063_1376615680_373642d8658b39a3eb8595a55c315bd5_rle_crop_3912137282_0.png treat image : temp/1755002428_1021063_1376615677_b47f6e88fedf38ae7790799daa764362_rle_crop_3912137283_0.png treat image : temp/1755002428_1021063_1376615677_b47f6e88fedf38ae7790799daa764362_rle_crop_3912137286_0.png treat image : temp/1755002428_1021063_1376615673_a7f4e834c017cdedc06508e65c2cd4f2_rle_crop_3912137289_0.png treat image : temp/1755002428_1021063_1376615668_a45ea13eecfdf999f3cec4b8155578a7_rle_crop_3912137292_0.png treat image : temp/1755002428_1021063_1376615663_6b4a4c06634987de40cf3e3719f1dd1d_rle_crop_3912137293_0.png treat image : temp/1755002428_1021063_1376615636_a588a7e9790978247d6b408d0b6e51a5_rle_crop_3912137296_0.png treat image : temp/1755002428_1021063_1376615636_a588a7e9790978247d6b408d0b6e51a5_rle_crop_3912137298_0.png treat image : temp/1755002428_1021063_1376615636_a588a7e9790978247d6b408d0b6e51a5_rle_crop_3912137299_0.png treat image : temp/1755002428_1021063_1376615636_a588a7e9790978247d6b408d0b6e51a5_rle_crop_3912137300_0.png treat image : temp/1755002428_1021063_1376615400_451aaf3d18cd835258aec99d5f93797c_rle_crop_3912137307_0.png treat image : temp/1755002428_1021063_1376615400_451aaf3d18cd835258aec99d5f93797c_rle_crop_3912137308_0.png treat image : temp/1755002428_1021063_1376615357_7f10fe0d8556c0f723e2ab4a5def730d_rle_crop_3912137311_0.png treat image : temp/1755002428_1021063_1376615325_10abadff93a036b3d96387016faec225_rle_crop_3912137312_0.png treat image : temp/1755002428_1021063_1376615325_10abadff93a036b3d96387016faec225_rle_crop_3912137313_0.png treat image : temp/1755002428_1021063_1376616255_8049566e22379de53578dbb77007546c_rle_crop_3912137230_0.png treat image : temp/1755002428_1021063_1376616225_ead42c0fa85550f15ac6e22cf35f5093_rle_crop_3912137236_0.png treat image : temp/1755002428_1021063_1376616121_0ab7ac92400248bd41cfe95f796e8493_rle_crop_3912137245_0.png treat image : temp/1755002428_1021063_1376615935_1f2050d0915a716343954df972c0a490_rle_crop_3912137260_0.png treat image : temp/1755002428_1021063_1376615912_5d8b19c3974b0643c09ae5c720107cba_rle_crop_3912137262_0.png treat image : temp/1755002428_1021063_1376615912_5d8b19c3974b0643c09ae5c720107cba_rle_crop_3912137263_0.png treat image : temp/1755002428_1021063_1376615912_5d8b19c3974b0643c09ae5c720107cba_rle_crop_3912137264_0.png treat image : temp/1755002428_1021063_1376615901_6e8399b10cf80e614a82e6929be5dc3f_rle_crop_3912137280_0.png treat image : temp/1755002428_1021063_1376615677_b47f6e88fedf38ae7790799daa764362_rle_crop_3912137284_0.png treat image : temp/1755002428_1021063_1376615677_b47f6e88fedf38ae7790799daa764362_rle_crop_3912137287_0.png treat image : temp/1755002428_1021063_1376615677_b47f6e88fedf38ae7790799daa764362_rle_crop_3912137288_0.png treat image : temp/1755002428_1021063_1376615673_a7f4e834c017cdedc06508e65c2cd4f2_rle_crop_3912137291_0.png treat image : temp/1755002428_1021063_1376615663_6b4a4c06634987de40cf3e3719f1dd1d_rle_crop_3912137294_0.png treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96_rle_crop_3912137197_0.png treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96_rle_crop_3912137200_0.png treat image : temp/1755002428_1021063_1376616336_7d9ca93dae5e4d339418eb9e28de7996_rle_crop_3912137224_0.png treat image : temp/1755002428_1021063_1376615939_abf5aa7cdddf642f4c0b187a10079c41_rle_crop_3912137254_0.png treat image : temp/1755002428_1021063_1376615939_abf5aa7cdddf642f4c0b187a10079c41_rle_crop_3912137255_0.png treat image : temp/1755002428_1021063_1376615935_1f2050d0915a716343954df972c0a490_rle_crop_3912137257_0.png treat image : temp/1755002428_1021063_1376615912_5d8b19c3974b0643c09ae5c720107cba_rle_crop_3912137265_0.png treat image : temp/1755002428_1021063_1376615907_363af079b186585feedc904addf0c4a9_rle_crop_3912137270_0.png treat image : temp/1755002428_1021063_1376616345_eb179660806c2458f319b164b8fd2875_rle_crop_3912137194_0.png treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96_rle_crop_3912137196_0.png treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96_rle_crop_3912137199_0.png treat image : temp/1755002428_1021063_1376616343_eb60c6ff0e078380da33484cdf3a7a96_rle_crop_3912137201_0.png treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137203_0.png treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137204_0.png treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137205_0.png treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137206_0.png treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137207_0.png treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137208_0.png treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137209_0.png treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137210_0.png treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137213_0.png treat image : temp/1755002428_1021063_1376616340_52d7a4e4658a1d040e2d8abe476cd960_rle_crop_3912137214_0.png treat image : temp/1755002428_1021063_1376616339_992d7fc68463c9b93cb6f1baeb2b19b1_rle_crop_3912137215_0.png treat image : temp/1755002428_1021063_1376616339_992d7fc68463c9b93cb6f1baeb2b19b1_rle_crop_3912137216_0.png treat image : temp/1755002428_1021063_1376616339_992d7fc68463c9b93cb6f1baeb2b19b1_rle_crop_3912137219_0.png treat image : temp/1755002428_1021063_1376616339_992d7fc68463c9b93cb6f1baeb2b19b1_rle_crop_3912137220_0.png treat image : temp/1755002428_1021063_1376616338_7551026f552781689006e7774394694a_rle_crop_3912137221_0.png treat image : temp/1755002428_1021063_1376616338_7551026f552781689006e7774394694a_rle_crop_3912137222_0.png treat image : temp/1755002428_1021063_1376616336_7d9ca93dae5e4d339418eb9e28de7996_rle_crop_3912137223_0.png treat image : temp/1755002428_1021063_1376616336_7d9ca93dae5e4d339418eb9e28de7996_rle_crop_3912137227_0.png treat image : temp/1755002428_1021063_1376616336_7d9ca93dae5e4d339418eb9e28de7996_rle_crop_3912137228_0.png treat image : temp/1755002428_1021063_1376616255_8049566e22379de53578dbb77007546c_rle_crop_3912137229_0.png treat image : temp/1755002428_1021063_1376616247_24a17bf472d313b99798fad16e2230fd_rle_crop_3912137231_0.png treat image : temp/1755002428_1021063_1376616247_24a17bf472d313b99798fad16e2230fd_rle_crop_3912137232_0.png treat image : temp/1755002428_1021063_1376616247_24a17bf472d313b99798fad16e2230fd_rle_crop_3912137233_0.png treat image : temp/1755002428_1021063_1376616247_24a17bf472d313b99798fad16e2230fd_rle_crop_3912137234_0.png treat image : temp/1755002428_1021063_1376616247_24a17bf472d313b99798fad16e2230fd_rle_crop_3912137235_0.png treat image : temp/1755002428_1021063_1376616190_e2d329b9c22b9a77c65cf740660ca8fe_rle_crop_3912137238_0.png treat image : temp/1755002428_1021063_1376616156_cecc6481d0d4840169059e54ba6235ed_rle_crop_3912137239_0.png treat image : temp/1755002428_1021063_1376616156_cecc6481d0d4840169059e54ba6235ed_rle_crop_3912137241_0.png treat image : temp/1755002428_1021063_1376616156_cecc6481d0d4840169059e54ba6235ed_rle_crop_3912137242_0.png treat image : temp/1755002428_1021063_1376616156_cecc6481d0d4840169059e54ba6235ed_rle_crop_3912137243_0.png treat image : temp/1755002428_1021063_1376616121_0ab7ac92400248bd41cfe95f796e8493_rle_crop_3912137246_0.png treat image : temp/1755002428_1021063_1376616121_0ab7ac92400248bd41cfe95f796e8493_rle_crop_3912137247_0.png treat image : temp/1755002428_1021063_1376615988_5a34606028a4082bd0c3e439a562586a_rle_crop_3912137251_0.png treat image : temp/1755002428_1021063_1376615988_5a34606028a4082bd0c3e439a562586a_rle_crop_3912137252_0.png treat image : temp/1755002428_1021063_1376615939_abf5aa7cdddf642f4c0b187a10079c41_rle_crop_3912137256_0.png treat image : temp/1755002428_1021063_1376615935_1f2050d0915a716343954df972c0a490_rle_crop_3912137261_0.png treat image : temp/1755002428_1021063_1376615910_ff695412a811ec5cc71ade82078c14ce_rle_crop_3912137267_0.png treat image : temp/1755002428_1021063_1376615910_ff695412a811ec5cc71ade82078c14ce_rle_crop_3912137268_0.png treat image : temp/1755002428_1021063_1376615906_f9a090953a2bf483a30d38f070fae10d_rle_crop_3912137271_0.png treat image : temp/1755002428_1021063_1376615905_1ba3ce0c01d0e6d7841d78f2a150d90e_rle_crop_3912137276_0.png treat image : temp/1755002428_1021063_1376615905_1ba3ce0c01d0e6d7841d78f2a150d90e_rle_crop_3912137278_0.png treat image : temp/1755002428_1021063_1376615673_a7f4e834c017cdedc06508e65c2cd4f2_rle_crop_3912137290_0.png treat image : temp/1755002428_1021063_1376615663_6b4a4c06634987de40cf3e3719f1dd1d_rle_crop_3912137295_0.png treat image : temp/1755002428_1021063_1376615636_a588a7e9790978247d6b408d0b6e51a5_rle_crop_3912137297_0.png treat image : temp/1755002428_1021063_1376615433_9ba98c93fecf9f94c6e640766695d089_rle_crop_3912137301_0.png treat image : temp/1755002428_1021063_1376615433_9ba98c93fecf9f94c6e640766695d089_rle_crop_3912137302_0.png treat image : temp/1755002428_1021063_1376615433_9ba98c93fecf9f94c6e640766695d089_rle_crop_3912137303_0.png treat image : temp/1755002428_1021063_1376615433_9ba98c93fecf9f94c6e640766695d089_rle_crop_3912137304_0.png treat image : temp/1755002428_1021063_1376615400_451aaf3d18cd835258aec99d5f93797c_rle_crop_3912137305_0.png treat image : temp/1755002428_1021063_1376615400_451aaf3d18cd835258aec99d5f93797c_rle_crop_3912137306_0.png treat image : temp/1755002428_1021063_1376615357_7f10fe0d8556c0f723e2ab4a5def730d_rle_crop_3912137309_0.png treat image : temp/1755002428_1021063_1376615357_7f10fe0d8556c0f723e2ab4a5def730d_rle_crop_3912137310_0.png treat image : temp/1755002428_1021063_1376615325_10abadff93a036b3d96387016faec225_rle_crop_3912137314_0.png treat image : temp/1755002428_1021063_1376615325_10abadff93a036b3d96387016faec225_rle_crop_3912137315_0.png treat image : temp/1755002428_1021063_1376616156_cecc6481d0d4840169059e54ba6235ed_rle_crop_3912137244_0.png treat image : temp/1755002428_1021063_1376615988_5a34606028a4082bd0c3e439a562586a_rle_crop_3912137250_0.png treat image : temp/1755002428_1021063_1376615677_b47f6e88fedf38ae7790799daa764362_rle_crop_3912137285_0.png treat image : temp/1755002428_1021063_1376615988_5a34606028a4082bd0c3e439a562586a_rle_crop_3912137249_0.png Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 154 time used for this insertion : 0.018525123596191406 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 154 time used for this insertion : 0.031002044677734375 save missing photos in datou_result : time spend for datou_step_exec : 7.629671335220337 time spend to save output : 0.05585646629333496 total time spend for step 7 : 7.685527801513672 step8:velours_tree Tue Aug 12 14:43:04 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure can't find the photo_desc_type Inside saveOutput : final : False verbose : 0 ouput is None No outpout to save, returning out of save general time spend for datou_step_exec : 0.09058332443237305 time spend to save output : 3.8623809814453125e-05 total time spend for step 8 : 0.0906219482421875 step9:send_mail_cod Tue Aug 12 14:43:04 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin/workarea/git/Velours/python in order to get the selector url, please entre the license of selector results_Auto_P25915180_12-08-2025_14_43_04.pdf 25915281 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette259152811755002584 25915282 imagette259152821755002585 25915283 imagette259152831755002585 25915284 change filename to text .change filename to text .change filename to text .imagette259152841755002585 25915285 change filename to text .imagette259152851755002585 25915286 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette259152861755002585 25915287 imagette259152871755002586 25915288 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette259152881755002586 25915289 imagette259152891755002587 25915290 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette259152901755002587 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=25915180 and hashtag_type = 3594 and pcr.hashtag_id = h.hashtag_id; velour_link : https://www.fotonower.com/velours/25915280,25915281,25915282,25915283,25915284,25915285,25915286,25915287,25915288,25915289,25915290?tags=environnement,papier,flou,pehd,autre,pet_fonce,pet_clair,background,carton,mal_croppe,metal args[1376616345] : ((1376616345, 1.3678124973524015, 492688767), (1376616345, 0.5581584756491994, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376616343] : ((1376616343, 2.7483875309323076, 492688767), (1376616343, 0.4792862620117052, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376616340] : ((1376616340, 1.7781591264735965, 492688767), (1376616340, 0.743575084422329, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376616339] : ((1376616339, 2.5296744098632713, 492688767), (1376616339, 0.3728037370506675, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376616338] : ((1376616338, 0.789852865277291, 492688767), (1376616338, 0.8481564506187402, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376616336] : ((1376616336, 3.977393841840734, 492688767), (1376616336, 0.6308607960839308, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376616255] : ((1376616255, 0.4574945920813751, 492688767), (1376616255, 0.7199226451403051, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376616247] : ((1376616247, 2.1094321425233478, 492688767), (1376616247, 0.5395384477211598, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376616225] : ((1376616225, 1.1030952925543622, 492688767), (1376616225, 0.6650284100667048, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376616190] : ((1376616190, 1.8472318561573158, 492688767), (1376616190, 0.46648067438169294, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376616156] : ((1376616156, 0.11795734540366531, 492688767), (1376616156, 0.47911083961784817, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376616121] : ((1376616121, -2.86325021768498, 492609224), (1376616121, 0.37027099325214774, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615988] : ((1376615988, -3.6931196348050146, 492609224), (1376615988, 0.3647674819217144, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615939] : ((1376615939, 0.5398208005083787, 492688767), (1376615939, 0.523048317335369, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615935] : ((1376615935, -0.051077777492504965, 492688767), (1376615935, 0.2920529994793965, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615933] : ((1376615933, 0.4067163189740742, 492688767), (1376615933, 0.334353624946272, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615912] : ((1376615912, 0.9258607613406578, 492688767), (1376615912, 0.5587863571912097, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615910] : ((1376615910, 0.3834055549562788, 492688767), (1376615910, 0.4599012824123494, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615907] : ((1376615907, 0.7404548500047579, 492688767), (1376615907, 0.512917080754255, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615906] : ((1376615906, -0.04808053514134563, 492688767), (1376615906, 0.777185886800979, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615905] : ((1376615905, -3.9595207802598558, 492609224), (1376615905, 0.31371218091570335, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615901] : ((1376615901, 1.8610503654734172, 492688767), (1376615901, 0.7279091973367439, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615680] : ((1376615680, 0.2091622960434755, 492688767), (1376615680, 0.2573463062001756, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615677] : ((1376615677, 0.3388924488030405, 492688767), (1376615677, 0.36430553093223883, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615673] : ((1376615673, 1.7086942679766672, 492688767), (1376615673, 0.5324524659408743, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615668] : ((1376615668, 0.2024731908591126, 492688767), (1376615668, 0.5485423061302553, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615663] : ((1376615663, 1.3870671611371004, 492688767), (1376615663, 0.8662648868660462, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615636] : ((1376615636, 0.955947786499488, 492688767), (1376615636, 0.5443108730096623, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615433] : ((1376615433, 0.9839343945695964, 492688767), (1376615433, 1.023279959307712, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615400] : ((1376615400, 1.8722194538548635, 492688767), (1376615400, 0.8939634442149942, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615357] : ((1376615357, 0.021510936326406774, 492688767), (1376615357, 0.5690509679722371, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com args[1376615325] : ((1376615325, -0.07726434110938954, 492688767), (1376615325, 0.7811175779564342, 2107752395), '0.054940908926504634') We are sending mail with results at report@fotonower.com refus_total : 0.054940908926504634 2022-04-13 10:29:59 0 SELECT ph.photo_id,ph.url,ph.username,ph.uploaded_at,ph.text FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=25915180 AND mpp.hide_status=0 ORDER BY mpp.order LIMIT 0, 1000 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25915180_12-08-2025_14_43_04.pdf results_Auto_P25915180_12-08-2025_14_43_04.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25915180_12-08-2025_14_43_04.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('3318','25915180','results_Auto_P25915180_12-08-2025_14_43_04.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25915180_12-08-2025_14_43_04.pdf','pdf','','0.38','0.054940908926504634') message_in_mail: Bonjour,
Veuillez trouver ci dessous les résultats du service carac on demand pour le portfolio: https://www.fotonower.com/view/25915180

https://www.fotonower.com/image?json=false&list_photos_id=1376616345
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.3678124973524015)
https://www.fotonower.com/image?json=false&list_photos_id=1376616343
La photo est trop floue, merci de reprendre une photo.(avec le score = 2.7483875309323076)
https://www.fotonower.com/image?json=false&list_photos_id=1376616340
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.7781591264735965)
https://www.fotonower.com/image?json=false&list_photos_id=1376616339
La photo est trop floue, merci de reprendre une photo.(avec le score = 2.5296744098632713)
https://www.fotonower.com/image?json=false&list_photos_id=1376616338
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376616336
La photo est trop floue, merci de reprendre une photo.(avec le score = 3.977393841840734)
https://www.fotonower.com/image?json=false&list_photos_id=1376616255
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376616247
La photo est trop floue, merci de reprendre une photo.(avec le score = 2.1094321425233478)
https://www.fotonower.com/image?json=false&list_photos_id=1376616225
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.1030952925543622)
https://www.fotonower.com/image?json=false&list_photos_id=1376616190
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.8472318561573158)
https://www.fotonower.com/image?json=false&list_photos_id=1376616156
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376616121
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376615988
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376615939
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376615935
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376615933
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376615912
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376615910
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376615907
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376615906
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376615905
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376615901
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.8610503654734172)
https://www.fotonower.com/image?json=false&list_photos_id=1376615680
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376615677
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376615673
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.7086942679766672)
https://www.fotonower.com/image?json=false&list_photos_id=1376615668
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376615663
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.3870671611371004)
https://www.fotonower.com/image?json=false&list_photos_id=1376615636
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376615433
La photo est trop floue, merci de reprendre une photo.(avec le score = 0.9839343945695964)
https://www.fotonower.com/image?json=false&list_photos_id=1376615400
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.8722194538548635)
https://www.fotonower.com/image?json=false&list_photos_id=1376615357
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376615325
Bravo, la photo est bien prise.

Dans ces conditions,le taux de refus est: 5.49%
Veuillez trouver les photos des contaminants.

exemples de contaminants: papier: https://www.fotonower.com/view/25915281?limit=200
exemples de contaminants: autre: https://www.fotonower.com/view/25915284?limit=200
exemples de contaminants: pet_fonce: https://www.fotonower.com/view/25915285?limit=200
exemples de contaminants: pet_clair: https://www.fotonower.com/view/25915286?limit=200
exemples de contaminants: carton: https://www.fotonower.com/view/25915288?limit=200
exemples de contaminants: metal: https://www.fotonower.com/view/25915290?limit=200
Veuillez trouver le rapport en pdf:https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25915180_12-08-2025_14_43_04.pdf.

Lien vers velours :https://www.fotonower.com/velours/25915280,25915281,25915282,25915283,25915284,25915285,25915286,25915287,25915288,25915289,25915290?tags=environnement,papier,flou,pehd,autre,pet_fonce,pet_clair,background,carton,mal_croppe,metal.


L'équipe Fotonower 202 b'' Server: nginx Date: Tue, 12 Aug 2025 12:43:12 GMT Content-Length: 0 Connection: close X-Message-Id: QX8UyiBzSu6A3NdyJ3YR8A Access-Control-Allow-Origin: https://sendgrid.api-docs.io Access-Control-Allow-Methods: POST Access-Control-Allow-Headers: Authorization, Content-Type, On-behalf-of, x-sg-elas-acl Access-Control-Max-Age: 600 X-No-CORS-Reason: https://sendgrid.com/docs/Classroom/Basics/API/cors.html Strict-Transport-Security: max-age=31536000; includeSubDomains Content-Security-Policy: frame-ancestors 'none' Cache-Control: no-cache X-Content-Type-Options: no-sniff Referrer-Policy: strict-origin-when-cross-origin Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1376616345, 1376616343, 1376616340, 1376616339, 1376616338, 1376616336, 1376616255, 1376616247, 1376616225, 1376616190, 1376616156, 1376616121, 1376615988, 1376615939, 1376615935, 1376615933, 1376615912, 1376615910, 1376615907, 1376615906, 1376615905, 1376615901, 1376615680, 1376615677, 1376615673, 1376615668, 1376615663, 1376615636, 1376615433, 1376615400, 1376615357, 1376615325] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616345', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616343', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616340', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616339', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616338', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616336', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616255', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616247', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616225', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616190', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616156', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616121', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615988', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615939', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615935', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615933', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615912', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615910', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615907', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615906', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615905', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615901', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615680', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615677', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615673', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615668', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615663', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615636', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615433', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615400', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615357', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615325', None, None, None, None, None, '3512843') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 32 time used for this insertion : 0.017572641372680664 save_final save missing photos in datou_result : time spend for datou_step_exec : 7.959237813949585 time spend to save output : 0.017981767654418945 total time spend for step 9 : 7.977219581604004 step10:split_time_score Tue Aug 12 14:43:12 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score Catched exception ! Connect or reconnect ! TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('14', 32),) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 12082025 25915180 Nombre de photos uploadées : 32 / 23040 (0%) 12082025 25915180 Nombre de photos taguées (types de déchets): 0 / 32 (0%) 12082025 25915180 Nombre de photos taguées (volume) : 0 / 32 (0%) elapsed_time : load_data_split_time_score 2.1457672119140625e-06 elapsed_time : order_list_meta_photo_and_scores 6.4373016357421875e-06 ???????????????????????????????? elapsed_time : fill_and_build_computed_from_old_data 0.0014204978942871094 Catched exception ! Connect or reconnect ! Catched exception ! Connect or reconnect ! elapsed_time : insert_dashboard_record_day_entry 0.19495224952697754 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.08963755666473765 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25907306_12-08-2025_12_02_14.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25907306 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25907306 AND mptpi.`type`=3594 To do Qualite : 0.2198127786351166 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25908856_12-08-2025_12_01_48.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25908856 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25908856 AND mptpi.`type`=3594 To do Qualite : 0.11567090446566358 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25908860_12-08-2025_13_02_05.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25908860 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25908860 AND mptpi.`type`=3594 To do Qualite : 0.03421356577932099 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25908864_12-08-2025_11_41_35.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25908864 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25908864 AND mptpi.`type`=3594 To do Qualite : 0.09985528962742503 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25907315_12-08-2025_12_41_49.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25907315 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25907315 AND mptpi.`type`=3594 To do Qualite : 0.24285493827160495 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25907317_12-08-2025_13_21_45.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25907317 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25907317 AND mptpi.`type`=3594 To do Qualite : 0.021655971995279587 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25907323_12-08-2025_11_22_07.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25907323 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25907323 AND mptpi.`type`=3594 To do Qualite : 0.06390883349867725 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25907331_12-08-2025_12_21_43.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25907331 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25907331 AND mptpi.`type`=3594 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25915170 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25915171 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25915172 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25915175 order by id desc limit 1 Qualite : 0.054940908926504634 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25915180_12-08-2025_14_43_04.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25915180 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25915180 AND mptpi.`type`=3594 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'12082025': {'nb_upload': 32, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1376616345, 1376616343, 1376616340, 1376616339, 1376616338, 1376616336, 1376616255, 1376616247, 1376616225, 1376616190, 1376616156, 1376616121, 1376615988, 1376615939, 1376615935, 1376615933, 1376615912, 1376615910, 1376615907, 1376615906, 1376615905, 1376615901, 1376615680, 1376615677, 1376615673, 1376615668, 1376615663, 1376615636, 1376615433, 1376615400, 1376615357, 1376615325] Looping around the photos to save general results len do output : 1 /25915180Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616345', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616343', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616340', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616339', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616338', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616336', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616255', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616247', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616225', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616190', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616156', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376616121', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615988', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615939', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615935', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615933', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615912', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615910', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615907', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615906', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615905', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615901', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615680', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615677', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615673', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615668', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615663', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615636', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615433', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615400', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615357', None, None, None, None, None, '3512843') ('3318', None, None, None, None, None, None, None, '3512843') ('3318', '25915180', '1376615325', None, None, None, None, None, '3512843') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 33 time used for this insertion : 0.018408536911010742 save_final save missing photos in datou_result : time spend for datou_step_exec : 4.656914949417114 time spend to save output : 0.01874089241027832 total time spend for step 10 : 4.675655841827393 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 32 set_done_treatment 90.10user 41.73system 2:51.86elapsed 76%CPU (0avgtext+0avgdata 3528528maxresident)k 504792inputs+36904outputs (13major+2891477minor)pagefaults 0swaps