python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 3318 ' -s datou_3318 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 3750635 load datou : 3318 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? [(photo_id, hashtag_id, hashtag_type, x0, x1, y0, y1, score, seg_temp, polygons), ...] was removed should we ? chemin de la photo was removed should we ? [ (photo_id_loc, hashtag_id, hashtag_type, x0, x1, y0, y1, score, None), ...] was removed should we ? chemin de la photo was removed should we ? id de la photo (peut être local ou global) was removed should we ? chemin de la photo was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? None was removed should we ? donnée sous forme de nombre was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) load thcls load THCL from format json or kwargs add thcl : 2847 in CacheModelConfig load pdts add pdt : 5275 in CacheModelConfig Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 3318, datou_cur_ids : ['3371491'] with mtr_portfolio_ids : ['25403226'] and first list_photo_ids : [] new path : /proc/3750635/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : mask_detect, crop_condition, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, velours_tree, send_mail_cod, split_time_score over limit max, limiting to limit_max 40 list_input_json : [] origin We have 1 , BFBFBFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 7 ; length of list_pids : 7 ; length of list_args : 7 time to download the photos : 1.197357416152954 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 10 step1:mask_detect Mon Jul 28 14:20:31 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 10512 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-07-28 14:20:34.990118: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-07-28 14:20:35.023544: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493035000 Hz 2025-07-28 14:20:35.026158: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f04e8000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-07-28 14:20:35.026247: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-07-28 14:20:35.032110: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-07-28 14:20:35.261444: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x32dba710 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-07-28 14:20:35.261553: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-07-28 14:20:35.263068: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-07-28 14:20:35.263689: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-28 14:20:35.266605: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-28 14:20:35.270607: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-07-28 14:20:35.271386: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-07-28 14:20:35.274105: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-07-28 14:20:35.275485: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-07-28 14:20:35.280279: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-28 14:20:35.283228: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-07-28 14:20:35.283324: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-28 14:20:35.285722: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-07-28 14:20:35.285740: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-07-28 14:20:35.285749: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-07-28 14:20:35.289523: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9597 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-07-28 14:20:35.691791: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-07-28 14:20:35.691888: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-28 14:20:35.691908: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-28 14:20:35.691926: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-07-28 14:20:35.691943: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-07-28 14:20:35.691960: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-07-28 14:20:35.691977: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-07-28 14:20:35.691994: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-28 14:20:35.693278: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-07-28 14:20:35.694486: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-07-28 14:20:35.694526: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-28 14:20:35.694544: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-28 14:20:35.694560: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-07-28 14:20:35.694576: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-07-28 14:20:35.694592: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-07-28 14:20:35.694608: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-07-28 14:20:35.694624: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-28 14:20:35.695924: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-07-28 14:20:35.695963: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-07-28 14:20:35.695974: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-07-28 14:20:35.695984: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-07-28 14:20:35.697324: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9597 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl2847 thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) {'thcl': {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'], 'list_hashtags_csv': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'svm_hashtag_type_desc': 5275, 'photo_desc_type': 5275, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME learn_RUBBIA_REFUS_AMIENS_23 NUM_CLASSES 9 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : learn_RUBBIA_REFUS_AMIENS_23 model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-07-28 14:20:49.224154: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-28 14:20:49.432979: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23 /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23/mask_model.h5 size_local : 256009536 size in s3 : 256009536 create time local : 2021-08-09 09:43:22 create time in s3 : 2021-08-06 18:54:04 mask_model.h5 already exist and didn't need to update list_images length : 7 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 24 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 39 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 47 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 52 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 44 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 38 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 30 Detection mask done ! Trying to reset tf kernel 3751374 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 5223 tf kernel not reseted sub process len(results) : 7 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 7 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 10293 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl2847 Catched exception ! Connect or reconnect ! thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] time for calcul the mask position with numpy : 0.0009331703186035156 nb_pixel_total : 12606 time to create 1 rle with old method : 0.01661992073059082 length of segment : 145 time for calcul the mask position with numpy : 0.0004210472106933594 nb_pixel_total : 20564 time to create 1 rle with old method : 0.02687215805053711 length of segment : 189 time for calcul the mask position with numpy : 0.0022802352905273438 nb_pixel_total : 91940 time to create 1 rle with old method : 0.11050868034362793 length of segment : 435 time for calcul the mask position with numpy : 0.0014462471008300781 nb_pixel_total : 71141 time to create 1 rle with old method : 0.08116292953491211 length of segment : 476 time for calcul the mask position with numpy : 0.0016791820526123047 nb_pixel_total : 85843 time to create 1 rle with old method : 0.10831332206726074 length of segment : 447 time for calcul the mask position with numpy : 0.0017139911651611328 nb_pixel_total : 82263 time to create 1 rle with old method : 0.09694290161132812 length of segment : 699 time for calcul the mask position with numpy : 0.0010001659393310547 nb_pixel_total : 41217 time to create 1 rle with old method : 0.058183908462524414 length of segment : 250 time for calcul the mask position with numpy : 0.00124359130859375 nb_pixel_total : 66541 time to create 1 rle with old method : 0.0820157527923584 length of segment : 336 time for calcul the mask position with numpy : 0.0003952980041503906 nb_pixel_total : 14793 time to create 1 rle with old method : 0.01888132095336914 length of segment : 173 time for calcul the mask position with numpy : 0.0010666847229003906 nb_pixel_total : 39743 time to create 1 rle with old method : 0.05272507667541504 length of segment : 321 time for calcul the mask position with numpy : 0.0005633831024169922 nb_pixel_total : 22976 time to create 1 rle with old method : 0.027143239974975586 length of segment : 172 time for calcul the mask position with numpy : 0.0007007122039794922 nb_pixel_total : 35410 time to create 1 rle with old method : 0.041133880615234375 length of segment : 235 time for calcul the mask position with numpy : 0.0017957687377929688 nb_pixel_total : 90421 time to create 1 rle with old method : 0.1120462417602539 length of segment : 407 time for calcul the mask position with numpy : 0.0010333061218261719 nb_pixel_total : 54019 time to create 1 rle with old method : 0.06075143814086914 length of segment : 368 time for calcul the mask position with numpy : 0.0004177093505859375 nb_pixel_total : 14979 time to create 1 rle with old method : 0.01746845245361328 length of segment : 216 time for calcul the mask position with numpy : 0.0002617835998535156 nb_pixel_total : 11023 time to create 1 rle with old method : 0.012915611267089844 length of segment : 112 time for calcul the mask position with numpy : 0.00025653839111328125 nb_pixel_total : 7915 time to create 1 rle with old method : 0.009349822998046875 length of segment : 111 time for calcul the mask position with numpy : 0.00030994415283203125 nb_pixel_total : 12047 time to create 1 rle with old method : 0.014219999313354492 length of segment : 143 time for calcul the mask position with numpy : 0.0009806156158447266 nb_pixel_total : 37391 time to create 1 rle with old method : 0.04296135902404785 length of segment : 326 time for calcul the mask position with numpy : 0.00033020973205566406 nb_pixel_total : 9622 time to create 1 rle with old method : 0.011565685272216797 length of segment : 126 time for calcul the mask position with numpy : 0.0007429122924804688 nb_pixel_total : 28250 time to create 1 rle with old method : 0.033492326736450195 length of segment : 194 time for calcul the mask position with numpy : 0.0014371871948242188 nb_pixel_total : 42735 time to create 1 rle with old method : 0.05400705337524414 length of segment : 549 time for calcul the mask position with numpy : 0.007241964340209961 nb_pixel_total : 243694 time to create 1 rle with new method : 0.01908397674560547 length of segment : 875 time for calcul the mask position with numpy : 0.0005962848663330078 nb_pixel_total : 24523 time to create 1 rle with old method : 0.0354459285736084 length of segment : 280 time for calcul the mask position with numpy : 0.0005686283111572266 nb_pixel_total : 28228 time to create 1 rle with old method : 0.03071761131286621 length of segment : 299 time for calcul the mask position with numpy : 0.00026917457580566406 nb_pixel_total : 11301 time to create 1 rle with old method : 0.01276087760925293 length of segment : 97 time for calcul the mask position with numpy : 0.00040984153747558594 nb_pixel_total : 21353 time to create 1 rle with old method : 0.02436542510986328 length of segment : 222 time for calcul the mask position with numpy : 0.000408172607421875 nb_pixel_total : 15431 time to create 1 rle with old method : 0.01751255989074707 length of segment : 215 time for calcul the mask position with numpy : 0.003244638442993164 nb_pixel_total : 182355 time to create 1 rle with new method : 0.012303352355957031 length of segment : 867 time for calcul the mask position with numpy : 0.0009958744049072266 nb_pixel_total : 61611 time to create 1 rle with old method : 0.06964612007141113 length of segment : 296 time for calcul the mask position with numpy : 0.00036334991455078125 nb_pixel_total : 18453 time to create 1 rle with old method : 0.020775556564331055 length of segment : 121 time for calcul the mask position with numpy : 0.0014803409576416016 nb_pixel_total : 34791 time to create 1 rle with old method : 0.03849482536315918 length of segment : 270 time for calcul the mask position with numpy : 0.0006678104400634766 nb_pixel_total : 23675 time to create 1 rle with old method : 0.027341842651367188 length of segment : 197 time for calcul the mask position with numpy : 0.0012526512145996094 nb_pixel_total : 38707 time to create 1 rle with old method : 0.043245792388916016 length of segment : 245 time for calcul the mask position with numpy : 0.0012602806091308594 nb_pixel_total : 26638 time to create 1 rle with old method : 0.03066563606262207 length of segment : 248 time for calcul the mask position with numpy : 0.0032863616943359375 nb_pixel_total : 72088 time to create 1 rle with old method : 0.07951068878173828 length of segment : 699 time for calcul the mask position with numpy : 0.004240512847900391 nb_pixel_total : 92005 time to create 1 rle with old method : 0.1015467643737793 length of segment : 401 time for calcul the mask position with numpy : 0.0010716915130615234 nb_pixel_total : 23169 time to create 1 rle with old method : 0.025539159774780273 length of segment : 195 time for calcul the mask position with numpy : 0.0009591579437255859 nb_pixel_total : 18672 time to create 1 rle with old method : 0.02118206024169922 length of segment : 231 time for calcul the mask position with numpy : 0.0026116371154785156 nb_pixel_total : 51300 time to create 1 rle with old method : 0.06126761436462402 length of segment : 284 time for calcul the mask position with numpy : 0.00484466552734375 nb_pixel_total : 111362 time to create 1 rle with old method : 0.12191224098205566 length of segment : 410 time for calcul the mask position with numpy : 0.0018486976623535156 nb_pixel_total : 43974 time to create 1 rle with old method : 0.049187660217285156 length of segment : 243 time for calcul the mask position with numpy : 0.001955747604370117 nb_pixel_total : 49322 time to create 1 rle with old method : 0.05601191520690918 length of segment : 264 time for calcul the mask position with numpy : 0.00047135353088378906 nb_pixel_total : 8157 time to create 1 rle with old method : 0.009357213973999023 length of segment : 111 time for calcul the mask position with numpy : 0.0002307891845703125 nb_pixel_total : 9221 time to create 1 rle with old method : 0.010802507400512695 length of segment : 73 time for calcul the mask position with numpy : 0.0006871223449707031 nb_pixel_total : 15855 time to create 1 rle with old method : 0.018151044845581055 length of segment : 118 time for calcul the mask position with numpy : 0.004200935363769531 nb_pixel_total : 132903 time to create 1 rle with old method : 0.14545869827270508 length of segment : 366 time for calcul the mask position with numpy : 0.0015344619750976562 nb_pixel_total : 24772 time to create 1 rle with old method : 0.02762627601623535 length of segment : 414 time for calcul the mask position with numpy : 0.0034439563751220703 nb_pixel_total : 51372 time to create 1 rle with old method : 0.05789375305175781 length of segment : 453 time for calcul the mask position with numpy : 0.005103588104248047 nb_pixel_total : 104871 time to create 1 rle with old method : 0.11727261543273926 length of segment : 607 time for calcul the mask position with numpy : 0.005239009857177734 nb_pixel_total : 115379 time to create 1 rle with old method : 0.1296236515045166 length of segment : 410 time for calcul the mask position with numpy : 0.018028736114501953 nb_pixel_total : 268737 time to create 1 rle with new method : 0.03143954277038574 length of segment : 833 time for calcul the mask position with numpy : 0.0012676715850830078 nb_pixel_total : 31066 time to create 1 rle with old method : 0.037091732025146484 length of segment : 236 time for calcul the mask position with numpy : 0.0018055438995361328 nb_pixel_total : 37523 time to create 1 rle with old method : 0.0433659553527832 length of segment : 286 time for calcul the mask position with numpy : 0.0014150142669677734 nb_pixel_total : 31624 time to create 1 rle with old method : 0.03653311729431152 length of segment : 253 time for calcul the mask position with numpy : 0.0014142990112304688 nb_pixel_total : 24853 time to create 1 rle with old method : 0.028927326202392578 length of segment : 267 time for calcul the mask position with numpy : 0.005715131759643555 nb_pixel_total : 286053 time to create 1 rle with new method : 0.027933359146118164 length of segment : 717 time for calcul the mask position with numpy : 0.001344442367553711 nb_pixel_total : 22706 time to create 1 rle with old method : 0.02685856819152832 length of segment : 258 time for calcul the mask position with numpy : 0.0009741783142089844 nb_pixel_total : 17533 time to create 1 rle with old method : 0.020655393600463867 length of segment : 257 time for calcul the mask position with numpy : 0.004730701446533203 nb_pixel_total : 105598 time to create 1 rle with old method : 0.12015748023986816 length of segment : 673 time for calcul the mask position with numpy : 0.0012824535369873047 nb_pixel_total : 26336 time to create 1 rle with old method : 0.03128314018249512 length of segment : 152 time for calcul the mask position with numpy : 0.002073526382446289 nb_pixel_total : 35741 time to create 1 rle with old method : 0.041383981704711914 length of segment : 297 time for calcul the mask position with numpy : 0.0020966529846191406 nb_pixel_total : 53558 time to create 1 rle with old method : 0.061804771423339844 length of segment : 257 time for calcul the mask position with numpy : 0.0023260116577148438 nb_pixel_total : 50402 time to create 1 rle with old method : 0.05832529067993164 length of segment : 299 time for calcul the mask position with numpy : 0.0027043819427490234 nb_pixel_total : 35390 time to create 1 rle with old method : 0.041306257247924805 length of segment : 375 time for calcul the mask position with numpy : 0.002585172653198242 nb_pixel_total : 69106 time to create 1 rle with old method : 0.07846689224243164 length of segment : 310 time for calcul the mask position with numpy : 0.01314401626586914 nb_pixel_total : 314287 time to create 1 rle with new method : 0.025746822357177734 length of segment : 1452 time for calcul the mask position with numpy : 0.0021390914916992188 nb_pixel_total : 38237 time to create 1 rle with old method : 0.04286456108093262 length of segment : 227 time for calcul the mask position with numpy : 0.0013954639434814453 nb_pixel_total : 38106 time to create 1 rle with old method : 0.04285001754760742 length of segment : 259 time for calcul the mask position with numpy : 0.0007562637329101562 nb_pixel_total : 21437 time to create 1 rle with old method : 0.024617671966552734 length of segment : 241 time for calcul the mask position with numpy : 0.004794120788574219 nb_pixel_total : 69587 time to create 1 rle with old method : 0.07699036598205566 length of segment : 559 time for calcul the mask position with numpy : 0.0015211105346679688 nb_pixel_total : 51845 time to create 1 rle with old method : 0.05729866027832031 length of segment : 273 time for calcul the mask position with numpy : 0.0005443096160888672 nb_pixel_total : 12297 time to create 1 rle with old method : 0.014386177062988281 length of segment : 141 time for calcul the mask position with numpy : 0.0019881725311279297 nb_pixel_total : 43611 time to create 1 rle with old method : 0.04868197441101074 length of segment : 352 time for calcul the mask position with numpy : 0.0010843276977539062 nb_pixel_total : 29264 time to create 1 rle with old method : 0.03335833549499512 length of segment : 355 time for calcul the mask position with numpy : 0.004271268844604492 nb_pixel_total : 99957 time to create 1 rle with old method : 0.11195039749145508 length of segment : 546 time for calcul the mask position with numpy : 0.0026645660400390625 nb_pixel_total : 44525 time to create 1 rle with old method : 0.05066180229187012 length of segment : 395 time for calcul the mask position with numpy : 0.0005178451538085938 nb_pixel_total : 12121 time to create 1 rle with old method : 0.013857603073120117 length of segment : 133 time for calcul the mask position with numpy : 0.0015854835510253906 nb_pixel_total : 22535 time to create 1 rle with old method : 0.02608013153076172 length of segment : 312 time for calcul the mask position with numpy : 0.003434896469116211 nb_pixel_total : 74357 time to create 1 rle with old method : 0.0810856819152832 length of segment : 457 time for calcul the mask position with numpy : 0.0010302066802978516 nb_pixel_total : 15101 time to create 1 rle with old method : 0.016977310180664062 length of segment : 198 time for calcul the mask position with numpy : 0.0007023811340332031 nb_pixel_total : 31103 time to create 1 rle with old method : 0.036034345626831055 length of segment : 184 time for calcul the mask position with numpy : 0.0016109943389892578 nb_pixel_total : 32978 time to create 1 rle with old method : 0.04484200477600098 length of segment : 315 time for calcul the mask position with numpy : 0.0013806819915771484 nb_pixel_total : 21004 time to create 1 rle with old method : 0.02467942237854004 length of segment : 432 time for calcul the mask position with numpy : 0.0025949478149414062 nb_pixel_total : 60835 time to create 1 rle with old method : 0.0701594352722168 length of segment : 288 time for calcul the mask position with numpy : 0.002924680709838867 nb_pixel_total : 57809 time to create 1 rle with old method : 0.06664395332336426 length of segment : 305 time for calcul the mask position with numpy : 0.0012404918670654297 nb_pixel_total : 24308 time to create 1 rle with old method : 0.02780914306640625 length of segment : 251 time for calcul the mask position with numpy : 0.0015494823455810547 nb_pixel_total : 38633 time to create 1 rle with old method : 0.0446622371673584 length of segment : 220 time for calcul the mask position with numpy : 0.0013718605041503906 nb_pixel_total : 29027 time to create 1 rle with old method : 0.03307533264160156 length of segment : 318 time for calcul the mask position with numpy : 0.0031397342681884766 nb_pixel_total : 86432 time to create 1 rle with old method : 0.09553742408752441 length of segment : 600 time for calcul the mask position with numpy : 0.0006721019744873047 nb_pixel_total : 15535 time to create 1 rle with old method : 0.018175125122070312 length of segment : 111 time for calcul the mask position with numpy : 0.0016293525695800781 nb_pixel_total : 29710 time to create 1 rle with old method : 0.03349566459655762 length of segment : 315 time for calcul the mask position with numpy : 0.0014765262603759766 nb_pixel_total : 38176 time to create 1 rle with old method : 0.04305124282836914 length of segment : 218 time for calcul the mask position with numpy : 0.0008306503295898438 nb_pixel_total : 19315 time to create 1 rle with old method : 0.022082090377807617 length of segment : 204 time for calcul the mask position with numpy : 0.0024852752685546875 nb_pixel_total : 73810 time to create 1 rle with old method : 0.08139967918395996 length of segment : 376 time for calcul the mask position with numpy : 0.004910945892333984 nb_pixel_total : 97142 time to create 1 rle with old method : 0.10593891143798828 length of segment : 393 time for calcul the mask position with numpy : 0.0015797615051269531 nb_pixel_total : 34532 time to create 1 rle with old method : 0.03796267509460449 length of segment : 345 time for calcul the mask position with numpy : 0.0015060901641845703 nb_pixel_total : 28896 time to create 1 rle with old method : 0.03347158432006836 length of segment : 184 time for calcul the mask position with numpy : 0.0029447078704833984 nb_pixel_total : 79336 time to create 1 rle with old method : 0.08671736717224121 length of segment : 314 time for calcul the mask position with numpy : 0.000171661376953125 nb_pixel_total : 6460 time to create 1 rle with old method : 0.007378339767456055 length of segment : 143 time for calcul the mask position with numpy : 0.0032720565795898438 nb_pixel_total : 70654 time to create 1 rle with old method : 0.07649469375610352 length of segment : 434 time for calcul the mask position with numpy : 0.001495361328125 nb_pixel_total : 38729 time to create 1 rle with old method : 0.041673898696899414 length of segment : 281 time for calcul the mask position with numpy : 0.0018398761749267578 nb_pixel_total : 50267 time to create 1 rle with old method : 0.055036306381225586 length of segment : 453 time for calcul the mask position with numpy : 0.0009779930114746094 nb_pixel_total : 25711 time to create 1 rle with old method : 0.02899622917175293 length of segment : 147 time for calcul the mask position with numpy : 0.008881568908691406 nb_pixel_total : 155322 time to create 1 rle with new method : 0.018713951110839844 length of segment : 620 time for calcul the mask position with numpy : 0.002125263214111328 nb_pixel_total : 54389 time to create 1 rle with old method : 0.060648202896118164 length of segment : 283 time for calcul the mask position with numpy : 0.0014066696166992188 nb_pixel_total : 36159 time to create 1 rle with old method : 0.04076099395751953 length of segment : 327 time for calcul the mask position with numpy : 0.00039196014404296875 nb_pixel_total : 9481 time to create 1 rle with old method : 0.010959148406982422 length of segment : 78 time for calcul the mask position with numpy : 0.005419492721557617 nb_pixel_total : 172981 time to create 1 rle with new method : 0.008521080017089844 length of segment : 620 time for calcul the mask position with numpy : 0.0006597042083740234 nb_pixel_total : 11867 time to create 1 rle with old method : 0.013760089874267578 length of segment : 121 time for calcul the mask position with numpy : 0.0005612373352050781 nb_pixel_total : 13127 time to create 1 rle with old method : 0.015491247177124023 length of segment : 119 time for calcul the mask position with numpy : 0.0019893646240234375 nb_pixel_total : 31511 time to create 1 rle with old method : 0.03417348861694336 length of segment : 352 time for calcul the mask position with numpy : 0.0050506591796875 nb_pixel_total : 159178 time to create 1 rle with new method : 0.010203123092651367 length of segment : 577 time for calcul the mask position with numpy : 0.0010099411010742188 nb_pixel_total : 33300 time to create 1 rle with old method : 0.035968780517578125 length of segment : 207 time for calcul the mask position with numpy : 0.0011479854583740234 nb_pixel_total : 17278 time to create 1 rle with old method : 0.019778013229370117 length of segment : 224 time for calcul the mask position with numpy : 0.0032036304473876953 nb_pixel_total : 41463 time to create 1 rle with old method : 0.046416282653808594 length of segment : 695 time for calcul the mask position with numpy : 0.0011990070343017578 nb_pixel_total : 22596 time to create 1 rle with old method : 0.025768518447875977 length of segment : 192 time for calcul the mask position with numpy : 0.0013508796691894531 nb_pixel_total : 31468 time to create 1 rle with old method : 0.03540396690368652 length of segment : 322 time for calcul the mask position with numpy : 0.0023207664489746094 nb_pixel_total : 49042 time to create 1 rle with old method : 0.057364702224731445 length of segment : 404 time for calcul the mask position with numpy : 0.0005829334259033203 nb_pixel_total : 18241 time to create 1 rle with old method : 0.022092342376708984 length of segment : 131 time for calcul the mask position with numpy : 0.010164022445678711 nb_pixel_total : 199042 time to create 1 rle with new method : 0.025400400161743164 length of segment : 1153 time for calcul the mask position with numpy : 0.0007836818695068359 nb_pixel_total : 15099 time to create 1 rle with old method : 0.01660752296447754 length of segment : 175 time for calcul the mask position with numpy : 0.0022149085998535156 nb_pixel_total : 67908 time to create 1 rle with old method : 0.07544088363647461 length of segment : 330 time for calcul the mask position with numpy : 0.0013341903686523438 nb_pixel_total : 34351 time to create 1 rle with old method : 0.03719186782836914 length of segment : 302 time for calcul the mask position with numpy : 0.0010249614715576172 nb_pixel_total : 24161 time to create 1 rle with old method : 0.026555299758911133 length of segment : 223 time for calcul the mask position with numpy : 0.0012729167938232422 nb_pixel_total : 39089 time to create 1 rle with old method : 0.04333758354187012 length of segment : 298 time for calcul the mask position with numpy : 0.002379179000854492 nb_pixel_total : 37322 time to create 1 rle with old method : 0.04179191589355469 length of segment : 703 time for calcul the mask position with numpy : 0.004770517349243164 nb_pixel_total : 59709 time to create 1 rle with old method : 0.06661057472229004 length of segment : 484 time for calcul the mask position with numpy : 0.004935264587402344 nb_pixel_total : 149613 time to create 1 rle with old method : 0.1624009609222412 length of segment : 438 time for calcul the mask position with numpy : 0.001859903335571289 nb_pixel_total : 50574 time to create 1 rle with old method : 0.05521821975708008 length of segment : 498 time for calcul the mask position with numpy : 0.0015261173248291016 nb_pixel_total : 48044 time to create 1 rle with old method : 0.05393695831298828 length of segment : 241 time for calcul the mask position with numpy : 0.0007746219635009766 nb_pixel_total : 38043 time to create 1 rle with old method : 0.04225420951843262 length of segment : 249 time for calcul the mask position with numpy : 0.0009319782257080078 nb_pixel_total : 34029 time to create 1 rle with old method : 0.04090261459350586 length of segment : 347 time for calcul the mask position with numpy : 0.0024094581604003906 nb_pixel_total : 96509 time to create 1 rle with old method : 0.10625362396240234 length of segment : 304 time spent for convertir_results : 14.264567136764526 Inside saveOutput : final : False verbose : 0 eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 134 chid ids of type : 3594 Number RLEs to save : 45159 save missing photos in datou_result : time spend for datou_step_exec : 88.66638994216919 time spend to save output : 2.792534828186035 total time spend for step 1 : 91.45892477035522 step2:crop_condition Mon Jul 28 14:22:03 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure Loading chi in step crop with photo_hashtag_type : 3594 Loading chi in step crop for list_pids : 7 ! batch 1 Loaded 134 chid ids of type : 3594 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ begin to crop the class : papier param for this class : {'min_score': 0.7} filtre for class : papier hashtag_id of this class : 492668766 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 110 About to insert : list_path_to_insert length 110 new photo from crops ! About to upload 110 photos upload in portfolio : 3736932 init cache_photo without model_param we have 110 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753705352_3750635 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 110 photos in the portfolio 3736932 time of upload the photos Elapsed time : 27.59818410873413 we have finished the crop for the class : papier begin to crop the class : carton param for this class : {'min_score': 0.7} filtre for class : carton hashtag_id of this class : 492774966 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 14 About to insert : list_path_to_insert length 14 new photo from crops ! About to upload 14 photos upload in portfolio : 3736932 init cache_photo without model_param we have 14 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753705385_3750635 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 14 photos in the portfolio 3736932 time of upload the photos Elapsed time : 3.829005002975464 we have finished the crop for the class : carton begin to crop the class : metal param for this class : {'min_score': 0.7} filtre for class : metal hashtag_id of this class : 492628673 we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 1 About to insert : list_path_to_insert length 1 new photo from crops ! About to upload 1 photos upload in portfolio : 3736932 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753705389_3750635 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 1 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.8209350109100342 we have finished the crop for the class : metal begin to crop the class : pet_clair param for this class : {'min_score': 0.7} filtre for class : pet_clair hashtag_id of this class : 2107755846 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 8 About to insert : list_path_to_insert length 8 new photo from crops ! About to upload 8 photos upload in portfolio : 3736932 init cache_photo without model_param we have 8 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753705394_3750635 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 8 photos in the portfolio 3736932 time of upload the photos Elapsed time : 2.8907809257507324 we have finished the crop for the class : pet_clair begin to crop the class : autre param for this class : {'min_score': 0.7} filtre for class : autre hashtag_id of this class : 494826614 we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 1 About to insert : list_path_to_insert length 1 new photo from crops ! About to upload 1 photos upload in portfolio : 3736932 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753705399_3750635 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 1 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.5442538261413574 we have finished the crop for the class : autre begin to crop the class : pehd param for this class : {'min_score': 0.7} filtre for class : pehd hashtag_id of this class : 628944319 begin to crop the class : pet_fonce param for this class : {'min_score': 0.7} filtre for class : pet_fonce hashtag_id of this class : 2107755900 delete rles from all chi we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : crop_condition we use saveGeneral [1374136574, 1374136542, 1374136518, 1374136494, 1374136490, 1374136432, 1374136237] Looping around the photos to save general results len do output : 134 /1374161197Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161199Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161200Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161201Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161202Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161204Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161205Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161206Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161208Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161209Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161210Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161212Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161213Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161214Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161215Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161217Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161218Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161219Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161221Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161222Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161223Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161224Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161226Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161227Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161228Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161230Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161231Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161232Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161234Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161235Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161236Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161238Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161239Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161240Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161241Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161243Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161244Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161245Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161247Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161248Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161249Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161250Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161252Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161253Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161254Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161255Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161256Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161257Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161258Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161259Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161260Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161262Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161264Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161266Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161267Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161268Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161269Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161270Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161271Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161272Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161273Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161274Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161275Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161276Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161277Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161278Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161279Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161280Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161281Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161282Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161283Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161284Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161285Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161286Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161287Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161288Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161289Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161290Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161291Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161292Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161293Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161294Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161295Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161297Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161298Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161299Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161301Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161302Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161303Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161304Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161306Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161307Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161308Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161310Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161312Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161313Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161314Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161316Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161317Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161318Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161320Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161321Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161322Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161324Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161325Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161326Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161327Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161329Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161330Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161331Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161365Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161366Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161368Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161369Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161370Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161372Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161373Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161374Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161376Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161377Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161378Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161379Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161381Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161382Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161395Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161458Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161460Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161461Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161462Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161464Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161465Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161466Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161468Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374161485Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136574', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136542', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136518', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136494', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136490', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136432', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136237', None, None, None, None, None, '3371491') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 409 time used for this insertion : 0.03074932098388672 save_final save missing photos in datou_result : time spend for datou_step_exec : 76.37329602241516 time spend to save output : 0.03583192825317383 total time spend for step 2 : 76.40912795066833 step3:rle_unique_nms_with_priority Mon Jul 28 14:23:19 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 134 chid ids of type : 3594 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++nb_obj : 8 nb_hashtags : 3 time to prepare the origin masks : 3.547344207763672 time for calcul the mask position with numpy : 0.41538071632385254 nb_pixel_total : 7822285 time to create 1 rle with new method : 0.47966933250427246 time for calcul the mask position with numpy : 0.028049230575561523 nb_pixel_total : 66541 time to create 1 rle with old method : 0.08365416526794434 time for calcul the mask position with numpy : 0.028982877731323242 nb_pixel_total : 41217 time to create 1 rle with old method : 0.046332359313964844 time for calcul the mask position with numpy : 0.026363372802734375 nb_pixel_total : 82263 time to create 1 rle with old method : 0.09385442733764648 time for calcul the mask position with numpy : 0.02777266502380371 nb_pixel_total : 85843 time to create 1 rle with old method : 0.11802053451538086 time for calcul the mask position with numpy : 0.02746748924255371 nb_pixel_total : 71141 time to create 1 rle with old method : 0.08327150344848633 time for calcul the mask position with numpy : 0.02820587158203125 nb_pixel_total : 91940 time to create 1 rle with old method : 0.10629940032958984 time for calcul the mask position with numpy : 0.028272628784179688 nb_pixel_total : 20564 time to create 1 rle with old method : 0.02486896514892578 time for calcul the mask position with numpy : 0.02708148956298828 nb_pixel_total : 12606 time to create 1 rle with old method : 0.015559911727905273 create new chi : 1.7307281494140625 time to delete rle : 0.024363994598388672 batch 1 Loaded 17 chid ids of type : 3594 +++++++++++++++Number RLEs to save : 8114 TO DO : save crop sub photo not yet done ! save time : 0.48839735984802246 nb_obj : 25 nb_hashtags : 3 time to prepare the origin masks : 7.490589380264282 time for calcul the mask position with numpy : 1.0172333717346191 nb_pixel_total : 7208286 time to create 1 rle with new method : 1.4024033546447754 time for calcul the mask position with numpy : 0.03612065315246582 nb_pixel_total : 23675 time to create 1 rle with old method : 0.026297330856323242 time for calcul the mask position with numpy : 0.03429555892944336 nb_pixel_total : 34791 time to create 1 rle with old method : 0.04363250732421875 time for calcul the mask position with numpy : 0.03674960136413574 nb_pixel_total : 18453 time to create 1 rle with old method : 0.023072004318237305 time for calcul the mask position with numpy : 0.04462623596191406 nb_pixel_total : 61611 time to create 1 rle with old method : 0.06925439834594727 time for calcul the mask position with numpy : 0.03268122673034668 nb_pixel_total : 181730 time to create 1 rle with new method : 0.8552882671356201 time for calcul the mask position with numpy : 0.03052353858947754 nb_pixel_total : 15431 time to create 1 rle with old method : 0.026401519775390625 time for calcul the mask position with numpy : 0.0294036865234375 nb_pixel_total : 21353 time to create 1 rle with old method : 0.02476644515991211 time for calcul the mask position with numpy : 0.026215553283691406 nb_pixel_total : 11301 time to create 1 rle with old method : 0.012501955032348633 time for calcul the mask position with numpy : 0.02788543701171875 nb_pixel_total : 28228 time to create 1 rle with old method : 0.03206157684326172 time for calcul the mask position with numpy : 0.027818679809570312 nb_pixel_total : 24523 time to create 1 rle with old method : 0.028809785842895508 time for calcul the mask position with numpy : 0.036454200744628906 nb_pixel_total : 243694 time to create 1 rle with new method : 0.8538632392883301 time for calcul the mask position with numpy : 0.03159785270690918 nb_pixel_total : 42735 time to create 1 rle with old method : 0.07232022285461426 time for calcul the mask position with numpy : 0.034247398376464844 nb_pixel_total : 28250 time to create 1 rle with old method : 0.04985404014587402 time for calcul the mask position with numpy : 0.04264068603515625 nb_pixel_total : 9622 time to create 1 rle with old method : 0.011156082153320312 time for calcul the mask position with numpy : 0.04485368728637695 nb_pixel_total : 37391 time to create 1 rle with old method : 0.043209075927734375 time for calcul the mask position with numpy : 0.0431516170501709 nb_pixel_total : 12047 time to create 1 rle with old method : 0.014181852340698242 time for calcul the mask position with numpy : 0.0440974235534668 nb_pixel_total : 7915 time to create 1 rle with old method : 0.009248495101928711 time for calcul the mask position with numpy : 0.04283642768859863 nb_pixel_total : 11023 time to create 1 rle with old method : 0.015175104141235352 time for calcul the mask position with numpy : 0.054154157638549805 nb_pixel_total : 14979 time to create 1 rle with old method : 0.020343780517578125 time for calcul the mask position with numpy : 0.04233741760253906 nb_pixel_total : 54019 time to create 1 rle with old method : 0.06139802932739258 time for calcul the mask position with numpy : 0.039247989654541016 nb_pixel_total : 90421 time to create 1 rle with old method : 0.10483098030090332 time for calcul the mask position with numpy : 0.03693580627441406 nb_pixel_total : 35410 time to create 1 rle with old method : 0.04205489158630371 time for calcul the mask position with numpy : 0.03873944282531738 nb_pixel_total : 22976 time to create 1 rle with old method : 0.027016639709472656 time for calcul the mask position with numpy : 0.0412595272064209 nb_pixel_total : 39743 time to create 1 rle with old method : 0.053566694259643555 time for calcul the mask position with numpy : 0.04961037635803223 nb_pixel_total : 14793 time to create 1 rle with old method : 0.01706242561340332 create new chi : 6.01895546913147 time to delete rle : 0.002897977828979492 batch 1 Loaded 51 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++Number RLEs to save : 16452 TO DO : save crop sub photo not yet done ! save time : 0.981464147567749 nb_obj : 26 nb_hashtags : 3 time to prepare the origin masks : 5.031678199768066 time for calcul the mask position with numpy : 0.6591048240661621 nb_pixel_total : 6845459 time to create 1 rle with new method : 0.934633731842041 time for calcul the mask position with numpy : 0.03524899482727051 nb_pixel_total : 92005 time to create 1 rle with old method : 0.10289120674133301 time for calcul the mask position with numpy : 0.033298492431640625 nb_pixel_total : 23169 time to create 1 rle with old method : 0.02542853355407715 time for calcul the mask position with numpy : 0.03346085548400879 nb_pixel_total : 22706 time to create 1 rle with old method : 0.025207042694091797 time for calcul the mask position with numpy : 0.03499794006347656 nb_pixel_total : 115379 time to create 1 rle with old method : 0.13281679153442383 time for calcul the mask position with numpy : 0.03464961051940918 nb_pixel_total : 38707 time to create 1 rle with old method : 0.04381203651428223 time for calcul the mask position with numpy : 0.03476428985595703 nb_pixel_total : 72088 time to create 1 rle with old method : 0.0842142105102539 time for calcul the mask position with numpy : 0.035593271255493164 nb_pixel_total : 37523 time to create 1 rle with old method : 0.0420992374420166 time for calcul the mask position with numpy : 0.03448367118835449 nb_pixel_total : 31066 time to create 1 rle with old method : 0.03478741645812988 time for calcul the mask position with numpy : 0.034590721130371094 nb_pixel_total : 51372 time to create 1 rle with old method : 0.05740046501159668 time for calcul the mask position with numpy : 0.03430461883544922 nb_pixel_total : 17533 time to create 1 rle with old method : 0.019815683364868164 time for calcul the mask position with numpy : 0.034269094467163086 nb_pixel_total : 31624 time to create 1 rle with old method : 0.03554964065551758 time for calcul the mask position with numpy : 0.03427743911743164 nb_pixel_total : 18672 time to create 1 rle with old method : 0.02082061767578125 time for calcul the mask position with numpy : 0.0337069034576416 nb_pixel_total : 43974 time to create 1 rle with old method : 0.04883837699890137 time for calcul the mask position with numpy : 0.03666234016418457 nb_pixel_total : 111362 time to create 1 rle with old method : 0.12695693969726562 time for calcul the mask position with numpy : 0.036406517028808594 nb_pixel_total : 24772 time to create 1 rle with old method : 0.027958393096923828 time for calcul the mask position with numpy : 0.03563523292541504 nb_pixel_total : 268137 time to create 1 rle with new method : 0.6641948223114014 time for calcul the mask position with numpy : 0.034639835357666016 nb_pixel_total : 49322 time to create 1 rle with old method : 0.05550122261047363 time for calcul the mask position with numpy : 0.0345911979675293 nb_pixel_total : 24853 time to create 1 rle with old method : 0.028105974197387695 time for calcul the mask position with numpy : 0.03440546989440918 nb_pixel_total : 25732 time to create 1 rle with old method : 0.03148937225341797 time for calcul the mask position with numpy : 0.0346527099609375 nb_pixel_total : 26638 time to create 1 rle with old method : 0.030089616775512695 time for calcul the mask position with numpy : 0.03619122505187988 nb_pixel_total : 9221 time to create 1 rle with old method : 0.010616302490234375 time for calcul the mask position with numpy : 0.03445029258728027 nb_pixel_total : 51300 time to create 1 rle with old method : 0.05777907371520996 time for calcul the mask position with numpy : 0.03482842445373535 nb_pixel_total : 104871 time to create 1 rle with old method : 0.11848855018615723 time for calcul the mask position with numpy : 0.03525495529174805 nb_pixel_total : 132903 time to create 1 rle with old method : 0.149003267288208 time for calcul the mask position with numpy : 0.03463435173034668 nb_pixel_total : 15855 time to create 1 rle with old method : 0.01796746253967285 time for calcul the mask position with numpy : 0.03454875946044922 nb_pixel_total : 8157 time to create 1 rle with old method : 0.009416580200195312 create new chi : 4.576979398727417 time to delete rle : 0.002710580825805664 batch 1 Loaded 53 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++Number RLEs to save : 19658 TO DO : save crop sub photo not yet done ! save time : 1.1919102668762207 nb_obj : 28 nb_hashtags : 3 time to prepare the origin masks : 4.735236167907715 time for calcul the mask position with numpy : 0.5402753353118896 nb_pixel_total : 6841575 time to create 1 rle with new method : 0.8466639518737793 time for calcul the mask position with numpy : 0.0355226993560791 nb_pixel_total : 69587 time to create 1 rle with old method : 0.08320236206054688 time for calcul the mask position with numpy : 0.03879547119140625 nb_pixel_total : 24308 time to create 1 rle with old method : 0.037837982177734375 time for calcul the mask position with numpy : 0.034957170486450195 nb_pixel_total : 57809 time to create 1 rle with old method : 0.06547045707702637 time for calcul the mask position with numpy : 0.03507590293884277 nb_pixel_total : 38237 time to create 1 rle with old method : 0.04355144500732422 time for calcul the mask position with numpy : 0.03492164611816406 nb_pixel_total : 31103 time to create 1 rle with old method : 0.03542828559875488 time for calcul the mask position with numpy : 0.03454875946044922 nb_pixel_total : 53558 time to create 1 rle with old method : 0.06090998649597168 time for calcul the mask position with numpy : 0.03776144981384277 nb_pixel_total : 60835 time to create 1 rle with old method : 0.07609987258911133 time for calcul the mask position with numpy : 0.03455924987792969 nb_pixel_total : 69106 time to create 1 rle with old method : 0.0780489444732666 time for calcul the mask position with numpy : 0.03495621681213379 nb_pixel_total : 99957 time to create 1 rle with old method : 0.11375021934509277 time for calcul the mask position with numpy : 0.0341191291809082 nb_pixel_total : 74357 time to create 1 rle with old method : 0.08416008949279785 time for calcul the mask position with numpy : 0.03419137001037598 nb_pixel_total : 105598 time to create 1 rle with old method : 0.11856889724731445 time for calcul the mask position with numpy : 0.035472869873046875 nb_pixel_total : 296757 time to create 1 rle with new method : 0.7030315399169922 time for calcul the mask position with numpy : 0.0348355770111084 nb_pixel_total : 43611 time to create 1 rle with old method : 0.04934811592102051 time for calcul the mask position with numpy : 0.0341644287109375 nb_pixel_total : 51845 time to create 1 rle with old method : 0.06044745445251465 time for calcul the mask position with numpy : 0.03407120704650879 nb_pixel_total : 38106 time to create 1 rle with old method : 0.04315495491027832 time for calcul the mask position with numpy : 0.03450322151184082 nb_pixel_total : 21429 time to create 1 rle with old method : 0.024347305297851562 time for calcul the mask position with numpy : 0.03430914878845215 nb_pixel_total : 8226 time to create 1 rle with old method : 0.00950932502746582 time for calcul the mask position with numpy : 0.035932064056396484 nb_pixel_total : 26336 time to create 1 rle with old method : 0.029979944229125977 time for calcul the mask position with numpy : 0.033946990966796875 nb_pixel_total : 35741 time to create 1 rle with old method : 0.039829254150390625 time for calcul the mask position with numpy : 0.03464365005493164 nb_pixel_total : 32978 time to create 1 rle with old method : 0.03981423377990723 time for calcul the mask position with numpy : 0.034082889556884766 nb_pixel_total : 35390 time to create 1 rle with old method : 0.03970193862915039 time for calcul the mask position with numpy : 0.03405570983886719 nb_pixel_total : 22535 time to create 1 rle with old method : 0.0251309871673584 time for calcul the mask position with numpy : 0.036067962646484375 nb_pixel_total : 50402 time to create 1 rle with old method : 0.057000160217285156 time for calcul the mask position with numpy : 0.03461408615112305 nb_pixel_total : 44525 time to create 1 rle with old method : 0.050966739654541016 time for calcul the mask position with numpy : 0.034372568130493164 nb_pixel_total : 21004 time to create 1 rle with old method : 0.024288177490234375 time for calcul the mask position with numpy : 0.03436636924743652 nb_pixel_total : 15101 time to create 1 rle with old method : 0.017191410064697266 time for calcul the mask position with numpy : 0.034458160400390625 nb_pixel_total : 12297 time to create 1 rle with old method : 0.014185428619384766 time for calcul the mask position with numpy : 0.033969879150390625 nb_pixel_total : 12087 time to create 1 rle with old method : 0.013698339462280273 create new chi : 4.473533391952515 time to delete rle : 0.004634857177734375 batch 1 Loaded 57 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++Number RLEs to save : 21215 TO DO : save crop sub photo not yet done ! save time : 1.3159840106964111 nb_obj : 18 nb_hashtags : 3 time to prepare the origin masks : 7.283308029174805 time for calcul the mask position with numpy : 0.3345317840576172 nb_pixel_total : 7441410 time to create 1 rle with new method : 1.5860610008239746 time for calcul the mask position with numpy : 0.04155445098876953 nb_pixel_total : 90625 time to create 1 rle with old method : 0.10100746154785156 time for calcul the mask position with numpy : 0.035787343978881836 nb_pixel_total : 25711 time to create 1 rle with old method : 0.02884960174560547 time for calcul the mask position with numpy : 0.03893136978149414 nb_pixel_total : 50267 time to create 1 rle with old method : 0.056342363357543945 time for calcul the mask position with numpy : 0.03354191780090332 nb_pixel_total : 38729 time to create 1 rle with old method : 0.04379010200500488 time for calcul the mask position with numpy : 0.04061770439147949 nb_pixel_total : 70654 time to create 1 rle with old method : 0.0815744400024414 time for calcul the mask position with numpy : 0.03883624076843262 nb_pixel_total : 6460 time to create 1 rle with old method : 0.008655071258544922 time for calcul the mask position with numpy : 0.03736591339111328 nb_pixel_total : 79336 time to create 1 rle with old method : 0.08967733383178711 time for calcul the mask position with numpy : 0.040628671646118164 nb_pixel_total : 28896 time to create 1 rle with old method : 0.03536176681518555 time for calcul the mask position with numpy : 0.034601449966430664 nb_pixel_total : 34532 time to create 1 rle with old method : 0.04247260093688965 time for calcul the mask position with numpy : 0.028667688369750977 nb_pixel_total : 97142 time to create 1 rle with old method : 0.12137341499328613 time for calcul the mask position with numpy : 0.025463581085205078 nb_pixel_total : 73810 time to create 1 rle with old method : 0.0805521011352539 time for calcul the mask position with numpy : 0.023659706115722656 nb_pixel_total : 19315 time to create 1 rle with old method : 0.020868778228759766 time for calcul the mask position with numpy : 0.024380922317504883 nb_pixel_total : 38176 time to create 1 rle with old method : 0.04205942153930664 time for calcul the mask position with numpy : 0.029419898986816406 nb_pixel_total : 29710 time to create 1 rle with old method : 0.032805681228637695 time for calcul the mask position with numpy : 0.03896498680114746 nb_pixel_total : 15535 time to create 1 rle with old method : 0.017360448837280273 time for calcul the mask position with numpy : 0.0400998592376709 nb_pixel_total : 86432 time to create 1 rle with old method : 0.0969536304473877 time for calcul the mask position with numpy : 0.03431963920593262 nb_pixel_total : 29027 time to create 1 rle with old method : 0.0329897403717041 time for calcul the mask position with numpy : 0.038735389709472656 nb_pixel_total : 38633 time to create 1 rle with old method : 0.04356741905212402 create new chi : 3.5662994384765625 time to delete rle : 0.002295255661010742 batch 1 Loaded 37 chid ids of type : 3594 ++++++++++++++++++++++++++++++++++Number RLEs to save : 13299 TO DO : save crop sub photo not yet done ! save time : 0.8165268898010254 nb_obj : 17 nb_hashtags : 3 time to prepare the origin masks : 6.387237071990967 time for calcul the mask position with numpy : 0.3666853904724121 nb_pixel_total : 7385632 time to create 1 rle with new method : 0.44389915466308594 time for calcul the mask position with numpy : 0.04445290565490723 nb_pixel_total : 15099 time to create 1 rle with old method : 0.01821756362915039 time for calcul the mask position with numpy : 0.04945492744445801 nb_pixel_total : 191588 time to create 1 rle with new method : 0.5657141208648682 time for calcul the mask position with numpy : 0.04064655303955078 nb_pixel_total : 18241 time to create 1 rle with old method : 0.020630359649658203 time for calcul the mask position with numpy : 0.03963589668273926 nb_pixel_total : 49042 time to create 1 rle with old method : 0.05632615089416504 time for calcul the mask position with numpy : 0.04110097885131836 nb_pixel_total : 31468 time to create 1 rle with old method : 0.03617596626281738 time for calcul the mask position with numpy : 0.040604591369628906 nb_pixel_total : 22596 time to create 1 rle with old method : 0.025625228881835938 time for calcul the mask position with numpy : 0.03229546546936035 nb_pixel_total : 41463 time to create 1 rle with old method : 0.046750783920288086 time for calcul the mask position with numpy : 0.025601863861083984 nb_pixel_total : 17278 time to create 1 rle with old method : 0.01940131187438965 time for calcul the mask position with numpy : 0.02554607391357422 nb_pixel_total : 33300 time to create 1 rle with old method : 0.037656307220458984 time for calcul the mask position with numpy : 0.0273435115814209 nb_pixel_total : 159178 time to create 1 rle with new method : 0.5105175971984863 time for calcul the mask position with numpy : 0.025541067123413086 nb_pixel_total : 31511 time to create 1 rle with old method : 0.0358889102935791 time for calcul the mask position with numpy : 0.028463125228881836 nb_pixel_total : 13127 time to create 1 rle with old method : 0.01533651351928711 time for calcul the mask position with numpy : 0.03750133514404297 nb_pixel_total : 11867 time to create 1 rle with old method : 0.013466835021972656 time for calcul the mask position with numpy : 0.04231405258178711 nb_pixel_total : 172981 time to create 1 rle with new method : 0.4709925651550293 time for calcul the mask position with numpy : 0.04350018501281738 nb_pixel_total : 9481 time to create 1 rle with old method : 0.011113882064819336 time for calcul the mask position with numpy : 0.04489874839782715 nb_pixel_total : 36159 time to create 1 rle with old method : 0.05835127830505371 time for calcul the mask position with numpy : 0.04161643981933594 nb_pixel_total : 54389 time to create 1 rle with old method : 0.0627293586730957 create new chi : 3.5686182975769043 time to delete rle : 0.002192974090576172 batch 1 Loaded 35 chid ids of type : 3594 ++++++++++++++++++++++++++++Number RLEs to save : 13685 TO DO : save crop sub photo not yet done ! save time : 0.8336024284362793 nb_obj : 12 nb_hashtags : 3 time to prepare the origin masks : 6.631157159805298 time for calcul the mask position with numpy : 0.7340941429138184 nb_pixel_total : 7622029 time to create 1 rle with new method : 1.1938226222991943 time for calcul the mask position with numpy : 0.05155658721923828 nb_pixel_total : 96509 time to create 1 rle with old method : 0.11940836906433105 time for calcul the mask position with numpy : 0.04477953910827637 nb_pixel_total : 27272 time to create 1 rle with old method : 0.031845808029174805 time for calcul the mask position with numpy : 0.03848838806152344 nb_pixel_total : 38043 time to create 1 rle with old method : 0.04366135597229004 time for calcul the mask position with numpy : 0.039305925369262695 nb_pixel_total : 48044 time to create 1 rle with old method : 0.05474567413330078 time for calcul the mask position with numpy : 0.029928207397460938 nb_pixel_total : 50574 time to create 1 rle with old method : 0.0611262321472168 time for calcul the mask position with numpy : 0.028467655181884766 nb_pixel_total : 149613 time to create 1 rle with old method : 0.18511056900024414 time for calcul the mask position with numpy : 0.030115365982055664 nb_pixel_total : 59709 time to create 1 rle with old method : 0.07121038436889648 time for calcul the mask position with numpy : 0.04360318183898926 nb_pixel_total : 37098 time to create 1 rle with old method : 0.04392075538635254 time for calcul the mask position with numpy : 0.04361414909362793 nb_pixel_total : 39089 time to create 1 rle with old method : 0.04423379898071289 time for calcul the mask position with numpy : 0.040535926818847656 nb_pixel_total : 24161 time to create 1 rle with old method : 0.02779531478881836 time for calcul the mask position with numpy : 0.04339241981506348 nb_pixel_total : 34351 time to create 1 rle with old method : 0.03913450241088867 time for calcul the mask position with numpy : 0.03547215461730957 nb_pixel_total : 67908 time to create 1 rle with old method : 0.07893562316894531 create new chi : 3.3330576419830322 time to delete rle : 0.0024182796478271484 batch 1 Loaded 25 chid ids of type : 3594 ++++++++++++++++Number RLEs to save : 10905 TO DO : save crop sub photo not yet done ! save time : 0.6900460720062256 map_output_result : {1374136574: (0.0, 'Should be the crop_list due to order', 0), 1374136542: (0.0, 'Should be the crop_list due to order', 0), 1374136518: (0.0, 'Should be the crop_list due to order', 0), 1374136494: (0.0, 'Should be the crop_list due to order', 0), 1374136490: (0.0, 'Should be the crop_list due to order', 0), 1374136432: (0.0, 'Should be the crop_list due to order', 0), 1374136237: (0.0, 'Should be the crop_list due to order', 0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1374136574, 1374136542, 1374136518, 1374136494, 1374136490, 1374136432, 1374136237] Looping around the photos to save general results len do output : 7 /1374136574.Didn't retrieve data . /1374136542.Didn't retrieve data . /1374136518.Didn't retrieve data . /1374136494.Didn't retrieve data . /1374136490.Didn't retrieve data . /1374136432.Didn't retrieve data . /1374136237.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136574', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136542', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136518', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136494', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136490', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136432', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136237', None, None, None, None, None, '3371491') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 21 time used for this insertion : 0.015182733535766602 save_final save missing photos in datou_result : time spend for datou_step_exec : 76.01144194602966 time spend to save output : 0.015609025955200195 total time spend for step 3 : 76.02705097198486 step4:ventilate_hashtags_in_portfolio Mon Jul 28 14:24:35 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 25403226 get user id for portfolio 25403226 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25403226 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('pehd','pet_fonce','environnement','carton','papier','autre','metal','mal_croppe','background','flou','pet_clair')) AND mptpi.`min_score`=0.5 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25403226 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('pehd','pet_fonce','environnement','carton','papier','autre','metal','mal_croppe','background','flou','pet_clair')) AND mptpi.`min_score`=0.5 To do Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25403226 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('pehd','pet_fonce','environnement','carton','papier','autre','metal','mal_croppe','background','flou','pet_clair')) AND mptpi.`min_score`=0.5 To do lien utilise dans velours : https://www.fotonower.com/velours/25406570,25406571,25406572,25406573,25406574,25406575,25406576,25406577,25406578,25406579,25406580?tags=pehd,pet_fonce,environnement,carton,papier,autre,metal,mal_croppe,background,flou,pet_clair Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1374136574, 1374136542, 1374136518, 1374136494, 1374136490, 1374136432, 1374136237] Looping around the photos to save general results len do output : 1 /25403226. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136574', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136542', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136518', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136494', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136490', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136432', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136237', None, None, None, None, None, '3371491') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 8 time used for this insertion : 0.01624917984008789 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.7180719375610352 time spend to save output : 0.01656031608581543 total time spend for step 4 : 1.7346322536468506 step5:final Mon Jul 28 14:24:37 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Catched exception ! Connect or reconnect ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1374136574: ('0.11873973489858906',), 1374136542: ('0.11873973489858906',), 1374136518: ('0.11873973489858906',), 1374136494: ('0.11873973489858906',), 1374136490: ('0.11873973489858906',), 1374136432: ('0.11873973489858906',), 1374136237: ('0.11873973489858906',)} new output for save of step final : {1374136574: ('0.11873973489858906',), 1374136542: ('0.11873973489858906',), 1374136518: ('0.11873973489858906',), 1374136494: ('0.11873973489858906',), 1374136490: ('0.11873973489858906',), 1374136432: ('0.11873973489858906',), 1374136237: ('0.11873973489858906',)} [1374136574, 1374136542, 1374136518, 1374136494, 1374136490, 1374136432, 1374136237] Looping around the photos to save general results len do output : 7 /1374136574.Didn't retrieve data . /1374136542.Didn't retrieve data . /1374136518.Didn't retrieve data . /1374136494.Didn't retrieve data . /1374136490.Didn't retrieve data . /1374136432.Didn't retrieve data . /1374136237.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136574', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136542', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136518', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136494', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136490', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136432', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136237', None, None, None, None, None, '3371491') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 21 time used for this insertion : 0.014531373977661133 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.1258695125579834 time spend to save output : 0.015228986740112305 total time spend for step 5 : 0.1410984992980957 step6:blur_detection Mon Jul 28 14:24:37 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection methode: ratio et variance treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb.jpg resize: (2160, 3840) 1374136574 -7.123152260828501 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3.jpg resize: (2160, 3840) 1374136542 -7.240451899213289 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6.jpg resize: (2160, 3840) 1374136518 -7.325555813427854 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110.jpg resize: (2160, 3840) 1374136494 -7.235752338579301 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c.jpg resize: (2160, 3840) 1374136490 -7.342090319683126 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e.jpg resize: (2160, 3840) 1374136432 -7.081893333424807 treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2.jpg resize: (2160, 3840) 1374136237 -7.368336082492026 treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb_rle_crop_3896483463_0.png resize: (336, 322) 1374161197 -1.1650513042005348 treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb_rle_crop_3896483457_0.png resize: (179, 171) 1374161199 -2.4829316153473147 treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb_rle_crop_3896483458_0.png resize: (419, 357) 1374161200 -3.2039221132711524 treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb_rle_crop_3896483461_0.png resize: (497, 369) 1374161201 -3.4424257978074166 treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb_rle_crop_3896483460_0.png resize: (406, 337) 1374161202 -4.748508039520945 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483475_0.png resize: (126, 117) 1374161204 -2.007313413627321 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483467_0.png resize: (233, 214) 1374161205 -4.774675944758545 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483488_0.png resize: (193, 258) 1374161206 -4.244406628359765 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483464_0.png resize: (173, 172) 1374161208 -2.4307792932249064 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483466_0.png resize: (172, 213) 1374161209 -1.8006357372735973 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483481_0.png resize: (93, 194) 1374161210 -4.121638216695594 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483473_0.png resize: (132, 128) 1374161212 -3.7804810270399503 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483480_0.png resize: (215, 206) 1374161213 -1.978126423939982 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483468_0.png resize: (379, 377) 1374161214 -3.976661465455081 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483471_0.png resize: (111, 121) 1374161215 -3.5451647472393444 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483465_0.png resize: (305, 272) 1374161217 -2.5958758082973605 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483476_0.png resize: (194, 236) 1374161218 -3.09029960228014 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483470_0.png resize: (198, 129) 1374161219 -3.704831663869723 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483483_0.png resize: (205, 121) 1374161221 -3.9848893945635577 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483469_0.png resize: (285, 289) 1374161222 -0.11953824845511214 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483472_0.png resize: (108, 108) 1374161223 1.1247395109065412 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483479_0.png resize: (278, 127) 1374161224 -3.452693802694122 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483486_0.png resize: (119, 193) 1374161226 -2.8798787194660136 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483484_0.png resize: (558, 566) 1374161227 -4.282461879414503 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483485_0.png resize: (289, 282) 1374161228 -4.861355678106319 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483482_0.png resize: (215, 118) 1374161230 -1.762008527272082 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483477_0.png resize: (383, 251) 1374161231 -3.92219992072179 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483478_0.png resize: (697, 551) 1374161232 -5.703499554437016 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483513_0.png resize: (258, 121) 1374161234 -3.433490849874244 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483512_0.png resize: (700, 666) 1374161235 -4.391308466861521 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483494_0.png resize: (231, 111) 1374161236 -2.746073552648587 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483491_0.png resize: (401, 273) 1374161238 -4.645623714959299 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483493_0.png resize: (195, 197) 1374161239 -1.238129704450898 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483498_0.png resize: (248, 289) 1374161240 -4.568583146973037 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483492_0.png resize: (387, 473) 1374161241 0.9344677945192076 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483497_0.png resize: (236, 281) 1374161243 2.3853368480272676 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483500_0.png resize: (73, 179) 1374161244 1.252401862227186 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483495_0.png resize: (284, 283) 1374161245 -4.6082450530818955 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483501_0.png resize: (117, 182) 1374161247 -3.514942067745766 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483496_0.png resize: (398, 496) 1374161248 -4.965591330853415 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483489_0.png resize: (240, 180) 1374161249 -1.8854705793174793 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483506_0.png resize: (410, 467) 1374161250 -3.9268340457184476 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483510_0.png resize: (250, 198) 1374161252 -4.80958519616041 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483499_0.png resize: (111, 111) 1374161253 -3.084524306965318 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483508_0.png resize: (216, 224) 1374161254 -4.4314648072836125 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483503_0.png resize: (216, 236) 1374161255 -2.6794431651706145 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483504_0.png resize: (451, 340) 1374161256 -3.394090212132589 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483490_0.png resize: (248, 181) 1374161257 -4.598923911457501 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483514_0.png resize: (257, 96) 1374161258 -4.386786377766166 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483532_0.png resize: (357, 226) 1374161259 -5.084336882779786 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483534_0.png resize: (253, 206) 1374161260 -2.9826898416830283 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483518_0.png resize: (245, 286) 1374161262 0.2354994395194732 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483529_0.png resize: (351, 178) 1374161264 -5.042494166969297 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483523_0.png resize: (201, 281) 1374161266 -3.062715324717694 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483531_0.png resize: (451, 264) 1374161267 -4.41672385397571 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483541_0.png resize: (291, 342) 1374161268 0.5923223082240946 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483524_0.png resize: (235, 268) 1374161269 -1.6557253123570723 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483527_0.png resize: (252, 235) 1374161270 -3.421566476389618 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483521_0.png resize: (273, 353) 1374161271 0.27352318255498304 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483516_0.png resize: (150, 238) 1374161272 0.06272059583075064 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483538_0.png resize: (315, 177) 1374161273 -4.932829319572846 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483515_0.png resize: (463, 352) 1374161274 -2.5543369363049915 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483528_0.png resize: (127, 112) 1374161275 -2.817476359284579 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483519_0.png resize: (288, 317) 1374161276 -1.1121260958628099 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483525_0.png resize: (241, 163) 1374161277 -3.3108741410948306 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483533_0.png resize: (128, 131) 1374161278 -4.59247481957652 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483522_0.png resize: (820, 732) 1374161279 -4.2298459710853065 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483517_0.png resize: (297, 187) 1374161280 -3.0817514009232014 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483539_0.png resize: (287, 211) 1374161281 -3.2352949995808546 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483530_0.png resize: (339, 194) 1374161282 -3.671968652419242 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483542_0.png resize: (250, 137) 1374161283 -3.385222039211539 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483536_0.png resize: (198, 107) 1374161284 -4.372483531535936 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483520_0.png resize: (375, 176) 1374161285 -2.571299893210722 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483535_0.png resize: (457, 253) 1374161286 -3.530590171080128 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483560_0.png resize: (591, 538) 1374161287 -3.1589808749027437 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483550_0.png resize: (362, 292) 1374161288 -2.323076490102177 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483543_0.png resize: (213, 287) 1374161289 1.8180943239785718 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483549_0.png resize: (204, 118) 1374161290 -1.9456539749604975 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483544_0.png resize: (306, 182) 1374161291 -0.7568766356834279 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483545_0.png resize: (551, 252) 1374161292 -2.1187422447328217 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483559_0.png resize: (147, 235) 1374161293 -4.179545091978629 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483555_0.png resize: (141, 68) 1374161294 -2.612501708876747 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483553_0.png resize: (184, 312) 1374161295 1.0493163697433396 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483557_0.png resize: (268, 193) 1374161297 -4.591583630294456 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483551_0.png resize: (248, 696) 1374161298 -3.4067330505214883 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483548_0.png resize: (214, 239) 1374161299 1.955143275760175 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483558_0.png resize: (269, 269) 1374161301 -4.7335218616463495 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483546_0.png resize: (110, 235) 1374161302 -0.9208391818939934 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483552_0.png resize: (289, 197) 1374161303 -2.3612999573152984 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483576_0.png resize: (904, 547) 1374161304 -3.6043020798030483 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483570_0.png resize: (246, 156) 1374161306 -3.2652608291003107 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483561_0.png resize: (275, 327) 1374161307 -2.75865216907306 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483577_0.png resize: (175, 162) 1374161308 -3.9026637207515744 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483567_0.png resize: (332, 186) 1374161310 -4.400573919810701 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483563_0.png resize: (73, 165) 1374161312 2.4346547764821618 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483562_0.png resize: (310, 159) 1374161313 -2.3129707170393923 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483573_0.png resize: (315, 172) 1374161314 -3.2004471066814286 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483572_0.png resize: (189, 262) 1374161316 -1.07365007004603 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483568_0.png resize: (276, 902) 1374161317 -4.86074118838805 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483564_0.png resize: (528, 460) 1374161318 -4.6087026925135675 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483571_0.png resize: (410, 253) 1374161320 -3.8600609659840868 treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483585_0.png resize: (474, 201) 1374161321 -4.966276343938353 treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483586_0.png resize: (226, 323) 1374161322 -3.9664065730335545 treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483582_0.png resize: (412, 310) 1374161324 -3.6882095249868625 treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483578_0.png resize: (323, 289) 1374161325 -4.2656426967739405 treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483579_0.png resize: (294, 186) 1374161326 -3.0848645473356036 treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483587_0.png resize: (249, 189) 1374161327 2.5496715176562477 treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483581_0.png resize: (287, 188) 1374161329 -3.5363897721254434 treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483584_0.png resize: (428, 505) 1374161330 -4.13447964710112 treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483589_0.png resize: (289, 385) 1374161331 -1.7247859118894562 treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb_rle_crop_3896483462_0.png resize: (234, 262) 1374161365 -3.8566845673502415 treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb_rle_crop_3896483456_0.png resize: (144, 145) 1374161366 -2.8093982449781865 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483474_0.png resize: (304, 266) 1374161368 -3.800753818530522 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483507_0.png resize: (747, 624) 1374161369 -4.399257549434393 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483509_0.png resize: (257, 235) 1374161370 -4.658317878542389 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483537_0.png resize: (184, 235) 1374161372 -2.7119553183082266 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483540_0.png resize: (288, 358) 1374161373 -3.501879747364533 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483547_0.png resize: (280, 175) 1374161374 -4.506348697513414 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483574_0.png resize: (388, 214) 1374161376 -4.600879629242422 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483569_0.png resize: (207, 213) 1374161377 -3.963662581345566 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483575_0.png resize: (130, 183) 1374161378 -2.7775146091234646 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483566_0.png resize: (118, 167) 1374161379 -3.0749536519264193 treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483588_0.png resize: (293, 220) 1374161381 -4.8469950018834975 treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483580_0.png resize: (223, 206) 1374161382 -4.7645952278794645 treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483565_0.png resize: (121, 143) 1374161395 -4.523849058864772 treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb_rle_crop_3896483459_0.png resize: (408, 287) 1374161458 -5.162888661714039 treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483487_0.png resize: (219, 253) 1374161460 -4.34748835549583 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483502_0.png resize: (320, 553) 1374161461 -4.566427378655191 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483505_0.png resize: (382, 525) 1374161462 -2.7907562332995157 treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483511_0.png resize: (259, 137) 1374161464 -4.244581747724055 treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483526_0.png resize: (429, 327) 1374161465 -4.2076958865216945 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483556_0.png resize: (396, 334) 1374161466 -4.333183595265751 treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483554_0.png resize: (206, 520) 1374161468 -3.0160335458761374 treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483583_0.png resize: (484, 456) 1374161485 -2.8389087438764378 Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 141 time used for this insertion : 0.0198209285736084 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 141 time used for this insertion : 0.03408932685852051 save missing photos in datou_result : time spend for datou_step_exec : 34.917341232299805 time spend to save output : 0.05977058410644531 total time spend for step 6 : 34.97711181640625 step7:brightness Mon Jul 28 14:25:12 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb.jpg treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3.jpg treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6.jpg treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110.jpg treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c.jpg treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e.jpg treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2.jpg treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb_rle_crop_3896483463_0.png treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb_rle_crop_3896483457_0.png treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb_rle_crop_3896483458_0.png treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb_rle_crop_3896483461_0.png treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb_rle_crop_3896483460_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483475_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483467_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483488_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483464_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483466_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483481_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483473_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483480_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483468_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483471_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483465_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483476_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483470_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483483_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483469_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483472_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483479_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483486_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483484_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483485_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483482_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483477_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483478_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483513_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483512_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483494_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483491_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483493_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483498_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483492_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483497_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483500_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483495_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483501_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483496_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483489_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483506_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483510_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483499_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483508_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483503_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483504_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483490_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483514_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483532_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483534_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483518_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483529_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483523_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483531_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483541_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483524_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483527_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483521_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483516_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483538_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483515_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483528_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483519_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483525_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483533_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483522_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483517_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483539_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483530_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483542_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483536_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483520_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483535_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483560_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483550_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483543_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483549_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483544_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483545_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483559_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483555_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483553_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483557_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483551_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483548_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483558_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483546_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483552_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483576_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483570_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483561_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483577_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483567_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483563_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483562_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483573_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483572_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483568_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483564_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483571_0.png treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483585_0.png treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483586_0.png treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483582_0.png treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483578_0.png treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483579_0.png treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483587_0.png treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483581_0.png treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483584_0.png treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483589_0.png treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb_rle_crop_3896483462_0.png treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb_rle_crop_3896483456_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483474_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483507_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483509_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483537_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483540_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483547_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483574_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483569_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483575_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483566_0.png treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483588_0.png treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483580_0.png treat image : temp/1753705230_3750635_1374136432_0c9512f87c8bf47431870e316783452e_rle_crop_3896483565_0.png treat image : temp/1753705230_3750635_1374136574_ed4eab6f1155eedd243889c77bf98ddb_rle_crop_3896483459_0.png treat image : temp/1753705230_3750635_1374136542_6f7917576c8a1644f1d73c1cab9ae2c3_rle_crop_3896483487_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483502_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483505_0.png treat image : temp/1753705230_3750635_1374136518_2d44bc134d380746537cfde5b2c063e6_rle_crop_3896483511_0.png treat image : temp/1753705230_3750635_1374136494_4d5bdf526edb40f01644407f67edb110_rle_crop_3896483526_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483556_0.png treat image : temp/1753705230_3750635_1374136490_1202b1f30cf0f8ba3d2b36217a86386c_rle_crop_3896483554_0.png treat image : temp/1753705230_3750635_1374136237_1bb112845046f3b36b1f290ce87ccfa2_rle_crop_3896483583_0.png Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 141 time used for this insertion : 0.02024245262145996 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 141 time used for this insertion : 0.034380197525024414 save missing photos in datou_result : time spend for datou_step_exec : 8.687827110290527 time spend to save output : 0.0602571964263916 total time spend for step 7 : 8.748084306716919 step8:velours_tree Mon Jul 28 14:25:21 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure can't find the photo_desc_type Inside saveOutput : final : False verbose : 0 ouput is None No outpout to save, returning out of save general time spend for datou_step_exec : 0.3582427501678467 time spend to save output : 0.00014162063598632812 total time spend for step 8 : 0.358384370803833 step9:send_mail_cod Mon Jul 28 14:25:21 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin/workarea/git/Velours/python in order to get the selector url, please entre the license of selector results_Auto_P25403226_28-07-2025_14_25_21.pdf 25406570 imagette254065701753705521 25406571 imagette254065711753705521 25406573 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette254065731753705521 25406574 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette254065741753705522 25406575 change filename to text .imagette254065751753705523 25406576 change filename to text .imagette254065761753705524 25406577 imagette254065771753705524 25406578 imagette254065781753705524 25406579 imagette254065791753705524 25406580 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette254065801753705524 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=25403226 and hashtag_type = 3594 and pcr.hashtag_id = h.hashtag_id; velour_link : https://www.fotonower.com/velours/25406570,25406571,25406572,25406573,25406574,25406575,25406576,25406577,25406578,25406579,25406580?tags=pehd,pet_fonce,environnement,carton,papier,autre,metal,mal_croppe,background,flou,pet_clair args[1374136574] : ((1374136574, -7.123152260828501, 492609224), (1374136574, 1.4683037359201978, 2107752395), '0.11873973489858906') We are sending mail with results at report@fotonower.com args[1374136542] : ((1374136542, -7.240451899213289, 492609224), (1374136542, 0.8079762703563851, 2107752395), '0.11873973489858906') We are sending mail with results at report@fotonower.com args[1374136518] : ((1374136518, -7.325555813427854, 492609224), (1374136518, 1.1208796521165327, 2107752395), '0.11873973489858906') We are sending mail with results at report@fotonower.com args[1374136494] : ((1374136494, -7.235752338579301, 492609224), (1374136494, 1.514278494741548, 2107752395), '0.11873973489858906') We are sending mail with results at report@fotonower.com args[1374136490] : ((1374136490, -7.342090319683126, 492609224), (1374136490, 1.3253391887157877, 2107752395), '0.11873973489858906') We are sending mail with results at report@fotonower.com args[1374136432] : ((1374136432, -7.081893333424807, 492609224), (1374136432, 1.283147175494751, 2107752395), '0.11873973489858906') We are sending mail with results at report@fotonower.com args[1374136237] : ((1374136237, -7.368336082492026, 492609224), (1374136237, 1.1822199438954988, 2107752395), '0.11873973489858906') We are sending mail with results at report@fotonower.com refus_total : 0.11873973489858906 2022-04-13 10:29:59 0 SELECT ph.photo_id,ph.url,ph.username,ph.uploaded_at,ph.text FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=25403226 AND mpp.hide_status=0 ORDER BY mpp.order LIMIT 0, 1000 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25403226_28-07-2025_14_25_21.pdf results_Auto_P25403226_28-07-2025_14_25_21.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25403226_28-07-2025_14_25_21.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('3318','25403226','results_Auto_P25403226_28-07-2025_14_25_21.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25403226_28-07-2025_14_25_21.pdf','pdf','','0.52','0.11873973489858906') message_in_mail: Bonjour,
Veuillez trouver ci dessous les résultats du service carac on demand pour le portfolio: https://www.fotonower.com/view/25403226

https://www.fotonower.com/image?json=false&list_photos_id=1374136574
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374136542
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374136518
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374136494
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374136490
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374136432
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374136237
Bravo, la photo est bien prise.

Dans ces conditions,le taux de refus est: 11.87%
Veuillez trouver les photos des contaminants.

exemples de contaminants: carton: https://www.fotonower.com/view/25406573?limit=200
exemples de contaminants: papier: https://www.fotonower.com/view/25406574?limit=200
exemples de contaminants: autre: https://www.fotonower.com/view/25406575?limit=200
exemples de contaminants: metal: https://www.fotonower.com/view/25406576?limit=200
exemples de contaminants: pet_clair: https://www.fotonower.com/view/25406580?limit=200
Veuillez trouver le rapport en pdf:https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25403226_28-07-2025_14_25_21.pdf.

Lien vers velours :https://www.fotonower.com/velours/25406570,25406571,25406572,25406573,25406574,25406575,25406576,25406577,25406578,25406579,25406580?tags=pehd,pet_fonce,environnement,carton,papier,autre,metal,mal_croppe,background,flou,pet_clair.


L'équipe Fotonower 202 b'' Server: nginx Date: Mon, 28 Jul 2025 12:25:29 GMT Content-Length: 0 Connection: close X-Message-Id: hGcE2lLWTdCGGHloBfxnww Access-Control-Allow-Origin: https://sendgrid.api-docs.io Access-Control-Allow-Methods: POST Access-Control-Allow-Headers: Authorization, Content-Type, On-behalf-of, x-sg-elas-acl Access-Control-Max-Age: 600 X-No-CORS-Reason: https://sendgrid.com/docs/Classroom/Basics/API/cors.html Strict-Transport-Security: max-age=31536000; includeSubDomains Content-Security-Policy: frame-ancestors 'none' Cache-Control: no-cache X-Content-Type-Options: no-sniff Referrer-Policy: strict-origin-when-cross-origin Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1374136574, 1374136542, 1374136518, 1374136494, 1374136490, 1374136432, 1374136237] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136574', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136542', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136518', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136494', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136490', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136432', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136237', None, None, None, None, None, '3371491') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 7 time used for this insertion : 0.013506889343261719 save_final save missing photos in datou_result : time spend for datou_step_exec : 7.5693464279174805 time spend to save output : 0.01370096206665039 total time spend for step 9 : 7.583047389984131 step10:split_time_score Mon Jul 28 14:25:29 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score Catched exception ! Connect or reconnect ! TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('12', 7),) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 28072025 25403226 Nombre de photos uploadées : 7 / 23040 (0%) 28072025 25403226 Nombre de photos taguées (types de déchets): 0 / 7 (0%) 28072025 25403226 Nombre de photos taguées (volume) : 0 / 7 (0%) elapsed_time : load_data_split_time_score 2.6226043701171875e-06 elapsed_time : order_list_meta_photo_and_scores 9.298324584960938e-06 ??????? elapsed_time : fill_and_build_computed_from_old_data 0.00048661231994628906 Catched exception ! Connect or reconnect ! Catched exception ! Connect or reconnect ! elapsed_time : insert_dashboard_record_day_entry 0.20969033241271973 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.0036678978592061414 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25395602_28-07-2025_10_37_26.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25395602 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25395602 AND mptpi.`type`=3726 To do Qualite : 0.050278109423888696 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25394246_28-07-2025_09_51_56.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25394246 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25394246 AND mptpi.`type`=3726 To do Qualite : 0.04542007865825641 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25399179_28-07-2025_11_48_37.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25399179 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25399179 AND mptpi.`type`=3726 To do Qualite : 0.10750159143518523 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25401991_28-07-2025_12_43_56.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25401991 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25401991 AND mptpi.`type`=3594 To do Qualite : 0.11873973489858906 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25403226_28-07-2025_14_25_21.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25403226 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25403226 AND mptpi.`type`=3594 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'28072025': {'nb_upload': 7, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1374136574, 1374136542, 1374136518, 1374136494, 1374136490, 1374136432, 1374136237] Looping around the photos to save general results len do output : 1 /25403226Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136574', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136542', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136518', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136494', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136490', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136432', None, None, None, None, None, '3371491') ('3318', None, None, None, None, None, None, None, '3371491') ('3318', '25403226', '1374136237', None, None, None, None, None, '3371491') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 8 time used for this insertion : 0.019211292266845703 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.7121808528900146 time spend to save output : 0.019461393356323242 total time spend for step 10 : 0.7316422462463379 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 7 set_done_treatment 145.01user 108.81system 5:04.11elapsed 83%CPU (0avgtext+0avgdata 4553596maxresident)k 2105424inputs+111472outputs (39831major+13986317minor)pagefaults 0swaps