python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 3318 ' -s datou_3318 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 3508334 load datou : 3318 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? [(photo_id, hashtag_id, hashtag_type, x0, x1, y0, y1, score, seg_temp, polygons), ...] was removed should we ? chemin de la photo was removed should we ? [ (photo_id_loc, hashtag_id, hashtag_type, x0, x1, y0, y1, score, None), ...] was removed should we ? chemin de la photo was removed should we ? id de la photo (peut être local ou global) was removed should we ? chemin de la photo was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? None was removed should we ? donnée sous forme de nombre was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) load thcls load THCL from format json or kwargs add thcl : 2847 in CacheModelConfig load pdts add pdt : 5275 in CacheModelConfig Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 3318, datou_cur_ids : ['3370199'] with mtr_portfolio_ids : ['25398994'] and first list_photo_ids : [] new path : /proc/3508334/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : mask_detect, crop_condition, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, velours_tree, send_mail_cod, split_time_score over limit max, limiting to limit_max 40 list_input_json : [] origin We have 1 , BFBFBFBFBFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 9 ; length of list_pids : 9 ; length of list_args : 9 time to download the photos : 1.9026601314544678 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 10 step1:mask_detect Mon Jul 28 13:00:31 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 6956 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-07-28 13:00:35.009941: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-07-28 13:00:35.043411: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493035000 Hz 2025-07-28 13:00:35.045724: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7fb3a0000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-07-28 13:00:35.045795: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-07-28 13:00:35.050444: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-07-28 13:00:35.182283: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x43129e0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-07-28 13:00:35.182337: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-07-28 13:00:35.183063: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-07-28 13:00:35.183507: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-28 13:00:35.186607: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-28 13:00:35.201534: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-07-28 13:00:35.202234: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-07-28 13:00:35.208131: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-07-28 13:00:35.211076: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-07-28 13:00:35.222815: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-28 13:00:35.224543: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-07-28 13:00:35.225081: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-28 13:00:35.225936: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-07-28 13:00:35.225960: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-07-28 13:00:35.225970: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-07-28 13:00:35.228087: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6396 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-07-28 13:00:35.610717: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-07-28 13:00:35.610801: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-28 13:00:35.610817: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-28 13:00:35.610832: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-07-28 13:00:35.610846: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-07-28 13:00:35.610861: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-07-28 13:00:35.610875: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-07-28 13:00:35.610889: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-28 13:00:35.611996: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-07-28 13:00:35.613179: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-07-28 13:00:35.613223: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-28 13:00:35.613240: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-28 13:00:35.613262: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-07-28 13:00:35.613291: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-07-28 13:00:35.613315: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-07-28 13:00:35.613343: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-07-28 13:00:35.613372: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-28 13:00:35.614812: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-07-28 13:00:35.614855: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-07-28 13:00:35.614869: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-07-28 13:00:35.614880: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-07-28 13:00:35.616252: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6396 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl2847 thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) {'thcl': {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'], 'list_hashtags_csv': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'svm_hashtag_type_desc': 5275, 'photo_desc_type': 5275, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME learn_RUBBIA_REFUS_AMIENS_23 NUM_CLASSES 9 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : learn_RUBBIA_REFUS_AMIENS_23 model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-07-28 13:00:47.234459: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-28 13:00:47.510905: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23 /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23/mask_model.h5 size_local : 256009536 size in s3 : 256009536 create time local : 2021-08-09 09:43:22 create time in s3 : 2021-08-06 18:54:04 mask_model.h5 already exist and didn't need to update list_images length : 9 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 24 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 62 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 27 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 21 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 31 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 41 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 32 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 37 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 32 Detection mask done ! Trying to reset tf kernel 3509622 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 4251 tf kernel not reseted sub process len(results) : 9 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 9 len(list_Values) 0 process is alive process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 9319 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl2847 Catched exception ! Connect or reconnect ! thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] time for calcul the mask position with numpy : 0.008328914642333984 nb_pixel_total : 82246 time to create 1 rle with old method : 0.10184454917907715 length of segment : 389 time for calcul the mask position with numpy : 0.06919145584106445 nb_pixel_total : 361087 time to create 1 rle with new method : 0.13252830505371094 length of segment : 715 time for calcul the mask position with numpy : 0.0035431385040283203 nb_pixel_total : 46718 time to create 1 rle with old method : 0.05370950698852539 length of segment : 291 time for calcul the mask position with numpy : 0.0019152164459228516 nb_pixel_total : 32293 time to create 1 rle with old method : 0.03764629364013672 length of segment : 133 time for calcul the mask position with numpy : 0.005453824996948242 nb_pixel_total : 26226 time to create 1 rle with old method : 0.04130125045776367 length of segment : 224 time for calcul the mask position with numpy : 0.0009877681732177734 nb_pixel_total : 9315 time to create 1 rle with old method : 0.01114654541015625 length of segment : 181 time for calcul the mask position with numpy : 0.0011277198791503906 nb_pixel_total : 14943 time to create 1 rle with old method : 0.01766180992126465 length of segment : 113 time for calcul the mask position with numpy : 0.00019502639770507812 nb_pixel_total : 7504 time to create 1 rle with old method : 0.009297609329223633 length of segment : 113 time for calcul the mask position with numpy : 0.003612518310546875 nb_pixel_total : 32085 time to create 1 rle with old method : 0.03943490982055664 length of segment : 239 time for calcul the mask position with numpy : 0.0025708675384521484 nb_pixel_total : 15771 time to create 1 rle with old method : 0.01809406280517578 length of segment : 229 time for calcul the mask position with numpy : 0.005342245101928711 nb_pixel_total : 54352 time to create 1 rle with old method : 0.06380319595336914 length of segment : 278 time for calcul the mask position with numpy : 0.08104729652404785 nb_pixel_total : 68979 time to create 1 rle with old method : 0.09334373474121094 length of segment : 379 time for calcul the mask position with numpy : 0.015005350112915039 nb_pixel_total : 95982 time to create 1 rle with old method : 0.11643457412719727 length of segment : 429 time for calcul the mask position with numpy : 0.0013203620910644531 nb_pixel_total : 16671 time to create 1 rle with old method : 0.019502639770507812 length of segment : 154 time for calcul the mask position with numpy : 0.1316525936126709 nb_pixel_total : 65861 time to create 1 rle with old method : 0.08675527572631836 length of segment : 368 time for calcul the mask position with numpy : 0.05675673484802246 nb_pixel_total : 27450 time to create 1 rle with old method : 0.03674459457397461 length of segment : 171 time for calcul the mask position with numpy : 0.03860616683959961 nb_pixel_total : 21551 time to create 1 rle with old method : 0.03795289993286133 length of segment : 192 time for calcul the mask position with numpy : 0.04264521598815918 nb_pixel_total : 97714 time to create 1 rle with old method : 0.12295866012573242 length of segment : 331 time for calcul the mask position with numpy : 0.01714944839477539 nb_pixel_total : 53034 time to create 1 rle with old method : 0.06224560737609863 length of segment : 322 time for calcul the mask position with numpy : 0.13317131996154785 nb_pixel_total : 86241 time to create 1 rle with old method : 0.1040341854095459 length of segment : 392 time for calcul the mask position with numpy : 0.0025272369384765625 nb_pixel_total : 26659 time to create 1 rle with old method : 0.031096935272216797 length of segment : 206 time for calcul the mask position with numpy : 0.04082226753234863 nb_pixel_total : 40514 time to create 1 rle with old method : 0.05330634117126465 length of segment : 263 time for calcul the mask position with numpy : 0.01857781410217285 nb_pixel_total : 12434 time to create 1 rle with old method : 0.022183656692504883 length of segment : 132 time for calcul the mask position with numpy : 0.11240863800048828 nb_pixel_total : 105291 time to create 1 rle with old method : 0.13386988639831543 length of segment : 578 time for calcul the mask position with numpy : 0.008643865585327148 nb_pixel_total : 62892 time to create 1 rle with old method : 0.1320493221282959 length of segment : 306 time for calcul the mask position with numpy : 0.05384564399719238 nb_pixel_total : 27369 time to create 1 rle with old method : 0.03676176071166992 length of segment : 251 time for calcul the mask position with numpy : 0.025691509246826172 nb_pixel_total : 60126 time to create 1 rle with old method : 0.08426570892333984 length of segment : 369 time for calcul the mask position with numpy : 0.0015091896057128906 nb_pixel_total : 13290 time to create 1 rle with old method : 0.018786191940307617 length of segment : 156 time for calcul the mask position with numpy : 0.012071847915649414 nb_pixel_total : 26273 time to create 1 rle with old method : 0.035088300704956055 length of segment : 223 time for calcul the mask position with numpy : 0.04150533676147461 nb_pixel_total : 73650 time to create 1 rle with old method : 0.09545111656188965 length of segment : 408 time for calcul the mask position with numpy : 0.26612281799316406 nb_pixel_total : 264332 time to create 1 rle with new method : 0.27143263816833496 length of segment : 946 time for calcul the mask position with numpy : 0.04914450645446777 nb_pixel_total : 24180 time to create 1 rle with old method : 0.033353567123413086 length of segment : 375 time for calcul the mask position with numpy : 0.012607097625732422 nb_pixel_total : 43500 time to create 1 rle with old method : 0.057267189025878906 length of segment : 313 time for calcul the mask position with numpy : 0.0023834705352783203 nb_pixel_total : 29083 time to create 1 rle with old method : 0.04949760437011719 length of segment : 168 time for calcul the mask position with numpy : 0.029980897903442383 nb_pixel_total : 28558 time to create 1 rle with old method : 0.03923201560974121 length of segment : 219 time for calcul the mask position with numpy : 0.0014202594757080078 nb_pixel_total : 26527 time to create 1 rle with old method : 0.03126096725463867 length of segment : 228 time for calcul the mask position with numpy : 0.023747920989990234 nb_pixel_total : 31640 time to create 1 rle with old method : 0.047239065170288086 length of segment : 315 time for calcul the mask position with numpy : 0.013143777847290039 nb_pixel_total : 40061 time to create 1 rle with old method : 0.06575274467468262 length of segment : 270 time for calcul the mask position with numpy : 0.04834580421447754 nb_pixel_total : 54539 time to create 1 rle with old method : 0.06753206253051758 length of segment : 346 time for calcul the mask position with numpy : 0.00912332534790039 nb_pixel_total : 48516 time to create 1 rle with old method : 0.06594181060791016 length of segment : 347 time for calcul the mask position with numpy : 0.0012705326080322266 nb_pixel_total : 8581 time to create 1 rle with old method : 0.014649391174316406 length of segment : 123 time for calcul the mask position with numpy : 0.003340482711791992 nb_pixel_total : 7480 time to create 1 rle with old method : 0.011238336563110352 length of segment : 107 time for calcul the mask position with numpy : 0.010326147079467773 nb_pixel_total : 72388 time to create 1 rle with old method : 0.08553099632263184 length of segment : 427 time for calcul the mask position with numpy : 0.007799625396728516 nb_pixel_total : 91475 time to create 1 rle with old method : 0.13296246528625488 length of segment : 421 time for calcul the mask position with numpy : 0.003834247589111328 nb_pixel_total : 77126 time to create 1 rle with old method : 0.09425592422485352 length of segment : 465 time for calcul the mask position with numpy : 0.002239704132080078 nb_pixel_total : 9463 time to create 1 rle with old method : 0.014277219772338867 length of segment : 99 time for calcul the mask position with numpy : 0.016988754272460938 nb_pixel_total : 40906 time to create 1 rle with old method : 0.06299304962158203 length of segment : 361 time for calcul the mask position with numpy : 0.0022873878479003906 nb_pixel_total : 50476 time to create 1 rle with old method : 0.06226301193237305 length of segment : 288 time for calcul the mask position with numpy : 0.028395414352416992 nb_pixel_total : 126104 time to create 1 rle with old method : 0.1587507724761963 length of segment : 1188 time for calcul the mask position with numpy : 0.002585172653198242 nb_pixel_total : 78826 time to create 1 rle with old method : 0.09740400314331055 length of segment : 304 time for calcul the mask position with numpy : 0.002788066864013672 nb_pixel_total : 24321 time to create 1 rle with old method : 0.02931666374206543 length of segment : 312 time for calcul the mask position with numpy : 0.0024526119232177734 nb_pixel_total : 48517 time to create 1 rle with old method : 0.07043194770812988 length of segment : 301 time for calcul the mask position with numpy : 0.001138925552368164 nb_pixel_total : 11687 time to create 1 rle with old method : 0.0153961181640625 length of segment : 373 time for calcul the mask position with numpy : 0.004420757293701172 nb_pixel_total : 118221 time to create 1 rle with old method : 0.15090537071228027 length of segment : 485 time for calcul the mask position with numpy : 0.0030112266540527344 nb_pixel_total : 46901 time to create 1 rle with old method : 0.05556035041809082 length of segment : 271 time for calcul the mask position with numpy : 0.0021440982818603516 nb_pixel_total : 50327 time to create 1 rle with old method : 0.06858110427856445 length of segment : 277 time for calcul the mask position with numpy : 0.004329681396484375 nb_pixel_total : 137202 time to create 1 rle with old method : 0.1571669578552246 length of segment : 518 time for calcul the mask position with numpy : 0.000392913818359375 nb_pixel_total : 7088 time to create 1 rle with old method : 0.008461236953735352 length of segment : 167 time for calcul the mask position with numpy : 0.0007388591766357422 nb_pixel_total : 22556 time to create 1 rle with old method : 0.029725313186645508 length of segment : 152 time for calcul the mask position with numpy : 0.0016217231750488281 nb_pixel_total : 46109 time to create 1 rle with old method : 0.05367231369018555 length of segment : 392 time for calcul the mask position with numpy : 0.0004162788391113281 nb_pixel_total : 7661 time to create 1 rle with old method : 0.00946497917175293 length of segment : 99 time for calcul the mask position with numpy : 0.004933357238769531 nb_pixel_total : 104635 time to create 1 rle with old method : 0.12129855155944824 length of segment : 573 time for calcul the mask position with numpy : 0.0015728473663330078 nb_pixel_total : 38988 time to create 1 rle with old method : 0.046036481857299805 length of segment : 256 time for calcul the mask position with numpy : 0.005354404449462891 nb_pixel_total : 162259 time to create 1 rle with new method : 0.01111149787902832 length of segment : 706 time for calcul the mask position with numpy : 0.001960277557373047 nb_pixel_total : 36779 time to create 1 rle with old method : 0.04122495651245117 length of segment : 468 time for calcul the mask position with numpy : 0.005088329315185547 nb_pixel_total : 160623 time to create 1 rle with new method : 0.008035659790039062 length of segment : 595 time for calcul the mask position with numpy : 0.0016214847564697266 nb_pixel_total : 23203 time to create 1 rle with old method : 0.026709556579589844 length of segment : 403 time for calcul the mask position with numpy : 0.0016713142395019531 nb_pixel_total : 32744 time to create 1 rle with old method : 0.039899349212646484 length of segment : 381 time for calcul the mask position with numpy : 0.0010273456573486328 nb_pixel_total : 21609 time to create 1 rle with old method : 0.02567124366760254 length of segment : 187 time for calcul the mask position with numpy : 0.0023343563079833984 nb_pixel_total : 35525 time to create 1 rle with old method : 0.04132866859436035 length of segment : 657 time for calcul the mask position with numpy : 0.003741741180419922 nb_pixel_total : 115642 time to create 1 rle with old method : 0.13505887985229492 length of segment : 388 time for calcul the mask position with numpy : 0.002971649169921875 nb_pixel_total : 68068 time to create 1 rle with old method : 0.08463621139526367 length of segment : 521 time for calcul the mask position with numpy : 0.001237630844116211 nb_pixel_total : 32082 time to create 1 rle with old method : 0.0374143123626709 length of segment : 203 time for calcul the mask position with numpy : 0.0035517215728759766 nb_pixel_total : 109438 time to create 1 rle with old method : 0.1253972053527832 length of segment : 857 time for calcul the mask position with numpy : 0.0028727054595947266 nb_pixel_total : 71062 time to create 1 rle with old method : 0.08117961883544922 length of segment : 493 time for calcul the mask position with numpy : 0.004716396331787109 nb_pixel_total : 118250 time to create 1 rle with old method : 0.15601515769958496 length of segment : 324 time for calcul the mask position with numpy : 0.0025687217712402344 nb_pixel_total : 31013 time to create 1 rle with old method : 0.03626298904418945 length of segment : 422 time for calcul the mask position with numpy : 0.0010471343994140625 nb_pixel_total : 19978 time to create 1 rle with old method : 0.023291349411010742 length of segment : 251 time for calcul the mask position with numpy : 0.0013349056243896484 nb_pixel_total : 34860 time to create 1 rle with old method : 0.040097713470458984 length of segment : 221 time for calcul the mask position with numpy : 0.009427309036254883 nb_pixel_total : 175204 time to create 1 rle with new method : 0.014226436614990234 length of segment : 459 time for calcul the mask position with numpy : 0.0009670257568359375 nb_pixel_total : 21056 time to create 1 rle with old method : 0.026189327239990234 length of segment : 136 time for calcul the mask position with numpy : 0.003663301467895508 nb_pixel_total : 58851 time to create 1 rle with old method : 0.0756216049194336 length of segment : 450 time for calcul the mask position with numpy : 0.0008447170257568359 nb_pixel_total : 17122 time to create 1 rle with old method : 0.020125389099121094 length of segment : 150 time for calcul the mask position with numpy : 0.0004062652587890625 nb_pixel_total : 5500 time to create 1 rle with old method : 0.0064678192138671875 length of segment : 139 time for calcul the mask position with numpy : 0.0012860298156738281 nb_pixel_total : 26276 time to create 1 rle with old method : 0.030468225479125977 length of segment : 206 time for calcul the mask position with numpy : 0.0007741451263427734 nb_pixel_total : 19354 time to create 1 rle with old method : 0.022670984268188477 length of segment : 143 time for calcul the mask position with numpy : 0.0019462108612060547 nb_pixel_total : 33332 time to create 1 rle with old method : 0.038251399993896484 length of segment : 368 time for calcul the mask position with numpy : 0.0006821155548095703 nb_pixel_total : 13707 time to create 1 rle with old method : 0.01635265350341797 length of segment : 272 time for calcul the mask position with numpy : 0.002438068389892578 nb_pixel_total : 49249 time to create 1 rle with old method : 0.05656599998474121 length of segment : 284 time for calcul the mask position with numpy : 0.0025327205657958984 nb_pixel_total : 70206 time to create 1 rle with old method : 0.08465862274169922 length of segment : 295 time for calcul the mask position with numpy : 0.0021371841430664062 nb_pixel_total : 52282 time to create 1 rle with old method : 0.060743093490600586 length of segment : 456 time for calcul the mask position with numpy : 0.0008993148803710938 nb_pixel_total : 14010 time to create 1 rle with old method : 0.017032384872436523 length of segment : 127 time for calcul the mask position with numpy : 0.001619577407836914 nb_pixel_total : 40433 time to create 1 rle with old method : 0.04706382751464844 length of segment : 186 time for calcul the mask position with numpy : 0.0006382465362548828 nb_pixel_total : 23493 time to create 1 rle with old method : 0.028218507766723633 length of segment : 126 time for calcul the mask position with numpy : 0.0015799999237060547 nb_pixel_total : 33433 time to create 1 rle with old method : 0.03913092613220215 length of segment : 258 time for calcul the mask position with numpy : 0.0006532669067382812 nb_pixel_total : 13084 time to create 1 rle with old method : 0.015766620635986328 length of segment : 113 time for calcul the mask position with numpy : 0.0007569789886474609 nb_pixel_total : 22538 time to create 1 rle with old method : 0.026942014694213867 length of segment : 113 time for calcul the mask position with numpy : 0.010524272918701172 nb_pixel_total : 241849 time to create 1 rle with new method : 0.03593635559082031 length of segment : 736 time for calcul the mask position with numpy : 0.0011510848999023438 nb_pixel_total : 18811 time to create 1 rle with old method : 0.023449420928955078 length of segment : 155 time for calcul the mask position with numpy : 0.0021347999572753906 nb_pixel_total : 54800 time to create 1 rle with old method : 0.06279921531677246 length of segment : 265 time for calcul the mask position with numpy : 0.0024530887603759766 nb_pixel_total : 47440 time to create 1 rle with old method : 0.0560002326965332 length of segment : 315 time for calcul the mask position with numpy : 0.005306720733642578 nb_pixel_total : 98709 time to create 1 rle with old method : 0.11217832565307617 length of segment : 438 time for calcul the mask position with numpy : 0.0054967403411865234 nb_pixel_total : 111871 time to create 1 rle with old method : 0.12894344329833984 length of segment : 536 time for calcul the mask position with numpy : 0.001718759536743164 nb_pixel_total : 34633 time to create 1 rle with old method : 0.04013657569885254 length of segment : 232 time for calcul the mask position with numpy : 0.0014591217041015625 nb_pixel_total : 21568 time to create 1 rle with old method : 0.02585315704345703 length of segment : 204 time for calcul the mask position with numpy : 0.005028963088989258 nb_pixel_total : 79232 time to create 1 rle with old method : 0.09282159805297852 length of segment : 321 time for calcul the mask position with numpy : 0.0010907649993896484 nb_pixel_total : 13990 time to create 1 rle with old method : 0.016781091690063477 length of segment : 166 time for calcul the mask position with numpy : 0.0013720989227294922 nb_pixel_total : 24645 time to create 1 rle with old method : 0.028723716735839844 length of segment : 223 time for calcul the mask position with numpy : 0.0013339519500732422 nb_pixel_total : 19057 time to create 1 rle with old method : 0.027208805084228516 length of segment : 153 time for calcul the mask position with numpy : 0.0028450489044189453 nb_pixel_total : 29889 time to create 1 rle with old method : 0.038071393966674805 length of segment : 378 time for calcul the mask position with numpy : 0.0016977787017822266 nb_pixel_total : 20824 time to create 1 rle with old method : 0.024523258209228516 length of segment : 178 time for calcul the mask position with numpy : 0.0019578933715820312 nb_pixel_total : 43751 time to create 1 rle with old method : 0.05506491661071777 length of segment : 286 time for calcul the mask position with numpy : 0.0012607574462890625 nb_pixel_total : 26159 time to create 1 rle with old method : 0.03313422203063965 length of segment : 191 time for calcul the mask position with numpy : 0.003175020217895508 nb_pixel_total : 58744 time to create 1 rle with old method : 0.07095527648925781 length of segment : 294 time for calcul the mask position with numpy : 0.0015692710876464844 nb_pixel_total : 19654 time to create 1 rle with old method : 0.025385618209838867 length of segment : 309 time for calcul the mask position with numpy : 0.008356094360351562 nb_pixel_total : 159537 time to create 1 rle with new method : 0.015166044235229492 length of segment : 477 time for calcul the mask position with numpy : 0.005084037780761719 nb_pixel_total : 65034 time to create 1 rle with old method : 0.0837557315826416 length of segment : 522 time for calcul the mask position with numpy : 0.001008749008178711 nb_pixel_total : 9278 time to create 1 rle with old method : 0.013865232467651367 length of segment : 133 time for calcul the mask position with numpy : 0.0032258033752441406 nb_pixel_total : 47743 time to create 1 rle with old method : 0.0548701286315918 length of segment : 277 time for calcul the mask position with numpy : 0.0034339427947998047 nb_pixel_total : 83812 time to create 1 rle with old method : 0.09881973266601562 length of segment : 379 time for calcul the mask position with numpy : 0.001773834228515625 nb_pixel_total : 33603 time to create 1 rle with old method : 0.039354801177978516 length of segment : 230 time for calcul the mask position with numpy : 0.0012750625610351562 nb_pixel_total : 20346 time to create 1 rle with old method : 0.02449631690979004 length of segment : 225 time for calcul the mask position with numpy : 0.0005941390991210938 nb_pixel_total : 15052 time to create 1 rle with old method : 0.01991128921508789 length of segment : 134 time for calcul the mask position with numpy : 0.0014529228210449219 nb_pixel_total : 30513 time to create 1 rle with old method : 0.038820505142211914 length of segment : 271 time for calcul the mask position with numpy : 0.0026116371154785156 nb_pixel_total : 32606 time to create 1 rle with old method : 0.04052591323852539 length of segment : 428 time for calcul the mask position with numpy : 0.0013697147369384766 nb_pixel_total : 20017 time to create 1 rle with old method : 0.024102210998535156 length of segment : 197 time for calcul the mask position with numpy : 0.001585245132446289 nb_pixel_total : 32823 time to create 1 rle with old method : 0.039931535720825195 length of segment : 248 time for calcul the mask position with numpy : 0.0016505718231201172 nb_pixel_total : 21414 time to create 1 rle with old method : 0.026354074478149414 length of segment : 238 time for calcul the mask position with numpy : 0.012111186981201172 nb_pixel_total : 200448 time to create 1 rle with new method : 0.02103590965270996 length of segment : 593 time for calcul the mask position with numpy : 0.009243965148925781 nb_pixel_total : 180857 time to create 1 rle with new method : 0.021922588348388672 length of segment : 656 time for calcul the mask position with numpy : 0.003007650375366211 nb_pixel_total : 46254 time to create 1 rle with old method : 0.0532689094543457 length of segment : 319 time for calcul the mask position with numpy : 0.002596616744995117 nb_pixel_total : 32469 time to create 1 rle with old method : 0.0379338264465332 length of segment : 517 time for calcul the mask position with numpy : 0.0018506050109863281 nb_pixel_total : 21189 time to create 1 rle with old method : 0.024935007095336914 length of segment : 255 time for calcul the mask position with numpy : 0.0020356178283691406 nb_pixel_total : 20329 time to create 1 rle with old method : 0.024199485778808594 length of segment : 380 time for calcul the mask position with numpy : 0.0036144256591796875 nb_pixel_total : 84923 time to create 1 rle with old method : 0.09876680374145508 length of segment : 407 time for calcul the mask position with numpy : 0.0008029937744140625 nb_pixel_total : 13307 time to create 1 rle with old method : 0.01591968536376953 length of segment : 94 time for calcul the mask position with numpy : 0.0010046958923339844 nb_pixel_total : 9494 time to create 1 rle with old method : 0.011637687683105469 length of segment : 165 time for calcul the mask position with numpy : 0.0024938583374023438 nb_pixel_total : 39285 time to create 1 rle with old method : 0.04664969444274902 length of segment : 180 time for calcul the mask position with numpy : 0.008603334426879883 nb_pixel_total : 218833 time to create 1 rle with new method : 0.017820358276367188 length of segment : 563 time for calcul the mask position with numpy : 0.010283231735229492 nb_pixel_total : 149681 time to create 1 rle with old method : 0.1709737777709961 length of segment : 807 time for calcul the mask position with numpy : 0.00040602684020996094 nb_pixel_total : 9853 time to create 1 rle with old method : 0.011460304260253906 length of segment : 150 time for calcul the mask position with numpy : 0.0012521743774414062 nb_pixel_total : 27943 time to create 1 rle with old method : 0.03271055221557617 length of segment : 206 time for calcul the mask position with numpy : 0.0009059906005859375 nb_pixel_total : 17892 time to create 1 rle with old method : 0.02319812774658203 length of segment : 134 time for calcul the mask position with numpy : 0.002728700637817383 nb_pixel_total : 49377 time to create 1 rle with old method : 0.05828666687011719 length of segment : 282 time for calcul the mask position with numpy : 0.0018661022186279297 nb_pixel_total : 64401 time to create 1 rle with old method : 0.07560515403747559 length of segment : 406 time for calcul the mask position with numpy : 0.001667022705078125 nb_pixel_total : 28358 time to create 1 rle with old method : 0.03282046318054199 length of segment : 398 time for calcul the mask position with numpy : 0.0019147396087646484 nb_pixel_total : 36604 time to create 1 rle with old method : 0.04245901107788086 length of segment : 259 time for calcul the mask position with numpy : 0.0006413459777832031 nb_pixel_total : 10622 time to create 1 rle with old method : 0.012653589248657227 length of segment : 121 time spent for convertir_results : 18.079788208007812 Inside saveOutput : final : False verbose : 0 eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 148 chid ids of type : 3594 Number RLEs to save : 47410 save missing photos in datou_result : time spend for datou_step_exec : 143.17040276527405 time spend to save output : 2.9252288341522217 total time spend for step 1 : 146.09563159942627 step2:crop_condition Mon Jul 28 13:02:57 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure Loading chi in step crop with photo_hashtag_type : 3594 Loading chi in step crop for list_pids : 9 ! batch 1 Loaded 148 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ begin to crop the class : papier param for this class : {'min_score': 0.7} filtre for class : papier hashtag_id of this class : 492668766 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 124 About to insert : list_path_to_insert length 124 new photo from crops ! About to upload 124 photos upload in portfolio : 3736932 init cache_photo without model_param we have 124 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753700612_3508334 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 124 photos in the portfolio 3736932 time of upload the photos Elapsed time : 31.526694536209106 we have finished the crop for the class : papier begin to crop the class : carton param for this class : {'min_score': 0.7} filtre for class : carton hashtag_id of this class : 492774966 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 17 About to insert : list_path_to_insert length 17 new photo from crops ! About to upload 17 photos upload in portfolio : 3736932 init cache_photo without model_param we have 17 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753700648_3508334 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 17 photos in the portfolio 3736932 time of upload the photos Elapsed time : 5.354559421539307 we have finished the crop for the class : carton begin to crop the class : metal param for this class : {'min_score': 0.7} filtre for class : metal hashtag_id of this class : 492628673 begin to crop the class : pet_clair param for this class : {'min_score': 0.7} filtre for class : pet_clair hashtag_id of this class : 2107755846 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 4 About to insert : list_path_to_insert length 4 new photo from crops ! About to upload 4 photos upload in portfolio : 3736932 init cache_photo without model_param we have 4 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753700657_3508334 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 4 photos in the portfolio 3736932 time of upload the photos Elapsed time : 1.4678287506103516 we have finished the crop for the class : pet_clair begin to crop the class : autre param for this class : {'min_score': 0.7} filtre for class : autre hashtag_id of this class : 494826614 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 3 About to insert : list_path_to_insert length 3 new photo from crops ! About to upload 3 photos upload in portfolio : 3736932 init cache_photo without model_param we have 3 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753700660_3508334 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 3 photos in the portfolio 3736932 time of upload the photos Elapsed time : 1.0223772525787354 we have finished the crop for the class : autre begin to crop the class : pehd param for this class : {'min_score': 0.7} filtre for class : pehd hashtag_id of this class : 628944319 begin to crop the class : pet_fonce param for this class : {'min_score': 0.7} filtre for class : pet_fonce hashtag_id of this class : 2107755900 delete rles from all chi we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : crop_condition we use saveGeneral [1373990931, 1373990874, 1373990507, 1373990504, 1373990502, 1373990500, 1373990499, 1373990498, 1373990440] Looping around the photos to save general results len do output : 148 /1374148211Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148213Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148214Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148215Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148217Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148218Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148219Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148221Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148222Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148224Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148225Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148226Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148227Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148229Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148230Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148232Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148233Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148234Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148236Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148237Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148238Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148240Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148241Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148242Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148244Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148245Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148246Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148248Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148249Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148250Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148252Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148253Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148254Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148256Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148257Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148258Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148260Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148261Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148262Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148264Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148265Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148266Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148268Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148269Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148270Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148272Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148273Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148274Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148276Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148277Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148278Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148280Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148281Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148282Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148284Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148285Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148286Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148288Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148290Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148291Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148293Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148294Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148295Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148297Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148298Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148299Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148301Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148302Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148303Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148305Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148306Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148307Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148309Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148310Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148311Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148313Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148314Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148315Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148317Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148318Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148319Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148321Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148322Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148324Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148325Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148326Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148328Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148329Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148330Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148332Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148333Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148334Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148336Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148337Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148338Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148340Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148341Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148342Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148344Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148345Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148346Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148348Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148349Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148350Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148352Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148353Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148354Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148356Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148357Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148358Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148360Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148361Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148362Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148364Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148365Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148366Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148368Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148369Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148370Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148372Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148373Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148374Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148376Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148377Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148440Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148441Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148443Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148444Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148445Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148447Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148448Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148450Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148451Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148452Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148454Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148455Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148456Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148458Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148459Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148460Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148462Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148489Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148490Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148492Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148493Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148511Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148513Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374148514Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990931', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990874', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990507', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990504', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990502', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990500', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990499', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990498', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990440', None, None, None, None, None, '3370199') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 453 time used for this insertion : 0.030497312545776367 save_final save missing photos in datou_result : time spend for datou_step_exec : 83.72567772865295 time spend to save output : 0.035109758377075195 total time spend for step 2 : 83.76078748703003 step3:rle_unique_nms_with_priority Mon Jul 28 13:04:21 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 148 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++nb_obj : 10 nb_hashtags : 1 time to prepare the origin masks : 6.302049875259399 time for calcul the mask position with numpy : 0.7135674953460693 nb_pixel_total : 7670761 time to create 1 rle with new method : 0.8531253337860107 time for calcul the mask position with numpy : 0.030301332473754883 nb_pixel_total : 15771 time to create 1 rle with old method : 0.020692110061645508 time for calcul the mask position with numpy : 0.03632497787475586 nb_pixel_total : 32085 time to create 1 rle with old method : 0.04116511344909668 time for calcul the mask position with numpy : 0.032737016677856445 nb_pixel_total : 2955 time to create 1 rle with old method : 0.003519773483276367 time for calcul the mask position with numpy : 0.02727365493774414 nb_pixel_total : 14943 time to create 1 rle with old method : 0.01708078384399414 time for calcul the mask position with numpy : 0.028509855270385742 nb_pixel_total : 9315 time to create 1 rle with old method : 0.01160740852355957 time for calcul the mask position with numpy : 0.03324079513549805 nb_pixel_total : 26226 time to create 1 rle with old method : 0.03467440605163574 time for calcul the mask position with numpy : 0.032560110092163086 nb_pixel_total : 32293 time to create 1 rle with old method : 0.03851032257080078 time for calcul the mask position with numpy : 0.028775691986083984 nb_pixel_total : 46718 time to create 1 rle with old method : 0.06554913520812988 time for calcul the mask position with numpy : 0.036306142807006836 nb_pixel_total : 361087 time to create 1 rle with new method : 0.7869534492492676 time for calcul the mask position with numpy : 0.05770730972290039 nb_pixel_total : 82246 time to create 1 rle with old method : 0.09558844566345215 create new chi : 3.116680860519409 time to delete rle : 0.028260469436645508 batch 1 Loaded 21 chid ids of type : 3594 +++++++++++++++Number RLEs to save : 7337 TO DO : save crop sub photo not yet done ! save time : 0.4797861576080322 nb_obj : 33 nb_hashtags : 3 time to prepare the origin masks : 4.863490581512451 time for calcul the mask position with numpy : 1.7076647281646729 nb_pixel_total : 6586321 time to create 1 rle with new method : 0.6130664348602295 time for calcul the mask position with numpy : 0.04401421546936035 nb_pixel_total : 54352 time to create 1 rle with old method : 0.07196974754333496 time for calcul the mask position with numpy : 0.037543296813964844 nb_pixel_total : 29083 time to create 1 rle with old method : 0.0378727912902832 time for calcul the mask position with numpy : 0.03482818603515625 nb_pixel_total : 16671 time to create 1 rle with old method : 0.019038677215576172 time for calcul the mask position with numpy : 0.03516030311584473 nb_pixel_total : 72388 time to create 1 rle with old method : 0.08184170722961426 time for calcul the mask position with numpy : 0.03702688217163086 nb_pixel_total : 95982 time to create 1 rle with old method : 0.114593505859375 time for calcul the mask position with numpy : 0.036427974700927734 nb_pixel_total : 53034 time to create 1 rle with old method : 0.06487798690795898 time for calcul the mask position with numpy : 0.03719472885131836 nb_pixel_total : 48516 time to create 1 rle with old method : 0.057709455490112305 time for calcul the mask position with numpy : 0.03825998306274414 nb_pixel_total : 264332 time to create 1 rle with new method : 0.79386305809021 time for calcul the mask position with numpy : 0.04056406021118164 nb_pixel_total : 68979 time to create 1 rle with old method : 0.1093759536743164 time for calcul the mask position with numpy : 0.03562593460083008 nb_pixel_total : 105291 time to create 1 rle with old method : 0.12534165382385254 time for calcul the mask position with numpy : 0.06678366661071777 nb_pixel_total : 65861 time to create 1 rle with old method : 0.09084463119506836 time for calcul the mask position with numpy : 0.04087543487548828 nb_pixel_total : 54539 time to create 1 rle with old method : 0.07394790649414062 time for calcul the mask position with numpy : 0.04210543632507324 nb_pixel_total : 7480 time to create 1 rle with old method : 0.015103578567504883 time for calcul the mask position with numpy : 0.0380558967590332 nb_pixel_total : 26527 time to create 1 rle with old method : 0.030661582946777344 time for calcul the mask position with numpy : 0.036417484283447266 nb_pixel_total : 62779 time to create 1 rle with old method : 0.0778658390045166 time for calcul the mask position with numpy : 0.054029226303100586 nb_pixel_total : 86241 time to create 1 rle with old method : 0.10737919807434082 time for calcul the mask position with numpy : 0.04021406173706055 nb_pixel_total : 13290 time to create 1 rle with old method : 0.015321493148803711 time for calcul the mask position with numpy : 0.03554415702819824 nb_pixel_total : 40514 time to create 1 rle with old method : 0.04612469673156738 time for calcul the mask position with numpy : 0.03661060333251953 nb_pixel_total : 66297 time to create 1 rle with old method : 0.07408285140991211 time for calcul the mask position with numpy : 0.03563332557678223 nb_pixel_total : 24180 time to create 1 rle with old method : 0.02931809425354004 time for calcul the mask position with numpy : 0.03691840171813965 nb_pixel_total : 27450 time to create 1 rle with old method : 0.03290534019470215 time for calcul the mask position with numpy : 0.03595709800720215 nb_pixel_total : 28385 time to create 1 rle with old method : 0.032233476638793945 time for calcul the mask position with numpy : 0.037464141845703125 nb_pixel_total : 27369 time to create 1 rle with old method : 0.031397342681884766 time for calcul the mask position with numpy : 0.036469459533691406 nb_pixel_total : 21551 time to create 1 rle with old method : 0.024779081344604492 time for calcul the mask position with numpy : 0.0383448600769043 nb_pixel_total : 12434 time to create 1 rle with old method : 0.02000260353088379 time for calcul the mask position with numpy : 0.05051422119140625 nb_pixel_total : 40061 time to create 1 rle with old method : 0.052164316177368164 time for calcul the mask position with numpy : 0.03575921058654785 nb_pixel_total : 8581 time to create 1 rle with old method : 0.010345458984375 time for calcul the mask position with numpy : 0.036637306213378906 nb_pixel_total : 31640 time to create 1 rle with old method : 0.03917574882507324 time for calcul the mask position with numpy : 0.036664485931396484 nb_pixel_total : 26273 time to create 1 rle with old method : 0.03156781196594238 time for calcul the mask position with numpy : 0.03728532791137695 nb_pixel_total : 60126 time to create 1 rle with old method : 0.07229924201965332 time for calcul the mask position with numpy : 0.035288095474243164 nb_pixel_total : 43500 time to create 1 rle with old method : 0.04821920394897461 time for calcul the mask position with numpy : 0.03722524642944336 nb_pixel_total : 97714 time to create 1 rle with old method : 0.1376953125 time for calcul the mask position with numpy : 0.03474164009094238 nb_pixel_total : 26659 time to create 1 rle with old method : 0.03001093864440918 create new chi : 6.56193208694458 time to delete rle : 0.002960205078125 batch 1 Loaded 67 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++++++++Number RLEs to save : 22116 TO DO : save crop sub photo not yet done ! save time : 1.3515808582305908 nb_obj : 8 nb_hashtags : 3 time to prepare the origin masks : 5.091035604476929 time for calcul the mask position with numpy : 1.1989727020263672 nb_pixel_total : 7795734 time to create 1 rle with new method : 0.9732058048248291 time for calcul the mask position with numpy : 0.03566312789916992 nb_pixel_total : 24321 time to create 1 rle with old method : 0.028153657913208008 time for calcul the mask position with numpy : 0.036293983459472656 nb_pixel_total : 78826 time to create 1 rle with old method : 0.09014678001403809 time for calcul the mask position with numpy : 0.031699419021606445 nb_pixel_total : 126073 time to create 1 rle with old method : 0.14400744438171387 time for calcul the mask position with numpy : 0.03481340408325195 nb_pixel_total : 50476 time to create 1 rle with old method : 0.06344199180603027 time for calcul the mask position with numpy : 0.03270578384399414 nb_pixel_total : 40906 time to create 1 rle with old method : 0.05131673812866211 time for calcul the mask position with numpy : 0.03295469284057617 nb_pixel_total : 9463 time to create 1 rle with old method : 0.010736703872680664 time for calcul the mask position with numpy : 0.034159183502197266 nb_pixel_total : 77126 time to create 1 rle with old method : 0.08758687973022461 time for calcul the mask position with numpy : 0.03199601173400879 nb_pixel_total : 91475 time to create 1 rle with old method : 0.10732054710388184 create new chi : 3.0809619426727295 time to delete rle : 0.002053499221801758 batch 1 Loaded 17 chid ids of type : 3594 ++++++++++++++++++Number RLEs to save : 9030 TO DO : save crop sub photo not yet done ! save time : 0.5932645797729492 nb_obj : 8 nb_hashtags : 2 time to prepare the origin masks : 3.795114278793335 time for calcul the mask position with numpy : 0.4436609745025635 nb_pixel_total : 7854385 time to create 1 rle with new method : 1.1447086334228516 time for calcul the mask position with numpy : 0.025462865829467773 nb_pixel_total : 22556 time to create 1 rle with old method : 0.03029322624206543 time for calcul the mask position with numpy : 0.04509735107421875 nb_pixel_total : 4604 time to create 1 rle with old method : 0.005398273468017578 time for calcul the mask position with numpy : 0.04449963569641113 nb_pixel_total : 137202 time to create 1 rle with old method : 0.15503549575805664 time for calcul the mask position with numpy : 0.043550968170166016 nb_pixel_total : 50327 time to create 1 rle with old method : 0.05790901184082031 time for calcul the mask position with numpy : 0.041886329650878906 nb_pixel_total : 46901 time to create 1 rle with old method : 0.051692962646484375 time for calcul the mask position with numpy : 0.04044294357299805 nb_pixel_total : 118221 time to create 1 rle with old method : 0.12960481643676758 time for calcul the mask position with numpy : 0.03443408012390137 nb_pixel_total : 11687 time to create 1 rle with old method : 0.012731313705444336 time for calcul the mask position with numpy : 0.0368044376373291 nb_pixel_total : 48517 time to create 1 rle with old method : 0.053795576095581055 create new chi : 2.446012020111084 time to delete rle : 0.0012669563293457031 batch 1 Loaded 17 chid ids of type : 3594 +++++++++Number RLEs to save : 7144 TO DO : save crop sub photo not yet done ! save time : 0.4736616611480713 nb_obj : 16 nb_hashtags : 3 time to prepare the origin masks : 8.250890493392944 time for calcul the mask position with numpy : 0.933495044708252 nb_pixel_total : 7235856 time to create 1 rle with new method : 1.1961627006530762 time for calcul the mask position with numpy : 0.04428744316101074 nb_pixel_total : 71062 time to create 1 rle with old method : 0.10641717910766602 time for calcul the mask position with numpy : 0.03921103477478027 nb_pixel_total : 109435 time to create 1 rle with old method : 0.12407970428466797 time for calcul the mask position with numpy : 0.03948521614074707 nb_pixel_total : 32082 time to create 1 rle with old method : 0.038778066635131836 time for calcul the mask position with numpy : 0.041168212890625 nb_pixel_total : 68068 time to create 1 rle with old method : 0.07881593704223633 time for calcul the mask position with numpy : 0.03611493110656738 nb_pixel_total : 115642 time to create 1 rle with old method : 0.12963199615478516 time for calcul the mask position with numpy : 0.03876829147338867 nb_pixel_total : 35525 time to create 1 rle with old method : 0.04113578796386719 time for calcul the mask position with numpy : 0.02571892738342285 nb_pixel_total : 21609 time to create 1 rle with old method : 0.024588346481323242 time for calcul the mask position with numpy : 0.026634693145751953 nb_pixel_total : 24864 time to create 1 rle with old method : 0.02903151512145996 time for calcul the mask position with numpy : 0.026505470275878906 nb_pixel_total : 23203 time to create 1 rle with old method : 0.02706313133239746 time for calcul the mask position with numpy : 0.02616739273071289 nb_pixel_total : 160623 time to create 1 rle with new method : 0.800978422164917 time for calcul the mask position with numpy : 0.025432586669921875 nb_pixel_total : 36779 time to create 1 rle with old method : 0.042960166931152344 time for calcul the mask position with numpy : 0.029180049896240234 nb_pixel_total : 162259 time to create 1 rle with new method : 0.8139662742614746 time for calcul the mask position with numpy : 0.025960445404052734 nb_pixel_total : 38988 time to create 1 rle with old method : 0.04387998580932617 time for calcul the mask position with numpy : 0.03968524932861328 nb_pixel_total : 104635 time to create 1 rle with old method : 0.11770892143249512 time for calcul the mask position with numpy : 0.04270625114440918 nb_pixel_total : 7661 time to create 1 rle with old method : 0.008744001388549805 time for calcul the mask position with numpy : 0.041999101638793945 nb_pixel_total : 46109 time to create 1 rle with old method : 0.051102399826049805 create new chi : 5.2683916091918945 time to delete rle : 0.002649068832397461 batch 1 Loaded 33 chid ids of type : 3594 ++++++++++++++++++++++++++++Number RLEs to save : 16339 TO DO : save crop sub photo not yet done ! save time : 0.9855818748474121 nb_obj : 19 nb_hashtags : 3 time to prepare the origin masks : 7.646036624908447 time for calcul the mask position with numpy : 0.6238601207733154 nb_pixel_total : 7489581 time to create 1 rle with new method : 1.1828572750091553 time for calcul the mask position with numpy : 0.04588651657104492 nb_pixel_total : 23321 time to create 1 rle with old method : 0.026554584503173828 time for calcul the mask position with numpy : 0.04160475730895996 nb_pixel_total : 40433 time to create 1 rle with old method : 0.046593666076660156 time for calcul the mask position with numpy : 0.04334425926208496 nb_pixel_total : 14010 time to create 1 rle with old method : 0.0160067081451416 time for calcul the mask position with numpy : 0.04262351989746094 nb_pixel_total : 40116 time to create 1 rle with old method : 0.047917842864990234 time for calcul the mask position with numpy : 0.04194784164428711 nb_pixel_total : 70206 time to create 1 rle with old method : 0.0859222412109375 time for calcul the mask position with numpy : 0.04424476623535156 nb_pixel_total : 49249 time to create 1 rle with old method : 0.06652307510375977 time for calcul the mask position with numpy : 0.042156219482421875 nb_pixel_total : 6688 time to create 1 rle with old method : 0.007750511169433594 time for calcul the mask position with numpy : 0.04099416732788086 nb_pixel_total : 33332 time to create 1 rle with old method : 0.03928995132446289 time for calcul the mask position with numpy : 0.042980194091796875 nb_pixel_total : 19354 time to create 1 rle with old method : 0.021820068359375 time for calcul the mask position with numpy : 0.03756403923034668 nb_pixel_total : 26276 time to create 1 rle with old method : 0.03495979309082031 time for calcul the mask position with numpy : 0.03519892692565918 nb_pixel_total : 5500 time to create 1 rle with old method : 0.006270408630371094 time for calcul the mask position with numpy : 0.03855633735656738 nb_pixel_total : 17122 time to create 1 rle with old method : 0.01949763298034668 time for calcul the mask position with numpy : 0.04376983642578125 nb_pixel_total : 58851 time to create 1 rle with old method : 0.08540177345275879 time for calcul the mask position with numpy : 0.03924751281738281 nb_pixel_total : 21056 time to create 1 rle with old method : 0.023752212524414062 time for calcul the mask position with numpy : 0.03298687934875488 nb_pixel_total : 175204 time to create 1 rle with new method : 1.1418921947479248 time for calcul the mask position with numpy : 0.04288768768310547 nb_pixel_total : 34860 time to create 1 rle with old method : 0.04085731506347656 time for calcul the mask position with numpy : 0.05163908004760742 nb_pixel_total : 19978 time to create 1 rle with old method : 0.022867202758789062 time for calcul the mask position with numpy : 0.049524784088134766 nb_pixel_total : 31013 time to create 1 rle with old method : 0.036306142807006836 time for calcul the mask position with numpy : 0.050193071365356445 nb_pixel_total : 118250 time to create 1 rle with old method : 0.15150046348571777 create new chi : 4.646188020706177 time to delete rle : 0.00426173210144043 batch 1 Loaded 39 chid ids of type : 3594 ++++++++++++++++++++++++Number RLEs to save : 11641 TO DO : save crop sub photo not yet done ! save time : 0.7078330516815186 nb_obj : 13 nb_hashtags : 3 time to prepare the origin masks : 6.73781681060791 time for calcul the mask position with numpy : 0.8671643733978271 nb_pixel_total : 7502656 time to create 1 rle with new method : 0.9733023643493652 time for calcul the mask position with numpy : 0.025679826736450195 nb_pixel_total : 13776 time to create 1 rle with old method : 0.015654802322387695 time for calcul the mask position with numpy : 0.025245189666748047 nb_pixel_total : 79232 time to create 1 rle with old method : 0.08925199508666992 time for calcul the mask position with numpy : 0.026572704315185547 nb_pixel_total : 21568 time to create 1 rle with old method : 0.024357318878173828 time for calcul the mask position with numpy : 0.026147842407226562 nb_pixel_total : 34633 time to create 1 rle with old method : 0.03917050361633301 time for calcul the mask position with numpy : 0.0265047550201416 nb_pixel_total : 111871 time to create 1 rle with old method : 0.12707066535949707 time for calcul the mask position with numpy : 0.026355504989624023 nb_pixel_total : 98709 time to create 1 rle with old method : 0.11916542053222656 time for calcul the mask position with numpy : 0.027632951736450195 nb_pixel_total : 47440 time to create 1 rle with old method : 0.05405783653259277 time for calcul the mask position with numpy : 0.03697681427001953 nb_pixel_total : 54800 time to create 1 rle with old method : 0.08134913444519043 time for calcul the mask position with numpy : 0.03827381134033203 nb_pixel_total : 18811 time to create 1 rle with old method : 0.034449100494384766 time for calcul the mask position with numpy : 0.052430152893066406 nb_pixel_total : 241849 time to create 1 rle with new method : 0.671501874923706 time for calcul the mask position with numpy : 0.04694533348083496 nb_pixel_total : 22538 time to create 1 rle with old method : 0.03452110290527344 time for calcul the mask position with numpy : 0.04424715042114258 nb_pixel_total : 13084 time to create 1 rle with old method : 0.021384716033935547 time for calcul the mask position with numpy : 0.040619611740112305 nb_pixel_total : 33433 time to create 1 rle with old method : 0.04185962677001953 create new chi : 3.7246503829956055 time to delete rle : 0.002257108688354492 batch 1 Loaded 27 chid ids of type : 3594 +++++++++++++++++Number RLEs to save : 9826 TO DO : save crop sub photo not yet done ! save time : 0.6052138805389404 nb_obj : 20 nb_hashtags : 2 time to prepare the origin masks : 10.897428035736084 time for calcul the mask position with numpy : 0.6854860782623291 nb_pixel_total : 7508214 time to create 1 rle with new method : 1.0179994106292725 time for calcul the mask position with numpy : 0.04423666000366211 nb_pixel_total : 32823 time to create 1 rle with old method : 0.041445016860961914 time for calcul the mask position with numpy : 0.051775217056274414 nb_pixel_total : 20017 time to create 1 rle with old method : 0.024529457092285156 time for calcul the mask position with numpy : 0.04677295684814453 nb_pixel_total : 32606 time to create 1 rle with old method : 0.03912520408630371 time for calcul the mask position with numpy : 0.045861005783081055 nb_pixel_total : 30513 time to create 1 rle with old method : 0.038359880447387695 time for calcul the mask position with numpy : 0.04610943794250488 nb_pixel_total : 8151 time to create 1 rle with old method : 0.009709835052490234 time for calcul the mask position with numpy : 0.04490160942077637 nb_pixel_total : 20346 time to create 1 rle with old method : 0.026195287704467773 time for calcul the mask position with numpy : 0.048300743103027344 nb_pixel_total : 33603 time to create 1 rle with old method : 0.04536771774291992 time for calcul the mask position with numpy : 0.04574108123779297 nb_pixel_total : 83812 time to create 1 rle with old method : 0.10103654861450195 time for calcul the mask position with numpy : 0.04303479194641113 nb_pixel_total : 47743 time to create 1 rle with old method : 0.05465841293334961 time for calcul the mask position with numpy : 0.0472865104675293 nb_pixel_total : 9278 time to create 1 rle with old method : 0.013502836227416992 time for calcul the mask position with numpy : 0.03786873817443848 nb_pixel_total : 65034 time to create 1 rle with old method : 0.07890987396240234 time for calcul the mask position with numpy : 0.037661075592041016 nb_pixel_total : 159537 time to create 1 rle with new method : 0.5899937152862549 time for calcul the mask position with numpy : 0.028101444244384766 nb_pixel_total : 19654 time to create 1 rle with old method : 0.02226400375366211 time for calcul the mask position with numpy : 0.027289867401123047 nb_pixel_total : 58744 time to create 1 rle with old method : 0.06599187850952148 time for calcul the mask position with numpy : 0.026530742645263672 nb_pixel_total : 26159 time to create 1 rle with old method : 0.029705047607421875 time for calcul the mask position with numpy : 0.026979446411132812 nb_pixel_total : 43751 time to create 1 rle with old method : 0.050211429595947266 time for calcul the mask position with numpy : 0.027280330657958984 nb_pixel_total : 20824 time to create 1 rle with old method : 0.02390122413635254 time for calcul the mask position with numpy : 0.025350093841552734 nb_pixel_total : 29889 time to create 1 rle with old method : 0.03400015830993652 time for calcul the mask position with numpy : 0.04605388641357422 nb_pixel_total : 19057 time to create 1 rle with old method : 0.021546363830566406 time for calcul the mask position with numpy : 0.04282546043395996 nb_pixel_total : 24645 time to create 1 rle with old method : 0.028073787689208984 create new chi : 3.9391613006591797 time to delete rle : 0.003630399703979492 batch 1 Loaded 41 chid ids of type : 3594 +++++++++++++++++++++++++++++++Number RLEs to save : 13134 TO DO : save crop sub photo not yet done ! save time : 0.8334321975708008 nb_obj : 21 nb_hashtags : 2 time to prepare the origin masks : 10.575986623764038 time for calcul the mask position with numpy : 0.9072163105010986 nb_pixel_total : 7029874 time to create 1 rle with new method : 1.581298589706421 time for calcul the mask position with numpy : 0.042133331298828125 nb_pixel_total : 10622 time to create 1 rle with old method : 0.012107133865356445 time for calcul the mask position with numpy : 0.042896270751953125 nb_pixel_total : 36604 time to create 1 rle with old method : 0.0414729118347168 time for calcul the mask position with numpy : 0.044014930725097656 nb_pixel_total : 28358 time to create 1 rle with old method : 0.03747224807739258 time for calcul the mask position with numpy : 0.04557299613952637 nb_pixel_total : 45394 time to create 1 rle with old method : 0.05168318748474121 time for calcul the mask position with numpy : 0.04298233985900879 nb_pixel_total : 49377 time to create 1 rle with old method : 0.058399200439453125 time for calcul the mask position with numpy : 0.04336905479431152 nb_pixel_total : 17892 time to create 1 rle with old method : 0.02046680450439453 time for calcul the mask position with numpy : 0.04197835922241211 nb_pixel_total : 27943 time to create 1 rle with old method : 0.03432488441467285 time for calcul the mask position with numpy : 0.04235124588012695 nb_pixel_total : 9853 time to create 1 rle with old method : 0.011275053024291992 time for calcul the mask position with numpy : 0.04234433174133301 nb_pixel_total : 149681 time to create 1 rle with old method : 0.17621469497680664 time for calcul the mask position with numpy : 0.044146060943603516 nb_pixel_total : 218833 time to create 1 rle with new method : 1.4633896350860596 time for calcul the mask position with numpy : 0.04034161567687988 nb_pixel_total : 39285 time to create 1 rle with old method : 0.04514479637145996 time for calcul the mask position with numpy : 0.04121661186218262 nb_pixel_total : 9494 time to create 1 rle with old method : 0.01096200942993164 time for calcul the mask position with numpy : 0.040796518325805664 nb_pixel_total : 13307 time to create 1 rle with old method : 0.015503883361816406 time for calcul the mask position with numpy : 0.04405617713928223 nb_pixel_total : 84923 time to create 1 rle with old method : 0.09566593170166016 time for calcul the mask position with numpy : 0.04157543182373047 nb_pixel_total : 20329 time to create 1 rle with old method : 0.02381610870361328 time for calcul the mask position with numpy : 0.04196286201477051 nb_pixel_total : 21189 time to create 1 rle with old method : 0.02454376220703125 time for calcul the mask position with numpy : 0.04440617561340332 nb_pixel_total : 32469 time to create 1 rle with old method : 0.03686332702636719 time for calcul the mask position with numpy : 0.04436969757080078 nb_pixel_total : 46254 time to create 1 rle with old method : 0.05201888084411621 time for calcul the mask position with numpy : 0.04321146011352539 nb_pixel_total : 180857 time to create 1 rle with new method : 0.8828256130218506 time for calcul the mask position with numpy : 0.048316001892089844 nb_pixel_total : 200448 time to create 1 rle with new method : 0.5895795822143555 time for calcul the mask position with numpy : 0.044648170471191406 nb_pixel_total : 21414 time to create 1 rle with old method : 0.026701927185058594 create new chi : 7.25206446647644 time to delete rle : 0.004865407943725586 batch 1 Loaded 43 chid ids of type : 3594 ++++++++++++++++++++++++++++++Number RLEs to save : 16129 TO DO : save crop sub photo not yet done ! save time : 0.9990055561065674 map_output_result : {1373990931: (0.0, 'Should be the crop_list due to order', 0), 1373990874: (0.0, 'Should be the crop_list due to order', 0), 1373990507: (0.0, 'Should be the crop_list due to order', 0), 1373990504: (0.0, 'Should be the crop_list due to order', 0), 1373990502: (0.0, 'Should be the crop_list due to order', 0), 1373990500: (0.0, 'Should be the crop_list due to order', 0), 1373990499: (0.0, 'Should be the crop_list due to order', 0), 1373990498: (0.0, 'Should be the crop_list due to order', 0), 1373990440: (0.0, 'Should be the crop_list due to order', 0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1373990931, 1373990874, 1373990507, 1373990504, 1373990502, 1373990500, 1373990499, 1373990498, 1373990440] Looping around the photos to save general results len do output : 9 /1373990931.Didn't retrieve data . /1373990874.Didn't retrieve data . /1373990507.Didn't retrieve data . /1373990504.Didn't retrieve data . /1373990502.Didn't retrieve data . /1373990500.Didn't retrieve data . /1373990499.Didn't retrieve data . /1373990498.Didn't retrieve data . /1373990440.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990931', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990874', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990507', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990504', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990502', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990500', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990499', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990498', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990440', None, None, None, None, None, '3370199') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 27 time used for this insertion : 0.012644052505493164 save_final save missing photos in datou_result : time spend for datou_step_exec : 112.81265830993652 time spend to save output : 0.17447781562805176 total time spend for step 3 : 112.98713612556458 step4:ventilate_hashtags_in_portfolio Mon Jul 28 13:06:14 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 25398994 get user id for portfolio 25398994 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25398994 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('pehd','papier','autre','environnement','pet_fonce','pet_clair','flou','background','metal','carton','mal_croppe')) AND mptpi.`min_score`=0.5 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25398994 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('pehd','papier','autre','environnement','pet_fonce','pet_clair','flou','background','metal','carton','mal_croppe')) AND mptpi.`min_score`=0.5 To do To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25398994 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('pehd','papier','autre','environnement','pet_fonce','pet_clair','flou','background','metal','carton','mal_croppe')) AND mptpi.`min_score`=0.5 To do lien utilise dans velours : https://www.fotonower.com/velours/25403332,25403333,25403334,25403335,25403336,25403337,25403338,25403339,25403340,25403341,25403343?tags=pehd,papier,autre,environnement,pet_fonce,pet_clair,flou,background,metal,carton,mal_croppe Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1373990931, 1373990874, 1373990507, 1373990504, 1373990502, 1373990500, 1373990499, 1373990498, 1373990440] Looping around the photos to save general results len do output : 1 /25398994. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990931', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990874', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990507', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990504', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990502', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990500', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990499', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990498', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990440', None, None, None, None, None, '3370199') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 10 time used for this insertion : 0.01584792137145996 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.8269343376159668 time spend to save output : 0.01629805564880371 total time spend for step 4 : 0.8432323932647705 step5:final Mon Jul 28 13:06:15 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Catched exception ! Connect or reconnect ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1373990931: ('0.10684877079046641',), 1373990874: ('0.10684877079046641',), 1373990507: ('0.10684877079046641',), 1373990504: ('0.10684877079046641',), 1373990502: ('0.10684877079046641',), 1373990500: ('0.10684877079046641',), 1373990499: ('0.10684877079046641',), 1373990498: ('0.10684877079046641',), 1373990440: ('0.10684877079046641',)} new output for save of step final : {1373990931: ('0.10684877079046641',), 1373990874: ('0.10684877079046641',), 1373990507: ('0.10684877079046641',), 1373990504: ('0.10684877079046641',), 1373990502: ('0.10684877079046641',), 1373990500: ('0.10684877079046641',), 1373990499: ('0.10684877079046641',), 1373990498: ('0.10684877079046641',), 1373990440: ('0.10684877079046641',)} [1373990931, 1373990874, 1373990507, 1373990504, 1373990502, 1373990500, 1373990499, 1373990498, 1373990440] Looping around the photos to save general results len do output : 9 /1373990931.Didn't retrieve data . /1373990874.Didn't retrieve data . /1373990507.Didn't retrieve data . /1373990504.Didn't retrieve data . /1373990502.Didn't retrieve data . /1373990500.Didn't retrieve data . /1373990499.Didn't retrieve data . /1373990498.Didn't retrieve data . /1373990440.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990931', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990874', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990507', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990504', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990502', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990500', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990499', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990498', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990440', None, None, None, None, None, '3370199') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 27 time used for this insertion : 0.01569509506225586 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.14328312873840332 time spend to save output : 0.01651620864868164 total time spend for step 5 : 0.15979933738708496 step6:blur_detection Mon Jul 28 13:06:15 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection methode: ratio et variance treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d.jpg resize: (2160, 3840) 1373990931 -7.244623960675525 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c.jpg resize: (2160, 3840) 1373990874 -7.294672886922219 treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c.jpg resize: (2160, 3840) 1373990507 -7.103635914008244 treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8.jpg resize: (2160, 3840) 1373990504 -7.383323722655824 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a.jpg resize: (2160, 3840) 1373990502 -7.309206310343124 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde.jpg resize: (2160, 3840) 1373990500 -7.306687806362486 treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e.jpg resize: (2160, 3840) 1373990499 -7.127740718804342 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1.jpg resize: (2160, 3840) 1373990498 -7.145653117532588 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e.jpg resize: (2160, 3840) 1373990440 -7.297241908750404 treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397626_0.png resize: (363, 378) 1374148211 -2.111162964326865 treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397627_0.png resize: (703, 713) 1374148213 -4.067387270203963 treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397628_0.png resize: (218, 349) 1374148214 -3.174636460974166 treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397629_0.png resize: (128, 346) 1374148215 -4.124331442812178 treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397630_0.png resize: (195, 194) 1374148217 -3.2740215494381357 treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397631_0.png resize: (109, 125) 1374148218 -4.309971850141908 treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397632_0.png resize: (110, 178) 1374148219 -1.9675111316773979 treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397633_0.png resize: (93, 129) 1374148221 -0.9207804001209258 treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397634_0.png resize: (228, 201) 1374148222 -2.220350505874296 treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397635_0.png resize: (229, 105) 1374148224 -4.08386279467437 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397636_0.png resize: (278, 342) 1374148225 -2.1523126282382252 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397637_0.png resize: (373, 340) 1374148226 -0.9537624014810565 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397638_0.png resize: (428, 345) 1374148227 -1.8516595146560022 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397640_0.png resize: (343, 346) 1374148229 -5.103396275122511 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397641_0.png resize: (159, 282) 1374148230 -2.439125306632507 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397642_0.png resize: (191, 143) 1374148232 -2.1670931250646026 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397643_0.png resize: (331, 445) 1374148233 -3.2476216429708904 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397644_0.png resize: (321, 205) 1374148234 -2.949826758748122 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397645_0.png resize: (336, 375) 1374148236 -4.707281394680194 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397646_0.png resize: (205, 235) 1374148237 -0.8223817788555338 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397647_0.png resize: (246, 294) 1374148238 -4.2862315040315195 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397648_0.png resize: (119, 170) 1374148240 -1.9068881276503944 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397649_0.png resize: (453, 363) 1374148241 -2.0280148119487045 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397650_0.png resize: (301, 315) 1374148242 -2.399425465217693 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397651_0.png resize: (207, 266) 1374148244 -2.389785218338448 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397652_0.png resize: (355, 210) 1374148245 -4.516339353910854 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397653_0.png resize: (154, 124) 1374148246 -2.9310109790513676 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397654_0.png resize: (208, 230) 1374148248 -1.7456342757592025 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397655_0.png resize: (373, 331) 1374148249 -4.453503363934154 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397656_0.png resize: (683, 762) 1374148250 -4.735516396792006 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397657_0.png resize: (310, 117) 1374148252 -4.24929180326404 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397658_0.png resize: (246, 233) 1374148253 -5.143791943793372 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397660_0.png resize: (231, 248) 1374148254 -1.5323242378639628 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397661_0.png resize: (196, 201) 1374148256 0.4227807449194637 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397662_0.png resize: (313, 180) 1374148257 -5.111997014065686 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397663_0.png resize: (270, 195) 1374148258 -4.632845544064276 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397664_0.png resize: (343, 262) 1374148260 -5.71884068909741 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397665_0.png resize: (323, 226) 1374148261 -4.634088669118805 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397666_0.png resize: (100, 152) 1374148262 -4.190865667405605 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397667_0.png resize: (107, 97) 1374148264 -4.8291084638151345 treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c_rle_crop_3896397669_0.png resize: (420, 394) 1374148265 -2.185834834822123 treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c_rle_crop_3896397670_0.png resize: (442, 288) 1374148266 -4.184691854540566 treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c_rle_crop_3896397672_0.png resize: (311, 203) 1374148268 -3.771917283315456 treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c_rle_crop_3896397673_0.png resize: (247, 341) 1374148269 -1.351914500725453 treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c_rle_crop_3896397674_0.png resize: (578, 521) 1374148270 -4.111381180887132 treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c_rle_crop_3896397676_0.png resize: (237, 140) 1374148272 -4.175018996773887 treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8_rle_crop_3896397677_0.png resize: (301, 282) 1374148273 0.8321247978180237 treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8_rle_crop_3896397678_0.png resize: (179, 110) 1374148274 -4.603458939286007 treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8_rle_crop_3896397681_0.png resize: (267, 281) 1374148276 -5.603620255097122 treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8_rle_crop_3896397682_0.png resize: (517, 364) 1374148277 -2.4101517307741567 treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8_rle_crop_3896397683_0.png resize: (167, 62) 1374148278 -3.4000197235181098 treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8_rle_crop_3896397684_0.png resize: (148, 189) 1374148280 -2.7688642228065405 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397685_0.png resize: (389, 143) 1374148281 -3.590348396355334 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397686_0.png resize: (93, 103) 1374148282 -2.860888049138002 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397687_0.png resize: (395, 517) 1374148284 -4.78695871054785 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397688_0.png resize: (256, 233) 1374148285 -1.6629598372940677 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397689_0.png resize: (587, 390) 1374148286 -5.077098781685018 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397690_0.png resize: (403, 155) 1374148288 -3.5744475685259527 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397691_0.png resize: (548, 408) 1374148290 -4.225054170601012 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397692_0.png resize: (388, 79) 1374148291 -3.532607404990931 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397693_0.png resize: (287, 261) 1374148293 -3.148683961434399 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397694_0.png resize: (184, 180) 1374148294 -3.1744847368017197 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397695_0.png resize: (369, 255) 1374148295 -2.466207917789484 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397697_0.png resize: (261, 455) 1374148297 -4.052930411969208 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397699_0.png resize: (453, 365) 1374148298 -3.603437257982332 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397700_0.png resize: (493, 244) 1374148299 -2.9188115696535246 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397701_0.png resize: (317, 523) 1374148301 -3.145069210372704 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397702_0.png resize: (199, 390) 1374148302 -1.8568076489888328 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397703_0.png resize: (251, 96) 1374148303 3.0894821126865843 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397705_0.png resize: (446, 564) 1374148305 -3.8071524377962196 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397706_0.png resize: (126, 261) 1374148306 -4.015964548816111 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397707_0.png resize: (334, 255) 1374148307 -4.877696263695182 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397708_0.png resize: (135, 167) 1374148309 -2.8327412448802427 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397709_0.png resize: (139, 57) 1374148310 -2.9518311243073745 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397712_0.png resize: (368, 166) 1374148311 -3.6912734171870163 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397713_0.png resize: (271, 122) 1374148313 -3.8958651292711854 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397714_0.png resize: (279, 281) 1374148314 -1.950816771238554 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397715_0.png resize: (293, 373) 1374148315 -4.4426087800832335 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397716_0.png resize: (444, 143) 1374148317 -4.945222099691542 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397717_0.png resize: (124, 175) 1374148318 0.32127239564878196 treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397721_0.png resize: (113, 169) 1374148319 0.015266021721586033 treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397722_0.png resize: (112, 235) 1374148321 1.8707083291751376 treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397723_0.png resize: (559, 802) 1374148322 -2.0798438106115427 treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397724_0.png resize: (155, 183) 1374148324 -4.331950905284701 treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397725_0.png resize: (244, 291) 1374148325 -2.5461159083722094 treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397726_0.png resize: (286, 335) 1374148326 -2.3493034537213484 treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397727_0.png resize: (431, 380) 1374148328 -4.844270102630142 treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397729_0.png resize: (162, 328) 1374148329 -3.578400475234353 treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397730_0.png resize: (189, 169) 1374148330 -3.66565837832091 treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397732_0.png resize: (163, 157) 1374148332 0.015600334394047278 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397733_0.png resize: (223, 163) 1374148333 -3.502475052933918 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397734_0.png resize: (140, 275) 1374148334 -0.7577932707076338 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397735_0.png resize: (260, 206) 1374148336 -4.9720967645251966 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397736_0.png resize: (178, 195) 1374148337 -2.4184203445274353 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397737_0.png resize: (283, 238) 1374148338 -2.231070935042017 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397738_0.png resize: (191, 164) 1374148340 0.5831574591372898 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397739_0.png resize: (294, 301) 1374148341 0.7876920094875383 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397740_0.png resize: (205, 156) 1374148342 -3.192576194369538 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397741_0.png resize: (473, 549) 1374148344 -4.856965773473332 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397743_0.png resize: (133, 115) 1374148345 -3.275319989858267 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397744_0.png resize: (275, 302) 1374148346 -2.2340599832240953 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397745_0.png resize: (377, 281) 1374148348 0.09055919306235898 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397747_0.png resize: (174, 185) 1374148349 -2.9313340847506644 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397748_0.png resize: (134, 183) 1374148350 -3.570054254986534 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397749_0.png resize: (224, 195) 1374148352 -3.052154655125114 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397750_0.png resize: (284, 352) 1374148353 -4.145455214484733 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397751_0.png resize: (196, 145) 1374148354 -4.313804520871569 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397752_0.png resize: (248, 181) 1374148356 -4.597773307795862 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397753_0.png resize: (208, 161) 1374148357 -2.3235950937684056 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397754_0.png resize: (440, 745) 1374148358 -3.2770358106169604 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397755_0.png resize: (628, 522) 1374148360 -4.7013574484518355 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397756_0.png resize: (305, 273) 1374148361 -3.6015760637575105 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397757_0.png resize: (344, 191) 1374148362 -2.841019718443498 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397759_0.png resize: (339, 148) 1374148364 -3.3069812119431425 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397760_0.png resize: (373, 312) 1374148365 -1.5441863098749349 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397761_0.png resize: (91, 188) 1374148366 -2.760790558657552 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397763_0.png resize: (174, 374) 1374148368 -3.545217719878119 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397764_0.png resize: (461, 659) 1374148369 -3.895840152738838 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397765_0.png resize: (554, 595) 1374148370 -3.0982109944034315 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397766_0.png resize: (149, 89) 1374148372 -3.3293324173485717 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397767_0.png resize: (206, 247) 1374148373 -3.9821901535399387 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397770_0.png resize: (343, 292) 1374148374 -4.9354982303226755 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397771_0.png resize: (283, 195) 1374148376 -3.1604678003005433 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397773_0.png resize: (121, 137) 1374148377 -2.834509405391886 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397639_0.png resize: (154, 137) 1374148440 -3.3059574360732427 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397659_0.png resize: (167, 203) 1374148441 -2.8420636910860093 treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c_rle_crop_3896397671_0.png resize: (99, 125) 1374148443 -2.1531654379180334 treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8_rle_crop_3896397679_0.png resize: (485, 405) 1374148444 -4.567813335044621 treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8_rle_crop_3896397680_0.png resize: (264, 265) 1374148445 -4.890210138156019 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397696_0.png resize: (307, 578) 1374148447 -4.08570177564864 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397710_0.png resize: (186, 202) 1374148448 -3.9338275443913466 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397718_0.png resize: (181, 327) 1374148450 -3.9950985173962534 treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397720_0.png resize: (257, 185) 1374148451 -4.914904139531052 treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397728_0.png resize: (530, 311) 1374148452 -4.592392477092368 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397742_0.png resize: (432, 335) 1374148454 -3.400983366034925 treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397746_0.png resize: (225, 199) 1374148455 -4.932916153387058 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397758_0.png resize: (219, 207) 1374148456 -3.5460376039565022 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397762_0.png resize: (157, 94) 1374148458 -4.6358599806160585 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397768_0.png resize: (120, 220) 1374148459 -3.5801748168549157 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397769_0.png resize: (281, 243) 1374148460 -4.5640696427714085 treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397772_0.png resize: (218, 248) 1374148462 -5.347929746668714 treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397668_0.png resize: (357, 376) 1374148489 -4.496744045735408 treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c_rle_crop_3896397675_0.png resize: (301, 362) 1374148490 -4.577786606876531 treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397698_0.png resize: (200, 224) 1374148492 -4.831981659270456 treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397731_0.png resize: (308, 371) 1374148493 -5.551763405777478 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397704_0.png resize: (188, 248) 1374148511 -2.7866522627917067 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397711_0.png resize: (143, 167) 1374148513 -3.3795972164646333 treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397719_0.png resize: (118, 253) 1374148514 -1.571857604095766 Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 157 time used for this insertion : 0.021552085876464844 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 157 time used for this insertion : 0.03739285469055176 save missing photos in datou_result : time spend for datou_step_exec : 42.597496032714844 time spend to save output : 0.06472444534301758 total time spend for step 6 : 42.66222047805786 step7:brightness Mon Jul 28 13:06:57 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d.jpg treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c.jpg treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c.jpg treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8.jpg treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a.jpg treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde.jpg treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e.jpg treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1.jpg treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e.jpg treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397626_0.png treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397627_0.png treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397628_0.png treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397629_0.png treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397630_0.png treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397631_0.png treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397632_0.png treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397633_0.png treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397634_0.png treat image : temp/1753700429_3508334_1373990931_61972db76e634c5983e9d3901a5c006d_rle_crop_3896397635_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397636_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397637_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397638_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397640_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397641_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397642_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397643_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397644_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397645_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397646_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397647_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397648_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397649_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397650_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397651_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397652_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397653_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397654_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397655_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397656_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397657_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397658_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397660_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397661_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397662_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397663_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397664_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397665_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397666_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397667_0.png treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c_rle_crop_3896397669_0.png treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c_rle_crop_3896397670_0.png treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c_rle_crop_3896397672_0.png treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c_rle_crop_3896397673_0.png treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c_rle_crop_3896397674_0.png treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c_rle_crop_3896397676_0.png treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8_rle_crop_3896397677_0.png treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8_rle_crop_3896397678_0.png treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8_rle_crop_3896397681_0.png treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8_rle_crop_3896397682_0.png treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8_rle_crop_3896397683_0.png treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8_rle_crop_3896397684_0.png treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397685_0.png treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397686_0.png treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397687_0.png treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397688_0.png treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397689_0.png treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397690_0.png treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397691_0.png treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397692_0.png treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397693_0.png treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397694_0.png treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397695_0.png treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397697_0.png treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397699_0.png treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397700_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397701_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397702_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397703_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397705_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397706_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397707_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397708_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397709_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397712_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397713_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397714_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397715_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397716_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397717_0.png treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397721_0.png treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397722_0.png treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397723_0.png treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397724_0.png treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397725_0.png treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397726_0.png treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397727_0.png treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397729_0.png treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397730_0.png treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397732_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397733_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397734_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397735_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397736_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397737_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397738_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397739_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397740_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397741_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397743_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397744_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397745_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397747_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397748_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397749_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397750_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397751_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397752_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397753_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397754_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397755_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397756_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397757_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397759_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397760_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397761_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397763_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397764_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397765_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397766_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397767_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397770_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397771_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397773_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397639_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397659_0.png treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c_rle_crop_3896397671_0.png treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8_rle_crop_3896397679_0.png treat image : temp/1753700429_3508334_1373990504_452803fc79cba97f9bd6f0adc33c39f8_rle_crop_3896397680_0.png treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397696_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397710_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397718_0.png treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397720_0.png treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397728_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397742_0.png treat image : temp/1753700429_3508334_1373990498_c1aa155e657ddcf908033bee8b64d9b1_rle_crop_3896397746_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397758_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397762_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397768_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397769_0.png treat image : temp/1753700429_3508334_1373990440_21e675e734f6d92ec410f3763f12a10e_rle_crop_3896397772_0.png treat image : temp/1753700429_3508334_1373990874_ec2d44ef43cc555a04ffc265dbf5609c_rle_crop_3896397668_0.png treat image : temp/1753700429_3508334_1373990507_da7ffde3c0ec609a91b1b9b8dd9c7c8c_rle_crop_3896397675_0.png treat image : temp/1753700429_3508334_1373990502_12c233ba313a93a312b9a90473e7e10a_rle_crop_3896397698_0.png treat image : temp/1753700429_3508334_1373990499_96b5f29ae87194e6b621b21feb9f976e_rle_crop_3896397731_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397704_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397711_0.png treat image : temp/1753700429_3508334_1373990500_c085a4ce5eb4505668ef4500ce5bebde_rle_crop_3896397719_0.png Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 157 time used for this insertion : 0.019089221954345703 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 157 time used for this insertion : 0.029298067092895508 save missing photos in datou_result : time spend for datou_step_exec : 9.999656915664673 time spend to save output : 0.05373072624206543 total time spend for step 7 : 10.053387641906738 step8:velours_tree Mon Jul 28 13:07:07 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure can't find the photo_desc_type Inside saveOutput : final : False verbose : 0 ouput is None No outpout to save, returning out of save general time spend for datou_step_exec : 0.4222571849822998 time spend to save output : 4.291534423828125e-05 total time spend for step 8 : 0.4223001003265381 step9:send_mail_cod Mon Jul 28 13:07:08 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin/workarea/git/Velours/python in order to get the selector url, please entre the license of selector results_Auto_P25398994_28-07-2025_13_07_08.pdf 25403332 imagette254033321753700828 25403333 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette254033331753700828 25403334 change filename to text .change filename to text .change filename to text .imagette254033341753700829 25403336 imagette254033361753700830 25403337 change filename to text .change filename to text .change filename to text .change filename to text .imagette254033371753700830 25403338 imagette254033381753700830 25403339 imagette254033391753700830 25403340 imagette254033401753700830 25403341 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette254033411753700830 25403343 imagette254033431753700831 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=25398994 and hashtag_type = 3594 and pcr.hashtag_id = h.hashtag_id; velour_link : https://www.fotonower.com/velours/25403332,25403333,25403334,25403335,25403336,25403337,25403338,25403339,25403340,25403341,25403343?tags=pehd,papier,autre,environnement,pet_fonce,pet_clair,flou,background,metal,carton,mal_croppe args[1373990931] : ((1373990931, -7.244623960675525, 492609224), (1373990931, 1.1850496433411826, 2107752395), '0.10684877079046641') We are sending mail with results at report@fotonower.com args[1373990874] : ((1373990874, -7.294672886922219, 492609224), (1373990874, 0.8725645838740429, 2107752395), '0.10684877079046641') We are sending mail with results at report@fotonower.com args[1373990507] : ((1373990507, -7.103635914008244, 492609224), (1373990507, 1.3056571640284138, 2107752395), '0.10684877079046641') We are sending mail with results at report@fotonower.com args[1373990504] : ((1373990504, -7.383323722655824, 492609224), (1373990504, 1.055151568751446, 2107752395), '0.10684877079046641') We are sending mail with results at report@fotonower.com args[1373990502] : ((1373990502, -7.309206310343124, 492609224), (1373990502, 0.9211147547971765, 2107752395), '0.10684877079046641') We are sending mail with results at report@fotonower.com args[1373990500] : ((1373990500, -7.306687806362486, 492609224), (1373990500, 1.1826610307972103, 2107752395), '0.10684877079046641') We are sending mail with results at report@fotonower.com args[1373990499] : ((1373990499, -7.127740718804342, 492609224), (1373990499, 0.8101249853645889, 2107752395), '0.10684877079046641') We are sending mail with results at report@fotonower.com args[1373990498] : ((1373990498, -7.145653117532588, 492609224), (1373990498, 1.075254281827784, 2107752395), '0.10684877079046641') We are sending mail with results at report@fotonower.com args[1373990440] : ((1373990440, -7.297241908750404, 492609224), (1373990440, 1.3250874518477322, 2107752395), '0.10684877079046641') We are sending mail with results at report@fotonower.com refus_total : 0.10684877079046641 2022-04-13 10:29:59 0 SELECT ph.photo_id,ph.url,ph.username,ph.uploaded_at,ph.text FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=25398994 AND mpp.hide_status=0 ORDER BY mpp.order LIMIT 0, 1000 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25398994_28-07-2025_13_07_08.pdf results_Auto_P25398994_28-07-2025_13_07_08.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25398994_28-07-2025_13_07_08.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('3318','25398994','results_Auto_P25398994_28-07-2025_13_07_08.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25398994_28-07-2025_13_07_08.pdf','pdf','','0.6','0.10684877079046641') message_in_mail: Bonjour,
Veuillez trouver ci dessous les résultats du service carac on demand pour le portfolio: https://www.fotonower.com/view/25398994

https://www.fotonower.com/image?json=false&list_photos_id=1373990931
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1373990874
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1373990507
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1373990504
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1373990502
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1373990500
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1373990499
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1373990498
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1373990440
Bravo, la photo est bien prise.

Dans ces conditions,le taux de refus est: 10.68%
Veuillez trouver les photos des contaminants.

exemples de contaminants: papier: https://www.fotonower.com/view/25403333?limit=200
exemples de contaminants: autre: https://www.fotonower.com/view/25403334?limit=200
exemples de contaminants: pet_clair: https://www.fotonower.com/view/25403337?limit=200
exemples de contaminants: carton: https://www.fotonower.com/view/25403341?limit=200
Veuillez trouver le rapport en pdf:https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25398994_28-07-2025_13_07_08.pdf.

Lien vers velours :https://www.fotonower.com/velours/25403332,25403333,25403334,25403335,25403336,25403337,25403338,25403339,25403340,25403341,25403343?tags=pehd,papier,autre,environnement,pet_fonce,pet_clair,flou,background,metal,carton,mal_croppe.


L'équipe Fotonower 202 b'' Server: nginx Date: Mon, 28 Jul 2025 11:07:15 GMT Content-Length: 0 Connection: close X-Message-Id: f_Ng19tmTpCZJBbhrPXYPg Access-Control-Allow-Origin: https://sendgrid.api-docs.io Access-Control-Allow-Methods: POST Access-Control-Allow-Headers: Authorization, Content-Type, On-behalf-of, x-sg-elas-acl Access-Control-Max-Age: 600 X-No-CORS-Reason: https://sendgrid.com/docs/Classroom/Basics/API/cors.html Strict-Transport-Security: max-age=31536000; includeSubDomains Content-Security-Policy: frame-ancestors 'none' Cache-Control: no-cache X-Content-Type-Options: no-sniff Referrer-Policy: strict-origin-when-cross-origin Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1373990931, 1373990874, 1373990507, 1373990504, 1373990502, 1373990500, 1373990499, 1373990498, 1373990440] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990931', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990874', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990507', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990504', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990502', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990500', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990499', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990498', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990440', None, None, None, None, None, '3370199') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 9 time used for this insertion : 0.018123388290405273 save_final save missing photos in datou_result : time spend for datou_step_exec : 7.4175896644592285 time spend to save output : 0.01830601692199707 total time spend for step 9 : 7.435895681381226 step10:split_time_score Mon Jul 28 13:07:15 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score Catched exception ! Connect or reconnect ! TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('12', 9),) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 19072025 25398994 Nombre de photos uploadées : 9 / 23040 (0%) 19072025 25398994 Nombre de photos taguées (types de déchets): 0 / 9 (0%) 19072025 25398994 Nombre de photos taguées (volume) : 0 / 9 (0%) elapsed_time : load_data_split_time_score 1.9073486328125e-06 elapsed_time : order_list_meta_photo_and_scores 7.152557373046875e-06 ????????? elapsed_time : fill_and_build_computed_from_old_data 0.0007140636444091797 Catched exception ! Connect or reconnect ! Catched exception ! Connect or reconnect ! elapsed_time : insert_dashboard_record_day_entry 0.22262096405029297 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.03100127996184855 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25398904_28-07-2025_12_32_34.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25398904 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25398904 AND mptpi.`type`=3726 To do Qualite : 0.01825568852062115 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25398937_28-07-2025_12_13_45.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25398937 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25398937 AND mptpi.`type`=3726 To do Qualite : 0.02289449354025213 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25398943_28-07-2025_12_02_50.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25398943 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25398943 AND mptpi.`type`=3726 To do Qualite : 0.10684877079046641 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25398994_28-07-2025_13_07_08.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25398994 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25398994 AND mptpi.`type`=3594 To do Qualite : 0.03506406923321646 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25398999_28-07-2025_11_56_56.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25398999 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25398999 AND mptpi.`type`=3726 To do Qualite : 0.024266514647614878 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25399030_28-07-2025_11_36_59.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25399030 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25399030 AND mptpi.`type`=3726 To do Qualite : 0.10871546325973407 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25399044_28-07-2025_13_00_33.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25399044 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25399044 AND mptpi.`type`=3594 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'19072025': {'nb_upload': 9, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1373990931, 1373990874, 1373990507, 1373990504, 1373990502, 1373990500, 1373990499, 1373990498, 1373990440] Looping around the photos to save general results len do output : 1 /25398994Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990931', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990874', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990507', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990504', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990502', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990500', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990499', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990498', None, None, None, None, None, '3370199') ('3318', None, None, None, None, None, None, None, '3370199') ('3318', '25398994', '1373990440', None, None, None, None, None, '3370199') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 10 time used for this insertion : 0.01612234115600586 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.8118760585784912 time spend to save output : 0.016340255737304688 total time spend for step 10 : 0.8282163143157959 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 9 set_done_treatment 163.81user 167.95system 6:52.24elapsed 80%CPU (0avgtext+0avgdata 3648308maxresident)k 6124184inputs+126080outputs (191012major+16044046minor)pagefaults 0swaps