python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 3318 ' -s datou_3318 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 2758669 load datou : 3318 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? [(photo_id, hashtag_id, hashtag_type, x0, x1, y0, y1, score, seg_temp, polygons), ...] was removed should we ? chemin de la photo was removed should we ? [ (photo_id_loc, hashtag_id, hashtag_type, x0, x1, y0, y1, score, None), ...] was removed should we ? chemin de la photo was removed should we ? id de la photo (peut être local ou global) was removed should we ? chemin de la photo was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? None was removed should we ? donnée sous forme de nombre was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) load thcls load THCL from format json or kwargs add thcl : 2847 in CacheModelConfig load pdts add pdt : 5275 in CacheModelConfig Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 3318, datou_cur_ids : ['3410768'] with mtr_portfolio_ids : ['25543232'] and first list_photo_ids : [] new path : /proc/2758669/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : mask_detect, crop_condition, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, velours_tree, send_mail_cod, split_time_score over limit max, limiting to limit_max 40 list_input_json : [] origin We have 1 , WARNING: data may be incomplete, need to offset and complete ! BFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 40 ; length of list_pids : 40 ; length of list_args : 40 time to download the photos : 5.341239929199219 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 10 step1:mask_detect Thu Jul 31 14:40:32 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 8759 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-07-31 14:40:35.531791: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-07-31 14:40:35.559415: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493035000 Hz 2025-07-31 14:40:35.561334: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7fca40000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-07-31 14:40:35.561381: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-07-31 14:40:35.565551: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-07-31 14:40:35.730260: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x2b4e7830 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-07-31 14:40:35.730294: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-07-31 14:40:35.731507: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-07-31 14:40:35.747704: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-31 14:40:35.751624: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-31 14:40:35.754379: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-07-31 14:40:35.754840: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-07-31 14:40:35.757835: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-07-31 14:40:35.758935: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-07-31 14:40:35.763577: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-31 14:40:35.764911: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-07-31 14:40:35.764967: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-31 14:40:35.765688: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-07-31 14:40:35.765703: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-07-31 14:40:35.765727: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-07-31 14:40:35.766923: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 8091 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-07-31 14:40:36.013147: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-07-31 14:40:36.013228: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-31 14:40:36.013248: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-31 14:40:36.013267: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-07-31 14:40:36.013329: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-07-31 14:40:36.013351: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-07-31 14:40:36.013369: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-07-31 14:40:36.013387: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-31 14:40:36.014832: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-07-31 14:40:36.016131: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-07-31 14:40:36.016172: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-31 14:40:36.016193: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-31 14:40:36.016213: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-07-31 14:40:36.016233: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-07-31 14:40:36.016252: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-07-31 14:40:36.016272: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-07-31 14:40:36.016291: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-31 14:40:36.017699: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-07-31 14:40:36.017730: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-07-31 14:40:36.017740: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-07-31 14:40:36.017750: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-07-31 14:40:36.019288: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 8091 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl2847 thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) {'thcl': {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'], 'list_hashtags_csv': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'svm_hashtag_type_desc': 5275, 'photo_desc_type': 5275, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME learn_RUBBIA_REFUS_AMIENS_23 NUM_CLASSES 9 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : learn_RUBBIA_REFUS_AMIENS_23 model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-07-31 14:40:45.646553: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-31 14:40:45.852346: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-31 14:40:47.442917: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:47.443670: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 3.60G (3865470464 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.264704: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.264800: W tensorflow/core/common_runtime/bfc_allocator.cc:311] Garbage collection: deallocate free memory regions (i.e., allocations) so that we can re-allocate a larger region to avoid OOM due to memory fragmentation. If you see this message frequently, you are running near the threshold of the available device memory and re-allocation may incur great performance overhead. You may try smaller batch sizes to observe the performance impact. Set TF_ENABLE_GPU_GARBAGE_COLLECTION=false if you'd like to disable this feature. 2025-07-31 14:40:48.341601: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.341739: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 3.29GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:40:48.342592: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.342623: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 3.29GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:40:48.350225: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.350307: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 1.78GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:40:48.351044: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.351066: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 1.78GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:40:48.380563: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.380689: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 19.91MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:40:48.380717: W tensorflow/core/kernels/gpu_utils.cc:49] Failed to allocate memory for convolution redzone checking; skipping this check. This is benign and only means that we won't check cudnn for out-of-bounds reads and writes. This message will only be printed once. 2025-07-31 14:40:48.382104: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.382152: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 16.00MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:40:48.383570: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.383613: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 16.00MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:40:48.391104: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.391180: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 63.85MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:40:48.392126: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.392161: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 63.85MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:40:48.393085: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.393115: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.26GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:40:48.394007: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.404553: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.405895: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.423960: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.424800: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.425537: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.426293: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.431185: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.431974: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.432699: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.433424: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.434724: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.445978: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.446702: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.457318: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.458594: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.459893: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.461166: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.462397: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:40:48.463634: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory local folder : /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23 /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23/mask_model.h5 size_local : 256009536 size in s3 : 256009536 create time local : 2021-08-09 09:43:22 create time in s3 : 2021-08-06 18:54:04 mask_model.h5 already exist and didn't need to update list_images length : 40 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 22.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 27.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 25.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 25.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 21.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 16.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 20.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 1 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 22.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 18.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 28.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 22.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 24.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 23.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 28.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 18.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 21.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 24.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 18.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 27.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 25.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 30.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 27.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 20.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 24.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 21.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 25.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 25.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 8 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 24.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 9 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 15.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 19.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 25.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 8 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 28.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 28.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 33.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 Detection mask done ! Trying to reset tf kernel 2759328 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 2377 tf kernel not reseted sub process len(results) : 40 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 40 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 3570 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl2847 Catched exception ! Connect or reconnect ! thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] time for calcul the mask position with numpy : 0.00025844573974609375 nb_pixel_total : 4426 time to create 1 rle with old method : 0.0053217411041259766 length of segment : 130 time for calcul the mask position with numpy : 0.04645252227783203 nb_pixel_total : 734495 time to create 1 rle with new method : 0.07083725929260254 length of segment : 976 time for calcul the mask position with numpy : 0.00023651123046875 nb_pixel_total : 5353 time to create 1 rle with old method : 0.005890369415283203 length of segment : 131 time for calcul the mask position with numpy : 0.00035643577575683594 nb_pixel_total : 11508 time to create 1 rle with old method : 0.012637138366699219 length of segment : 147 time for calcul the mask position with numpy : 0.00046133995056152344 nb_pixel_total : 19610 time to create 1 rle with old method : 0.02107405662536621 length of segment : 148 time for calcul the mask position with numpy : 0.00039839744567871094 nb_pixel_total : 10543 time to create 1 rle with old method : 0.011825084686279297 length of segment : 129 time for calcul the mask position with numpy : 0.0002372264862060547 nb_pixel_total : 11546 time to create 1 rle with old method : 0.012546300888061523 length of segment : 166 time for calcul the mask position with numpy : 0.016237258911132812 nb_pixel_total : 755177 time to create 1 rle with new method : 0.06872916221618652 length of segment : 986 time for calcul the mask position with numpy : 0.019501447677612305 nb_pixel_total : 806539 time to create 1 rle with new method : 0.12773966789245605 length of segment : 1669 time for calcul the mask position with numpy : 0.0004112720489501953 nb_pixel_total : 13855 time to create 1 rle with old method : 0.015231609344482422 length of segment : 173 time for calcul the mask position with numpy : 0.0005376338958740234 nb_pixel_total : 39137 time to create 1 rle with old method : 0.04499983787536621 length of segment : 239 time for calcul the mask position with numpy : 0.01309823989868164 nb_pixel_total : 717355 time to create 1 rle with new method : 0.22050833702087402 length of segment : 964 time for calcul the mask position with numpy : 0.21017193794250488 nb_pixel_total : 714426 time to create 1 rle with new method : 0.04420804977416992 length of segment : 968 time for calcul the mask position with numpy : 0.11981892585754395 nb_pixel_total : 759078 time to create 1 rle with new method : 0.05875444412231445 length of segment : 993 time for calcul the mask position with numpy : 0.00020265579223632812 nb_pixel_total : 6977 time to create 1 rle with old method : 0.008000850677490234 length of segment : 129 time for calcul the mask position with numpy : 8.7738037109375e-05 nb_pixel_total : 3378 time to create 1 rle with old method : 0.004033565521240234 length of segment : 81 time for calcul the mask position with numpy : 0.0002918243408203125 nb_pixel_total : 15521 time to create 1 rle with old method : 0.01849055290222168 length of segment : 151 time for calcul the mask position with numpy : 0.0006289482116699219 nb_pixel_total : 19438 time to create 1 rle with old method : 0.02240920066833496 length of segment : 367 time for calcul the mask position with numpy : 0.0007755756378173828 nb_pixel_total : 23336 time to create 1 rle with old method : 0.02718067169189453 length of segment : 249 time for calcul the mask position with numpy : 0.0002560615539550781 nb_pixel_total : 6855 time to create 1 rle with old method : 0.008359193801879883 length of segment : 89 time for calcul the mask position with numpy : 0.0006060600280761719 nb_pixel_total : 16196 time to create 1 rle with old method : 0.019212722778320312 length of segment : 281 time for calcul the mask position with numpy : 0.0005342960357666016 nb_pixel_total : 24691 time to create 1 rle with old method : 0.028708219528198242 length of segment : 175 time for calcul the mask position with numpy : 0.00015425682067871094 nb_pixel_total : 3345 time to create 1 rle with old method : 0.003922224044799805 length of segment : 60 time for calcul the mask position with numpy : 0.012992143630981445 nb_pixel_total : 744206 time to create 1 rle with new method : 0.05279064178466797 length of segment : 987 time for calcul the mask position with numpy : 0.00014066696166992188 nb_pixel_total : 2673 time to create 1 rle with old method : 0.0032739639282226562 length of segment : 48 time for calcul the mask position with numpy : 0.014888763427734375 nb_pixel_total : 723293 time to create 1 rle with new method : 0.05902981758117676 length of segment : 933 time for calcul the mask position with numpy : 0.00025463104248046875 nb_pixel_total : 6419 time to create 1 rle with old method : 0.0074117183685302734 length of segment : 97 time for calcul the mask position with numpy : 0.0001876354217529297 nb_pixel_total : 3772 time to create 1 rle with old method : 0.004662990570068359 length of segment : 82 time for calcul the mask position with numpy : 0.002282381057739258 nb_pixel_total : 109969 time to create 1 rle with old method : 0.12438154220581055 length of segment : 595 time for calcul the mask position with numpy : 0.07810091972351074 nb_pixel_total : 1542869 time to create 1 rle with new method : 0.0896298885345459 length of segment : 1150 time for calcul the mask position with numpy : 0.01226186752319336 nb_pixel_total : 745546 time to create 1 rle with new method : 0.18354010581970215 length of segment : 970 time for calcul the mask position with numpy : 0.0002486705780029297 nb_pixel_total : 16629 time to create 1 rle with old method : 0.019868135452270508 length of segment : 153 time for calcul the mask position with numpy : 0.00022602081298828125 nb_pixel_total : 5987 time to create 1 rle with old method : 0.0069086551666259766 length of segment : 65 time for calcul the mask position with numpy : 0.010563373565673828 nb_pixel_total : 631229 time to create 1 rle with new method : 0.017890214920043945 length of segment : 1163 time for calcul the mask position with numpy : 0.00018072128295898438 nb_pixel_total : 7616 time to create 1 rle with old method : 0.008445978164672852 length of segment : 107 time for calcul the mask position with numpy : 0.00035262107849121094 nb_pixel_total : 11766 time to create 1 rle with old method : 0.013271093368530273 length of segment : 144 time for calcul the mask position with numpy : 0.0005664825439453125 nb_pixel_total : 13091 time to create 1 rle with old method : 0.014842510223388672 length of segment : 178 time for calcul the mask position with numpy : 0.0006353855133056641 nb_pixel_total : 29417 time to create 1 rle with old method : 0.03362298011779785 length of segment : 298 time for calcul the mask position with numpy : 0.0003609657287597656 nb_pixel_total : 4109 time to create 1 rle with old method : 0.0055353641510009766 length of segment : 116 time for calcul the mask position with numpy : 0.00034356117248535156 nb_pixel_total : 12198 time to create 1 rle with old method : 0.014364004135131836 length of segment : 118 time for calcul the mask position with numpy : 0.013950347900390625 nb_pixel_total : 739790 time to create 1 rle with new method : 0.019740819931030273 length of segment : 993 time for calcul the mask position with numpy : 0.0019545555114746094 nb_pixel_total : 105648 time to create 1 rle with old method : 0.11490488052368164 length of segment : 576 time for calcul the mask position with numpy : 0.00027489662170410156 nb_pixel_total : 7549 time to create 1 rle with old method : 0.008829116821289062 length of segment : 111 time for calcul the mask position with numpy : 0.00027441978454589844 nb_pixel_total : 10983 time to create 1 rle with old method : 0.0126800537109375 length of segment : 84 time for calcul the mask position with numpy : 0.0023462772369384766 nb_pixel_total : 114271 time to create 1 rle with old method : 0.12528419494628906 length of segment : 567 time for calcul the mask position with numpy : 0.00013256072998046875 nb_pixel_total : 1850 time to create 1 rle with old method : 0.002286672592163086 length of segment : 47 time for calcul the mask position with numpy : 0.01100301742553711 nb_pixel_total : 754888 time to create 1 rle with new method : 0.01679539680480957 length of segment : 964 time for calcul the mask position with numpy : 0.0014984607696533203 nb_pixel_total : 117170 time to create 1 rle with old method : 0.1258869171142578 length of segment : 534 time for calcul the mask position with numpy : 0.007918357849121094 nb_pixel_total : 463646 time to create 1 rle with new method : 0.016319751739501953 length of segment : 1258 time for calcul the mask position with numpy : 0.0025529861450195312 nb_pixel_total : 97151 time to create 1 rle with old method : 0.1069498062133789 length of segment : 808 time for calcul the mask position with numpy : 0.0005271434783935547 nb_pixel_total : 19224 time to create 1 rle with old method : 0.021910667419433594 length of segment : 114 time for calcul the mask position with numpy : 0.014652013778686523 nb_pixel_total : 751858 time to create 1 rle with new method : 0.022296428680419922 length of segment : 987 time for calcul the mask position with numpy : 0.0009968280792236328 nb_pixel_total : 19591 time to create 1 rle with old method : 0.02343606948852539 length of segment : 218 time for calcul the mask position with numpy : 0.0001773834228515625 nb_pixel_total : 3026 time to create 1 rle with old method : 0.003715038299560547 length of segment : 61 time for calcul the mask position with numpy : 0.00043082237243652344 nb_pixel_total : 9566 time to create 1 rle with old method : 0.011118173599243164 length of segment : 174 time for calcul the mask position with numpy : 0.008107900619506836 nb_pixel_total : 319257 time to create 1 rle with new method : 0.011114835739135742 length of segment : 421 time for calcul the mask position with numpy : 0.0002582073211669922 nb_pixel_total : 5250 time to create 1 rle with old method : 0.00629878044128418 length of segment : 87 time for calcul the mask position with numpy : 0.0006783008575439453 nb_pixel_total : 30398 time to create 1 rle with old method : 0.03490042686462402 length of segment : 186 time for calcul the mask position with numpy : 0.0004329681396484375 nb_pixel_total : 10057 time to create 1 rle with old method : 0.011955022811889648 length of segment : 115 time for calcul the mask position with numpy : 0.0005674362182617188 nb_pixel_total : 7638 time to create 1 rle with old method : 0.009230613708496094 length of segment : 187 time for calcul the mask position with numpy : 0.017977237701416016 nb_pixel_total : 743915 time to create 1 rle with new method : 0.024395227432250977 length of segment : 1002 time for calcul the mask position with numpy : 0.00023174285888671875 nb_pixel_total : 8037 time to create 1 rle with old method : 0.009232044219970703 length of segment : 119 time for calcul the mask position with numpy : 0.00033783912658691406 nb_pixel_total : 10161 time to create 1 rle with old method : 0.01175689697265625 length of segment : 123 time for calcul the mask position with numpy : 0.0022318363189697266 nb_pixel_total : 109359 time to create 1 rle with old method : 0.11927437782287598 length of segment : 539 time for calcul the mask position with numpy : 0.000225067138671875 nb_pixel_total : 7269 time to create 1 rle with old method : 0.008569002151489258 length of segment : 95 time spent for convertir_results : 7.789315223693848 Inside saveOutput : final : False verbose : 0 eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 99 chid ids of type : 3594 +++++++++++++++++Number RLEs to save : 26975 save missing photos in datou_result : time spend for datou_step_exec : 46.846137285232544 time spend to save output : 1.4591944217681885 total time spend for step 1 : 48.30533170700073 step2:crop_condition Thu Jul 31 14:41:20 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure Loading chi in step crop with photo_hashtag_type : 3594 Loading chi in step crop for list_pids : 40 ! batch 1 Loaded 99 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ begin to crop the class : papier param for this class : {'min_score': 0.7} filtre for class : papier hashtag_id of this class : 492668766 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 12 About to insert : list_path_to_insert length 12 new photo from crops ! About to upload 12 photos upload in portfolio : 3736932 init cache_photo without model_param we have 12 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753965682_2758669 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 12 photos in the portfolio 3736932 time of upload the photos Elapsed time : 2.9480817317962646 we have finished the crop for the class : papier begin to crop the class : carton param for this class : {'min_score': 0.7} filtre for class : carton hashtag_id of this class : 492774966 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 6 About to insert : list_path_to_insert length 6 new photo from crops ! About to upload 6 photos upload in portfolio : 3736932 init cache_photo without model_param we have 6 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753965686_2758669 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 6 photos in the portfolio 3736932 time of upload the photos Elapsed time : 2.0131630897521973 we have finished the crop for the class : carton begin to crop the class : metal param for this class : {'min_score': 0.7} filtre for class : metal hashtag_id of this class : 492628673 begin to crop the class : pet_clair param for this class : {'min_score': 0.7} filtre for class : pet_clair hashtag_id of this class : 2107755846 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 60 About to insert : list_path_to_insert length 60 new photo from crops ! About to upload 60 photos upload in portfolio : 3736932 init cache_photo without model_param we have 60 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753965743_2758669 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! we have uploaded 60 photos in the portfolio 3736932 time of upload the photos Elapsed time : 14.476406574249268 we have finished the crop for the class : pet_clair begin to crop the class : autre param for this class : {'min_score': 0.7} filtre for class : autre hashtag_id of this class : 494826614 we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 3 About to insert : list_path_to_insert length 3 new photo from crops ! About to upload 3 photos upload in portfolio : 3736932 init cache_photo without model_param we have 3 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753965758_2758669 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first Unexecpected behavior in 07/2025 that can be generalized l287 : type_extension .jpg This is a hack ! we have uploaded 3 photos in the portfolio 3736932 time of upload the photos Elapsed time : 2.289893388748169 we have finished the crop for the class : autre begin to crop the class : pehd param for this class : {'min_score': 0.7} filtre for class : pehd hashtag_id of this class : 628944319 begin to crop the class : pet_fonce param for this class : {'min_score': 0.7} filtre for class : pet_fonce hashtag_id of this class : 2107755900 delete rles from all chi we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : crop_condition we use saveGeneral [1374583980, 1374583967, 1374583965, 1374583877, 1374583874, 1374583774, 1374583750, 1374583724, 1374583698, 1374583671, 1374583645, 1374583005, 1374582695, 1374582671, 1374582646, 1374582554, 1374582524, 1374582520, 1374582517, 1374582515, 1374582514, 1374582506, 1374582438, 1374582435, 1374582428, 1374582403, 1374582391, 1374582384, 1374568368, 1374568367, 1374568366, 1374568364, 1374568361, 1374568357, 1374568345, 1374568334, 1374568326, 1374568324, 1374568320, 1374568317] Looping around the photos to save general results len do output : 81 /1374601915Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601916Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601917Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601918Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601919Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601920Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601921Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601922Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601923Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601924Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601925Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601926Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601929Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601930Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601931Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601932Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601933Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374601934Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602172Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602174Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602175Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602176Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602179Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602180Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602181Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602183Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602185Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602186Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602187Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602189Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602190Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602191Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602193Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602194Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602195Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602196Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602198Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602199Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602200Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602202Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602203Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602204Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602205Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602207Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602208Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602209Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602211Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602212Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602213Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602214Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602216Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602217Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602218Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602220Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602221Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602222Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602223Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602225Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602226Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602227Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602229Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602230Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602231Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602233Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602234Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602235Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602236Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602238Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602239Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602240Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602242Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602243Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602244Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602245Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602247Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602248Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602249Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602251Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602275Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602277Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374602278Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583980', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583967', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583965', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583877', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583874', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583774', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583750', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583724', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583698', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583671', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583645', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583005', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582695', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582671', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582646', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582554', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582524', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582520', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582517', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582515', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582514', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582506', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582438', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582435', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582428', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582403', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582391', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582384', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568368', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568367', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568366', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568364', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568361', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568357', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568345', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568334', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568326', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568324', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568320', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568317', None, None, None, None, None, '3410768') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 283 time used for this insertion : 0.027761459350585938 save_final save missing photos in datou_result : time spend for datou_step_exec : 79.90522265434265 time spend to save output : 0.03056645393371582 total time spend for step 2 : 79.93578910827637 step3:rle_unique_nms_with_priority Thu Jul 31 14:42:40 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 99 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ No data in photo_id : 1374583980 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.2004861831665039 time for calcul the mask position with numpy : 0.06071162223815918 nb_pixel_total : 1334679 time to create 1 rle with new method : 0.26761651039123535 time for calcul the mask position with numpy : 0.013943672180175781 nb_pixel_total : 734495 time to create 1 rle with new method : 0.2513594627380371 time for calcul the mask position with numpy : 0.0068852901458740234 nb_pixel_total : 4426 time to create 1 rle with old method : 0.005115985870361328 create new chi : 0.6224794387817383 time to delete rle : 0.026318788528442383 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 3292 TO DO : save crop sub photo not yet done ! save time : 0.2169485092163086 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.07547259330749512 time for calcul the mask position with numpy : 0.028812885284423828 nb_pixel_total : 2068247 time to create 1 rle with new method : 0.1498556137084961 time for calcul the mask position with numpy : 0.006356716156005859 nb_pixel_total : 5353 time to create 1 rle with old method : 0.006186723709106445 create new chi : 0.19885897636413574 time to delete rle : 0.00022649765014648438 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1342 TO DO : save crop sub photo not yet done ! save time : 0.11842727661132812 No data in photo_id : 1374583877 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.046851396560668945 time for calcul the mask position with numpy : 0.03436088562011719 nb_pixel_total : 2062092 time to create 1 rle with new method : 0.10908961296081543 time for calcul the mask position with numpy : 0.00652003288269043 nb_pixel_total : 11508 time to create 1 rle with old method : 0.01322317123413086 create new chi : 0.1745905876159668 time to delete rle : 0.0003733634948730469 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1374 TO DO : save crop sub photo not yet done ! save time : 0.11153721809387207 No data in photo_id : 1374583774 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 1.15511155128479 time for calcul the mask position with numpy : 0.13945317268371582 nb_pixel_total : 1276724 time to create 1 rle with new method : 0.11627960205078125 time for calcul the mask position with numpy : 0.01247096061706543 nb_pixel_total : 755177 time to create 1 rle with new method : 0.25035977363586426 time for calcul the mask position with numpy : 0.007526397705078125 nb_pixel_total : 11546 time to create 1 rle with old method : 0.013158559799194336 time for calcul the mask position with numpy : 0.007111549377441406 nb_pixel_total : 10543 time to create 1 rle with old method : 0.011945724487304688 time for calcul the mask position with numpy : 0.006891727447509766 nb_pixel_total : 19610 time to create 1 rle with old method : 0.0221860408782959 create new chi : 0.6044187545776367 time to delete rle : 0.0006673336029052734 batch 1 Loaded 9 chid ids of type : 3594 ++++Number RLEs to save : 3938 TO DO : save crop sub photo not yet done ! save time : 0.2372446060180664 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.04050111770629883 time for calcul the mask position with numpy : 0.015542030334472656 nb_pixel_total : 1267061 time to create 1 rle with new method : 0.15185189247131348 time for calcul the mask position with numpy : 0.012434720993041992 nb_pixel_total : 806539 time to create 1 rle with new method : 0.030936241149902344 create new chi : 0.2168416976928711 time to delete rle : 0.0005981922149658203 batch 1 Loaded 3 chid ids of type : 3594 ++Number RLEs to save : 4418 TO DO : save crop sub photo not yet done ! save time : 0.27146148681640625 No data in photo_id : 1374583698 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.05140948295593262 time for calcul the mask position with numpy : 0.0250852108001709 nb_pixel_total : 2020608 time to create 1 rle with new method : 0.24631500244140625 time for calcul the mask position with numpy : 0.010748147964477539 nb_pixel_total : 39137 time to create 1 rle with old method : 0.04417252540588379 time for calcul the mask position with numpy : 0.006485939025878906 nb_pixel_total : 13855 time to create 1 rle with old method : 0.0157470703125 create new chi : 0.3593018054962158 time to delete rle : 0.0004978179931640625 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1904 TO DO : save crop sub photo not yet done ! save time : 0.14646482467651367 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.08305716514587402 time for calcul the mask position with numpy : 0.021886348724365234 nb_pixel_total : 1356245 time to create 1 rle with new method : 0.23696136474609375 time for calcul the mask position with numpy : 0.014278650283813477 nb_pixel_total : 717355 time to create 1 rle with new method : 0.11662578582763672 create new chi : 0.40488457679748535 time to delete rle : 0.0003676414489746094 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 3008 TO DO : save crop sub photo not yet done ! save time : 0.22653794288635254 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.035169363021850586 time for calcul the mask position with numpy : 0.12464022636413574 nb_pixel_total : 1359174 time to create 1 rle with new method : 0.1270732879638672 time for calcul the mask position with numpy : 0.011442899703979492 nb_pixel_total : 714426 time to create 1 rle with new method : 0.24736881256103516 create new chi : 0.5271036624908447 time to delete rle : 0.00054168701171875 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 3016 TO DO : save crop sub photo not yet done ! save time : 0.2078845500946045 nb_obj : 5 nb_hashtags : 3 time to prepare the origin masks : 0.44806575775146484 time for calcul the mask position with numpy : 0.1283881664276123 nb_pixel_total : 1269208 time to create 1 rle with new method : 0.10603833198547363 time for calcul the mask position with numpy : 0.010419607162475586 nb_pixel_total : 19438 time to create 1 rle with old method : 0.022054672241210938 time for calcul the mask position with numpy : 0.009973526000976562 nb_pixel_total : 15521 time to create 1 rle with old method : 0.01758122444152832 time for calcul the mask position with numpy : 0.00678253173828125 nb_pixel_total : 3378 time to create 1 rle with old method : 0.0038988590240478516 time for calcul the mask position with numpy : 0.006746768951416016 nb_pixel_total : 6977 time to create 1 rle with old method : 0.008884429931640625 time for calcul the mask position with numpy : 0.014353513717651367 nb_pixel_total : 759078 time to create 1 rle with new method : 0.17401814460754395 create new chi : 0.5321564674377441 time to delete rle : 0.0025167465209960938 batch 1 Loaded 11 chid ids of type : 3594 +++++Number RLEs to save : 4522 TO DO : save crop sub photo not yet done ! save time : 0.30358338356018066 nb_obj : 3 nb_hashtags : 1 time to prepare the origin masks : 0.06673383712768555 time for calcul the mask position with numpy : 0.024728059768676758 nb_pixel_total : 2027213 time to create 1 rle with new method : 0.12096333503723145 time for calcul the mask position with numpy : 0.00656890869140625 nb_pixel_total : 16196 time to create 1 rle with old method : 0.018218040466308594 time for calcul the mask position with numpy : 0.006232023239135742 nb_pixel_total : 6855 time to create 1 rle with old method : 0.007851600646972656 time for calcul the mask position with numpy : 0.006722688674926758 nb_pixel_total : 23336 time to create 1 rle with old method : 0.026720523834228516 create new chi : 0.22363519668579102 time to delete rle : 0.0005512237548828125 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 2318 TO DO : save crop sub photo not yet done ! save time : 0.1686720848083496 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.03750753402709961 time for calcul the mask position with numpy : 0.022366046905517578 nb_pixel_total : 2048909 time to create 1 rle with new method : 0.06322312355041504 time for calcul the mask position with numpy : 0.006694793701171875 nb_pixel_total : 24691 time to create 1 rle with old method : 0.03158879280090332 create new chi : 0.12958407402038574 time to delete rle : 0.000339508056640625 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1430 TO DO : save crop sub photo not yet done ! save time : 0.11163020133972168 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.42974066734313965 time for calcul the mask position with numpy : 0.17940783500671387 nb_pixel_total : 1326049 time to create 1 rle with new method : 0.18499350547790527 time for calcul the mask position with numpy : 0.01956462860107422 nb_pixel_total : 744206 time to create 1 rle with new method : 0.3240985870361328 time for calcul the mask position with numpy : 0.010035991668701172 nb_pixel_total : 3345 time to create 1 rle with old method : 0.004823207855224609 create new chi : 0.7383756637573242 time to delete rle : 0.0011217594146728516 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 3174 TO DO : save crop sub photo not yet done ! save time : 0.22176551818847656 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.04817461967468262 time for calcul the mask position with numpy : 0.15131735801696777 nb_pixel_total : 2070927 time to create 1 rle with new method : 0.13341355323791504 time for calcul the mask position with numpy : 0.006589412689208984 nb_pixel_total : 2673 time to create 1 rle with old method : 0.0030832290649414062 create new chi : 0.3065488338470459 time to delete rle : 0.0002453327178955078 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1176 TO DO : save crop sub photo not yet done ! save time : 0.10187578201293945 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.03987431526184082 time for calcul the mask position with numpy : 0.06735801696777344 nb_pixel_total : 1350307 time to create 1 rle with new method : 0.11815500259399414 time for calcul the mask position with numpy : 0.01163339614868164 nb_pixel_total : 723293 time to create 1 rle with new method : 0.42922091484069824 create new chi : 0.6362075805664062 time to delete rle : 0.0003025531768798828 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 2946 TO DO : save crop sub photo not yet done ! save time : 0.19843792915344238 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.9327197074890137 time for calcul the mask position with numpy : 0.13769984245300293 nb_pixel_total : 2063409 time to create 1 rle with new method : 0.3089640140533447 time for calcul the mask position with numpy : 0.010757923126220703 nb_pixel_total : 3772 time to create 1 rle with old method : 0.004235267639160156 time for calcul the mask position with numpy : 0.010596036911010742 nb_pixel_total : 6419 time to create 1 rle with old method : 0.007238149642944336 create new chi : 0.48949146270751953 time to delete rle : 0.00037741661071777344 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1438 TO DO : save crop sub photo not yet done ! save time : 0.12604284286499023 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.06766104698181152 time for calcul the mask position with numpy : 0.008911371231079102 nb_pixel_total : 424145 time to create 1 rle with new method : 0.2767326831817627 time for calcul the mask position with numpy : 0.027123212814331055 nb_pixel_total : 1539486 time to create 1 rle with new method : 0.1360948085784912 time for calcul the mask position with numpy : 0.007132530212402344 nb_pixel_total : 109969 time to create 1 rle with old method : 0.11919951438903809 create new chi : 0.5920026302337646 time to delete rle : 0.0006666183471679688 batch 1 Loaded 5 chid ids of type : 3594 +++Number RLEs to save : 4357 TO DO : save crop sub photo not yet done ! save time : 0.2897980213165283 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.05574393272399902 time for calcul the mask position with numpy : 0.05777597427368164 nb_pixel_total : 1311425 time to create 1 rle with new method : 0.2666480541229248 time for calcul the mask position with numpy : 0.0064487457275390625 nb_pixel_total : 16629 time to create 1 rle with old method : 0.01942133903503418 time for calcul the mask position with numpy : 0.014488935470581055 nb_pixel_total : 745546 time to create 1 rle with new method : 0.09998583793640137 create new chi : 0.4742872714996338 time to delete rle : 0.00040149688720703125 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 3326 TO DO : save crop sub photo not yet done ! save time : 0.2370624542236328 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.16968417167663574 time for calcul the mask position with numpy : 0.09462404251098633 nb_pixel_total : 1428768 time to create 1 rle with new method : 0.14901995658874512 time for calcul the mask position with numpy : 0.006922006607055664 nb_pixel_total : 7616 time to create 1 rle with old method : 0.008328914642333984 time for calcul the mask position with numpy : 0.010843753814697266 nb_pixel_total : 631229 time to create 1 rle with new method : 0.1401348114013672 time for calcul the mask position with numpy : 0.006167411804199219 nb_pixel_total : 5987 time to create 1 rle with old method : 0.0067234039306640625 create new chi : 0.43366289138793945 time to delete rle : 0.000843048095703125 batch 1 Loaded 7 chid ids of type : 3594 ++++Number RLEs to save : 3750 TO DO : save crop sub photo not yet done ! save time : 0.2435905933380127 nb_obj : 3 nb_hashtags : 1 time to prepare the origin masks : 0.05453324317932129 time for calcul the mask position with numpy : 0.01888751983642578 nb_pixel_total : 2019493 time to create 1 rle with new method : 0.14462733268737793 time for calcul the mask position with numpy : 0.006670475006103516 nb_pixel_total : 29250 time to create 1 rle with old method : 0.03202080726623535 time for calcul the mask position with numpy : 0.006088972091674805 nb_pixel_total : 13091 time to create 1 rle with old method : 0.01426243782043457 time for calcul the mask position with numpy : 0.0061397552490234375 nb_pixel_total : 11766 time to create 1 rle with old method : 0.013056755065917969 create new chi : 0.25069236755371094 time to delete rle : 0.0003638267517089844 batch 1 Loaded 7 chid ids of type : 3594 +++++++Number RLEs to save : 2279 TO DO : save crop sub photo not yet done ! save time : 0.15116357803344727 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.05449986457824707 time for calcul the mask position with numpy : 0.07176780700683594 nb_pixel_total : 1317503 time to create 1 rle with new method : 0.10761427879333496 time for calcul the mask position with numpy : 0.011050224304199219 nb_pixel_total : 739790 time to create 1 rle with new method : 0.29018425941467285 time for calcul the mask position with numpy : 0.0061147212982177734 nb_pixel_total : 12198 time to create 1 rle with old method : 0.01334381103515625 time for calcul the mask position with numpy : 0.005998373031616211 nb_pixel_total : 4109 time to create 1 rle with old method : 0.0045850276947021484 create new chi : 0.5211117267608643 time to delete rle : 0.0006318092346191406 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 3534 TO DO : save crop sub photo not yet done ! save time : 0.2413787841796875 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.0425567626953125 time for calcul the mask position with numpy : 0.021817922592163086 nb_pixel_total : 1967952 time to create 1 rle with new method : 0.12932991981506348 time for calcul the mask position with numpy : 0.006846189498901367 nb_pixel_total : 105648 time to create 1 rle with old method : 0.11745190620422363 create new chi : 0.28310465812683105 time to delete rle : 0.00034332275390625 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 2232 TO DO : save crop sub photo not yet done ! save time : 0.15906620025634766 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.0337224006652832 time for calcul the mask position with numpy : 0.019913673400878906 nb_pixel_total : 2066051 time to create 1 rle with new method : 0.18001794815063477 time for calcul the mask position with numpy : 0.006409168243408203 nb_pixel_total : 7549 time to create 1 rle with old method : 0.00851750373840332 create new chi : 0.22498750686645508 time to delete rle : 0.00026154518127441406 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1302 TO DO : save crop sub photo not yet done ! save time : 0.1169593334197998 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.0365142822265625 time for calcul the mask position with numpy : 0.12444567680358887 nb_pixel_total : 2062617 time to create 1 rle with new method : 0.11644196510314941 time for calcul the mask position with numpy : 0.006248950958251953 nb_pixel_total : 10983 time to create 1 rle with old method : 0.012632608413696289 create new chi : 0.27081966400146484 time to delete rle : 0.0002498626708984375 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1248 TO DO : save crop sub photo not yet done ! save time : 0.11252808570861816 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.03390765190124512 time for calcul the mask position with numpy : 0.13176202774047852 nb_pixel_total : 1959329 time to create 1 rle with new method : 0.14330816268920898 time for calcul the mask position with numpy : 0.011887311935424805 nb_pixel_total : 114271 time to create 1 rle with old method : 0.1283245086669922 create new chi : 0.42641162872314453 time to delete rle : 0.000377655029296875 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 2214 TO DO : save crop sub photo not yet done ! save time : 0.1654512882232666 nb_obj : 4 nb_hashtags : 1 time to prepare the origin masks : 0.5337193012237549 time for calcul the mask position with numpy : 0.07743430137634277 nb_pixel_total : 1199673 time to create 1 rle with new method : 0.1138615608215332 time for calcul the mask position with numpy : 0.007663249969482422 nb_pixel_total : 117170 time to create 1 rle with old method : 0.13794565200805664 time for calcul the mask position with numpy : 0.007468700408935547 nb_pixel_total : 19 time to create 1 rle with old method : 8.893013000488281e-05 time for calcul the mask position with numpy : 0.016171693801879883 nb_pixel_total : 754888 time to create 1 rle with new method : 0.26865315437316895 time for calcul the mask position with numpy : 0.006737709045410156 nb_pixel_total : 1850 time to create 1 rle with old method : 0.0021796226501464844 create new chi : 0.6487407684326172 time to delete rle : 0.000820159912109375 batch 1 Loaded 9 chid ids of type : 3594 ++++Number RLEs to save : 4190 TO DO : save crop sub photo not yet done ! save time : 0.27851176261901855 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.11618304252624512 time for calcul the mask position with numpy : 0.05712747573852539 nb_pixel_total : 1363629 time to create 1 rle with new method : 0.3263521194458008 time for calcul the mask position with numpy : 0.008707523345947266 nb_pixel_total : 246325 time to create 1 rle with new method : 0.2973923683166504 time for calcul the mask position with numpy : 0.013440608978271484 nb_pixel_total : 463646 time to create 1 rle with new method : 0.03017592430114746 create new chi : 0.7418391704559326 time to delete rle : 0.0007507801055908203 batch 1 Loaded 6 chid ids of type : 3594 +++++Number RLEs to save : 5284 TO DO : save crop sub photo not yet done ! save time : 0.3063240051269531 nb_obj : 3 nb_hashtags : 1 time to prepare the origin masks : 0.17845916748046875 time for calcul the mask position with numpy : 0.018554210662841797 nb_pixel_total : 1924241 time to create 1 rle with new method : 0.22300386428833008 time for calcul the mask position with numpy : 0.006350517272949219 nb_pixel_total : 19224 time to create 1 rle with old method : 0.02146625518798828 time for calcul the mask position with numpy : 0.0061893463134765625 nb_pixel_total : 32984 time to create 1 rle with old method : 0.03729820251464844 time for calcul the mask position with numpy : 0.00672459602355957 nb_pixel_total : 97151 time to create 1 rle with old method : 0.10750722885131836 create new chi : 0.4372291564941406 time to delete rle : 0.0008149147033691406 batch 1 Loaded 8 chid ids of type : 3594 +++++Number RLEs to save : 3575 TO DO : save crop sub photo not yet done ! save time : 0.22273635864257812 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.1553332805633545 time for calcul the mask position with numpy : 0.015825271606445312 nb_pixel_total : 1312251 time to create 1 rle with new method : 0.19634270668029785 time for calcul the mask position with numpy : 0.006659269332885742 nb_pixel_total : 9491 time to create 1 rle with old method : 0.012017488479614258 time for calcul the mask position with numpy : 0.015671491622924805 nb_pixel_total : 751858 time to create 1 rle with new method : 0.09466814994812012 create new chi : 0.34976625442504883 time to delete rle : 0.0007004737854003906 batch 1 Loaded 6 chid ids of type : 3594 ++Number RLEs to save : 4220 TO DO : save crop sub photo not yet done ! save time : 0.24692273139953613 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.09319067001342773 time for calcul the mask position with numpy : 0.11285805702209473 nb_pixel_total : 2054009 time to create 1 rle with new method : 0.265688419342041 time for calcul the mask position with numpy : 0.006752729415893555 nb_pixel_total : 19591 time to create 1 rle with old method : 0.023284435272216797 create new chi : 0.41788578033447266 time to delete rle : 0.00047135353088378906 batch 1 Loaded 4 chid ids of type : 3594 +Number RLEs to save : 1516 TO DO : save crop sub photo not yet done ! save time : 0.13048624992370605 nb_obj : 4 nb_hashtags : 3 time to prepare the origin masks : 0.43883204460144043 time for calcul the mask position with numpy : 0.07858943939208984 nb_pixel_total : 1743444 time to create 1 rle with new method : 0.5059113502502441 time for calcul the mask position with numpy : 0.00733184814453125 nb_pixel_total : 1099 time to create 1 rle with old method : 0.0022253990173339844 time for calcul the mask position with numpy : 0.009542226791381836 nb_pixel_total : 316465 time to create 1 rle with new method : 0.0966958999633789 time for calcul the mask position with numpy : 0.006291389465332031 nb_pixel_total : 9566 time to create 1 rle with old method : 0.011218070983886719 time for calcul the mask position with numpy : 0.006703853607177734 nb_pixel_total : 3026 time to create 1 rle with old method : 0.0035178661346435547 create new chi : 0.7365484237670898 time to delete rle : 0.0005450248718261719 batch 1 Loaded 11 chid ids of type : 3594 ++++Number RLEs to save : 2778 TO DO : save crop sub photo not yet done ! save time : 0.19148755073547363 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.10042667388916016 time for calcul the mask position with numpy : 0.021371841430664062 nb_pixel_total : 2037952 time to create 1 rle with new method : 0.028733253479003906 time for calcul the mask position with numpy : 0.006207704544067383 nb_pixel_total : 30398 time to create 1 rle with old method : 0.033745527267456055 time for calcul the mask position with numpy : 0.006102800369262695 nb_pixel_total : 5250 time to create 1 rle with old method : 0.0058786869049072266 create new chi : 0.1023247241973877 time to delete rle : 0.0002932548522949219 batch 1 Loaded 6 chid ids of type : 3594 ++Number RLEs to save : 1626 TO DO : save crop sub photo not yet done ! save time : 0.13449382781982422 nb_obj : 7 nb_hashtags : 3 time to prepare the origin masks : 0.6226060390472412 time for calcul the mask position with numpy : 0.05765104293823242 nb_pixel_total : 1291747 time to create 1 rle with new method : 0.2403249740600586 time for calcul the mask position with numpy : 0.006903409957885742 nb_pixel_total : 11910 time to create 1 rle with old method : 0.01989150047302246 time for calcul the mask position with numpy : 0.006991147994995117 nb_pixel_total : 8037 time to create 1 rle with old method : 0.009192943572998047 time for calcul the mask position with numpy : 0.011792421340942383 nb_pixel_total : 743915 time to create 1 rle with new method : 0.18747735023498535 time for calcul the mask position with numpy : 0.006473541259765625 nb_pixel_total : 147 time to create 1 rle with old method : 0.00027060508728027344 time for calcul the mask position with numpy : 0.006195545196533203 nb_pixel_total : 149 time to create 1 rle with old method : 0.0002758502960205078 time for calcul the mask position with numpy : 0.0062847137451171875 nb_pixel_total : 7638 time to create 1 rle with old method : 0.008756637573242188 time for calcul the mask position with numpy : 0.006526470184326172 nb_pixel_total : 10057 time to create 1 rle with old method : 0.01134181022644043 create new chi : 0.5990116596221924 time to delete rle : 0.0007784366607666016 batch 1 Loaded 16 chid ids of type : 3594 +++++++Number RLEs to save : 4354 TO DO : save crop sub photo not yet done ! save time : 0.27983665466308594 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.055349111557006836 time for calcul the mask position with numpy : 0.02024555206298828 nb_pixel_total : 2063439 time to create 1 rle with new method : 0.02860283851623535 time for calcul the mask position with numpy : 0.006221294403076172 nb_pixel_total : 10161 time to create 1 rle with old method : 0.011525630950927734 create new chi : 0.06682944297790527 time to delete rle : 0.00024771690368652344 batch 1 Loaded 4 chid ids of type : 3594 +Number RLEs to save : 1326 TO DO : save crop sub photo not yet done ! save time : 0.10198354721069336 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.20406126976013184 time for calcul the mask position with numpy : 0.18138575553894043 nb_pixel_total : 1956972 time to create 1 rle with new method : 0.14803004264831543 time for calcul the mask position with numpy : 0.0066411495208740234 nb_pixel_total : 7269 time to create 1 rle with old method : 0.008278608322143555 time for calcul the mask position with numpy : 0.00795292854309082 nb_pixel_total : 109359 time to create 1 rle with old method : 0.12929630279541016 create new chi : 0.49069666862487793 time to delete rle : 0.0006825923919677734 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 2348 TO DO : save crop sub photo not yet done ! save time : 0.15285301208496094 No data in photo_id : 1374568320 No data in photo_id : 1374568317 map_output_result : {1374583980: (0.0, 'Should be the crop_list due to order', 0.0), 1374583967: (0.0, 'Should be the crop_list due to order', 0), 1374583965: (0.0, 'Should be the crop_list due to order', 0), 1374583877: (0.0, 'Should be the crop_list due to order', 0.0), 1374583874: (0.0, 'Should be the crop_list due to order', 0), 1374583774: (0.0, 'Should be the crop_list due to order', 0.0), 1374583750: (0.0, 'Should be the crop_list due to order', 0), 1374583724: (0.0, 'Should be the crop_list due to order', 0), 1374583698: (0.0, 'Should be the crop_list due to order', 0.0), 1374583671: (0.0, 'Should be the crop_list due to order', 0), 1374583645: (0.0, 'Should be the crop_list due to order', 0), 1374583005: (0.0, 'Should be the crop_list due to order', 0), 1374582695: (0.0, 'Should be the crop_list due to order', 0), 1374582671: (0.0, 'Should be the crop_list due to order', 0), 1374582646: (0.0, 'Should be the crop_list due to order', 0), 1374582554: (0.0, 'Should be the crop_list due to order', 0), 1374582524: (0.0, 'Should be the crop_list due to order', 0), 1374582520: (0.0, 'Should be the crop_list due to order', 0), 1374582517: (0.0, 'Should be the crop_list due to order', 0), 1374582515: (0.0, 'Should be the crop_list due to order', 0), 1374582514: (0.0, 'Should be the crop_list due to order', 0), 1374582506: (0.0, 'Should be the crop_list due to order', 0), 1374582438: (0.0, 'Should be the crop_list due to order', 0), 1374582435: (0.0, 'Should be the crop_list due to order', 0), 1374582428: (0.0, 'Should be the crop_list due to order', 0), 1374582403: (0.0, 'Should be the crop_list due to order', 0), 1374582391: (0.0, 'Should be the crop_list due to order', 0), 1374582384: (0.0, 'Should be the crop_list due to order', 0), 1374568368: (0.0, 'Should be the crop_list due to order', 0), 1374568367: (0.0, 'Should be the crop_list due to order', 0), 1374568366: (0.0, 'Should be the crop_list due to order', 0), 1374568364: (0.0, 'Should be the crop_list due to order', 0), 1374568361: (0.0, 'Should be the crop_list due to order', 0), 1374568357: (0.0, 'Should be the crop_list due to order', 0), 1374568345: (0.0, 'Should be the crop_list due to order', 0), 1374568334: (0.0, 'Should be the crop_list due to order', 0), 1374568326: (0.0, 'Should be the crop_list due to order', 0), 1374568324: (0.0, 'Should be the crop_list due to order', 0), 1374568320: (0.0, 'Should be the crop_list due to order', 0.0), 1374568317: (0.0, 'Should be the crop_list due to order', 0.0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1374583980, 1374583967, 1374583965, 1374583877, 1374583874, 1374583774, 1374583750, 1374583724, 1374583698, 1374583671, 1374583645, 1374583005, 1374582695, 1374582671, 1374582646, 1374582554, 1374582524, 1374582520, 1374582517, 1374582515, 1374582514, 1374582506, 1374582438, 1374582435, 1374582428, 1374582403, 1374582391, 1374582384, 1374568368, 1374568367, 1374568366, 1374568364, 1374568361, 1374568357, 1374568345, 1374568334, 1374568326, 1374568324, 1374568320, 1374568317] Looping around the photos to save general results len do output : 40 /1374583980.Didn't retrieve data . /1374583967.Didn't retrieve data . /1374583965.Didn't retrieve data . /1374583877.Didn't retrieve data . /1374583874.Didn't retrieve data . /1374583774.Didn't retrieve data . /1374583750.Didn't retrieve data . /1374583724.Didn't retrieve data . /1374583698.Didn't retrieve data . /1374583671.Didn't retrieve data . /1374583645.Didn't retrieve data . /1374583005.Didn't retrieve data . /1374582695.Didn't retrieve data . /1374582671.Didn't retrieve data . /1374582646.Didn't retrieve data . /1374582554.Didn't retrieve data . /1374582524.Didn't retrieve data . /1374582520.Didn't retrieve data . /1374582517.Didn't retrieve data . /1374582515.Didn't retrieve data . /1374582514.Didn't retrieve data . /1374582506.Didn't retrieve data . /1374582438.Didn't retrieve data . /1374582435.Didn't retrieve data . /1374582428.Didn't retrieve data . /1374582403.Didn't retrieve data . /1374582391.Didn't retrieve data . /1374582384.Didn't retrieve data . /1374568368.Didn't retrieve data . /1374568367.Didn't retrieve data . /1374568366.Didn't retrieve data . /1374568364.Didn't retrieve data . /1374568361.Didn't retrieve data . /1374568357.Didn't retrieve data . /1374568345.Didn't retrieve data . /1374568334.Didn't retrieve data . /1374568326.Didn't retrieve data . /1374568324.Didn't retrieve data . /1374568320.Didn't retrieve data . /1374568317.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583980', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583967', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583965', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583877', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583874', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583774', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583750', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583724', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583698', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583671', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583645', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583005', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582695', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582671', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582646', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582554', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582524', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582520', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582517', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582515', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582514', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582506', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582438', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582435', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582428', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582403', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582391', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582384', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568368', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568367', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568366', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568364', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568361', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568357', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568345', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568334', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568326', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568324', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568320', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568317', None, None, None, None, None, '3410768') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 120 time used for this insertion : 0.017958879470825195 save_final save missing photos in datou_result : time spend for datou_step_exec : 28.59896945953369 time spend to save output : 0.019043445587158203 total time spend for step 3 : 28.61801290512085 step4:ventilate_hashtags_in_portfolio Thu Jul 31 14:43:09 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 25543232 get user id for portfolio 25543232 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25543232 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('flou','metal','environnement','autre','papier','carton','pet_fonce','mal_croppe','background','pehd','pet_clair')) AND mptpi.`min_score`=0.5 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25543232 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('flou','metal','environnement','autre','papier','carton','pet_fonce','mal_croppe','background','pehd','pet_clair')) AND mptpi.`min_score`=0.5 To do Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25543232 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('flou','metal','environnement','autre','papier','carton','pet_fonce','mal_croppe','background','pehd','pet_clair')) AND mptpi.`min_score`=0.5 To do lien utilise dans velours : https://www.fotonower.com/velours/25544981,25544982,25544983,25544984,25544985,25544986,25544987,25544988,25544989,25544990,25544991?tags=flou,metal,environnement,autre,papier,carton,pet_fonce,mal_croppe,background,pehd,pet_clair Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1374583980, 1374583967, 1374583965, 1374583877, 1374583874, 1374583774, 1374583750, 1374583724, 1374583698, 1374583671, 1374583645, 1374583005, 1374582695, 1374582671, 1374582646, 1374582554, 1374582524, 1374582520, 1374582517, 1374582515, 1374582514, 1374582506, 1374582438, 1374582435, 1374582428, 1374582403, 1374582391, 1374582384, 1374568368, 1374568367, 1374568366, 1374568364, 1374568361, 1374568357, 1374568345, 1374568334, 1374568326, 1374568324, 1374568320, 1374568317] Looping around the photos to save general results len do output : 1 /25543232. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583980', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583967', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583965', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583877', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583874', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583774', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583750', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583724', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583698', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583671', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583645', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583005', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582695', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582671', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582646', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582554', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582524', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582520', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582517', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582515', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582514', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582506', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582438', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582435', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582428', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582403', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582391', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582384', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568368', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568367', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568366', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568364', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568361', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568357', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568345', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568334', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568326', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568324', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568320', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568317', None, None, None, None, None, '3410768') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 41 time used for this insertion : 0.018059730529785156 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.6845018863677979 time spend to save output : 0.018503189086914062 total time spend for step 4 : 1.703005075454712 step5:final Thu Jul 31 14:43:11 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Catched exception ! Connect or reconnect ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1374583980: ('0.1775103777154558',), 1374583967: ('0.1775103777154558',), 1374583965: ('0.1775103777154558',), 1374583877: ('0.1775103777154558',), 1374583874: ('0.1775103777154558',), 1374583774: ('0.1775103777154558',), 1374583750: ('0.1775103777154558',), 1374583724: ('0.1775103777154558',), 1374583698: ('0.1775103777154558',), 1374583671: ('0.1775103777154558',), 1374583645: ('0.1775103777154558',), 1374583005: ('0.1775103777154558',), 1374582695: ('0.1775103777154558',), 1374582671: ('0.1775103777154558',), 1374582646: ('0.1775103777154558',), 1374582554: ('0.1775103777154558',), 1374582524: ('0.1775103777154558',), 1374582520: ('0.1775103777154558',), 1374582517: ('0.1775103777154558',), 1374582515: ('0.1775103777154558',), 1374582514: ('0.1775103777154558',), 1374582506: ('0.1775103777154558',), 1374582438: ('0.1775103777154558',), 1374582435: ('0.1775103777154558',), 1374582428: ('0.1775103777154558',), 1374582403: ('0.1775103777154558',), 1374582391: ('0.1775103777154558',), 1374582384: ('0.1775103777154558',), 1374568368: ('0.1775103777154558',), 1374568367: ('0.1775103777154558',), 1374568366: ('0.1775103777154558',), 1374568364: ('0.1775103777154558',), 1374568361: ('0.1775103777154558',), 1374568357: ('0.1775103777154558',), 1374568345: ('0.1775103777154558',), 1374568334: ('0.1775103777154558',), 1374568326: ('0.1775103777154558',), 1374568324: ('0.1775103777154558',), 1374568320: ('0.1775103777154558',), 1374568317: ('0.1775103777154558',)} new output for save of step final : {1374583980: ('0.1775103777154558',), 1374583967: ('0.1775103777154558',), 1374583965: ('0.1775103777154558',), 1374583877: ('0.1775103777154558',), 1374583874: ('0.1775103777154558',), 1374583774: ('0.1775103777154558',), 1374583750: ('0.1775103777154558',), 1374583724: ('0.1775103777154558',), 1374583698: ('0.1775103777154558',), 1374583671: ('0.1775103777154558',), 1374583645: ('0.1775103777154558',), 1374583005: ('0.1775103777154558',), 1374582695: ('0.1775103777154558',), 1374582671: ('0.1775103777154558',), 1374582646: ('0.1775103777154558',), 1374582554: ('0.1775103777154558',), 1374582524: ('0.1775103777154558',), 1374582520: ('0.1775103777154558',), 1374582517: ('0.1775103777154558',), 1374582515: ('0.1775103777154558',), 1374582514: ('0.1775103777154558',), 1374582506: ('0.1775103777154558',), 1374582438: ('0.1775103777154558',), 1374582435: ('0.1775103777154558',), 1374582428: ('0.1775103777154558',), 1374582403: ('0.1775103777154558',), 1374582391: ('0.1775103777154558',), 1374582384: ('0.1775103777154558',), 1374568368: ('0.1775103777154558',), 1374568367: ('0.1775103777154558',), 1374568366: ('0.1775103777154558',), 1374568364: ('0.1775103777154558',), 1374568361: ('0.1775103777154558',), 1374568357: ('0.1775103777154558',), 1374568345: ('0.1775103777154558',), 1374568334: ('0.1775103777154558',), 1374568326: ('0.1775103777154558',), 1374568324: ('0.1775103777154558',), 1374568320: ('0.1775103777154558',), 1374568317: ('0.1775103777154558',)} [1374583980, 1374583967, 1374583965, 1374583877, 1374583874, 1374583774, 1374583750, 1374583724, 1374583698, 1374583671, 1374583645, 1374583005, 1374582695, 1374582671, 1374582646, 1374582554, 1374582524, 1374582520, 1374582517, 1374582515, 1374582514, 1374582506, 1374582438, 1374582435, 1374582428, 1374582403, 1374582391, 1374582384, 1374568368, 1374568367, 1374568366, 1374568364, 1374568361, 1374568357, 1374568345, 1374568334, 1374568326, 1374568324, 1374568320, 1374568317] Looping around the photos to save general results len do output : 40 /1374583980.Didn't retrieve data . /1374583967.Didn't retrieve data . /1374583965.Didn't retrieve data . /1374583877.Didn't retrieve data . /1374583874.Didn't retrieve data . /1374583774.Didn't retrieve data . /1374583750.Didn't retrieve data . /1374583724.Didn't retrieve data . /1374583698.Didn't retrieve data . /1374583671.Didn't retrieve data . /1374583645.Didn't retrieve data . /1374583005.Didn't retrieve data . /1374582695.Didn't retrieve data . /1374582671.Didn't retrieve data . /1374582646.Didn't retrieve data . /1374582554.Didn't retrieve data . /1374582524.Didn't retrieve data . /1374582520.Didn't retrieve data . /1374582517.Didn't retrieve data . /1374582515.Didn't retrieve data . /1374582514.Didn't retrieve data . /1374582506.Didn't retrieve data . /1374582438.Didn't retrieve data . /1374582435.Didn't retrieve data . /1374582428.Didn't retrieve data . /1374582403.Didn't retrieve data . /1374582391.Didn't retrieve data . /1374582384.Didn't retrieve data . /1374568368.Didn't retrieve data . /1374568367.Didn't retrieve data . /1374568366.Didn't retrieve data . /1374568364.Didn't retrieve data . /1374568361.Didn't retrieve data . /1374568357.Didn't retrieve data . /1374568345.Didn't retrieve data . /1374568334.Didn't retrieve data . /1374568326.Didn't retrieve data . /1374568324.Didn't retrieve data . /1374568320.Didn't retrieve data . /1374568317.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583980', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583967', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583965', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583877', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583874', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583774', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583750', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583724', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583698', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583671', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583645', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583005', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582695', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582671', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582646', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582554', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582524', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582520', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582517', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582515', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582514', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582506', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582438', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582435', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582428', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582403', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582391', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582384', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568368', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568367', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568366', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568364', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568361', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568357', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568345', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568334', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568326', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568324', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568320', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568317', None, None, None, None, None, '3410768') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 120 time used for this insertion : 0.019142866134643555 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.11984467506408691 time spend to save output : 0.020796537399291992 total time spend for step 5 : 0.1406412124633789 step6:blur_detection Thu Jul 31 14:43:11 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection methode: ratio et variance treat image : temp/1753965627_2758669_1374583980_f38417260015c28b587ed5be2b65e101.jpg resize: (1080, 1920) 1374583980 -4.1425063775002045 treat image : temp/1753965627_2758669_1374583967_fa6c6595a11526583914c31472ea3d0c.jpg resize: (1080, 1920) 1374583967 -2.4024284860521274 treat image : temp/1753965627_2758669_1374583965_7a756f7a9d3bd524f26733ca12b7dcfd.jpg resize: (1080, 1920) 1374583965 -2.8948370714886904 treat image : temp/1753965627_2758669_1374583877_548e89888ecc8bf4f03ee57479b6e4f7.jpg resize: (1080, 1920) 1374583877 -2.6035974210736366 treat image : temp/1753965627_2758669_1374583874_5e2eb3cf8da94d68bc6a1cd314048a50.jpg resize: (1080, 1920) 1374583874 -2.252930280394346 treat image : temp/1753965627_2758669_1374583774_b28b6557092741488ab5323658710c79.jpg resize: (1080, 1920) 1374583774 -1.6712080657856205 treat image : temp/1753965627_2758669_1374583750_fe0b09f43d1d9567dec0141af4da84d5.jpg resize: (1080, 1920) 1374583750 -2.1606819522010157 treat image : temp/1753965627_2758669_1374583724_cd99c4bddeac8d110c3422949a506e27.jpg resize: (1080, 1920) 1374583724 -2.6058968770009754 treat image : temp/1753965627_2758669_1374583698_35fbf121c2b35a6879151d1561369807.jpg resize: (1080, 1920) 1374583698 -2.322306202080638 treat image : temp/1753965627_2758669_1374583671_4e343bbefbd1daa5ec7116cf215e9474.jpg resize: (1080, 1920) 1374583671 -2.223741077791181 treat image : temp/1753965627_2758669_1374583645_36d6d97bb6ad61b422864460c18c270c.jpg resize: (1080, 1920) 1374583645 -2.556720776748674 treat image : temp/1753965627_2758669_1374583005_8d7eed679f9f7ed859106c547ed25acd.jpg resize: (1080, 1920) 1374583005 -2.180510455667319 treat image : temp/1753965627_2758669_1374582695_056248648fe33af94d072c38d16a9ea3.jpg resize: (1080, 1920) 1374582695 -2.90743210881374 treat image : temp/1753965627_2758669_1374582671_a6e3176659d44869827a4d581eaf5a36.jpg resize: (1080, 1920) 1374582671 -2.8119261665862987 treat image : temp/1753965627_2758669_1374582646_4862f6d4dced5c06eeb2d70c774a20a6.jpg resize: (1080, 1920) 1374582646 -2.992178474475019 treat image : temp/1753965627_2758669_1374582554_2121047f642e85b5956b4821d7ef2442.jpg resize: (1080, 1920) 1374582554 -2.578535478786695 treat image : temp/1753965627_2758669_1374582524_8c1a3688f19f894bb505518855c5e3e4.jpg resize: (1080, 1920) 1374582524 -4.365171541584825 treat image : temp/1753965627_2758669_1374582520_332797921667030df818fea2b226bdd9.jpg resize: (1080, 1920) 1374582520 0.670271265699891 treat image : temp/1753965627_2758669_1374582517_946baed0941b0b9bd68f488a957b57fb.jpg resize: (1080, 1920) 1374582517 -1.8446123166310753 treat image : temp/1753965627_2758669_1374582515_85e02c6c6efd102a98ed58ae3d8e5ed1.jpg resize: (1080, 1920) 1374582515 -0.1081580950153327 treat image : temp/1753965627_2758669_1374582514_067b3048da51963a311b6e64f77f597d.jpg resize: (1080, 1920) 1374582514 -2.64337186436381 treat image : temp/1753965627_2758669_1374582506_f2c07fab43d83f3e22cded5839508783.jpg resize: (1080, 1920) 1374582506 -1.531488162302333 treat image : temp/1753965627_2758669_1374582438_ec19a3369c909bae591276beabf933cb.jpg resize: (1080, 1920) 1374582438 -1.461370166009664 treat image : temp/1753965627_2758669_1374582435_b363c00da8c01827463b8007af17f0c2.jpg resize: (1080, 1920) 1374582435 -2.3868581796958845 treat image : temp/1753965627_2758669_1374582428_11e3a4e4e0fbe27daa3b5942f7528e9d.jpg resize: (1080, 1920) 1374582428 -2.868043656826242 treat image : temp/1753965627_2758669_1374582403_5209c15a51dd5a5ba141885e68d3e7fd.jpg resize: (1080, 1920) 1374582403 -1.3515454418003325 treat image : temp/1753965627_2758669_1374582391_208d4666ae8c3b91c7f9d7df9cf20c6f.jpg resize: (1080, 1920) 1374582391 -1.9359086760337236 treat image : temp/1753965627_2758669_1374582384_d2e656c37230bb7d8464fcdf94ffc666.jpg resize: (1080, 1920) 1374582384 -2.9227587137656887 treat image : temp/1753965627_2758669_1374568368_6d83dba734c2b9fffd10daa1cde59e63.jpg resize: (1080, 1920) 1374568368 -2.2049223601133217 treat image : temp/1753965627_2758669_1374568367_6427090c1b1628a55b4f183045c74988.jpg resize: (1080, 1920) 1374568367 -1.6449544072872513 treat image : temp/1753965627_2758669_1374568366_4ca4c3bb514b5be376e5f9c3a99cf9e8.jpg resize: (1080, 1920) 1374568366 -2.2857906163953667 treat image : temp/1753965627_2758669_1374568364_e3bf2a419e939b8ccf475f1ad307bf61.jpg resize: (1080, 1920) 1374568364 -2.316148521883746 treat image : temp/1753965627_2758669_1374568361_147aa32695db7c4033e8b36f752e0b48.jpg resize: (1080, 1920) 1374568361 -2.775507418886225 treat image : temp/1753965627_2758669_1374568357_1f5e661a9ab12360bbf9c49baab4b79d.jpg resize: (1080, 1920) 1374568357 -2.4582478492380337 treat image : temp/1753965627_2758669_1374568345_bddf134071376c3d1df51d1898648361.jpg resize: (1080, 1920) 1374568345 -1.693200284639881 treat image : temp/1753965627_2758669_1374568334_f6520716ce98efeb28782d616fe9a06a.jpg resize: (1080, 1920) 1374568334 -2.5542210250691517 treat image : temp/1753965627_2758669_1374568326_073bd397f186ca3e19dbea809cda64ab.jpg resize: (1080, 1920) 1374568326 -2.2691698548136023 treat image : temp/1753965627_2758669_1374568324_2ced5bdf3799ea8e47fe1470f77012ed.jpg resize: (1080, 1920) 1374568324 -2.6931379522236334 treat image : temp/1753965627_2758669_1374568320_be58b5b26762c9aae7278ec6d6699b52.jpg resize: (1080, 1920) 1374568320 -1.4990965213293093 treat image : temp/1753965627_2758669_1374568317_3e2f13dc516cfeae649819bb73e9786e.jpg resize: (1080, 1920) 1374568317 -2.361538697203884 treat image : temp/1753965627_2758669_1374583967_fa6c6595a11526583914c31472ea3d0c_rle_crop_3899309022_0.png resize: (126, 81) 1374601915 -2.139002252663061 treat image : temp/1753965627_2758669_1374582695_056248648fe33af94d072c38d16a9ea3_rle_crop_3899309036_0.png resize: (129, 78) 1374601916 -1.7000158532610388 treat image : temp/1753965627_2758669_1374582695_056248648fe33af94d072c38d16a9ea3_rle_crop_3899309039_0.png resize: (367, 208) 1374601917 -2.1983903658493817 treat image : temp/1753965627_2758669_1374582524_8c1a3688f19f894bb505518855c5e3e4_rle_crop_3899309046_0.png resize: (41, 76) 1374601918 -2.0557177982554546 treat image : temp/1753965627_2758669_1374582517_946baed0941b0b9bd68f488a957b57fb_rle_crop_3899309048_0.png resize: (97, 88) 1374601919 -0.9157890685171546 treat image : temp/1753965627_2758669_1374582517_946baed0941b0b9bd68f488a957b57fb_rle_crop_3899309049_0.png resize: (82, 69) 1374601920 -0.5819709078303188 treat image : temp/1753965627_2758669_1374582435_b363c00da8c01827463b8007af17f0c2_rle_crop_3899309060_0.png resize: (116, 102) 1374601921 -1.838970475958053 treat image : temp/1753965627_2758669_1374582403_5209c15a51dd5a5ba141885e68d3e7fd_rle_crop_3899309064_0.png resize: (106, 105) 1374601922 0.7952220027289818 treat image : temp/1753965627_2758669_1374568357_1f5e661a9ab12360bbf9c49baab4b79d_rle_crop_3899150959_0.png resize: (174, 85) 1374601923 -1.920021258079624 treat image : temp/1753965627_2758669_1374568357_1f5e661a9ab12360bbf9c49baab4b79d_bib_crop_3899151468_0.jpg resize: (174, 82) 1374601924 20.0 treat image : temp/1753965627_2758669_1374568334_f6520716ce98efeb28782d616fe9a06a_bib_crop_3899151499_0.jpg resize: (181, 114) 1374601925 20.0 treat image : temp/1753965627_2758669_1374568334_f6520716ce98efeb28782d616fe9a06a_rle_crop_3899309076_0.png resize: (187, 113) 1374601926 -2.0385194671561964 treat image : temp/1753965627_2758669_1374583874_5e2eb3cf8da94d68bc6a1cd314048a50_rle_crop_3899309025_0.png resize: (147, 91) 1374601929 0.6825462112240144 treat image : temp/1753965627_2758669_1374583750_fe0b09f43d1d9567dec0141af4da84d5_rle_crop_3899309027_0.png resize: (126, 109) 1374601930 -0.47597451498656046 treat image : temp/1753965627_2758669_1374582506_f2c07fab43d83f3e22cded5839508783_rle_crop_3899309056_0.png resize: (105, 102) 1374601931 -0.9495433231190374 treat image : temp/1753965627_2758669_1374568334_f6520716ce98efeb28782d616fe9a06a_bib_crop_3899151498_0.jpg resize: (115, 112) 1374601932 20.0 treat image : temp/1753965627_2758669_1374568334_f6520716ce98efeb28782d616fe9a06a_rle_crop_3899309075_0.png resize: (115, 114) 1374601933 -1.0804896425452204 treat image : temp/1753965627_2758669_1374568334_f6520716ce98efeb28782d616fe9a06a_rle_crop_3899309078_0.png resize: (119, 81) 1374601934 1.551528591730961 treat image : temp/1753965627_2758669_1374583967_fa6c6595a11526583914c31472ea3d0c_rle_crop_3899309023_0.png resize: (972, 970) 1374602172 -0.20704173822560593 treat image : temp/1753965627_2758669_1374583965_7a756f7a9d3bd524f26733ca12b7dcfd_rle_crop_3899309024_0.png resize: (131, 58) 1374602174 -2.3286907293487973 treat image : temp/1753965627_2758669_1374583750_fe0b09f43d1d9567dec0141af4da84d5_rle_crop_3899309026_0.png resize: (148, 192) 1374602175 -2.9788269097686135 treat image : temp/1753965627_2758669_1374583750_fe0b09f43d1d9567dec0141af4da84d5_rle_crop_3899309028_0.png resize: (166, 98) 1374602176 -2.7870010391576896 treat image : temp/1753965627_2758669_1374583750_fe0b09f43d1d9567dec0141af4da84d5_rle_crop_3899309029_0.png resize: (982, 1060) 1374602179 -0.1571373341772211 treat image : temp/1753965627_2758669_1374583724_cd99c4bddeac8d110c3422949a506e27_rle_crop_3899309030_0.png resize: (975, 1135) 1374602180 -1.8537773219176443 treat image : temp/1753965627_2758669_1374583671_4e343bbefbd1daa5ec7116cf215e9474_rle_crop_3899309031_0.png resize: (173, 117) 1374602181 -3.103778937347453 treat image : temp/1753965627_2758669_1374583671_4e343bbefbd1daa5ec7116cf215e9474_rle_crop_3899309032_0.png resize: (227, 207) 1374602183 -0.5378477841855958 treat image : temp/1753965627_2758669_1374583645_36d6d97bb6ad61b422864460c18c270c_rle_crop_3899309033_0.png resize: (950, 971) 1374602185 -0.1173788524089107 treat image : temp/1753965627_2758669_1374583005_8d7eed679f9f7ed859106c547ed25acd_rle_crop_3899309034_0.png resize: (954, 973) 1374602186 -0.025943714711900937 treat image : temp/1753965627_2758669_1374582695_056248648fe33af94d072c38d16a9ea3_rle_crop_3899309035_0.png resize: (986, 1061) 1374602187 -0.6542157054450926 treat image : temp/1753965627_2758669_1374582695_056248648fe33af94d072c38d16a9ea3_rle_crop_3899309038_0.png resize: (137, 171) 1374602189 -1.8816422123924539 treat image : temp/1753965627_2758669_1374582671_a6e3176659d44869827a4d581eaf5a36_rle_crop_3899309040_0.png resize: (244, 208) 1374602190 -2.0548261268908106 treat image : temp/1753965627_2758669_1374582671_a6e3176659d44869827a4d581eaf5a36_rle_crop_3899309041_0.png resize: (88, 104) 1374602191 -2.985055693484295 treat image : temp/1753965627_2758669_1374582671_a6e3176659d44869827a4d581eaf5a36_rle_crop_3899309042_0.png resize: (277, 122) 1374602193 -2.633889289442254 treat image : temp/1753965627_2758669_1374582646_4862f6d4dced5c06eeb2d70c774a20a6_rle_crop_3899309043_0.png resize: (175, 208) 1374602194 -5.011820579071958 treat image : temp/1753965627_2758669_1374582554_2121047f642e85b5956b4821d7ef2442_rle_crop_3899309044_0.png resize: (60, 65) 1374602195 -1.7199118735383903 treat image : temp/1753965627_2758669_1374582554_2121047f642e85b5956b4821d7ef2442_rle_crop_3899309045_0.png resize: (977, 990) 1374602196 -0.38567956729758535 treat image : temp/1753965627_2758669_1374582520_332797921667030df818fea2b226bdd9_rle_crop_3899309047_0.png resize: (933, 987) 1374602198 1.7075796947809603 treat image : temp/1753965627_2758669_1374582515_85e02c6c6efd102a98ed58ae3d8e5ed1_rle_crop_3899309050_0.png resize: (517, 362) 1374602199 -0.7492007276350211 treat image : temp/1753965627_2758669_1374582515_85e02c6c6efd102a98ed58ae3d8e5ed1_rle_crop_3899309051_0.png resize: (973, 1738) 1374602200 -1.0442885521271363 treat image : temp/1753965627_2758669_1374582514_067b3048da51963a311b6e64f77f597d_rle_crop_3899309052_0.png resize: (963, 1013) 1374602202 -0.42029134304676324 treat image : temp/1753965627_2758669_1374582514_067b3048da51963a311b6e64f77f597d_rle_crop_3899309053_0.png resize: (149, 141) 1374602203 -1.9912353835082992 treat image : temp/1753965627_2758669_1374582506_f2c07fab43d83f3e22cded5839508783_rle_crop_3899309054_0.png resize: (65, 116) 1374602204 -2.4443470098039994 treat image : temp/1753965627_2758669_1374582506_f2c07fab43d83f3e22cded5839508783_rle_crop_3899309055_0.png resize: (886, 952) 1374602205 -0.5803780979574839 treat image : temp/1753965627_2758669_1374582438_ec19a3369c909bae591276beabf933cb_rle_crop_3899309057_0.png resize: (144, 118) 1374602207 -3.3041273196666054 treat image : temp/1753965627_2758669_1374582438_ec19a3369c909bae591276beabf933cb_rle_crop_3899309058_0.png resize: (165, 125) 1374602208 -1.6029402179076768 treat image : temp/1753965627_2758669_1374582438_ec19a3369c909bae591276beabf933cb_rle_crop_3899309059_0.png resize: (208, 309) 1374602209 -3.0028843049536165 treat image : temp/1753965627_2758669_1374582435_b363c00da8c01827463b8007af17f0c2_rle_crop_3899309061_0.png resize: (117, 176) 1374602211 -2.9002480489612306 treat image : temp/1753965627_2758669_1374582435_b363c00da8c01827463b8007af17f0c2_rle_crop_3899309062_0.png resize: (987, 972) 1374602212 -0.24517899250992542 treat image : temp/1753965627_2758669_1374582428_11e3a4e4e0fbe27daa3b5942f7528e9d_rle_crop_3899309063_0.png resize: (518, 316) 1374602213 -0.11056435171882117 treat image : temp/1753965627_2758669_1374582391_208d4666ae8c3b91c7f9d7df9cf20c6f_rle_crop_3899309065_0.png resize: (82, 198) 1374602214 -2.199762163570526 treat image : temp/1753965627_2758669_1374582384_d2e656c37230bb7d8464fcdf94ffc666_rle_crop_3899309066_0.png resize: (563, 339) 1374602216 0.008620191550284552 treat image : temp/1753965627_2758669_1374568368_6d83dba734c2b9fffd10daa1cde59e63_bib_crop_3899151446_0.jpg resize: (47, 50) 1374602217 2.0669295131224787 treat image : temp/1753965627_2758669_1374568368_6d83dba734c2b9fffd10daa1cde59e63_rle_crop_3899309067_0.png resize: (47, 50) 1374602218 0.32409928117427006 treat image : temp/1753965627_2758669_1374568368_6d83dba734c2b9fffd10daa1cde59e63_rle_crop_3899309068_0.png resize: (959, 1051) 1374602220 -0.10051256595958753 treat image : temp/1753965627_2758669_1374568368_6d83dba734c2b9fffd10daa1cde59e63_rle_crop_3899309069_0.png resize: (519, 348) 1374602221 -0.028639623502948696 treat image : temp/1753965627_2758669_1374568367_6427090c1b1628a55b4f183045c74988_bib_crop_3899151448_0.jpg resize: (911, 953) 1374602222 -0.05701630802393499 treat image : temp/1753965627_2758669_1374568367_6427090c1b1628a55b4f183045c74988_rle_crop_3899309070_0.png resize: (910, 937) 1374602223 -0.2728792213172218 treat image : temp/1753965627_2758669_1374568366_4ca4c3bb514b5be376e5f9c3a99cf9e8_bib_crop_3899151450_0.jpg resize: (536, 361) 1374602225 6.740439789667493 treat image : temp/1753965627_2758669_1374568366_4ca4c3bb514b5be376e5f9c3a99cf9e8_rle_crop_3899309071_0.png resize: (528, 360) 1374602226 -1.6749136287934299 treat image : temp/1753965627_2758669_1374568366_4ca4c3bb514b5be376e5f9c3a99cf9e8_rle_crop_3899309072_0.png resize: (112, 208) 1374602227 -4.1676882308309935 treat image : temp/1753965627_2758669_1374568364_e3bf2a419e939b8ccf475f1ad307bf61_bib_crop_3899151452_0.jpg resize: (935, 1032) 1374602229 -2.3862783725808523 treat image : temp/1753965627_2758669_1374568364_e3bf2a419e939b8ccf475f1ad307bf61_rle_crop_3899309073_0.png resize: (981, 1037) 1374602230 0.16768354407613748 treat image : temp/1753965627_2758669_1374568361_147aa32695db7c4033e8b36f752e0b48_rle_crop_3899150957_0.png resize: (209, 143) 1374602231 -2.577020042474754 treat image : temp/1753965627_2758669_1374568361_147aa32695db7c4033e8b36f752e0b48_bib_crop_3899151454_0.jpg resize: (209, 142) 1374602233 -0.34823142668105156 treat image : temp/1753965627_2758669_1374568357_1f5e661a9ab12360bbf9c49baab4b79d_bib_crop_3899151467_0.jpg resize: (418, 1109) 1374602234 -4.437803121160207 treat image : temp/1753965627_2758669_1374568357_1f5e661a9ab12360bbf9c49baab4b79d_rle_crop_3899309074_0.png resize: (420, 1107) 1374602235 -4.3702622356527 treat image : temp/1753965627_2758669_1374568345_bddf134071376c3d1df51d1898648361_rle_crop_3899150961_0.png resize: (87, 92) 1374602236 -2.5088825849739584 treat image : temp/1753965627_2758669_1374568345_bddf134071376c3d1df51d1898648361_rle_crop_3899150962_0.png resize: (176, 226) 1374602238 -1.3203599505086954 treat image : temp/1753965627_2758669_1374568345_bddf134071376c3d1df51d1898648361_bib_crop_3899151482_0.jpg resize: (174, 225) 1374602239 2.4155421423925083 treat image : temp/1753965627_2758669_1374568345_bddf134071376c3d1df51d1898648361_bib_crop_3899151483_0.jpg resize: (86, 91) 1374602240 -0.5159381146438692 treat image : temp/1753965627_2758669_1374568334_f6520716ce98efeb28782d616fe9a06a_bib_crop_3899151497_0.jpg resize: (105, 157) 1374602242 -1.1402603970734855 treat image : temp/1753965627_2758669_1374568334_f6520716ce98efeb28782d616fe9a06a_rle_crop_3899309077_0.png resize: (989, 1009) 1374602243 0.15399775969322457 treat image : temp/1753965627_2758669_1374568326_073bd397f186ca3e19dbea809cda64ab_rle_crop_3899150966_0.png resize: (122, 107) 1374602244 -3.1963141613605046 treat image : temp/1753965627_2758669_1374568326_073bd397f186ca3e19dbea809cda64ab_bib_crop_3899151501_0.jpg resize: (121, 105) 1374602245 -2.7527103428309627 treat image : temp/1753965627_2758669_1374568324_2ced5bdf3799ea8e47fe1470f77012ed_rle_crop_3899150967_0.png resize: (535, 368) 1374602247 0.19828453452057826 treat image : temp/1753965627_2758669_1374568324_2ced5bdf3799ea8e47fe1470f77012ed_rle_crop_3899150968_0.png resize: (95, 92) 1374602248 -5.527688486985872 treat image : temp/1753965627_2758669_1374568324_2ced5bdf3799ea8e47fe1470f77012ed_bib_crop_3899151513_0.jpg resize: (94, 91) 1374602249 -5.467838384167777 treat image : temp/1753965627_2758669_1374568324_2ced5bdf3799ea8e47fe1470f77012ed_bib_crop_3899151514_0.jpg resize: (534, 367) 1374602251 5.574361845052642 treat image : temp/1753965627_2758669_1374582695_056248648fe33af94d072c38d16a9ea3_rle_crop_3899309037_0.png resize: (81, 49) 1374602275 3.3347803382981196 treat image : temp/1753965627_2758669_1374568357_1f5e661a9ab12360bbf9c49baab4b79d_rle_crop_3899150958_0.png resize: (61, 66) 1374602277 0.158931034205425 treat image : temp/1753965627_2758669_1374568357_1f5e661a9ab12360bbf9c49baab4b79d_bib_crop_3899151469_0.jpg resize: (60, 65) 1374602278 1.8910284084784352 Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 121 time used for this insertion : 0.01824188232421875 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 121 time used for this insertion : 0.03086400032043457 save missing photos in datou_result : time spend for datou_step_exec : 39.85104274749756 time spend to save output : 0.053899288177490234 total time spend for step 6 : 39.90494203567505 step7:brightness Thu Jul 31 14:43:51 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness treat image : temp/1753965627_2758669_1374583980_f38417260015c28b587ed5be2b65e101.jpg treat image : temp/1753965627_2758669_1374583967_fa6c6595a11526583914c31472ea3d0c.jpg treat image : temp/1753965627_2758669_1374583965_7a756f7a9d3bd524f26733ca12b7dcfd.jpg treat image : temp/1753965627_2758669_1374583877_548e89888ecc8bf4f03ee57479b6e4f7.jpg treat image : temp/1753965627_2758669_1374583874_5e2eb3cf8da94d68bc6a1cd314048a50.jpg treat image : temp/1753965627_2758669_1374583774_b28b6557092741488ab5323658710c79.jpg treat image : temp/1753965627_2758669_1374583750_fe0b09f43d1d9567dec0141af4da84d5.jpg treat image : temp/1753965627_2758669_1374583724_cd99c4bddeac8d110c3422949a506e27.jpg treat image : temp/1753965627_2758669_1374583698_35fbf121c2b35a6879151d1561369807.jpg treat image : temp/1753965627_2758669_1374583671_4e343bbefbd1daa5ec7116cf215e9474.jpg treat image : temp/1753965627_2758669_1374583645_36d6d97bb6ad61b422864460c18c270c.jpg treat image : temp/1753965627_2758669_1374583005_8d7eed679f9f7ed859106c547ed25acd.jpg treat image : temp/1753965627_2758669_1374582695_056248648fe33af94d072c38d16a9ea3.jpg treat image : temp/1753965627_2758669_1374582671_a6e3176659d44869827a4d581eaf5a36.jpg treat image : temp/1753965627_2758669_1374582646_4862f6d4dced5c06eeb2d70c774a20a6.jpg treat image : temp/1753965627_2758669_1374582554_2121047f642e85b5956b4821d7ef2442.jpg treat image : temp/1753965627_2758669_1374582524_8c1a3688f19f894bb505518855c5e3e4.jpg treat image : temp/1753965627_2758669_1374582520_332797921667030df818fea2b226bdd9.jpg treat image : temp/1753965627_2758669_1374582517_946baed0941b0b9bd68f488a957b57fb.jpg treat image : temp/1753965627_2758669_1374582515_85e02c6c6efd102a98ed58ae3d8e5ed1.jpg treat image : temp/1753965627_2758669_1374582514_067b3048da51963a311b6e64f77f597d.jpg treat image : temp/1753965627_2758669_1374582506_f2c07fab43d83f3e22cded5839508783.jpg treat image : temp/1753965627_2758669_1374582438_ec19a3369c909bae591276beabf933cb.jpg treat image : temp/1753965627_2758669_1374582435_b363c00da8c01827463b8007af17f0c2.jpg treat image : temp/1753965627_2758669_1374582428_11e3a4e4e0fbe27daa3b5942f7528e9d.jpg treat image : temp/1753965627_2758669_1374582403_5209c15a51dd5a5ba141885e68d3e7fd.jpg treat image : temp/1753965627_2758669_1374582391_208d4666ae8c3b91c7f9d7df9cf20c6f.jpg treat image : temp/1753965627_2758669_1374582384_d2e656c37230bb7d8464fcdf94ffc666.jpg treat image : temp/1753965627_2758669_1374568368_6d83dba734c2b9fffd10daa1cde59e63.jpg treat image : temp/1753965627_2758669_1374568367_6427090c1b1628a55b4f183045c74988.jpg treat image : temp/1753965627_2758669_1374568366_4ca4c3bb514b5be376e5f9c3a99cf9e8.jpg treat image : temp/1753965627_2758669_1374568364_e3bf2a419e939b8ccf475f1ad307bf61.jpg treat image : temp/1753965627_2758669_1374568361_147aa32695db7c4033e8b36f752e0b48.jpg treat image : temp/1753965627_2758669_1374568357_1f5e661a9ab12360bbf9c49baab4b79d.jpg treat image : temp/1753965627_2758669_1374568345_bddf134071376c3d1df51d1898648361.jpg treat image : temp/1753965627_2758669_1374568334_f6520716ce98efeb28782d616fe9a06a.jpg treat image : temp/1753965627_2758669_1374568326_073bd397f186ca3e19dbea809cda64ab.jpg treat image : temp/1753965627_2758669_1374568324_2ced5bdf3799ea8e47fe1470f77012ed.jpg treat image : temp/1753965627_2758669_1374568320_be58b5b26762c9aae7278ec6d6699b52.jpg treat image : temp/1753965627_2758669_1374568317_3e2f13dc516cfeae649819bb73e9786e.jpg treat image : temp/1753965627_2758669_1374583967_fa6c6595a11526583914c31472ea3d0c_rle_crop_3899309022_0.png treat image : temp/1753965627_2758669_1374582695_056248648fe33af94d072c38d16a9ea3_rle_crop_3899309036_0.png treat image : temp/1753965627_2758669_1374582695_056248648fe33af94d072c38d16a9ea3_rle_crop_3899309039_0.png treat image : temp/1753965627_2758669_1374582524_8c1a3688f19f894bb505518855c5e3e4_rle_crop_3899309046_0.png treat image : temp/1753965627_2758669_1374582517_946baed0941b0b9bd68f488a957b57fb_rle_crop_3899309048_0.png treat image : temp/1753965627_2758669_1374582517_946baed0941b0b9bd68f488a957b57fb_rle_crop_3899309049_0.png treat image : temp/1753965627_2758669_1374582435_b363c00da8c01827463b8007af17f0c2_rle_crop_3899309060_0.png treat image : temp/1753965627_2758669_1374582403_5209c15a51dd5a5ba141885e68d3e7fd_rle_crop_3899309064_0.png treat image : temp/1753965627_2758669_1374568357_1f5e661a9ab12360bbf9c49baab4b79d_rle_crop_3899150959_0.png treat image : temp/1753965627_2758669_1374568357_1f5e661a9ab12360bbf9c49baab4b79d_bib_crop_3899151468_0.jpg treat image : temp/1753965627_2758669_1374568334_f6520716ce98efeb28782d616fe9a06a_bib_crop_3899151499_0.jpg treat image : temp/1753965627_2758669_1374568334_f6520716ce98efeb28782d616fe9a06a_rle_crop_3899309076_0.png treat image : temp/1753965627_2758669_1374583874_5e2eb3cf8da94d68bc6a1cd314048a50_rle_crop_3899309025_0.png treat image : temp/1753965627_2758669_1374583750_fe0b09f43d1d9567dec0141af4da84d5_rle_crop_3899309027_0.png treat image : temp/1753965627_2758669_1374582506_f2c07fab43d83f3e22cded5839508783_rle_crop_3899309056_0.png treat image : temp/1753965627_2758669_1374568334_f6520716ce98efeb28782d616fe9a06a_bib_crop_3899151498_0.jpg treat image : temp/1753965627_2758669_1374568334_f6520716ce98efeb28782d616fe9a06a_rle_crop_3899309075_0.png treat image : temp/1753965627_2758669_1374568334_f6520716ce98efeb28782d616fe9a06a_rle_crop_3899309078_0.png treat image : temp/1753965627_2758669_1374583967_fa6c6595a11526583914c31472ea3d0c_rle_crop_3899309023_0.png treat image : temp/1753965627_2758669_1374583965_7a756f7a9d3bd524f26733ca12b7dcfd_rle_crop_3899309024_0.png treat image : temp/1753965627_2758669_1374583750_fe0b09f43d1d9567dec0141af4da84d5_rle_crop_3899309026_0.png treat image : temp/1753965627_2758669_1374583750_fe0b09f43d1d9567dec0141af4da84d5_rle_crop_3899309028_0.png treat image : temp/1753965627_2758669_1374583750_fe0b09f43d1d9567dec0141af4da84d5_rle_crop_3899309029_0.png treat image : temp/1753965627_2758669_1374583724_cd99c4bddeac8d110c3422949a506e27_rle_crop_3899309030_0.png treat image : temp/1753965627_2758669_1374583671_4e343bbefbd1daa5ec7116cf215e9474_rle_crop_3899309031_0.png treat image : temp/1753965627_2758669_1374583671_4e343bbefbd1daa5ec7116cf215e9474_rle_crop_3899309032_0.png treat image : temp/1753965627_2758669_1374583645_36d6d97bb6ad61b422864460c18c270c_rle_crop_3899309033_0.png treat image : temp/1753965627_2758669_1374583005_8d7eed679f9f7ed859106c547ed25acd_rle_crop_3899309034_0.png treat image : temp/1753965627_2758669_1374582695_056248648fe33af94d072c38d16a9ea3_rle_crop_3899309035_0.png treat image : temp/1753965627_2758669_1374582695_056248648fe33af94d072c38d16a9ea3_rle_crop_3899309038_0.png treat image : temp/1753965627_2758669_1374582671_a6e3176659d44869827a4d581eaf5a36_rle_crop_3899309040_0.png treat image : temp/1753965627_2758669_1374582671_a6e3176659d44869827a4d581eaf5a36_rle_crop_3899309041_0.png treat image : temp/1753965627_2758669_1374582671_a6e3176659d44869827a4d581eaf5a36_rle_crop_3899309042_0.png treat image : temp/1753965627_2758669_1374582646_4862f6d4dced5c06eeb2d70c774a20a6_rle_crop_3899309043_0.png treat image : temp/1753965627_2758669_1374582554_2121047f642e85b5956b4821d7ef2442_rle_crop_3899309044_0.png treat image : temp/1753965627_2758669_1374582554_2121047f642e85b5956b4821d7ef2442_rle_crop_3899309045_0.png treat image : temp/1753965627_2758669_1374582520_332797921667030df818fea2b226bdd9_rle_crop_3899309047_0.png treat image : temp/1753965627_2758669_1374582515_85e02c6c6efd102a98ed58ae3d8e5ed1_rle_crop_3899309050_0.png treat image : temp/1753965627_2758669_1374582515_85e02c6c6efd102a98ed58ae3d8e5ed1_rle_crop_3899309051_0.png treat image : temp/1753965627_2758669_1374582514_067b3048da51963a311b6e64f77f597d_rle_crop_3899309052_0.png treat image : temp/1753965627_2758669_1374582514_067b3048da51963a311b6e64f77f597d_rle_crop_3899309053_0.png treat image : temp/1753965627_2758669_1374582506_f2c07fab43d83f3e22cded5839508783_rle_crop_3899309054_0.png treat image : temp/1753965627_2758669_1374582506_f2c07fab43d83f3e22cded5839508783_rle_crop_3899309055_0.png treat image : temp/1753965627_2758669_1374582438_ec19a3369c909bae591276beabf933cb_rle_crop_3899309057_0.png treat image : temp/1753965627_2758669_1374582438_ec19a3369c909bae591276beabf933cb_rle_crop_3899309058_0.png treat image : temp/1753965627_2758669_1374582438_ec19a3369c909bae591276beabf933cb_rle_crop_3899309059_0.png treat image : temp/1753965627_2758669_1374582435_b363c00da8c01827463b8007af17f0c2_rle_crop_3899309061_0.png treat image : temp/1753965627_2758669_1374582435_b363c00da8c01827463b8007af17f0c2_rle_crop_3899309062_0.png treat image : temp/1753965627_2758669_1374582428_11e3a4e4e0fbe27daa3b5942f7528e9d_rle_crop_3899309063_0.png treat image : temp/1753965627_2758669_1374582391_208d4666ae8c3b91c7f9d7df9cf20c6f_rle_crop_3899309065_0.png treat image : temp/1753965627_2758669_1374582384_d2e656c37230bb7d8464fcdf94ffc666_rle_crop_3899309066_0.png treat image : temp/1753965627_2758669_1374568368_6d83dba734c2b9fffd10daa1cde59e63_bib_crop_3899151446_0.jpg treat image : temp/1753965627_2758669_1374568368_6d83dba734c2b9fffd10daa1cde59e63_rle_crop_3899309067_0.png treat image : temp/1753965627_2758669_1374568368_6d83dba734c2b9fffd10daa1cde59e63_rle_crop_3899309068_0.png treat image : temp/1753965627_2758669_1374568368_6d83dba734c2b9fffd10daa1cde59e63_rle_crop_3899309069_0.png treat image : temp/1753965627_2758669_1374568367_6427090c1b1628a55b4f183045c74988_bib_crop_3899151448_0.jpg treat image : temp/1753965627_2758669_1374568367_6427090c1b1628a55b4f183045c74988_rle_crop_3899309070_0.png treat image : temp/1753965627_2758669_1374568366_4ca4c3bb514b5be376e5f9c3a99cf9e8_bib_crop_3899151450_0.jpg treat image : temp/1753965627_2758669_1374568366_4ca4c3bb514b5be376e5f9c3a99cf9e8_rle_crop_3899309071_0.png treat image : temp/1753965627_2758669_1374568366_4ca4c3bb514b5be376e5f9c3a99cf9e8_rle_crop_3899309072_0.png treat image : temp/1753965627_2758669_1374568364_e3bf2a419e939b8ccf475f1ad307bf61_bib_crop_3899151452_0.jpg treat image : temp/1753965627_2758669_1374568364_e3bf2a419e939b8ccf475f1ad307bf61_rle_crop_3899309073_0.png treat image : temp/1753965627_2758669_1374568361_147aa32695db7c4033e8b36f752e0b48_rle_crop_3899150957_0.png treat image : temp/1753965627_2758669_1374568361_147aa32695db7c4033e8b36f752e0b48_bib_crop_3899151454_0.jpg treat image : temp/1753965627_2758669_1374568357_1f5e661a9ab12360bbf9c49baab4b79d_bib_crop_3899151467_0.jpg treat image : temp/1753965627_2758669_1374568357_1f5e661a9ab12360bbf9c49baab4b79d_rle_crop_3899309074_0.png treat image : temp/1753965627_2758669_1374568345_bddf134071376c3d1df51d1898648361_rle_crop_3899150961_0.png treat image : temp/1753965627_2758669_1374568345_bddf134071376c3d1df51d1898648361_rle_crop_3899150962_0.png treat image : temp/1753965627_2758669_1374568345_bddf134071376c3d1df51d1898648361_bib_crop_3899151482_0.jpg treat image : temp/1753965627_2758669_1374568345_bddf134071376c3d1df51d1898648361_bib_crop_3899151483_0.jpg treat image : temp/1753965627_2758669_1374568334_f6520716ce98efeb28782d616fe9a06a_bib_crop_3899151497_0.jpg treat image : temp/1753965627_2758669_1374568334_f6520716ce98efeb28782d616fe9a06a_rle_crop_3899309077_0.png treat image : temp/1753965627_2758669_1374568326_073bd397f186ca3e19dbea809cda64ab_rle_crop_3899150966_0.png treat image : temp/1753965627_2758669_1374568326_073bd397f186ca3e19dbea809cda64ab_bib_crop_3899151501_0.jpg treat image : temp/1753965627_2758669_1374568324_2ced5bdf3799ea8e47fe1470f77012ed_rle_crop_3899150967_0.png treat image : temp/1753965627_2758669_1374568324_2ced5bdf3799ea8e47fe1470f77012ed_rle_crop_3899150968_0.png treat image : temp/1753965627_2758669_1374568324_2ced5bdf3799ea8e47fe1470f77012ed_bib_crop_3899151513_0.jpg treat image : temp/1753965627_2758669_1374568324_2ced5bdf3799ea8e47fe1470f77012ed_bib_crop_3899151514_0.jpg treat image : temp/1753965627_2758669_1374582695_056248648fe33af94d072c38d16a9ea3_rle_crop_3899309037_0.png treat image : temp/1753965627_2758669_1374568357_1f5e661a9ab12360bbf9c49baab4b79d_rle_crop_3899150958_0.png treat image : temp/1753965627_2758669_1374568357_1f5e661a9ab12360bbf9c49baab4b79d_bib_crop_3899151469_0.jpg Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 121 time used for this insertion : 0.018359899520874023 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 121 time used for this insertion : 0.029176712036132812 save missing photos in datou_result : time spend for datou_step_exec : 11.356911420822144 time spend to save output : 0.05295228958129883 total time spend for step 7 : 11.409863710403442 step8:velours_tree Thu Jul 31 14:44:02 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure can't find the photo_desc_type Inside saveOutput : final : False verbose : 0 ouput is None No outpout to save, returning out of save general time spend for datou_step_exec : 0.09271454811096191 time spend to save output : 6.0558319091796875e-05 total time spend for step 8 : 0.09277510643005371 step9:send_mail_cod Thu Jul 31 14:44:02 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin/workarea/git/Velours/python in order to get the selector url, please entre the license of selector results_Auto_P25543232_31-07-2025_14_44_02.pdf 25544981 imagette255449811753965842 25544982 imagette255449821753965842 25544984 change filename to text .change filename to text .change filename to text .imagette255449841753965842 25544985 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette255449851753965842 25544986 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette255449861753965844 25544987 imagette255449871753965844 25544988 imagette255449881753965844 25544989 imagette255449891753965844 25544990 imagette255449901753965844 25544991 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette255449911753965844 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=25543232 and hashtag_type = 3594 and pcr.hashtag_id = h.hashtag_id; velour_link : https://www.fotonower.com/velours/25544981,25544982,25544983,25544984,25544985,25544986,25544987,25544988,25544989,25544990,25544991?tags=flou,metal,environnement,autre,papier,carton,pet_fonce,mal_croppe,background,pehd,pet_clair args[1374583980] : ((1374583980, -4.1425063775002045, 492609224), (1374583980, 0.4356182251489362, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374583967] : ((1374583967, -2.4024284860521274, 492609224), (1374583967, 0.6005759993675656, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374583965] : ((1374583965, -2.8948370714886904, 492609224), (1374583965, 0.4814859205141672, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374583877] : ((1374583877, -2.6035974210736366, 492609224), (1374583877, 0.48063161391616743, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374583874] : ((1374583874, -2.252930280394346, 492609224), (1374583874, 0.4029381604548252, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374583774] : ((1374583774, -1.6712080657856205, 492688767), (1374583774, 0.4837327419185861, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374583750] : ((1374583750, -2.1606819522010157, 492609224), (1374583750, 0.40514057963678407, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374583724] : ((1374583724, -2.6058968770009754, 492609224), (1374583724, 0.35578243799423975, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374583698] : ((1374583698, -2.322306202080638, 492609224), (1374583698, 0.33779596940235906, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374583671] : ((1374583671, -2.223741077791181, 492609224), (1374583671, 0.36788793474613457, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374583645] : ((1374583645, -2.556720776748674, 492609224), (1374583645, 0.6473587069235766, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374583005] : ((1374583005, -2.180510455667319, 492609224), (1374583005, 0.6053289361813189, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374582695] : ((1374582695, -2.90743210881374, 492609224), (1374582695, 0.46977783264808637, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374582671] : ((1374582671, -2.8119261665862987, 492609224), (1374582671, 0.4999596164234897, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374582646] : ((1374582646, -2.992178474475019, 492609224), (1374582646, 0.5922092804471186, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374582554] : ((1374582554, -2.578535478786695, 492609224), (1374582554, 0.3877022828994014, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374582524] : ((1374582524, -4.365171541584825, 492609224), (1374582524, 0.32231507676021437, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374582520] : ((1374582520, 0.670271265699891, 492688767), (1374582520, 0.3359552632569527, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374582517] : ((1374582517, -1.8446123166310753, 492688767), (1374582517, 0.4196369820161598, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374582515] : ((1374582515, -0.1081580950153327, 492688767), (1374582515, 0.31236512860616955, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374582514] : ((1374582514, -2.64337186436381, 492609224), (1374582514, 0.581683837779465, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374582506] : ((1374582506, -1.531488162302333, 492688767), (1374582506, 0.5186258626949556, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374582438] : ((1374582438, -1.461370166009664, 492688767), (1374582438, 0.5611613267947944, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374582435] : ((1374582435, -2.3868581796958845, 492609224), (1374582435, 0.4972077681808284, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374582428] : ((1374582428, -2.868043656826242, 492609224), (1374582428, 0.6530440811024041, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374582403] : ((1374582403, -1.3515454418003325, 492688767), (1374582403, 0.578462805683765, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374582391] : ((1374582391, -1.9359086760337236, 492688767), (1374582391, 0.45440714469937865, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374582384] : ((1374582384, -2.9227587137656887, 492609224), (1374582384, 0.557797886666583, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374568368] : ((1374568368, -2.2049223601133217, 492609224), (1374568368, 0.46894693902388357, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374568367] : ((1374568367, -1.6449544072872513, 492688767), (1374568367, 0.5038126121425491, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374568366] : ((1374568366, -2.2857906163953667, 492609224), (1374568366, 0.4649930692809077, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374568364] : ((1374568364, -2.316148521883746, 492609224), (1374568364, 0.471256902671915, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374568361] : ((1374568361, -2.775507418886225, 492609224), (1374568361, 0.4260902563906776, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374568357] : ((1374568357, -2.4582478492380337, 492609224), (1374568357, 0.4816739578683957, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374568345] : ((1374568345, -1.693200284639881, 492688767), (1374568345, 0.3264439460989924, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374568334] : ((1374568334, -2.5542210250691517, 492609224), (1374568334, 0.3822882630668186, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374568326] : ((1374568326, -2.2691698548136023, 492609224), (1374568326, 0.3808558062900621, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374568324] : ((1374568324, -2.6931379522236334, 492609224), (1374568324, 0.547057238887637, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374568320] : ((1374568320, -1.4990965213293093, 492688767), (1374568320, 0.7141989820813819, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com args[1374568317] : ((1374568317, -2.361538697203884, 492609224), (1374568317, 0.7329016217812432, 2107752395), '0.1775103777154558') We are sending mail with results at report@fotonower.com refus_total : 0.1775103777154558 2022-04-13 10:29:59 0 SELECT ph.photo_id,ph.url,ph.username,ph.uploaded_at,ph.text FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=25543232 AND mpp.hide_status=0 ORDER BY mpp.order LIMIT 0, 1000 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25543232_31-07-2025_14_44_02.pdf results_Auto_P25543232_31-07-2025_14_44_02.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25543232_31-07-2025_14_44_02.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('3318','25543232','results_Auto_P25543232_31-07-2025_14_44_02.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25543232_31-07-2025_14_44_02.pdf','pdf','','0.65','0.1775103777154558') message_in_mail: Bonjour,
Veuillez trouver ci dessous les résultats du service carac on demand pour le portfolio: https://www.fotonower.com/view/25543232

https://www.fotonower.com/image?json=false&list_photos_id=1374583980
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374583967
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374583965
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374583877
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374583874
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374583774
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374583750
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374583724
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374583698
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374583671
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374583645
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374583005
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374582695
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374582671
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374582646
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374582554
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374582524
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374582520
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374582517
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374582515
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374582514
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374582506
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374582438
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374582435
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374582428
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374582403
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374582391
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374582384
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374568368
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374568367
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374568366
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374568364
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374568361
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374568357
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374568345
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374568334
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374568326
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374568324
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374568320
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374568317
Bravo, la photo est bien prise.

Dans ces conditions,le taux de refus est: 17.75%
Veuillez trouver les photos des contaminants.

exemples de contaminants: autre: https://www.fotonower.com/view/25544984?limit=200
exemples de contaminants: papier: https://www.fotonower.com/view/25544985?limit=200
exemples de contaminants: carton: https://www.fotonower.com/view/25544986?limit=200
exemples de contaminants: pet_clair: https://www.fotonower.com/view/25544991?limit=200
Veuillez trouver le rapport en pdf:https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25543232_31-07-2025_14_44_02.pdf.

Lien vers velours :https://www.fotonower.com/velours/25544981,25544982,25544983,25544984,25544985,25544986,25544987,25544988,25544989,25544990,25544991?tags=flou,metal,environnement,autre,papier,carton,pet_fonce,mal_croppe,background,pehd,pet_clair.


L'équipe Fotonower 202 b'' Server: nginx Date: Thu, 31 Jul 2025 12:44:11 GMT Content-Length: 0 Connection: close X-Message-Id: 2JJNBlzDQP6-ZaV-H_htfQ Access-Control-Allow-Origin: https://sendgrid.api-docs.io Access-Control-Allow-Methods: POST Access-Control-Allow-Headers: Authorization, Content-Type, On-behalf-of, x-sg-elas-acl Access-Control-Max-Age: 600 X-No-CORS-Reason: https://sendgrid.com/docs/Classroom/Basics/API/cors.html Strict-Transport-Security: max-age=31536000; includeSubDomains Content-Security-Policy: frame-ancestors 'none' Cache-Control: no-cache X-Content-Type-Options: no-sniff Referrer-Policy: strict-origin-when-cross-origin Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1374583980, 1374583967, 1374583965, 1374583877, 1374583874, 1374583774, 1374583750, 1374583724, 1374583698, 1374583671, 1374583645, 1374583005, 1374582695, 1374582671, 1374582646, 1374582554, 1374582524, 1374582520, 1374582517, 1374582515, 1374582514, 1374582506, 1374582438, 1374582435, 1374582428, 1374582403, 1374582391, 1374582384, 1374568368, 1374568367, 1374568366, 1374568364, 1374568361, 1374568357, 1374568345, 1374568334, 1374568326, 1374568324, 1374568320, 1374568317] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583980', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583967', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583965', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583877', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583874', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583774', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583750', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583724', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583698', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583671', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583645', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583005', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582695', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582671', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582646', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582554', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582524', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582520', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582517', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582515', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582514', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582506', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582438', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582435', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582428', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582403', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582391', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582384', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568368', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568367', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568366', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568364', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568361', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568357', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568345', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568334', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568326', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568324', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568320', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568317', None, None, None, None, None, '3410768') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 40 time used for this insertion : 0.017306804656982422 save_final save missing photos in datou_result : time spend for datou_step_exec : 8.366587162017822 time spend to save output : 0.017631053924560547 total time spend for step 9 : 8.384218215942383 step10:split_time_score Thu Jul 31 14:44:11 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score Catched exception ! Connect or reconnect ! TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('10', 52),) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 31072025 25543232 Nombre de photos uploadées : 52 / 23040 (0%) 31072025 25543232 Nombre de photos taguées (types de déchets): 0 / 52 (0%) 31072025 25543232 Nombre de photos taguées (volume) : 0 / 52 (0%) elapsed_time : load_data_split_time_score 4.76837158203125e-06 elapsed_time : order_list_meta_photo_and_scores 1.3113021850585938e-05 ???????????????????????????????????????????????????? elapsed_time : fill_and_build_computed_from_old_data 0.0025548934936523438 Catched exception ! Connect or reconnect ! Catched exception ! Connect or reconnect ! elapsed_time : insert_dashboard_record_day_entry 0.20004916191101074 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.1288892103909465 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25530216_31-07-2025_08_21_35.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25530216 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25530216 AND mptpi.`type`=3594 To do Qualite : 0.0400941679526749 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25532093_31-07-2025_09_51_52.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25532093 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25532093 AND mptpi.`type`=3594 To do Qualite : 0.017316454475308645 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25532109_31-07-2025_09_41_23.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25532109 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25532109 AND mptpi.`type`=3594 To do Qualite : 0.045261622299382735 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25532112_31-07-2025_09_31_06.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25532112 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25532112 AND mptpi.`type`=3594 To do Qualite : 0.10603395061728398 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25537191_31-07-2025_11_41_29.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25537191 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25537191 AND mptpi.`type`=3594 To do Qualite : 0.1775103777154558 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25543232_31-07-2025_14_44_02.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25543232 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25543232 AND mptpi.`type`=3594 To do Qualite : 0.2067458164544753 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25543235_31-07-2025_14_31_40.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25543235 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25543235 AND mptpi.`type`=3594 To do Qualite : 0.15670513974622782 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25543266_31-07-2025_14_21_39.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25543266 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25543266 AND mptpi.`type`=3594 To do Qualite : 0.11439766589506177 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25543287_31-07-2025_14_12_54.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25543287 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25543287 AND mptpi.`type`=3594 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'31072025': {'nb_upload': 52, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1374583980, 1374583967, 1374583965, 1374583877, 1374583874, 1374583774, 1374583750, 1374583724, 1374583698, 1374583671, 1374583645, 1374583005, 1374582695, 1374582671, 1374582646, 1374582554, 1374582524, 1374582520, 1374582517, 1374582515, 1374582514, 1374582506, 1374582438, 1374582435, 1374582428, 1374582403, 1374582391, 1374582384, 1374568368, 1374568367, 1374568366, 1374568364, 1374568361, 1374568357, 1374568345, 1374568334, 1374568326, 1374568324, 1374568320, 1374568317] Looping around the photos to save general results len do output : 1 /25543232Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583980', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583967', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583965', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583877', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583874', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583774', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583750', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583724', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583698', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583671', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583645', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374583005', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582695', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582671', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582646', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582554', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582524', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582520', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582517', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582515', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582514', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582506', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582438', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582435', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582428', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582403', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582391', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374582384', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568368', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568367', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568366', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568364', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568361', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568357', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568345', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568334', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568326', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568324', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568320', None, None, None, None, None, '3410768') ('3318', None, None, None, None, None, None, None, '3410768') ('3318', '25543232', '1374568317', None, None, None, None, None, '3410768') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 41 time used for this insertion : 0.01862812042236328 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.8936927318572998 time spend to save output : 0.019140958786010742 total time spend for step 10 : 0.9128336906433105 caffe_path_current : About to save ! 2 After save, about to update current ! update_current_state 146.29user 51.99system 3:47.89elapsed 87%CPU (0avgtext+0avgdata 3118908maxresident)k 254232inputs+79080outputs (8450major+3573671minor)pagefaults 0swaps