python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 3318 ' -s datou_3318 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 2800533 load datou : 3318 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? [(photo_id, hashtag_id, hashtag_type, x0, x1, y0, y1, score, seg_temp, polygons), ...] was removed should we ? chemin de la photo was removed should we ? [ (photo_id_loc, hashtag_id, hashtag_type, x0, x1, y0, y1, score, None), ...] was removed should we ? chemin de la photo was removed should we ? id de la photo (peut être local ou global) was removed should we ? chemin de la photo was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? None was removed should we ? donnée sous forme de nombre was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) load thcls load THCL from format json or kwargs add thcl : 2847 in CacheModelConfig load pdts add pdt : 5275 in CacheModelConfig Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 3318, datou_cur_ids : ['3534279'] with mtr_portfolio_ids : ['25980071'] and first list_photo_ids : [] new path : /proc/2800533/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : mask_detect, crop_condition, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, velours_tree, send_mail_cod, split_time_score over limit max, limiting to limit_max 40 list_input_json : [] origin We have 1 , WARNING: data may be incomplete, need to offset and complete ! BFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 40 ; length of list_pids : 40 ; length of list_args : 40 time to download the photos : 5.224339485168457 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 10 step1:mask_detect Thu Aug 14 14:30:34 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 10395 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-08-14 14:30:37.282017: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-08-14 14:30:37.312589: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3492910000 Hz 2025-08-14 14:30:37.314214: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f3f50000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-08-14 14:30:37.314274: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-08-14 14:30:37.317318: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-08-14 14:30:37.429592: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x42da6610 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-08-14 14:30:37.429648: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-08-14 14:30:37.430894: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-08-14 14:30:37.431319: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-08-14 14:30:37.434101: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-08-14 14:30:37.436922: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-08-14 14:30:37.437392: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-08-14 14:30:37.439730: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-08-14 14:30:37.440967: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-08-14 14:30:37.445872: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-08-14 14:30:37.447382: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-08-14 14:30:37.447455: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-08-14 14:30:37.448240: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-08-14 14:30:37.448256: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-08-14 14:30:37.448280: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-08-14 14:30:37.449646: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9485 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-08-14 14:30:37.712222: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-08-14 14:30:37.712392: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-08-14 14:30:37.712418: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-08-14 14:30:37.712438: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-08-14 14:30:37.712456: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-08-14 14:30:37.712475: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-08-14 14:30:37.712493: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-08-14 14:30:37.712511: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-08-14 14:30:37.714042: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-08-14 14:30:37.715735: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-08-14 14:30:37.715787: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-08-14 14:30:37.715806: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-08-14 14:30:37.715824: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-08-14 14:30:37.715841: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-08-14 14:30:37.715858: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-08-14 14:30:37.715875: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-08-14 14:30:37.715892: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-08-14 14:30:37.717439: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-08-14 14:30:37.717481: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-08-14 14:30:37.717492: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-08-14 14:30:37.717501: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-08-14 14:30:37.718880: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9485 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl2847 thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) {'thcl': {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'], 'list_hashtags_csv': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'svm_hashtag_type_desc': 5275, 'photo_desc_type': 5275, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME learn_RUBBIA_REFUS_AMIENS_23 NUM_CLASSES 9 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : learn_RUBBIA_REFUS_AMIENS_23 model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-08-14 14:30:45.820620: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-08-14 14:30:46.056447: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23 /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23/mask_model.h5 size_local : 256009536 size in s3 : 256009536 create time local : 2021-08-09 09:43:22 create time in s3 : 2021-08-06 18:54:04 mask_model.h5 already exist and didn't need to update list_images length : 40 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 9.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 9 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 16.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 25.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 16.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 17.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 7.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 19.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 20.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 25.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 28.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 13.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 14.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 28.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 27.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 27.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 28.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 9 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 10.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 11 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 24.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 27.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 11 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 4.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 21.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 23.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 30.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 20.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 29.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 10 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 27.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 14.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 25.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 25.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 8 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 15.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 16.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 23.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 10 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 25.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 Detection mask done ! Trying to reset tf kernel 2801213 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 136 tf kernel not reseted sub process len(results) : 40 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 40 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 6618 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl2847 Catched exception ! Connect or reconnect ! thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] time for calcul the mask position with numpy : 0.00021719932556152344 nb_pixel_total : 7859 time to create 1 rle with old method : 0.009123563766479492 length of segment : 113 time for calcul the mask position with numpy : 0.0001399517059326172 nb_pixel_total : 5383 time to create 1 rle with old method : 0.006464719772338867 length of segment : 123 time for calcul the mask position with numpy : 0.0020902156829833984 nb_pixel_total : 115359 time to create 1 rle with old method : 0.1570730209350586 length of segment : 546 time for calcul the mask position with numpy : 0.00019478797912597656 nb_pixel_total : 8356 time to create 1 rle with old method : 0.009695768356323242 length of segment : 133 time for calcul the mask position with numpy : 0.00013113021850585938 nb_pixel_total : 4416 time to create 1 rle with old method : 0.005269527435302734 length of segment : 72 time for calcul the mask position with numpy : 0.0002646446228027344 nb_pixel_total : 12719 time to create 1 rle with old method : 0.015004634857177734 length of segment : 195 time for calcul the mask position with numpy : 0.0002491474151611328 nb_pixel_total : 8998 time to create 1 rle with old method : 0.010735511779785156 length of segment : 79 time for calcul the mask position with numpy : 0.015618085861206055 nb_pixel_total : 729861 time to create 1 rle with new method : 0.026287078857421875 length of segment : 976 time for calcul the mask position with numpy : 0.012435674667358398 nb_pixel_total : 668617 time to create 1 rle with new method : 0.025191783905029297 length of segment : 993 time for calcul the mask position with numpy : 0.0022203922271728516 nb_pixel_total : 93918 time to create 1 rle with old method : 0.10797762870788574 length of segment : 516 time for calcul the mask position with numpy : 0.0004506111145019531 nb_pixel_total : 11058 time to create 1 rle with old method : 0.012471437454223633 length of segment : 175 time for calcul the mask position with numpy : 0.00020170211791992188 nb_pixel_total : 6596 time to create 1 rle with old method : 0.007767200469970703 length of segment : 73 time for calcul the mask position with numpy : 0.012880325317382812 nb_pixel_total : 806553 time to create 1 rle with new method : 0.020617008209228516 length of segment : 1564 time for calcul the mask position with numpy : 0.0002696514129638672 nb_pixel_total : 6772 time to create 1 rle with old method : 0.007746696472167969 length of segment : 110 time for calcul the mask position with numpy : 0.000213623046875 nb_pixel_total : 4323 time to create 1 rle with old method : 0.004930973052978516 length of segment : 117 time for calcul the mask position with numpy : 0.00039887428283691406 nb_pixel_total : 10737 time to create 1 rle with old method : 0.011796712875366211 length of segment : 184 time for calcul the mask position with numpy : 0.0021677017211914062 nb_pixel_total : 116107 time to create 1 rle with old method : 0.12622761726379395 length of segment : 537 time for calcul the mask position with numpy : 0.00019049644470214844 nb_pixel_total : 8809 time to create 1 rle with old method : 0.009895086288452148 length of segment : 137 time for calcul the mask position with numpy : 0.0006275177001953125 nb_pixel_total : 21756 time to create 1 rle with old method : 0.024901390075683594 length of segment : 140 time for calcul the mask position with numpy : 0.0004601478576660156 nb_pixel_total : 19396 time to create 1 rle with old method : 0.021798372268676758 length of segment : 168 time for calcul the mask position with numpy : 0.0005185604095458984 nb_pixel_total : 23287 time to create 1 rle with old method : 0.027218341827392578 length of segment : 153 time for calcul the mask position with numpy : 0.0004494190216064453 nb_pixel_total : 10847 time to create 1 rle with old method : 0.012015342712402344 length of segment : 180 time for calcul the mask position with numpy : 0.000499725341796875 nb_pixel_total : 9286 time to create 1 rle with old method : 0.012057304382324219 length of segment : 167 time for calcul the mask position with numpy : 0.00015664100646972656 nb_pixel_total : 4112 time to create 1 rle with old method : 0.005181550979614258 length of segment : 59 time for calcul the mask position with numpy : 0.0002810955047607422 nb_pixel_total : 7104 time to create 1 rle with old method : 0.008490800857543945 length of segment : 119 time for calcul the mask position with numpy : 0.00016570091247558594 nb_pixel_total : 3032 time to create 1 rle with old method : 0.0038611888885498047 length of segment : 60 time for calcul the mask position with numpy : 0.0001995563507080078 nb_pixel_total : 6221 time to create 1 rle with old method : 0.007722377777099609 length of segment : 100 time for calcul the mask position with numpy : 0.0003991127014160156 nb_pixel_total : 11314 time to create 1 rle with old method : 0.0138702392578125 length of segment : 145 time for calcul the mask position with numpy : 0.00033783912658691406 nb_pixel_total : 10510 time to create 1 rle with old method : 0.012518644332885742 length of segment : 131 time for calcul the mask position with numpy : 0.0002846717834472656 nb_pixel_total : 13433 time to create 1 rle with old method : 0.01555490493774414 length of segment : 146 time for calcul the mask position with numpy : 0.00018787384033203125 nb_pixel_total : 2212 time to create 1 rle with old method : 0.0027222633361816406 length of segment : 94 time for calcul the mask position with numpy : 0.00037360191345214844 nb_pixel_total : 8727 time to create 1 rle with old method : 0.01036524772644043 length of segment : 247 time for calcul the mask position with numpy : 0.013210535049438477 nb_pixel_total : 771571 time to create 1 rle with new method : 0.03033590316772461 length of segment : 989 time for calcul the mask position with numpy : 0.00023174285888671875 nb_pixel_total : 5026 time to create 1 rle with old method : 0.00619196891784668 length of segment : 83 time for calcul the mask position with numpy : 0.0002796649932861328 nb_pixel_total : 4487 time to create 1 rle with old method : 0.005496978759765625 length of segment : 116 time for calcul the mask position with numpy : 0.0004603862762451172 nb_pixel_total : 11245 time to create 1 rle with old method : 0.013525009155273438 length of segment : 183 time for calcul the mask position with numpy : 0.0128936767578125 nb_pixel_total : 725189 time to create 1 rle with new method : 0.025667428970336914 length of segment : 951 time for calcul the mask position with numpy : 0.0001220703125 nb_pixel_total : 4290 time to create 1 rle with old method : 0.005265951156616211 length of segment : 70 time for calcul the mask position with numpy : 0.0003414154052734375 nb_pixel_total : 12652 time to create 1 rle with old method : 0.015439271926879883 length of segment : 107 time for calcul the mask position with numpy : 0.00018405914306640625 nb_pixel_total : 10704 time to create 1 rle with old method : 0.012606620788574219 length of segment : 173 time for calcul the mask position with numpy : 0.00019168853759765625 nb_pixel_total : 9253 time to create 1 rle with old method : 0.011133432388305664 length of segment : 118 time for calcul the mask position with numpy : 0.00032329559326171875 nb_pixel_total : 11415 time to create 1 rle with old method : 0.013386011123657227 length of segment : 183 time for calcul the mask position with numpy : 0.005479335784912109 nb_pixel_total : 300036 time to create 1 rle with new method : 0.01642131805419922 length of segment : 1851 time for calcul the mask position with numpy : 0.0016679763793945312 nb_pixel_total : 94806 time to create 1 rle with old method : 0.1093902587890625 length of segment : 561 time for calcul the mask position with numpy : 0.0004744529724121094 nb_pixel_total : 22720 time to create 1 rle with old method : 0.02687358856201172 length of segment : 217 time for calcul the mask position with numpy : 0.0002567768096923828 nb_pixel_total : 12947 time to create 1 rle with old method : 0.015419721603393555 length of segment : 181 time for calcul the mask position with numpy : 0.0002377033233642578 nb_pixel_total : 8654 time to create 1 rle with old method : 0.010457277297973633 length of segment : 136 time for calcul the mask position with numpy : 0.0017049312591552734 nb_pixel_total : 111623 time to create 1 rle with old method : 0.12728500366210938 length of segment : 554 time for calcul the mask position with numpy : 0.0002772808074951172 nb_pixel_total : 10931 time to create 1 rle with old method : 0.013122797012329102 length of segment : 187 time for calcul the mask position with numpy : 0.00019550323486328125 nb_pixel_total : 11603 time to create 1 rle with old method : 0.014079809188842773 length of segment : 132 time for calcul the mask position with numpy : 0.008725166320800781 nb_pixel_total : 712756 time to create 1 rle with new method : 0.015877485275268555 length of segment : 911 time for calcul the mask position with numpy : 0.00012493133544921875 nb_pixel_total : 5469 time to create 1 rle with old method : 0.006352663040161133 length of segment : 117 time for calcul the mask position with numpy : 0.00017118453979492188 nb_pixel_total : 10007 time to create 1 rle with old method : 0.011475563049316406 length of segment : 156 time for calcul the mask position with numpy : 0.0001671314239501953 nb_pixel_total : 9627 time to create 1 rle with old method : 0.011074304580688477 length of segment : 104 time for calcul the mask position with numpy : 0.0001747608184814453 nb_pixel_total : 7569 time to create 1 rle with old method : 0.00915670394897461 length of segment : 109 time for calcul the mask position with numpy : 0.0016660690307617188 nb_pixel_total : 118223 time to create 1 rle with old method : 0.13038873672485352 length of segment : 553 time for calcul the mask position with numpy : 0.0001461505889892578 nb_pixel_total : 8115 time to create 1 rle with old method : 0.009369373321533203 length of segment : 83 time for calcul the mask position with numpy : 9.632110595703125e-05 nb_pixel_total : 2114 time to create 1 rle with old method : 0.002597808837890625 length of segment : 32 time for calcul the mask position with numpy : 0.0003440380096435547 nb_pixel_total : 12019 time to create 1 rle with old method : 0.013502120971679688 length of segment : 129 time for calcul the mask position with numpy : 0.012967348098754883 nb_pixel_total : 799891 time to create 1 rle with new method : 0.019638538360595703 length of segment : 1185 time for calcul the mask position with numpy : 0.0002510547637939453 nb_pixel_total : 10120 time to create 1 rle with old method : 0.017625808715820312 length of segment : 159 time for calcul the mask position with numpy : 0.0004391670227050781 nb_pixel_total : 18813 time to create 1 rle with old method : 0.021887540817260742 length of segment : 118 time for calcul the mask position with numpy : 0.00032520294189453125 nb_pixel_total : 11556 time to create 1 rle with old method : 0.013169288635253906 length of segment : 106 time for calcul the mask position with numpy : 0.00946950912475586 nb_pixel_total : 420099 time to create 1 rle with new method : 0.021799564361572266 length of segment : 1739 time for calcul the mask position with numpy : 0.0002372264862060547 nb_pixel_total : 3297 time to create 1 rle with old method : 0.0038437843322753906 length of segment : 78 time for calcul the mask position with numpy : 0.01157522201538086 nb_pixel_total : 667617 time to create 1 rle with new method : 0.020284652709960938 length of segment : 978 time for calcul the mask position with numpy : 0.0020673274993896484 nb_pixel_total : 108156 time to create 1 rle with old method : 0.14857125282287598 length of segment : 540 time for calcul the mask position with numpy : 0.00026988983154296875 nb_pixel_total : 10596 time to create 1 rle with old method : 0.012295722961425781 length of segment : 152 time for calcul the mask position with numpy : 0.0002987384796142578 nb_pixel_total : 8545 time to create 1 rle with old method : 0.010273933410644531 length of segment : 102 time for calcul the mask position with numpy : 0.0022733211517333984 nb_pixel_total : 104651 time to create 1 rle with old method : 0.11753344535827637 length of segment : 559 time for calcul the mask position with numpy : 0.00025725364685058594 nb_pixel_total : 6429 time to create 1 rle with old method : 0.007611274719238281 length of segment : 93 time for calcul the mask position with numpy : 0.00044465065002441406 nb_pixel_total : 8565 time to create 1 rle with old method : 0.009964704513549805 length of segment : 171 time for calcul the mask position with numpy : 0.0003981590270996094 nb_pixel_total : 11507 time to create 1 rle with old method : 0.013386011123657227 length of segment : 183 time for calcul the mask position with numpy : 0.0005018711090087891 nb_pixel_total : 15777 time to create 1 rle with old method : 0.018397092819213867 length of segment : 162 time for calcul the mask position with numpy : 0.0023653507232666016 nb_pixel_total : 105034 time to create 1 rle with old method : 0.11883878707885742 length of segment : 548 time for calcul the mask position with numpy : 0.011861801147460938 nb_pixel_total : 745257 time to create 1 rle with new method : 0.019275188446044922 length of segment : 937 time for calcul the mask position with numpy : 0.00011873245239257812 nb_pixel_total : 1825 time to create 1 rle with old method : 0.002367258071899414 length of segment : 65 time for calcul the mask position with numpy : 0.0005521774291992188 nb_pixel_total : 20437 time to create 1 rle with old method : 0.023534536361694336 length of segment : 199 time for calcul the mask position with numpy : 0.00037670135498046875 nb_pixel_total : 5905 time to create 1 rle with old method : 0.006998777389526367 length of segment : 143 time for calcul the mask position with numpy : 0.0001277923583984375 nb_pixel_total : 2264 time to create 1 rle with old method : 0.00284576416015625 length of segment : 48 time for calcul the mask position with numpy : 0.00012946128845214844 nb_pixel_total : 4009 time to create 1 rle with old method : 0.004957437515258789 length of segment : 72 time for calcul the mask position with numpy : 0.00031685829162597656 nb_pixel_total : 10666 time to create 1 rle with old method : 0.012724637985229492 length of segment : 94 time for calcul the mask position with numpy : 0.018220186233520508 nb_pixel_total : 1039167 time to create 1 rle with new method : 0.033498287200927734 length of segment : 1374 time for calcul the mask position with numpy : 0.0003635883331298828 nb_pixel_total : 22451 time to create 1 rle with old method : 0.028448820114135742 length of segment : 156 time for calcul the mask position with numpy : 0.0005464553833007812 nb_pixel_total : 24165 time to create 1 rle with old method : 0.0287778377532959 length of segment : 153 time for calcul the mask position with numpy : 0.010890722274780273 nb_pixel_total : 633358 time to create 1 rle with new method : 0.02225804328918457 length of segment : 927 time for calcul the mask position with numpy : 0.011023998260498047 nb_pixel_total : 714270 time to create 1 rle with new method : 0.01609945297241211 length of segment : 926 time spent for convertir_results : 6.429731607437134 Inside saveOutput : final : False verbose : 0 eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 87 chid ids of type : 3594 Number RLEs to save : 29505 save missing photos in datou_result : time spend for datou_step_exec : 38.805996894836426 time spend to save output : 1.7746210098266602 total time spend for step 1 : 40.580617904663086 step2:crop_condition Thu Aug 14 14:31:15 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure Loading chi in step crop with photo_hashtag_type : 3594 Loading chi in step crop for list_pids : 40 ! batch 1 Loaded 87 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ begin to crop the class : papier param for this class : {'min_score': 0.7} filtre for class : papier hashtag_id of this class : 492668766 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 24 About to insert : list_path_to_insert length 24 new photo from crops ! About to upload 24 photos upload in portfolio : 3736932 init cache_photo without model_param we have 24 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1755174677_2800533 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 24 photos in the portfolio 3736932 time of upload the photos Elapsed time : 5.350218057632446 we have finished the crop for the class : papier begin to crop the class : carton param for this class : {'min_score': 0.7} filtre for class : carton hashtag_id of this class : 492774966 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 21 About to insert : list_path_to_insert length 21 new photo from crops ! About to upload 21 photos upload in portfolio : 3736932 init cache_photo without model_param we have 21 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1755174694_2800533 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 21 photos in the portfolio 3736932 time of upload the photos Elapsed time : 4.851412773132324 we have finished the crop for the class : carton begin to crop the class : metal param for this class : {'min_score': 0.7} filtre for class : metal hashtag_id of this class : 492628673 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 3 About to insert : list_path_to_insert length 3 new photo from crops ! About to upload 3 photos upload in portfolio : 3736932 init cache_photo without model_param we have 3 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1755174700_2800533 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 3 photos in the portfolio 3736932 time of upload the photos Elapsed time : 1.8875389099121094 we have finished the crop for the class : metal begin to crop the class : pet_clair param for this class : {'min_score': 0.7} filtre for class : pet_clair hashtag_id of this class : 2107755846 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 32 About to insert : list_path_to_insert length 32 new photo from crops ! About to upload 32 photos upload in portfolio : 3736932 init cache_photo without model_param we have 32 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1755174735_2800533 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 32 photos in the portfolio 3736932 time of upload the photos Elapsed time : 8.161721229553223 we have finished the crop for the class : pet_clair begin to crop the class : autre param for this class : {'min_score': 0.7} filtre for class : autre hashtag_id of this class : 494826614 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 7 About to insert : list_path_to_insert length 7 new photo from crops ! About to upload 7 photos upload in portfolio : 3736932 init cache_photo without model_param we have 7 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1755174745_2800533 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 7 photos in the portfolio 3736932 time of upload the photos Elapsed time : 2.5562729835510254 we have finished the crop for the class : autre begin to crop the class : pehd param for this class : {'min_score': 0.7} filtre for class : pehd hashtag_id of this class : 628944319 begin to crop the class : pet_fonce param for this class : {'min_score': 0.7} filtre for class : pet_fonce hashtag_id of this class : 2107755900 delete rles from all chi we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : crop_condition we use saveGeneral [1376968105, 1376968022, 1376967713, 1376967712, 1376967711, 1376967708, 1376967701, 1376967694, 1376967615, 1376967613, 1376967612, 1376967611, 1376967609, 1376967606, 1376967580, 1376967577, 1376967571, 1376967383, 1376967354, 1376967324, 1376967290, 1376967244, 1376967187, 1376967072, 1376967068, 1376967066, 1376967064, 1376967061, 1376967058, 1376967001, 1376966999, 1376966998, 1376966996, 1376966974, 1376966948, 1376966947, 1376966945, 1376966941, 1376966938, 1376966934] Looping around the photos to save general results len do output : 87 /1377020687Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020688Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020689Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020691Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020692Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020693Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020694Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020695Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020696Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020697Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020698Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020699Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020700Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020701Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020702Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020703Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020704Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020705Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020706Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020707Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020708Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020709Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020710Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020711Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020733Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020735Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020736Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020737Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020738Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020739Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020740Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020741Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020742Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020743Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020744Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020745Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020746Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020747Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020748Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020749Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020750Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020751Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020752Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020753Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020754Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020756Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020757Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377020758Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021015Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021017Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021020Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021021Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021022Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021023Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021025Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021026Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021027Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021029Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021030Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021031Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021032Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021034Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021035Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021036Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021038Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021039Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021040Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021041Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021043Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021044Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021045Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021047Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021048Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021049Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021050Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021052Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021053Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021054Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021056Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021058Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021097Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021098Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021099Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021100Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021102Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021103Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1377021104Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376968105', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376968022', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967713', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967712', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967711', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967708', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967701', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967694', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967615', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967613', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967612', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967611', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967609', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967606', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967580', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967577', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967571', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967383', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967354', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967324', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967290', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967244', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967187', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967072', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967068', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967066', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967064', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967061', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967058', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967001', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966999', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966998', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966996', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966974', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966948', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966947', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966945', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966941', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966938', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966934', None, None, None, None, None, '3534279') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 301 time used for this insertion : 0.0329132080078125 save_final save missing photos in datou_result : time spend for datou_step_exec : 73.05336427688599 time spend to save output : 0.03644919395446777 total time spend for step 2 : 73.08981347084045 step3:rle_unique_nms_with_priority Thu Aug 14 14:32:28 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 87 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++nb_obj : 6 nb_hashtags : 4 time to prepare the origin masks : 0.19641828536987305 time for calcul the mask position with numpy : 0.0356595516204834 nb_pixel_total : 1919508 time to create 1 rle with new method : 0.043669700622558594 time for calcul the mask position with numpy : 0.006525516510009766 nb_pixel_total : 12719 time to create 1 rle with old method : 0.014372587203979492 time for calcul the mask position with numpy : 0.006550312042236328 nb_pixel_total : 4416 time to create 1 rle with old method : 0.0049440860748291016 time for calcul the mask position with numpy : 0.007923603057861328 nb_pixel_total : 8356 time to create 1 rle with old method : 0.009367704391479492 time for calcul the mask position with numpy : 0.007613420486450195 nb_pixel_total : 115359 time to create 1 rle with old method : 0.15611648559570312 time for calcul the mask position with numpy : 0.006452798843383789 nb_pixel_total : 5383 time to create 1 rle with old method : 0.006013154983520508 time for calcul the mask position with numpy : 0.006185770034790039 nb_pixel_total : 7859 time to create 1 rle with old method : 0.008914709091186523 create new chi : 0.3251986503601074 time to delete rle : 0.022946596145629883 batch 1 Loaded 13 chid ids of type : 3594 +++++++Number RLEs to save : 3444 TO DO : save crop sub photo not yet done ! save time : 0.2376861572265625 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.054286956787109375 time for calcul the mask position with numpy : 0.01809859275817871 nb_pixel_total : 1334741 time to create 1 rle with new method : 0.03209209442138672 time for calcul the mask position with numpy : 0.014052152633666992 nb_pixel_total : 729861 time to create 1 rle with new method : 0.03254127502441406 time for calcul the mask position with numpy : 0.0069577693939208984 nb_pixel_total : 8998 time to create 1 rle with old method : 0.014917373657226562 create new chi : 0.1192169189453125 time to delete rle : 0.000728607177734375 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 3190 TO DO : save crop sub photo not yet done ! save time : 0.22691822052001953 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.03602266311645508 time for calcul the mask position with numpy : 0.015654325485229492 nb_pixel_total : 1404983 time to create 1 rle with new method : 0.02950263023376465 time for calcul the mask position with numpy : 0.01268458366394043 nb_pixel_total : 668617 time to create 1 rle with new method : 0.03600358963012695 create new chi : 0.09425878524780273 time to delete rle : 0.0002987384796142578 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 3066 TO DO : save crop sub photo not yet done ! save time : 0.20973491668701172 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.03399252891540527 time for calcul the mask position with numpy : 0.020135879516601562 nb_pixel_total : 1979682 time to create 1 rle with new method : 0.029986858367919922 time for calcul the mask position with numpy : 0.006834506988525391 nb_pixel_total : 93918 time to create 1 rle with old method : 0.10510420799255371 create new chi : 0.16241145133972168 time to delete rle : 0.0004093647003173828 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 2112 TO DO : save crop sub photo not yet done ! save time : 0.15501785278320312 No data in photo_id : 1376967711 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.06333732604980469 time for calcul the mask position with numpy : 0.01910114288330078 nb_pixel_total : 1249393 time to create 1 rle with new method : 0.036364078521728516 time for calcul the mask position with numpy : 0.012801647186279297 nb_pixel_total : 806553 time to create 1 rle with new method : 0.03135180473327637 time for calcul the mask position with numpy : 0.006622791290283203 nb_pixel_total : 6596 time to create 1 rle with old method : 0.007442951202392578 time for calcul the mask position with numpy : 0.006674766540527344 nb_pixel_total : 11058 time to create 1 rle with old method : 0.012484550476074219 create new chi : 0.13356590270996094 time to delete rle : 0.0008590221405029297 batch 1 Loaded 7 chid ids of type : 3594 ++++Number RLEs to save : 4704 TO DO : save crop sub photo not yet done ! save time : 0.2959432601928711 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.0418400764465332 time for calcul the mask position with numpy : 0.024834632873535156 nb_pixel_total : 2062505 time to create 1 rle with new method : 0.029990673065185547 time for calcul the mask position with numpy : 0.0064296722412109375 nb_pixel_total : 4323 time to create 1 rle with old method : 0.0049130916595458984 time for calcul the mask position with numpy : 0.006154537200927734 nb_pixel_total : 6772 time to create 1 rle with old method : 0.00764775276184082 create new chi : 0.08025431632995605 time to delete rle : 0.0002639293670654297 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1534 TO DO : save crop sub photo not yet done ! save time : 0.13968443870544434 nb_obj : 4 nb_hashtags : 3 time to prepare the origin masks : 0.061173200607299805 time for calcul the mask position with numpy : 0.01845574378967285 nb_pixel_total : 1916191 time to create 1 rle with new method : 0.032881736755371094 time for calcul the mask position with numpy : 0.006164073944091797 nb_pixel_total : 21756 time to create 1 rle with old method : 0.024624347686767578 time for calcul the mask position with numpy : 0.006241559982299805 nb_pixel_total : 8809 time to create 1 rle with old method : 0.009912252426147461 time for calcul the mask position with numpy : 0.007120370864868164 nb_pixel_total : 116107 time to create 1 rle with old method : 0.13232660293579102 time for calcul the mask position with numpy : 0.0062410831451416016 nb_pixel_total : 10737 time to create 1 rle with old method : 0.01454019546508789 create new chi : 0.25908422470092773 time to delete rle : 0.0006105899810791016 batch 1 Loaded 9 chid ids of type : 3594 ++++Number RLEs to save : 3076 TO DO : save crop sub photo not yet done ! save time : 0.197951078414917 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.04500770568847656 time for calcul the mask position with numpy : 0.020245790481567383 nb_pixel_total : 2030917 time to create 1 rle with new method : 0.029420852661132812 time for calcul the mask position with numpy : 0.00625157356262207 nb_pixel_total : 23287 time to create 1 rle with old method : 0.026628971099853516 time for calcul the mask position with numpy : 0.006167888641357422 nb_pixel_total : 19396 time to create 1 rle with old method : 0.021958351135253906 create new chi : 0.11097025871276855 time to delete rle : 0.0003085136413574219 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1722 TO DO : save crop sub photo not yet done ! save time : 0.11859011650085449 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.0334012508392334 time for calcul the mask position with numpy : 0.021530628204345703 nb_pixel_total : 2062753 time to create 1 rle with new method : 0.02839827537536621 time for calcul the mask position with numpy : 0.006208181381225586 nb_pixel_total : 10847 time to create 1 rle with old method : 0.012105941772460938 create new chi : 0.0684814453125 time to delete rle : 0.00024580955505371094 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1440 TO DO : save crop sub photo not yet done ! save time : 0.11378192901611328 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.03270673751831055 time for calcul the mask position with numpy : 0.018662452697753906 nb_pixel_total : 2064314 time to create 1 rle with new method : 0.02818894386291504 time for calcul the mask position with numpy : 0.005892753601074219 nb_pixel_total : 9286 time to create 1 rle with old method : 0.009976387023925781 create new chi : 0.0629425048828125 time to delete rle : 0.00022864341735839844 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1414 TO DO : save crop sub photo not yet done ! save time : 0.10666942596435547 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.031647682189941406 time for calcul the mask position with numpy : 0.019346237182617188 nb_pixel_total : 2069488 time to create 1 rle with new method : 0.030804872512817383 time for calcul the mask position with numpy : 0.006846189498901367 nb_pixel_total : 4112 time to create 1 rle with old method : 0.0057353973388671875 create new chi : 0.06293725967407227 time to delete rle : 0.00024247169494628906 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1198 TO DO : save crop sub photo not yet done ! save time : 0.1023561954498291 nb_obj : 3 nb_hashtags : 3 time to prepare the origin masks : 0.046593666076660156 time for calcul the mask position with numpy : 0.020750761032104492 nb_pixel_total : 2057243 time to create 1 rle with new method : 0.028599977493286133 time for calcul the mask position with numpy : 0.007452249526977539 nb_pixel_total : 6221 time to create 1 rle with old method : 0.007132768630981445 time for calcul the mask position with numpy : 0.00603175163269043 nb_pixel_total : 3032 time to create 1 rle with old method : 0.0034983158111572266 time for calcul the mask position with numpy : 0.006757020950317383 nb_pixel_total : 7104 time to create 1 rle with old method : 0.008083105087280273 create new chi : 0.08864808082580566 time to delete rle : 0.00027942657470703125 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 1638 TO DO : save crop sub photo not yet done ! save time : 0.12756848335266113 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.03719592094421387 time for calcul the mask position with numpy : 0.020824193954467773 nb_pixel_total : 2062286 time to create 1 rle with new method : 0.029279470443725586 time for calcul the mask position with numpy : 0.006297111511230469 nb_pixel_total : 11314 time to create 1 rle with old method : 0.01266932487487793 create new chi : 0.06930780410766602 time to delete rle : 0.0002300739288330078 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1370 TO DO : save crop sub photo not yet done ! save time : 0.1148076057434082 nb_obj : 4 nb_hashtags : 3 time to prepare the origin masks : 0.05642437934875488 time for calcul the mask position with numpy : 0.01910877227783203 nb_pixel_total : 2038718 time to create 1 rle with new method : 0.029926538467407227 time for calcul the mask position with numpy : 0.007236957550048828 nb_pixel_total : 8727 time to create 1 rle with old method : 0.01631021499633789 time for calcul the mask position with numpy : 0.007018327713012695 nb_pixel_total : 2212 time to create 1 rle with old method : 0.0041484832763671875 time for calcul the mask position with numpy : 0.010260581970214844 nb_pixel_total : 13433 time to create 1 rle with old method : 0.015417098999023438 time for calcul the mask position with numpy : 0.006063222885131836 nb_pixel_total : 10510 time to create 1 rle with old method : 0.011703968048095703 create new chi : 0.1283407211303711 time to delete rle : 0.00040984153747558594 batch 1 Loaded 9 chid ids of type : 3594 ++++++Number RLEs to save : 2316 TO DO : save crop sub photo not yet done ! save time : 0.15928912162780762 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.036771535873413086 time for calcul the mask position with numpy : 0.0167849063873291 nb_pixel_total : 1302029 time to create 1 rle with new method : 0.029333829879760742 time for calcul the mask position with numpy : 0.011525630950927734 nb_pixel_total : 771571 time to create 1 rle with new method : 0.03015875816345215 create new chi : 0.0883326530456543 time to delete rle : 0.0005137920379638672 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 3058 TO DO : save crop sub photo not yet done ! save time : 0.1995258331298828 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.04294085502624512 time for calcul the mask position with numpy : 0.020799636840820312 nb_pixel_total : 2064087 time to create 1 rle with new method : 0.028196334838867188 time for calcul the mask position with numpy : 0.00600886344909668 nb_pixel_total : 4487 time to create 1 rle with old method : 0.005135774612426758 time for calcul the mask position with numpy : 0.006240129470825195 nb_pixel_total : 5026 time to create 1 rle with old method : 0.005846738815307617 create new chi : 0.07249331474304199 time to delete rle : 0.00033020973205566406 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1478 TO DO : save crop sub photo not yet done ! save time : 0.11809253692626953 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.033910274505615234 time for calcul the mask position with numpy : 0.019551992416381836 nb_pixel_total : 2062355 time to create 1 rle with new method : 0.028609514236450195 time for calcul the mask position with numpy : 0.006084442138671875 nb_pixel_total : 11245 time to create 1 rle with old method : 0.012683868408203125 create new chi : 0.06717801094055176 time to delete rle : 0.0002503395080566406 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1446 TO DO : save crop sub photo not yet done ! save time : 0.11045527458190918 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.03694295883178711 time for calcul the mask position with numpy : 0.01495981216430664 nb_pixel_total : 1348411 time to create 1 rle with new method : 0.02921295166015625 time for calcul the mask position with numpy : 0.011069059371948242 nb_pixel_total : 725189 time to create 1 rle with new method : 0.028595447540283203 create new chi : 0.08424925804138184 time to delete rle : 0.00029850006103515625 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 2982 TO DO : save crop sub photo not yet done ! save time : 0.2051534652709961 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.0390629768371582 time for calcul the mask position with numpy : 0.02255702018737793 nb_pixel_total : 2056658 time to create 1 rle with new method : 0.029509305953979492 time for calcul the mask position with numpy : 0.006130218505859375 nb_pixel_total : 12652 time to create 1 rle with old method : 0.014173030853271484 time for calcul the mask position with numpy : 0.006139039993286133 nb_pixel_total : 4290 time to create 1 rle with old method : 0.004882097244262695 create new chi : 0.08364987373352051 time to delete rle : 0.0003266334533691406 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1434 TO DO : save crop sub photo not yet done ! save time : 0.11981940269470215 nb_obj : 6 nb_hashtags : 2 time to prepare the origin masks : 0.08008337020874023 time for calcul the mask position with numpy : 0.016788482666015625 nb_pixel_total : 1632978 time to create 1 rle with new method : 0.03945660591125488 time for calcul the mask position with numpy : 0.006353855133056641 nb_pixel_total : 22720 time to create 1 rle with old method : 0.02587747573852539 time for calcul the mask position with numpy : 0.007035970687866211 nb_pixel_total : 94806 time to create 1 rle with old method : 0.10607004165649414 time for calcul the mask position with numpy : 0.00781869888305664 nb_pixel_total : 300036 time to create 1 rle with new method : 0.03135561943054199 time for calcul the mask position with numpy : 0.0062732696533203125 nb_pixel_total : 3103 time to create 1 rle with old method : 0.0035965442657470703 time for calcul the mask position with numpy : 0.006075620651245117 nb_pixel_total : 9253 time to create 1 rle with old method : 0.010481834411621094 time for calcul the mask position with numpy : 0.006075859069824219 nb_pixel_total : 10704 time to create 1 rle with old method : 0.012141942977905273 create new chi : 0.2885627746582031 time to delete rle : 0.0010998249053955078 batch 1 Loaded 13 chid ids of type : 3594 +++++++++++++++++++++++Number RLEs to save : 7187 TO DO : save crop sub photo not yet done ! save time : 0.4169349670410156 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.03277277946472168 time for calcul the mask position with numpy : 0.019975900650024414 nb_pixel_total : 2060653 time to create 1 rle with new method : 0.028570175170898438 time for calcul the mask position with numpy : 0.0060956478118896484 nb_pixel_total : 12947 time to create 1 rle with old method : 0.01455378532409668 create new chi : 0.06943225860595703 time to delete rle : 0.00025916099548339844 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1442 TO DO : save crop sub photo not yet done ! save time : 0.11340928077697754 nb_obj : 3 nb_hashtags : 1 time to prepare the origin masks : 0.05246853828430176 time for calcul the mask position with numpy : 0.018915176391601562 nb_pixel_total : 1942392 time to create 1 rle with new method : 0.029315710067749023 time for calcul the mask position with numpy : 0.0061054229736328125 nb_pixel_total : 10931 time to create 1 rle with old method : 0.012340545654296875 time for calcul the mask position with numpy : 0.006684064865112305 nb_pixel_total : 111623 time to create 1 rle with old method : 0.1258695125579834 time for calcul the mask position with numpy : 0.0060613155364990234 nb_pixel_total : 8654 time to create 1 rle with old method : 0.009909629821777344 create new chi : 0.21564054489135742 time to delete rle : 0.0004963874816894531 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 2834 TO DO : save crop sub photo not yet done ! save time : 0.19003725051879883 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.042171478271484375 time for calcul the mask position with numpy : 0.015120506286621094 nb_pixel_total : 1349241 time to create 1 rle with new method : 0.02989339828491211 time for calcul the mask position with numpy : 0.01476287841796875 nb_pixel_total : 712756 time to create 1 rle with new method : 0.031018972396850586 time for calcul the mask position with numpy : 0.006187915802001953 nb_pixel_total : 11603 time to create 1 rle with old method : 0.01326894760131836 create new chi : 0.11069965362548828 time to delete rle : 0.0005211830139160156 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 3166 TO DO : save crop sub photo not yet done ! save time : 0.22223711013793945 nb_obj : 6 nb_hashtags : 3 time to prepare the origin masks : 0.07503247261047363 time for calcul the mask position with numpy : 0.01856064796447754 nb_pixel_total : 1914590 time to create 1 rle with new method : 0.03745865821838379 time for calcul the mask position with numpy : 0.006198406219482422 nb_pixel_total : 8115 time to create 1 rle with old method : 0.00915837287902832 time for calcul the mask position with numpy : 0.006693840026855469 nb_pixel_total : 118223 time to create 1 rle with old method : 0.1320352554321289 time for calcul the mask position with numpy : 0.00626826286315918 nb_pixel_total : 7569 time to create 1 rle with old method : 0.008521318435668945 time for calcul the mask position with numpy : 0.006030082702636719 nb_pixel_total : 9627 time to create 1 rle with old method : 0.010839462280273438 time for calcul the mask position with numpy : 0.0060236454010009766 nb_pixel_total : 10007 time to create 1 rle with old method : 0.011358499526977539 time for calcul the mask position with numpy : 0.0060884952545166016 nb_pixel_total : 5469 time to create 1 rle with old method : 0.006208658218383789 create new chi : 0.27433323860168457 time to delete rle : 0.0005693435668945312 batch 1 Loaded 13 chid ids of type : 3594 ++++++Number RLEs to save : 3324 TO DO : save crop sub photo not yet done ! save time : 0.21223020553588867 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.03455829620361328 time for calcul the mask position with numpy : 0.019677400588989258 nb_pixel_total : 2071486 time to create 1 rle with new method : 0.028568744659423828 time for calcul the mask position with numpy : 0.006016969680786133 nb_pixel_total : 2114 time to create 1 rle with old method : 0.0024192333221435547 create new chi : 0.05688929557800293 time to delete rle : 0.00021505355834960938 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1144 TO DO : save crop sub photo not yet done ! save time : 0.10146903991699219 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.05205869674682617 time for calcul the mask position with numpy : 0.014544010162353516 nb_pixel_total : 1262442 time to create 1 rle with new method : 0.030094146728515625 time for calcul the mask position with numpy : 0.006357908248901367 nb_pixel_total : 9465 time to create 1 rle with old method : 0.01083064079284668 time for calcul the mask position with numpy : 0.012031793594360352 nb_pixel_total : 789674 time to create 1 rle with new method : 0.0285184383392334 time for calcul the mask position with numpy : 0.006131887435913086 nb_pixel_total : 12019 time to create 1 rle with old method : 0.013550758361816406 create new chi : 0.12261795997619629 time to delete rle : 0.0006248950958251953 batch 1 Loaded 7 chid ids of type : 3594 +++++++Number RLEs to save : 3925 TO DO : save crop sub photo not yet done ! save time : 0.25070810317993164 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.034995317459106445 time for calcul the mask position with numpy : 0.021062374114990234 nb_pixel_total : 2054787 time to create 1 rle with new method : 0.02837371826171875 time for calcul the mask position with numpy : 0.006142616271972656 nb_pixel_total : 18813 time to create 1 rle with old method : 0.02112126350402832 create new chi : 0.07692480087280273 time to delete rle : 0.00023818016052246094 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1316 TO DO : save crop sub photo not yet done ! save time : 0.11257648468017578 No data in photo_id : 1376967058 No data in photo_id : 1376967001 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.04523134231567383 time for calcul the mask position with numpy : 0.0173647403717041 nb_pixel_total : 1641945 time to create 1 rle with new method : 0.029985427856445312 time for calcul the mask position with numpy : 0.008774280548095703 nb_pixel_total : 420099 time to create 1 rle with new method : 0.028667688369750977 time for calcul the mask position with numpy : 0.006226539611816406 nb_pixel_total : 11556 time to create 1 rle with old method : 0.013135433197021484 create new chi : 0.10476350784301758 time to delete rle : 0.0007688999176025391 batch 1 Loaded 5 chid ids of type : 3594 ++++Number RLEs to save : 4770 TO DO : save crop sub photo not yet done ! save time : 0.3173859119415283 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.034020423889160156 time for calcul the mask position with numpy : 0.021133899688720703 nb_pixel_total : 2070303 time to create 1 rle with new method : 0.029667139053344727 time for calcul the mask position with numpy : 0.006191253662109375 nb_pixel_total : 3297 time to create 1 rle with old method : 0.003730297088623047 create new chi : 0.06094551086425781 time to delete rle : 0.00023818016052246094 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1236 TO DO : save crop sub photo not yet done ! save time : 0.10592412948608398 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.05627846717834473 time for calcul the mask position with numpy : 0.015850543975830078 nb_pixel_total : 1287231 time to create 1 rle with new method : 0.030214786529541016 time for calcul the mask position with numpy : 0.00597381591796875 nb_pixel_total : 10596 time to create 1 rle with old method : 0.011943340301513672 time for calcul the mask position with numpy : 0.006769657135009766 nb_pixel_total : 108156 time to create 1 rle with old method : 0.12330317497253418 time for calcul the mask position with numpy : 0.013799190521240234 nb_pixel_total : 667617 time to create 1 rle with new method : 0.028990745544433594 create new chi : 0.23746013641357422 time to delete rle : 0.0006728172302246094 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 4420 TO DO : save crop sub photo not yet done ! save time : 0.2726724147796631 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.043268680572509766 time for calcul the mask position with numpy : 0.019034147262573242 nb_pixel_total : 1960404 time to create 1 rle with new method : 0.02989649772644043 time for calcul the mask position with numpy : 0.006714582443237305 nb_pixel_total : 104651 time to create 1 rle with old method : 0.12065005302429199 time for calcul the mask position with numpy : 0.006846427917480469 nb_pixel_total : 8545 time to create 1 rle with old method : 0.009593725204467773 create new chi : 0.19316673278808594 time to delete rle : 0.0005345344543457031 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 2402 TO DO : save crop sub photo not yet done ! save time : 0.1654644012451172 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.04131960868835449 time for calcul the mask position with numpy : 0.020015478134155273 nb_pixel_total : 2058606 time to create 1 rle with new method : 0.030322790145874023 time for calcul the mask position with numpy : 0.006579160690307617 nb_pixel_total : 8565 time to create 1 rle with old method : 0.009733915328979492 time for calcul the mask position with numpy : 0.006085395812988281 nb_pixel_total : 6429 time to create 1 rle with old method : 0.007234811782836914 create new chi : 0.0804286003112793 time to delete rle : 0.0002837181091308594 batch 1 Loaded 5 chid ids of type : 3594 +++Number RLEs to save : 1608 TO DO : save crop sub photo not yet done ! save time : 0.11932682991027832 nb_obj : 5 nb_hashtags : 2 time to prepare the origin masks : 0.07290124893188477 time for calcul the mask position with numpy : 0.01415872573852539 nb_pixel_total : 1194200 time to create 1 rle with new method : 0.03397035598754883 time for calcul the mask position with numpy : 0.006294965744018555 nb_pixel_total : 1825 time to create 1 rle with old method : 0.002125978469848633 time for calcul the mask position with numpy : 0.011678934097290039 nb_pixel_total : 745257 time to create 1 rle with new method : 0.027996301651000977 time for calcul the mask position with numpy : 0.006617546081542969 nb_pixel_total : 105034 time to create 1 rle with old method : 0.1167752742767334 time for calcul the mask position with numpy : 0.006499767303466797 nb_pixel_total : 15777 time to create 1 rle with old method : 0.0177609920501709 time for calcul the mask position with numpy : 0.006247997283935547 nb_pixel_total : 11507 time to create 1 rle with old method : 0.013049602508544922 create new chi : 0.26389408111572266 time to delete rle : 0.0007593631744384766 batch 1 Loaded 11 chid ids of type : 3594 +++++++Number RLEs to save : 4870 TO DO : save crop sub photo not yet done ! save time : 0.29460883140563965 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.03328394889831543 time for calcul the mask position with numpy : 0.019370555877685547 nb_pixel_total : 2053163 time to create 1 rle with new method : 0.028596878051757812 time for calcul the mask position with numpy : 0.006136178970336914 nb_pixel_total : 20437 time to create 1 rle with old method : 0.023101329803466797 create new chi : 0.0774540901184082 time to delete rle : 0.0002601146697998047 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1478 TO DO : save crop sub photo not yet done ! save time : 0.11583423614501953 nb_obj : 5 nb_hashtags : 3 time to prepare the origin masks : 0.07475066184997559 time for calcul the mask position with numpy : 0.013393163681030273 nb_pixel_total : 1017924 time to create 1 rle with new method : 0.03146982192993164 time for calcul the mask position with numpy : 0.013184547424316406 nb_pixel_total : 1032832 time to create 1 rle with new method : 0.031986236572265625 time for calcul the mask position with numpy : 0.0061397552490234375 nb_pixel_total : 10666 time to create 1 rle with old method : 0.011914491653442383 time for calcul the mask position with numpy : 0.006079435348510742 nb_pixel_total : 4009 time to create 1 rle with old method : 0.004561185836791992 time for calcul the mask position with numpy : 0.006029605865478516 nb_pixel_total : 2264 time to create 1 rle with old method : 0.0025649070739746094 time for calcul the mask position with numpy : 0.0060617923736572266 nb_pixel_total : 5905 time to create 1 rle with old method : 0.006637096405029297 create new chi : 0.14140582084655762 time to delete rle : 0.0007491111755371094 batch 1 Loaded 12 chid ids of type : 3594 ++++++Number RLEs to save : 4533 TO DO : save crop sub photo not yet done ! save time : 0.2824282646179199 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.04505419731140137 time for calcul the mask position with numpy : 0.015933513641357422 nb_pixel_total : 1416077 time to create 1 rle with new method : 0.02940821647644043 time for calcul the mask position with numpy : 0.010523557662963867 nb_pixel_total : 633358 time to create 1 rle with new method : 0.028376102447509766 time for calcul the mask position with numpy : 0.00615692138671875 nb_pixel_total : 24165 time to create 1 rle with old method : 0.02728414535522461 create new chi : 0.11814117431640625 time to delete rle : 0.0005469322204589844 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 3240 TO DO : save crop sub photo not yet done ! save time : 0.21491098403930664 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.03472447395324707 time for calcul the mask position with numpy : 0.015523910522460938 nb_pixel_total : 1359330 time to create 1 rle with new method : 0.029147624969482422 time for calcul the mask position with numpy : 0.011187314987182617 nb_pixel_total : 714270 time to create 1 rle with new method : 0.02860713005065918 create new chi : 0.0848703384399414 time to delete rle : 0.0002880096435546875 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 2932 TO DO : save crop sub photo not yet done ! save time : 0.2023317813873291 map_output_result : {1376968105: (0.0, 'Should be the crop_list due to order', 0), 1376968022: (0.0, 'Should be the crop_list due to order', 0), 1376967713: (0.0, 'Should be the crop_list due to order', 0), 1376967712: (0.0, 'Should be the crop_list due to order', 0), 1376967711: (0.0, 'Should be the crop_list due to order', 0.0), 1376967708: (0.0, 'Should be the crop_list due to order', 0), 1376967701: (0.0, 'Should be the crop_list due to order', 0), 1376967694: (0.0, 'Should be the crop_list due to order', 0), 1376967615: (0.0, 'Should be the crop_list due to order', 0), 1376967613: (0.0, 'Should be the crop_list due to order', 0), 1376967612: (0.0, 'Should be the crop_list due to order', 0), 1376967611: (0.0, 'Should be the crop_list due to order', 0), 1376967609: (0.0, 'Should be the crop_list due to order', 0), 1376967606: (0.0, 'Should be the crop_list due to order', 0), 1376967580: (0.0, 'Should be the crop_list due to order', 0), 1376967577: (0.0, 'Should be the crop_list due to order', 0), 1376967571: (0.0, 'Should be the crop_list due to order', 0), 1376967383: (0.0, 'Should be the crop_list due to order', 0), 1376967354: (0.0, 'Should be the crop_list due to order', 0), 1376967324: (0.0, 'Should be the crop_list due to order', 0), 1376967290: (0.0, 'Should be the crop_list due to order', 0), 1376967244: (0.0, 'Should be the crop_list due to order', 0), 1376967187: (0.0, 'Should be the crop_list due to order', 0), 1376967072: (0.0, 'Should be the crop_list due to order', 0), 1376967068: (0.0, 'Should be the crop_list due to order', 0), 1376967066: (0.0, 'Should be the crop_list due to order', 0), 1376967064: (0.0, 'Should be the crop_list due to order', 0), 1376967061: (0.0, 'Should be the crop_list due to order', 0), 1376967058: (0.0, 'Should be the crop_list due to order', 0.0), 1376967001: (0.0, 'Should be the crop_list due to order', 0.0), 1376966999: (0.0, 'Should be the crop_list due to order', 0), 1376966998: (0.0, 'Should be the crop_list due to order', 0), 1376966996: (0.0, 'Should be the crop_list due to order', 0), 1376966974: (0.0, 'Should be the crop_list due to order', 0), 1376966948: (0.0, 'Should be the crop_list due to order', 0), 1376966947: (0.0, 'Should be the crop_list due to order', 0), 1376966945: (0.0, 'Should be the crop_list due to order', 0), 1376966941: (0.0, 'Should be the crop_list due to order', 0), 1376966938: (0.0, 'Should be the crop_list due to order', 0), 1376966934: (0.0, 'Should be the crop_list due to order', 0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1376968105, 1376968022, 1376967713, 1376967712, 1376967711, 1376967708, 1376967701, 1376967694, 1376967615, 1376967613, 1376967612, 1376967611, 1376967609, 1376967606, 1376967580, 1376967577, 1376967571, 1376967383, 1376967354, 1376967324, 1376967290, 1376967244, 1376967187, 1376967072, 1376967068, 1376967066, 1376967064, 1376967061, 1376967058, 1376967001, 1376966999, 1376966998, 1376966996, 1376966974, 1376966948, 1376966947, 1376966945, 1376966941, 1376966938, 1376966934] Looping around the photos to save general results len do output : 40 /1376968105.Didn't retrieve data . /1376968022.Didn't retrieve data . /1376967713.Didn't retrieve data . /1376967712.Didn't retrieve data . /1376967711.Didn't retrieve data . /1376967708.Didn't retrieve data . /1376967701.Didn't retrieve data . /1376967694.Didn't retrieve data . /1376967615.Didn't retrieve data . /1376967613.Didn't retrieve data . /1376967612.Didn't retrieve data . /1376967611.Didn't retrieve data . /1376967609.Didn't retrieve data . /1376967606.Didn't retrieve data . /1376967580.Didn't retrieve data . /1376967577.Didn't retrieve data . /1376967571.Didn't retrieve data . /1376967383.Didn't retrieve data . /1376967354.Didn't retrieve data . /1376967324.Didn't retrieve data . /1376967290.Didn't retrieve data . /1376967244.Didn't retrieve data . /1376967187.Didn't retrieve data . /1376967072.Didn't retrieve data . /1376967068.Didn't retrieve data . /1376967066.Didn't retrieve data . /1376967064.Didn't retrieve data . /1376967061.Didn't retrieve data . /1376967058.Didn't retrieve data . /1376967001.Didn't retrieve data . /1376966999.Didn't retrieve data . /1376966998.Didn't retrieve data . /1376966996.Didn't retrieve data . /1376966974.Didn't retrieve data . /1376966948.Didn't retrieve data . /1376966947.Didn't retrieve data . /1376966945.Didn't retrieve data . /1376966941.Didn't retrieve data . /1376966938.Didn't retrieve data . /1376966934.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376968105', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376968022', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967713', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967712', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967711', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967708', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967701', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967694', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967615', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967613', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967612', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967611', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967609', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967606', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967580', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967577', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967571', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967383', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967354', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967324', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967290', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967244', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967187', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967072', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967068', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967066', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967064', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967061', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967058', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967001', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966999', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966998', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966996', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966974', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966948', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966947', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966945', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966941', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966938', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966934', None, None, None, None, None, '3534279') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 120 time used for this insertion : 0.018778562545776367 save_final save missing photos in datou_result : time spend for datou_step_exec : 14.428518295288086 time spend to save output : 0.020426273345947266 total time spend for step 3 : 14.448944568634033 step4:ventilate_hashtags_in_portfolio Thu Aug 14 14:32:42 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 25980071 get user id for portfolio 25980071 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25980071 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','flou','environnement','autre','pet_fonce','metal','pehd','background','carton','pet_clair','mal_croppe')) AND mptpi.`min_score`=0.5 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25980071 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','flou','environnement','autre','pet_fonce','metal','pehd','background','carton','pet_clair','mal_croppe')) AND mptpi.`min_score`=0.5 To do To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25980071 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','flou','environnement','autre','pet_fonce','metal','pehd','background','carton','pet_clair','mal_croppe')) AND mptpi.`min_score`=0.5 To do lien utilise dans velours : https://www.fotonower.com/velours/25984950,25984951,25984953,25984954,25984955,25984956,25984957,25984958,25984959,25984960,25984961?tags=papier,flou,environnement,autre,pet_fonce,metal,pehd,background,carton,pet_clair,mal_croppe Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1376968105, 1376968022, 1376967713, 1376967712, 1376967711, 1376967708, 1376967701, 1376967694, 1376967615, 1376967613, 1376967612, 1376967611, 1376967609, 1376967606, 1376967580, 1376967577, 1376967571, 1376967383, 1376967354, 1376967324, 1376967290, 1376967244, 1376967187, 1376967072, 1376967068, 1376967066, 1376967064, 1376967061, 1376967058, 1376967001, 1376966999, 1376966998, 1376966996, 1376966974, 1376966948, 1376966947, 1376966945, 1376966941, 1376966938, 1376966934] Looping around the photos to save general results len do output : 1 /25980071. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376968105', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376968022', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967713', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967712', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967711', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967708', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967701', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967694', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967615', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967613', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967612', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967611', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967609', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967606', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967580', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967577', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967571', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967383', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967354', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967324', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967290', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967244', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967187', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967072', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967068', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967066', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967064', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967061', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967058', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967001', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966999', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966998', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966996', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966974', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966948', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966947', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966945', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966941', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966938', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966934', None, None, None, None, None, '3534279') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 41 time used for this insertion : 0.017200231552124023 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.6450517177581787 time spend to save output : 0.017613887786865234 total time spend for step 4 : 0.662665605545044 step5:final Thu Aug 14 14:32:43 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Catched exception ! Connect or reconnect ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1376968105: ('0.1361061197916667',), 1376968022: ('0.1361061197916667',), 1376967713: ('0.1361061197916667',), 1376967712: ('0.1361061197916667',), 1376967711: ('0.1361061197916667',), 1376967708: ('0.1361061197916667',), 1376967701: ('0.1361061197916667',), 1376967694: ('0.1361061197916667',), 1376967615: ('0.1361061197916667',), 1376967613: ('0.1361061197916667',), 1376967612: ('0.1361061197916667',), 1376967611: ('0.1361061197916667',), 1376967609: ('0.1361061197916667',), 1376967606: ('0.1361061197916667',), 1376967580: ('0.1361061197916667',), 1376967577: ('0.1361061197916667',), 1376967571: ('0.1361061197916667',), 1376967383: ('0.1361061197916667',), 1376967354: ('0.1361061197916667',), 1376967324: ('0.1361061197916667',), 1376967290: ('0.1361061197916667',), 1376967244: ('0.1361061197916667',), 1376967187: ('0.1361061197916667',), 1376967072: ('0.1361061197916667',), 1376967068: ('0.1361061197916667',), 1376967066: ('0.1361061197916667',), 1376967064: ('0.1361061197916667',), 1376967061: ('0.1361061197916667',), 1376967058: ('0.1361061197916667',), 1376967001: ('0.1361061197916667',), 1376966999: ('0.1361061197916667',), 1376966998: ('0.1361061197916667',), 1376966996: ('0.1361061197916667',), 1376966974: ('0.1361061197916667',), 1376966948: ('0.1361061197916667',), 1376966947: ('0.1361061197916667',), 1376966945: ('0.1361061197916667',), 1376966941: ('0.1361061197916667',), 1376966938: ('0.1361061197916667',), 1376966934: ('0.1361061197916667',)} new output for save of step final : {1376968105: ('0.1361061197916667',), 1376968022: ('0.1361061197916667',), 1376967713: ('0.1361061197916667',), 1376967712: ('0.1361061197916667',), 1376967711: ('0.1361061197916667',), 1376967708: ('0.1361061197916667',), 1376967701: ('0.1361061197916667',), 1376967694: ('0.1361061197916667',), 1376967615: ('0.1361061197916667',), 1376967613: ('0.1361061197916667',), 1376967612: ('0.1361061197916667',), 1376967611: ('0.1361061197916667',), 1376967609: ('0.1361061197916667',), 1376967606: ('0.1361061197916667',), 1376967580: ('0.1361061197916667',), 1376967577: ('0.1361061197916667',), 1376967571: ('0.1361061197916667',), 1376967383: ('0.1361061197916667',), 1376967354: ('0.1361061197916667',), 1376967324: ('0.1361061197916667',), 1376967290: ('0.1361061197916667',), 1376967244: ('0.1361061197916667',), 1376967187: ('0.1361061197916667',), 1376967072: ('0.1361061197916667',), 1376967068: ('0.1361061197916667',), 1376967066: ('0.1361061197916667',), 1376967064: ('0.1361061197916667',), 1376967061: ('0.1361061197916667',), 1376967058: ('0.1361061197916667',), 1376967001: ('0.1361061197916667',), 1376966999: ('0.1361061197916667',), 1376966998: ('0.1361061197916667',), 1376966996: ('0.1361061197916667',), 1376966974: ('0.1361061197916667',), 1376966948: ('0.1361061197916667',), 1376966947: ('0.1361061197916667',), 1376966945: ('0.1361061197916667',), 1376966941: ('0.1361061197916667',), 1376966938: ('0.1361061197916667',), 1376966934: ('0.1361061197916667',)} [1376968105, 1376968022, 1376967713, 1376967712, 1376967711, 1376967708, 1376967701, 1376967694, 1376967615, 1376967613, 1376967612, 1376967611, 1376967609, 1376967606, 1376967580, 1376967577, 1376967571, 1376967383, 1376967354, 1376967324, 1376967290, 1376967244, 1376967187, 1376967072, 1376967068, 1376967066, 1376967064, 1376967061, 1376967058, 1376967001, 1376966999, 1376966998, 1376966996, 1376966974, 1376966948, 1376966947, 1376966945, 1376966941, 1376966938, 1376966934] Looping around the photos to save general results len do output : 40 /1376968105.Didn't retrieve data . /1376968022.Didn't retrieve data . /1376967713.Didn't retrieve data . /1376967712.Didn't retrieve data . /1376967711.Didn't retrieve data . /1376967708.Didn't retrieve data . /1376967701.Didn't retrieve data . /1376967694.Didn't retrieve data . /1376967615.Didn't retrieve data . /1376967613.Didn't retrieve data . /1376967612.Didn't retrieve data . /1376967611.Didn't retrieve data . /1376967609.Didn't retrieve data . /1376967606.Didn't retrieve data . /1376967580.Didn't retrieve data . /1376967577.Didn't retrieve data . /1376967571.Didn't retrieve data . /1376967383.Didn't retrieve data . /1376967354.Didn't retrieve data . /1376967324.Didn't retrieve data . /1376967290.Didn't retrieve data . /1376967244.Didn't retrieve data . /1376967187.Didn't retrieve data . /1376967072.Didn't retrieve data . /1376967068.Didn't retrieve data . /1376967066.Didn't retrieve data . /1376967064.Didn't retrieve data . /1376967061.Didn't retrieve data . /1376967058.Didn't retrieve data . /1376967001.Didn't retrieve data . /1376966999.Didn't retrieve data . /1376966998.Didn't retrieve data . /1376966996.Didn't retrieve data . /1376966974.Didn't retrieve data . /1376966948.Didn't retrieve data . /1376966947.Didn't retrieve data . /1376966945.Didn't retrieve data . /1376966941.Didn't retrieve data . /1376966938.Didn't retrieve data . /1376966934.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376968105', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376968022', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967713', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967712', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967711', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967708', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967701', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967694', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967615', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967613', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967612', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967611', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967609', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967606', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967580', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967577', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967571', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967383', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967354', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967324', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967290', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967244', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967187', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967072', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967068', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967066', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967064', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967061', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967058', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967001', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966999', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966998', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966996', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966974', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966948', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966947', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966945', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966941', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966938', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966934', None, None, None, None, None, '3534279') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 120 time used for this insertion : 0.01761007308959961 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.12633705139160156 time spend to save output : 0.019269227981567383 total time spend for step 5 : 0.14560627937316895 step6:blur_detection Thu Aug 14 14:32:43 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection methode: ratio et variance treat image : temp/1755174629_2800533_1376968105_429e45ea19eb0389522fd3ff4a66195f.jpg resize: (1080, 1920) 1376968105 -1.453735304071575 treat image : temp/1755174629_2800533_1376968022_f313d20332dc08dcd360591a99a4b476.jpg resize: (1080, 1920) 1376968022 -2.661230618419301 treat image : temp/1755174629_2800533_1376967713_693235fd1c79c88850b0361f0e368ad9.jpg resize: (1080, 1920) 1376967713 -2.4409175604540785 treat image : temp/1755174629_2800533_1376967712_2b4885e1dd93aef5bc05fa9d06ac40d1.jpg resize: (1080, 1920) 1376967712 -3.0380921889118992 treat image : temp/1755174629_2800533_1376967711_95b76cb1fe72a0edc0612b9dd34fd287.jpg resize: (1080, 1920) 1376967711 -3.0109384677739075 treat image : temp/1755174629_2800533_1376967708_aaa974d76523b7c697305e67b25613f6.jpg resize: (1080, 1920) 1376967708 -1.9809111775718573 treat image : temp/1755174629_2800533_1376967701_eb66dc1966882ea8c820aadee130ac4e.jpg resize: (1080, 1920) 1376967701 -2.664499687099964 treat image : temp/1755174629_2800533_1376967694_28ada3f478539972969145654b2700a6.jpg resize: (1080, 1920) 1376967694 -4.413156449529452 treat image : temp/1755174629_2800533_1376967615_c8f268649f3407109bac7e25a1e06538.jpg resize: (1080, 1920) 1376967615 -2.776490626478827 treat image : temp/1755174629_2800533_1376967613_47b7ceb17e52edefddad4c73cabebfd9.jpg resize: (1080, 1920) 1376967613 -2.3457003088601494 treat image : temp/1755174629_2800533_1376967612_963973cd7710b680d54ba67b8889ce1d.jpg resize: (1080, 1920) 1376967612 -1.1338895182667097 treat image : temp/1755174629_2800533_1376967611_1f4e7f7b0de901ce5347392d0908a548.jpg resize: (1080, 1920) 1376967611 -1.9656136340277102 treat image : temp/1755174629_2800533_1376967609_f4b96a0a5c83248dbbcdce57132dfb0d.jpg resize: (1080, 1920) 1376967609 -2.3606115399846646 treat image : temp/1755174629_2800533_1376967606_f98bde0b3995eccf7042648034508a44.jpg resize: (1080, 1920) 1376967606 -2.352950231696972 treat image : temp/1755174629_2800533_1376967580_8b4d5d106ce04804f688e0956730729b.jpg resize: (1080, 1920) 1376967580 -2.429872833206447 treat image : temp/1755174629_2800533_1376967577_a9864c893ced678500274e71b30a5360.jpg resize: (1080, 1920) 1376967577 -2.266441441402648 treat image : temp/1755174629_2800533_1376967571_2cea1f5a4bada9b1c909d480806cc019.jpg resize: (1080, 1920) 1376967571 -2.3935238007511015 treat image : temp/1755174629_2800533_1376967383_e076f6e97ca804e645b84ad71e7312f9.jpg resize: (1080, 1920) 1376967383 -1.697818437764015 treat image : temp/1755174629_2800533_1376967354_15453e2b97ced75c2908c9e5ca11e48f.jpg resize: (1080, 1920) 1376967354 -1.6143165640722696 treat image : temp/1755174629_2800533_1376967324_2ae0f24a476159432199511ab2d2ccf0.jpg resize: (1080, 1920) 1376967324 -1.0940294525079512 treat image : temp/1755174629_2800533_1376967290_ccd1f9865229e62dd455fe9dcfbc18f7.jpg resize: (1080, 1920) 1376967290 -2.257495127438786 treat image : temp/1755174629_2800533_1376967244_ef77b6c2bb2aa5ba629fd91f3c357180.jpg resize: (1080, 1920) 1376967244 -2.84722052704298 treat image : temp/1755174629_2800533_1376967187_b71b898d817f3211a085b297660b5a3b.jpg resize: (1080, 1920) 1376967187 -4.292563641241433 treat image : temp/1755174629_2800533_1376967072_54e785443d557e0ffc21ab9fa9f69425.jpg resize: (1080, 1920) 1376967072 -2.265135240729659 treat image : temp/1755174629_2800533_1376967068_774965e66d370ba4743f9997ce8826df.jpg resize: (1080, 1920) 1376967068 -1.4576169632143088 treat image : temp/1755174629_2800533_1376967066_3d0c31e98a95d056472313792460d8a2.jpg resize: (1080, 1920) 1376967066 -2.9852882386654924 treat image : temp/1755174629_2800533_1376967064_9ba47ea687a1b34b3efa3a6f3a59dbf9.jpg resize: (1080, 1920) 1376967064 -1.4880208374713462 treat image : temp/1755174629_2800533_1376967061_3fbc2d5f59b3d1902877d91a251bcb66.jpg resize: (1080, 1920) 1376967061 -0.4114915100108082 treat image : temp/1755174629_2800533_1376967058_8ee3cc496cbc70fe3c676b7b4b7fd44a.jpg resize: (1080, 1920) 1376967058 -2.3487997506666525 treat image : temp/1755174629_2800533_1376967001_188793f947e1a748ae66dca37bf3e3cc.jpg resize: (1080, 1920) 1376967001 -2.3866468226153006 treat image : temp/1755174629_2800533_1376966999_a3d02769b25bd5679e11956252246fa1.jpg resize: (1080, 1920) 1376966999 -2.1292307297048993 treat image : temp/1755174629_2800533_1376966998_fa17c97c278cf4a830a66ba7b5b553aa.jpg resize: (1080, 1920) 1376966998 -2.407039463449419 treat image : temp/1755174629_2800533_1376966996_f7f820771aed9eb3af35f475f92cdb71.jpg resize: (1080, 1920) 1376966996 -4.393763214925184 treat image : temp/1755174629_2800533_1376966974_38327837cae519d84840e61806597c03.jpg resize: (1080, 1920) 1376966974 -1.5812874523644398 treat image : temp/1755174629_2800533_1376966948_8581d11cdeb706fdb41db20e73d90651.jpg resize: (1080, 1920) 1376966948 -1.62795888676795 treat image : temp/1755174629_2800533_1376966947_3608c05956332387283c6f2c8229dcc1.jpg resize: (1080, 1920) 1376966947 -2.426540078797956 treat image : temp/1755174629_2800533_1376966945_24c0dcafd8e6aa3caaad992333857e85.jpg resize: (1080, 1920) 1376966945 -2.317180179038744 treat image : temp/1755174629_2800533_1376966941_d34929070dd4689a6f2a44ade0f4d0fc.jpg resize: (1080, 1920) 1376966941 -2.2063045784716313 treat image : temp/1755174629_2800533_1376966938_0ecfa41dc4e05718d92d61701b0aee63.jpg resize: (1080, 1920) 1376966938 -3.042428773057051 treat image : temp/1755174629_2800533_1376966934_9e74f9babde629b32edc9db8aa39f27e.jpg resize: (1080, 1920) 1376966934 -2.636886526870398 treat image : temp/1755174629_2800533_1376968105_429e45ea19eb0389522fd3ff4a66195f_rle_crop_3914814054_0.png resize: (123, 90) 1377020687 -1.971878477421054 treat image : temp/1755174629_2800533_1376967708_aaa974d76523b7c697305e67b25613f6_rle_crop_3914814063_0.png resize: (175, 110) 1377020688 -0.5050128793142309 treat image : temp/1755174629_2800533_1376967694_28ada3f478539972969145654b2700a6_rle_crop_3914814070_0.png resize: (118, 98) 1377020689 -0.837099927941128 treat image : temp/1755174629_2800533_1376967613_47b7ceb17e52edefddad4c73cabebfd9_rle_crop_3914814074_0.png resize: (179, 110) 1377020691 -0.5035705154722404 treat image : temp/1755174629_2800533_1376967612_963973cd7710b680d54ba67b8889ce1d_rle_crop_3914814075_0.png resize: (167, 102) 1377020692 -0.543354594782242 treat image : temp/1755174629_2800533_1376967609_f4b96a0a5c83248dbbcdce57132dfb0d_rle_crop_3914814079_0.png resize: (100, 100) 1377020693 -3.0663613447646223 treat image : temp/1755174629_2800533_1376967606_f98bde0b3995eccf7042648034508a44_rle_crop_3914814080_0.png resize: (145, 114) 1377020694 -1.5938420275066496 treat image : temp/1755174629_2800533_1376967580_8b4d5d106ce04804f688e0956730729b_rle_crop_3914814084_0.png resize: (204, 117) 1377020695 -2.9093568440980686 treat image : temp/1755174629_2800533_1376967571_2cea1f5a4bada9b1c909d480806cc019_rle_crop_3914814086_0.png resize: (83, 87) 1377020696 -5.640422033333365 treat image : temp/1755174629_2800533_1376967571_2cea1f5a4bada9b1c909d480806cc019_rle_crop_3914814087_0.png resize: (116, 97) 1377020697 -2.2772193346794345 treat image : temp/1755174629_2800533_1376967383_e076f6e97ca804e645b84ad71e7312f9_rle_crop_3914814088_0.png resize: (182, 115) 1377020698 -0.5821244718729226 treat image : temp/1755174629_2800533_1376967290_ccd1f9865229e62dd455fe9dcfbc18f7_rle_crop_3914814092_0.png resize: (173, 88) 1377020699 -3.745166092927463 treat image : temp/1755174629_2800533_1376967290_ccd1f9865229e62dd455fe9dcfbc18f7_rle_crop_3914814093_0.png resize: (113, 125) 1377020700 -1.2755236305395286 treat image : temp/1755174629_2800533_1376967290_ccd1f9865229e62dd455fe9dcfbc18f7_rle_crop_3914814096_0.png resize: (534, 357) 1377020701 -0.8857354170195775 treat image : temp/1755174629_2800533_1376967290_ccd1f9865229e62dd455fe9dcfbc18f7_rle_crop_3914814097_0.png resize: (141, 341) 1377020702 -3.1366675857271256 treat image : temp/1755174629_2800533_1376967244_ef77b6c2bb2aa5ba629fd91f3c357180_rle_crop_3914814098_0.png resize: (180, 124) 1377020703 -0.45180585350847186 treat image : temp/1755174629_2800533_1376967068_774965e66d370ba4743f9997ce8826df_rle_crop_3914814104_0.png resize: (109, 81) 1377020704 -2.0154822410930633 treat image : temp/1755174629_2800533_1376967068_774965e66d370ba4743f9997ce8826df_rle_crop_3914814105_0.png resize: (152, 97) 1377020705 -1.3646153060819233 treat image : temp/1755174629_2800533_1376967068_774965e66d370ba4743f9997ce8826df_rle_crop_3914814107_0.png resize: (108, 97) 1377020706 -0.6533666894418433 treat image : temp/1755174629_2800533_1376966948_8581d11cdeb706fdb41db20e73d90651_rle_crop_3914814124_0.png resize: (171, 106) 1377020707 -1.8939591174846235 treat image : temp/1755174629_2800533_1376966947_3608c05956332387283c6f2c8229dcc1_rle_crop_3914814125_0.png resize: (132, 136) 1377020708 -2.3620451831249722 treat image : temp/1755174629_2800533_1376966947_3608c05956332387283c6f2c8229dcc1_rle_crop_3914814129_0.png resize: (60, 68) 1377020709 -3.169788849212653 treat image : temp/1755174629_2800533_1376966941_d34929070dd4689a6f2a44ade0f4d0fc_rle_crop_3914814131_0.png resize: (143, 71) 1377020710 -1.8829119224397166 treat image : temp/1755174629_2800533_1376966941_d34929070dd4689a6f2a44ade0f4d0fc_rle_crop_3914814133_0.png resize: (72, 82) 1377020711 -1.4383986930455095 treat image : temp/1755174629_2800533_1376968105_429e45ea19eb0389522fd3ff4a66195f_rle_crop_3914814053_0.png resize: (113, 100) 1377020733 -1.1595502344782131 treat image : temp/1755174629_2800533_1376968105_429e45ea19eb0389522fd3ff4a66195f_rle_crop_3914814056_0.png resize: (121, 91) 1377020735 -0.5842986691943917 treat image : temp/1755174629_2800533_1376968105_429e45ea19eb0389522fd3ff4a66195f_rle_crop_3914814058_0.png resize: (186, 123) 1377020736 -2.0674281485508264 treat image : temp/1755174629_2800533_1376967713_693235fd1c79c88850b0361f0e368ad9_rle_crop_3914814061_0.png resize: (993, 953) 1377020737 -0.28468223238475965 treat image : temp/1755174629_2800533_1376967701_eb66dc1966882ea8c820aadee130ac4e_rle_crop_3914814066_0.png resize: (110, 83) 1377020738 -0.7141232674337176 treat image : temp/1755174629_2800533_1376967694_28ada3f478539972969145654b2700a6_rle_crop_3914814068_0.png resize: (184, 92) 1377020739 -1.8240506171303097 treat image : temp/1755174629_2800533_1376967609_f4b96a0a5c83248dbbcdce57132dfb0d_rle_crop_3914814077_0.png resize: (119, 106) 1377020740 -1.8737821276781104 treat image : temp/1755174629_2800533_1376967580_8b4d5d106ce04804f688e0956730729b_rle_crop_3914814081_0.png resize: (128, 108) 1377020741 -0.8139571187661172 treat image : temp/1755174629_2800533_1376967580_8b4d5d106ce04804f688e0956730729b_rle_crop_3914814082_0.png resize: (145, 113) 1377020742 -0.1168525667524323 treat image : temp/1755174629_2800533_1376967324_2ae0f24a476159432199511ab2d2ccf0_rle_crop_3914814090_0.png resize: (70, 95) 1377020743 -1.8153143414893445 treat image : temp/1755174629_2800533_1376967072_54e785443d557e0ffc21ab9fa9f69425_rle_crop_3914814102_0.png resize: (131, 110) 1377020744 -0.05586473537183862 treat image : temp/1755174629_2800533_1376967064_9ba47ea687a1b34b3efa3a6f3a59dbf9_rle_crop_3914814111_0.png resize: (129, 114) 1377020745 0.2276626340010066 treat image : temp/1755174629_2800533_1376967064_9ba47ea687a1b34b3efa3a6f3a59dbf9_rle_crop_3914814113_0.png resize: (138, 107) 1377020746 -1.9619095851680497 treat image : temp/1755174629_2800533_1376967061_3fbc2d5f59b3d1902877d91a251bcb66_rle_crop_3914814114_0.png resize: (118, 200) 1377020747 -0.8295818631660534 treat image : temp/1755174629_2800533_1376966999_a3d02769b25bd5679e11956252246fa1_rle_crop_3914814115_0.png resize: (105, 141) 1377020748 -1.5767045927165486 treat image : temp/1755174629_2800533_1376966999_a3d02769b25bd5679e11956252246fa1_rle_crop_3914814116_0.png resize: (983, 872) 1377020749 -0.9282915920171746 treat image : temp/1755174629_2800533_1376966996_f7f820771aed9eb3af35f475f92cdb71_rle_crop_3914814118_0.png resize: (970, 981) 1377020750 -0.7219852819384148 treat image : temp/1755174629_2800533_1376966996_f7f820771aed9eb3af35f475f92cdb71_rle_crop_3914814120_0.png resize: (152, 98) 1377020751 -1.1040054246356557 treat image : temp/1755174629_2800533_1376966941_d34929070dd4689a6f2a44ade0f4d0fc_rle_crop_3914814136_0.png resize: (150, 221) 1377020752 -1.615919688070128 treat image : temp/1755174629_2800533_1376966938_0ecfa41dc4e05718d92d61701b0aee63_rle_crop_3914814137_0.png resize: (140, 241) 1377020753 -1.4634166339572026 treat image : temp/1755174629_2800533_1376966938_0ecfa41dc4e05718d92d61701b0aee63_rle_crop_3914814138_0.png resize: (927, 940) 1377020754 -0.13841860233891484 treat image : temp/1755174629_2800533_1376967611_1f4e7f7b0de901ce5347392d0908a548_rle_crop_3914814076_0.png resize: (59, 92) 1377020756 -0.7798888845467882 treat image : temp/1755174629_2800533_1376967068_774965e66d370ba4743f9997ce8826df_rle_crop_3914814109_0.png resize: (83, 132) 1377020757 -1.4413823207039795 treat image : temp/1755174629_2800533_1376967066_3d0c31e98a95d056472313792460d8a2_rle_crop_3914814110_0.png resize: (32, 85) 1377020758 -3.3625415570252586 treat image : temp/1755174629_2800533_1376968105_429e45ea19eb0389522fd3ff4a66195f_rle_crop_3914814055_0.png resize: (539, 352) 1377021015 0.42534329928027353 treat image : temp/1755174629_2800533_1376968022_f313d20332dc08dcd360591a99a4b476_rle_crop_3914814059_0.png resize: (78, 168) 1377021017 -1.835949168269624 treat image : temp/1755174629_2800533_1376968022_f313d20332dc08dcd360591a99a4b476_rle_crop_3914814060_0.png resize: (963, 1042) 1377021020 -0.3630098535712459 treat image : temp/1755174629_2800533_1376967712_2b4885e1dd93aef5bc05fa9d06ac40d1_rle_crop_3914814062_0.png resize: (505, 357) 1377021021 -0.4249356195762827 treat image : temp/1755174629_2800533_1376967708_aaa974d76523b7c697305e67b25613f6_rle_crop_3914814064_0.png resize: (70, 141) 1377021022 -1.8763576100022747 treat image : temp/1755174629_2800533_1376967708_aaa974d76523b7c697305e67b25613f6_rle_crop_3914814065_0.png resize: (848, 1404) 1377021023 -1.5976319112378254 treat image : temp/1755174629_2800533_1376967694_28ada3f478539972969145654b2700a6_rle_crop_3914814069_0.png resize: (535, 336) 1377021025 0.14300433382990474 treat image : temp/1755174629_2800533_1376967694_28ada3f478539972969145654b2700a6_rle_crop_3914814071_0.png resize: (135, 231) 1377021026 -3.723257907000542 treat image : temp/1755174629_2800533_1376967615_c8f268649f3407109bac7e25a1e06538_rle_crop_3914814072_0.png resize: (168, 167) 1377021027 -3.87539194858093 treat image : temp/1755174629_2800533_1376967615_c8f268649f3407109bac7e25a1e06538_rle_crop_3914814073_0.png resize: (153, 197) 1377021029 -2.533409714311139 treat image : temp/1755174629_2800533_1376967577_a9864c893ced678500274e71b30a5360_rle_crop_3914814085_0.png resize: (987, 1049) 1377021030 0.1646212592964427 treat image : temp/1755174629_2800533_1376967354_15453e2b97ced75c2908c9e5ca11e48f_rle_crop_3914814089_0.png resize: (948, 1017) 1377021031 0.5529107952031769 treat image : temp/1755174629_2800533_1376967324_2ae0f24a476159432199511ab2d2ccf0_rle_crop_3914814091_0.png resize: (100, 149) 1377021032 0.844561469091031 treat image : temp/1755174629_2800533_1376967187_b71b898d817f3211a085b297660b5a3b_rle_crop_3914814099_0.png resize: (135, 119) 1377021034 -3.760039016106375 treat image : temp/1755174629_2800533_1376967187_b71b898d817f3211a085b297660b5a3b_rle_crop_3914814100_0.png resize: (545, 362) 1377021035 -0.3592727701902981 treat image : temp/1755174629_2800533_1376967187_b71b898d817f3211a085b297660b5a3b_rle_crop_3914814101_0.png resize: (187, 94) 1377021036 -0.33453279447526635 treat image : temp/1755174629_2800533_1376967072_54e785443d557e0ffc21ab9fa9f69425_rle_crop_3914814103_0.png resize: (911, 1064) 1377021038 -0.38751629917552005 treat image : temp/1755174629_2800533_1376967068_774965e66d370ba4743f9997ce8826df_rle_crop_3914814106_0.png resize: (103, 119) 1377021039 -1.2192866564606382 treat image : temp/1755174629_2800533_1376967068_774965e66d370ba4743f9997ce8826df_rle_crop_3914814108_0.png resize: (545, 378) 1377021040 0.3206869111329111 treat image : temp/1755174629_2800533_1376967064_9ba47ea687a1b34b3efa3a6f3a59dbf9_rle_crop_3914814112_0.png resize: (980, 1180) 1377021041 -0.025208208181409033 treat image : temp/1755174629_2800533_1376966998_fa17c97c278cf4a830a66ba7b5b553aa_rle_crop_3914814117_0.png resize: (78, 55) 1377021043 -0.635952101871098 treat image : temp/1755174629_2800533_1376966996_f7f820771aed9eb3af35f475f92cdb71_rle_crop_3914814119_0.png resize: (531, 343) 1377021044 -0.1291860167769249 treat image : temp/1755174629_2800533_1376966974_38327837cae519d84840e61806597c03_rle_crop_3914814121_0.png resize: (98, 121) 1377021045 -3.473834453140985 treat image : temp/1755174629_2800533_1376966974_38327837cae519d84840e61806597c03_rle_crop_3914814122_0.png resize: (543, 337) 1377021047 -0.17016863128761156 treat image : temp/1755174629_2800533_1376966948_8581d11cdeb706fdb41db20e73d90651_rle_crop_3914814123_0.png resize: (86, 112) 1377021048 -2.8825737667129054 treat image : temp/1755174629_2800533_1376966947_3608c05956332387283c6f2c8229dcc1_rle_crop_3914814126_0.png resize: (162, 159) 1377021049 -3.5588522622765737 treat image : temp/1755174629_2800533_1376966947_3608c05956332387283c6f2c8229dcc1_rle_crop_3914814127_0.png resize: (542, 341) 1377021050 0.13706863924858723 treat image : temp/1755174629_2800533_1376966947_3608c05956332387283c6f2c8229dcc1_rle_crop_3914814128_0.png resize: (933, 1030) 1377021052 -0.1753171110926841 treat image : temp/1755174629_2800533_1376966945_24c0dcafd8e6aa3caaad992333857e85_rle_crop_3914814130_0.png resize: (194, 136) 1377021053 -2.8298332647370135 treat image : temp/1755174629_2800533_1376966941_d34929070dd4689a6f2a44ade0f4d0fc_rle_crop_3914814134_0.png resize: (89, 149) 1377021054 -3.696701425615989 treat image : temp/1755174629_2800533_1376966941_d34929070dd4689a6f2a44ade0f4d0fc_rle_crop_3914814135_0.png resize: (858, 1598) 1377021056 -0.4213506282344748 treat image : temp/1755174629_2800533_1376966934_9e74f9babde629b32edc9db8aa39f27e_rle_crop_3914814139_0.png resize: (925, 979) 1377021058 0.3642266743893964 treat image : temp/1755174629_2800533_1376968105_429e45ea19eb0389522fd3ff4a66195f_rle_crop_3914814057_0.png resize: (70, 83) 1377021097 0.4363540781380195 treat image : temp/1755174629_2800533_1376967701_eb66dc1966882ea8c820aadee130ac4e_rle_crop_3914814067_0.png resize: (117, 51) 1377021098 -1.9870255122992349 treat image : temp/1755174629_2800533_1376967609_f4b96a0a5c83248dbbcdce57132dfb0d_rle_crop_3914814078_0.png resize: (60, 67) 1377021099 -1.9772818838822028 treat image : temp/1755174629_2800533_1376967580_8b4d5d106ce04804f688e0956730729b_rle_crop_3914814083_0.png resize: (94, 33) 1377021100 -2.2340612689764128 treat image : temp/1755174629_2800533_1376967290_ccd1f9865229e62dd455fe9dcfbc18f7_rle_crop_3914814094_0.png resize: (183, 125) 1377021102 -1.2208193836481864 treat image : temp/1755174629_2800533_1376967290_ccd1f9865229e62dd455fe9dcfbc18f7_rle_crop_3914814095_0.png resize: (903, 869) 1377021103 -0.7572093557525772 treat image : temp/1755174629_2800533_1376966941_d34929070dd4689a6f2a44ade0f4d0fc_rle_crop_3914814132_0.png resize: (48, 57) 1377021104 20.0 Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 127 time used for this insertion : 0.015410661697387695 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 127 time used for this insertion : 0.02960491180419922 save missing photos in datou_result : time spend for datou_step_exec : 37.2576265335083 time spend to save output : 0.05066251754760742 total time spend for step 6 : 37.30828905105591 step7:brightness Thu Aug 14 14:33:20 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness treat image : temp/1755174629_2800533_1376968105_429e45ea19eb0389522fd3ff4a66195f.jpg treat image : temp/1755174629_2800533_1376968022_f313d20332dc08dcd360591a99a4b476.jpg treat image : temp/1755174629_2800533_1376967713_693235fd1c79c88850b0361f0e368ad9.jpg treat image : temp/1755174629_2800533_1376967712_2b4885e1dd93aef5bc05fa9d06ac40d1.jpg treat image : temp/1755174629_2800533_1376967711_95b76cb1fe72a0edc0612b9dd34fd287.jpg treat image : temp/1755174629_2800533_1376967708_aaa974d76523b7c697305e67b25613f6.jpg treat image : temp/1755174629_2800533_1376967701_eb66dc1966882ea8c820aadee130ac4e.jpg treat image : temp/1755174629_2800533_1376967694_28ada3f478539972969145654b2700a6.jpg treat image : temp/1755174629_2800533_1376967615_c8f268649f3407109bac7e25a1e06538.jpg treat image : temp/1755174629_2800533_1376967613_47b7ceb17e52edefddad4c73cabebfd9.jpg treat image : temp/1755174629_2800533_1376967612_963973cd7710b680d54ba67b8889ce1d.jpg treat image : temp/1755174629_2800533_1376967611_1f4e7f7b0de901ce5347392d0908a548.jpg treat image : temp/1755174629_2800533_1376967609_f4b96a0a5c83248dbbcdce57132dfb0d.jpg treat image : temp/1755174629_2800533_1376967606_f98bde0b3995eccf7042648034508a44.jpg treat image : temp/1755174629_2800533_1376967580_8b4d5d106ce04804f688e0956730729b.jpg treat image : temp/1755174629_2800533_1376967577_a9864c893ced678500274e71b30a5360.jpg treat image : temp/1755174629_2800533_1376967571_2cea1f5a4bada9b1c909d480806cc019.jpg treat image : temp/1755174629_2800533_1376967383_e076f6e97ca804e645b84ad71e7312f9.jpg treat image : temp/1755174629_2800533_1376967354_15453e2b97ced75c2908c9e5ca11e48f.jpg treat image : temp/1755174629_2800533_1376967324_2ae0f24a476159432199511ab2d2ccf0.jpg treat image : temp/1755174629_2800533_1376967290_ccd1f9865229e62dd455fe9dcfbc18f7.jpg treat image : temp/1755174629_2800533_1376967244_ef77b6c2bb2aa5ba629fd91f3c357180.jpg treat image : temp/1755174629_2800533_1376967187_b71b898d817f3211a085b297660b5a3b.jpg treat image : temp/1755174629_2800533_1376967072_54e785443d557e0ffc21ab9fa9f69425.jpg treat image : temp/1755174629_2800533_1376967068_774965e66d370ba4743f9997ce8826df.jpg treat image : temp/1755174629_2800533_1376967066_3d0c31e98a95d056472313792460d8a2.jpg treat image : temp/1755174629_2800533_1376967064_9ba47ea687a1b34b3efa3a6f3a59dbf9.jpg treat image : temp/1755174629_2800533_1376967061_3fbc2d5f59b3d1902877d91a251bcb66.jpg treat image : temp/1755174629_2800533_1376967058_8ee3cc496cbc70fe3c676b7b4b7fd44a.jpg treat image : temp/1755174629_2800533_1376967001_188793f947e1a748ae66dca37bf3e3cc.jpg treat image : temp/1755174629_2800533_1376966999_a3d02769b25bd5679e11956252246fa1.jpg treat image : temp/1755174629_2800533_1376966998_fa17c97c278cf4a830a66ba7b5b553aa.jpg treat image : temp/1755174629_2800533_1376966996_f7f820771aed9eb3af35f475f92cdb71.jpg treat image : temp/1755174629_2800533_1376966974_38327837cae519d84840e61806597c03.jpg treat image : temp/1755174629_2800533_1376966948_8581d11cdeb706fdb41db20e73d90651.jpg treat image : temp/1755174629_2800533_1376966947_3608c05956332387283c6f2c8229dcc1.jpg treat image : temp/1755174629_2800533_1376966945_24c0dcafd8e6aa3caaad992333857e85.jpg treat image : temp/1755174629_2800533_1376966941_d34929070dd4689a6f2a44ade0f4d0fc.jpg treat image : temp/1755174629_2800533_1376966938_0ecfa41dc4e05718d92d61701b0aee63.jpg treat image : temp/1755174629_2800533_1376966934_9e74f9babde629b32edc9db8aa39f27e.jpg treat image : temp/1755174629_2800533_1376968105_429e45ea19eb0389522fd3ff4a66195f_rle_crop_3914814054_0.png treat image : temp/1755174629_2800533_1376967708_aaa974d76523b7c697305e67b25613f6_rle_crop_3914814063_0.png treat image : temp/1755174629_2800533_1376967694_28ada3f478539972969145654b2700a6_rle_crop_3914814070_0.png treat image : temp/1755174629_2800533_1376967613_47b7ceb17e52edefddad4c73cabebfd9_rle_crop_3914814074_0.png treat image : temp/1755174629_2800533_1376967612_963973cd7710b680d54ba67b8889ce1d_rle_crop_3914814075_0.png treat image : temp/1755174629_2800533_1376967609_f4b96a0a5c83248dbbcdce57132dfb0d_rle_crop_3914814079_0.png treat image : temp/1755174629_2800533_1376967606_f98bde0b3995eccf7042648034508a44_rle_crop_3914814080_0.png treat image : temp/1755174629_2800533_1376967580_8b4d5d106ce04804f688e0956730729b_rle_crop_3914814084_0.png treat image : temp/1755174629_2800533_1376967571_2cea1f5a4bada9b1c909d480806cc019_rle_crop_3914814086_0.png treat image : temp/1755174629_2800533_1376967571_2cea1f5a4bada9b1c909d480806cc019_rle_crop_3914814087_0.png treat image : temp/1755174629_2800533_1376967383_e076f6e97ca804e645b84ad71e7312f9_rle_crop_3914814088_0.png treat image : temp/1755174629_2800533_1376967290_ccd1f9865229e62dd455fe9dcfbc18f7_rle_crop_3914814092_0.png treat image : temp/1755174629_2800533_1376967290_ccd1f9865229e62dd455fe9dcfbc18f7_rle_crop_3914814093_0.png treat image : temp/1755174629_2800533_1376967290_ccd1f9865229e62dd455fe9dcfbc18f7_rle_crop_3914814096_0.png treat image : temp/1755174629_2800533_1376967290_ccd1f9865229e62dd455fe9dcfbc18f7_rle_crop_3914814097_0.png treat image : temp/1755174629_2800533_1376967244_ef77b6c2bb2aa5ba629fd91f3c357180_rle_crop_3914814098_0.png treat image : temp/1755174629_2800533_1376967068_774965e66d370ba4743f9997ce8826df_rle_crop_3914814104_0.png treat image : temp/1755174629_2800533_1376967068_774965e66d370ba4743f9997ce8826df_rle_crop_3914814105_0.png treat image : temp/1755174629_2800533_1376967068_774965e66d370ba4743f9997ce8826df_rle_crop_3914814107_0.png treat image : temp/1755174629_2800533_1376966948_8581d11cdeb706fdb41db20e73d90651_rle_crop_3914814124_0.png treat image : temp/1755174629_2800533_1376966947_3608c05956332387283c6f2c8229dcc1_rle_crop_3914814125_0.png treat image : temp/1755174629_2800533_1376966947_3608c05956332387283c6f2c8229dcc1_rle_crop_3914814129_0.png treat image : temp/1755174629_2800533_1376966941_d34929070dd4689a6f2a44ade0f4d0fc_rle_crop_3914814131_0.png treat image : temp/1755174629_2800533_1376966941_d34929070dd4689a6f2a44ade0f4d0fc_rle_crop_3914814133_0.png treat image : temp/1755174629_2800533_1376968105_429e45ea19eb0389522fd3ff4a66195f_rle_crop_3914814053_0.png treat image : temp/1755174629_2800533_1376968105_429e45ea19eb0389522fd3ff4a66195f_rle_crop_3914814056_0.png treat image : temp/1755174629_2800533_1376968105_429e45ea19eb0389522fd3ff4a66195f_rle_crop_3914814058_0.png treat image : temp/1755174629_2800533_1376967713_693235fd1c79c88850b0361f0e368ad9_rle_crop_3914814061_0.png treat image : temp/1755174629_2800533_1376967701_eb66dc1966882ea8c820aadee130ac4e_rle_crop_3914814066_0.png treat image : temp/1755174629_2800533_1376967694_28ada3f478539972969145654b2700a6_rle_crop_3914814068_0.png treat image : temp/1755174629_2800533_1376967609_f4b96a0a5c83248dbbcdce57132dfb0d_rle_crop_3914814077_0.png treat image : temp/1755174629_2800533_1376967580_8b4d5d106ce04804f688e0956730729b_rle_crop_3914814081_0.png treat image : temp/1755174629_2800533_1376967580_8b4d5d106ce04804f688e0956730729b_rle_crop_3914814082_0.png treat image : temp/1755174629_2800533_1376967324_2ae0f24a476159432199511ab2d2ccf0_rle_crop_3914814090_0.png treat image : temp/1755174629_2800533_1376967072_54e785443d557e0ffc21ab9fa9f69425_rle_crop_3914814102_0.png treat image : temp/1755174629_2800533_1376967064_9ba47ea687a1b34b3efa3a6f3a59dbf9_rle_crop_3914814111_0.png treat image : temp/1755174629_2800533_1376967064_9ba47ea687a1b34b3efa3a6f3a59dbf9_rle_crop_3914814113_0.png treat image : temp/1755174629_2800533_1376967061_3fbc2d5f59b3d1902877d91a251bcb66_rle_crop_3914814114_0.png treat image : temp/1755174629_2800533_1376966999_a3d02769b25bd5679e11956252246fa1_rle_crop_3914814115_0.png treat image : temp/1755174629_2800533_1376966999_a3d02769b25bd5679e11956252246fa1_rle_crop_3914814116_0.png treat image : temp/1755174629_2800533_1376966996_f7f820771aed9eb3af35f475f92cdb71_rle_crop_3914814118_0.png treat image : temp/1755174629_2800533_1376966996_f7f820771aed9eb3af35f475f92cdb71_rle_crop_3914814120_0.png treat image : temp/1755174629_2800533_1376966941_d34929070dd4689a6f2a44ade0f4d0fc_rle_crop_3914814136_0.png treat image : temp/1755174629_2800533_1376966938_0ecfa41dc4e05718d92d61701b0aee63_rle_crop_3914814137_0.png treat image : temp/1755174629_2800533_1376966938_0ecfa41dc4e05718d92d61701b0aee63_rle_crop_3914814138_0.png treat image : temp/1755174629_2800533_1376967611_1f4e7f7b0de901ce5347392d0908a548_rle_crop_3914814076_0.png treat image : temp/1755174629_2800533_1376967068_774965e66d370ba4743f9997ce8826df_rle_crop_3914814109_0.png treat image : temp/1755174629_2800533_1376967066_3d0c31e98a95d056472313792460d8a2_rle_crop_3914814110_0.png treat image : temp/1755174629_2800533_1376968105_429e45ea19eb0389522fd3ff4a66195f_rle_crop_3914814055_0.png treat image : temp/1755174629_2800533_1376968022_f313d20332dc08dcd360591a99a4b476_rle_crop_3914814059_0.png treat image : temp/1755174629_2800533_1376968022_f313d20332dc08dcd360591a99a4b476_rle_crop_3914814060_0.png treat image : temp/1755174629_2800533_1376967712_2b4885e1dd93aef5bc05fa9d06ac40d1_rle_crop_3914814062_0.png treat image : temp/1755174629_2800533_1376967708_aaa974d76523b7c697305e67b25613f6_rle_crop_3914814064_0.png treat image : temp/1755174629_2800533_1376967708_aaa974d76523b7c697305e67b25613f6_rle_crop_3914814065_0.png treat image : temp/1755174629_2800533_1376967694_28ada3f478539972969145654b2700a6_rle_crop_3914814069_0.png treat image : temp/1755174629_2800533_1376967694_28ada3f478539972969145654b2700a6_rle_crop_3914814071_0.png treat image : temp/1755174629_2800533_1376967615_c8f268649f3407109bac7e25a1e06538_rle_crop_3914814072_0.png treat image : temp/1755174629_2800533_1376967615_c8f268649f3407109bac7e25a1e06538_rle_crop_3914814073_0.png treat image : temp/1755174629_2800533_1376967577_a9864c893ced678500274e71b30a5360_rle_crop_3914814085_0.png treat image : temp/1755174629_2800533_1376967354_15453e2b97ced75c2908c9e5ca11e48f_rle_crop_3914814089_0.png treat image : temp/1755174629_2800533_1376967324_2ae0f24a476159432199511ab2d2ccf0_rle_crop_3914814091_0.png treat image : temp/1755174629_2800533_1376967187_b71b898d817f3211a085b297660b5a3b_rle_crop_3914814099_0.png treat image : temp/1755174629_2800533_1376967187_b71b898d817f3211a085b297660b5a3b_rle_crop_3914814100_0.png treat image : temp/1755174629_2800533_1376967187_b71b898d817f3211a085b297660b5a3b_rle_crop_3914814101_0.png treat image : temp/1755174629_2800533_1376967072_54e785443d557e0ffc21ab9fa9f69425_rle_crop_3914814103_0.png treat image : temp/1755174629_2800533_1376967068_774965e66d370ba4743f9997ce8826df_rle_crop_3914814106_0.png treat image : temp/1755174629_2800533_1376967068_774965e66d370ba4743f9997ce8826df_rle_crop_3914814108_0.png treat image : temp/1755174629_2800533_1376967064_9ba47ea687a1b34b3efa3a6f3a59dbf9_rle_crop_3914814112_0.png treat image : temp/1755174629_2800533_1376966998_fa17c97c278cf4a830a66ba7b5b553aa_rle_crop_3914814117_0.png treat image : temp/1755174629_2800533_1376966996_f7f820771aed9eb3af35f475f92cdb71_rle_crop_3914814119_0.png treat image : temp/1755174629_2800533_1376966974_38327837cae519d84840e61806597c03_rle_crop_3914814121_0.png treat image : temp/1755174629_2800533_1376966974_38327837cae519d84840e61806597c03_rle_crop_3914814122_0.png treat image : temp/1755174629_2800533_1376966948_8581d11cdeb706fdb41db20e73d90651_rle_crop_3914814123_0.png treat image : temp/1755174629_2800533_1376966947_3608c05956332387283c6f2c8229dcc1_rle_crop_3914814126_0.png treat image : temp/1755174629_2800533_1376966947_3608c05956332387283c6f2c8229dcc1_rle_crop_3914814127_0.png treat image : temp/1755174629_2800533_1376966947_3608c05956332387283c6f2c8229dcc1_rle_crop_3914814128_0.png treat image : temp/1755174629_2800533_1376966945_24c0dcafd8e6aa3caaad992333857e85_rle_crop_3914814130_0.png treat image : temp/1755174629_2800533_1376966941_d34929070dd4689a6f2a44ade0f4d0fc_rle_crop_3914814134_0.png treat image : temp/1755174629_2800533_1376966941_d34929070dd4689a6f2a44ade0f4d0fc_rle_crop_3914814135_0.png treat image : temp/1755174629_2800533_1376966934_9e74f9babde629b32edc9db8aa39f27e_rle_crop_3914814139_0.png treat image : temp/1755174629_2800533_1376968105_429e45ea19eb0389522fd3ff4a66195f_rle_crop_3914814057_0.png treat image : temp/1755174629_2800533_1376967701_eb66dc1966882ea8c820aadee130ac4e_rle_crop_3914814067_0.png treat image : temp/1755174629_2800533_1376967609_f4b96a0a5c83248dbbcdce57132dfb0d_rle_crop_3914814078_0.png treat image : temp/1755174629_2800533_1376967580_8b4d5d106ce04804f688e0956730729b_rle_crop_3914814083_0.png treat image : temp/1755174629_2800533_1376967290_ccd1f9865229e62dd455fe9dcfbc18f7_rle_crop_3914814094_0.png treat image : temp/1755174629_2800533_1376967290_ccd1f9865229e62dd455fe9dcfbc18f7_rle_crop_3914814095_0.png treat image : temp/1755174629_2800533_1376966941_d34929070dd4689a6f2a44ade0f4d0fc_rle_crop_3914814132_0.png Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 127 time used for this insertion : 0.016481399536132812 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 127 time used for this insertion : 0.026725053787231445 save missing photos in datou_result : time spend for datou_step_exec : 11.259487867355347 time spend to save output : 0.048537492752075195 total time spend for step 7 : 11.308025360107422 step8:velours_tree Thu Aug 14 14:33:31 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure can't find the photo_desc_type Inside saveOutput : final : False verbose : 0 ouput is None No outpout to save, returning out of save general time spend for datou_step_exec : 0.07917499542236328 time spend to save output : 4.482269287109375e-05 total time spend for step 8 : 0.07921981811523438 step9:send_mail_cod Thu Aug 14 14:33:32 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin/workarea/git/Velours/python in order to get the selector url, please entre the license of selector results_Auto_P25980071_14-08-2025_14_33_32.pdf 25984950 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette259849501755174812 25984951 imagette259849511755174813 25984954 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette259849541755174813 25984955 imagette259849551755174813 25984956 change filename to text .change filename to text .change filename to text .imagette259849561755174813 25984957 imagette259849571755174813 25984958 imagette259849581755174813 25984959 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette259849591755174813 25984960 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette259849601755174815 25984961 imagette259849611755174816 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=25980071 and hashtag_type = 3594 and pcr.hashtag_id = h.hashtag_id; velour_link : https://www.fotonower.com/velours/25984950,25984951,25984953,25984954,25984955,25984956,25984957,25984958,25984959,25984960,25984961?tags=papier,flou,environnement,autre,pet_fonce,metal,pehd,background,carton,pet_clair,mal_croppe args[1376968105] : ((1376968105, -1.453735304071575, 492688767), (1376968105, 0.25055946747535174, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376968022] : ((1376968022, -2.661230618419301, 492609224), (1376968022, 0.5577000436360052, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967713] : ((1376967713, -2.4409175604540785, 492609224), (1376967713, 0.5901743801530768, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967712] : ((1376967712, -3.0380921889118992, 492609224), (1376967712, 0.5366104453590769, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967711] : ((1376967711, -3.0109384677739075, 492609224), (1376967711, 0.6089979852594835, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967708] : ((1376967708, -1.9809111775718573, 492688767), (1376967708, 0.41733554362897324, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967701] : ((1376967701, -2.664499687099964, 492609224), (1376967701, 0.306697165086029, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967694] : ((1376967694, -4.413156449529452, 492609224), (1376967694, 0.4266908754342978, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967615] : ((1376967615, -2.776490626478827, 492609224), (1376967615, 0.5396393413200181, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967613] : ((1376967613, -2.3457003088601494, 492609224), (1376967613, 0.6958418087521495, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967612] : ((1376967612, -1.1338895182667097, 492688767), (1376967612, 0.7524107182084925, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967611] : ((1376967611, -1.9656136340277102, 492688767), (1376967611, 0.621182425666571, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967609] : ((1376967609, -2.3606115399846646, 492609224), (1376967609, 0.5695164414561387, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967606] : ((1376967606, -2.352950231696972, 492609224), (1376967606, 0.4931666655632084, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967580] : ((1376967580, -2.429872833206447, 492609224), (1376967580, 0.5100288629752503, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967577] : ((1376967577, -2.266441441402648, 492609224), (1376967577, 0.7139804849122071, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967571] : ((1376967571, -2.3935238007511015, 492609224), (1376967571, 0.5648099360008636, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967383] : ((1376967383, -1.697818437764015, 492688767), (1376967383, 0.6979623528663434, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967354] : ((1376967354, -1.6143165640722696, 492688767), (1376967354, 0.688164928157054, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967324] : ((1376967324, -1.0940294525079512, 492688767), (1376967324, 0.6519906481689852, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967290] : ((1376967290, -2.257495127438786, 492609224), (1376967290, 0.6359493018216974, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967244] : ((1376967244, -2.84722052704298, 492609224), (1376967244, 0.5635283393697272, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967187] : ((1376967187, -4.292563641241433, 492609224), (1376967187, 0.4085593246079448, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967072] : ((1376967072, -2.265135240729659, 492609224), (1376967072, 0.48948460936969734, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967068] : ((1376967068, -1.4576169632143088, 492688767), (1376967068, 0.5006786769619574, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967066] : ((1376967066, -2.9852882386654924, 492609224), (1376967066, 0.4935594227105296, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967064] : ((1376967064, -1.4880208374713462, 492688767), (1376967064, 0.4754942411676153, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967061] : ((1376967061, -0.4114915100108082, 492688767), (1376967061, 0.45608708259499886, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967058] : ((1376967058, -2.3487997506666525, 492609224), (1376967058, 0.7130612977859265, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376967001] : ((1376967001, -2.3866468226153006, 492609224), (1376967001, 0.39286568081211415, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376966999] : ((1376966999, -2.1292307297048993, 492609224), (1376966999, 0.6867230063484981, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376966998] : ((1376966998, -2.407039463449419, 492609224), (1376966998, 0.5274715122727252, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376966996] : ((1376966996, -4.393763214925184, 492609224), (1376966996, 0.31748722144627783, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376966974] : ((1376966974, -1.5812874523644398, 492688767), (1376966974, 0.5800630589785035, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376966948] : ((1376966948, -1.62795888676795, 492688767), (1376966948, 0.4899351298313527, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376966947] : ((1376966947, -2.426540078797956, 492609224), (1376966947, 0.4363537502019096, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376966945] : ((1376966945, -2.317180179038744, 492609224), (1376966945, 0.42700424584054386, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376966941] : ((1376966941, -2.2063045784716313, 492609224), (1376966941, 0.38328122698447403, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376966938] : ((1376966938, -3.042428773057051, 492609224), (1376966938, 0.4901049039396685, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com args[1376966934] : ((1376966934, -2.636886526870398, 492609224), (1376966934, 0.6100020962237197, 2107752395), '0.1361061197916667') We are sending mail with results at report@fotonower.com refus_total : 0.1361061197916667 2022-04-13 10:29:59 0 SELECT ph.photo_id,ph.url,ph.username,ph.uploaded_at,ph.text FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=25980071 AND mpp.hide_status=0 ORDER BY mpp.order LIMIT 0, 1000 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25980071_14-08-2025_14_33_32.pdf results_Auto_P25980071_14-08-2025_14_33_32.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25980071_14-08-2025_14_33_32.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('3318','25980071','results_Auto_P25980071_14-08-2025_14_33_32.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25980071_14-08-2025_14_33_32.pdf','pdf','','1.01','0.1361061197916667') message_in_mail: Bonjour,
Veuillez trouver ci dessous les résultats du service carac on demand pour le portfolio: https://www.fotonower.com/view/25980071

https://www.fotonower.com/image?json=false&list_photos_id=1376968105
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376968022
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967713
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967712
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967711
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967708
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967701
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967694
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967615
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967613
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967612
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967611
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967609
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967606
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967580
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967577
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967571
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967383
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967354
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967324
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967290
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967244
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967187
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967072
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967068
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967066
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967064
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967061
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967058
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376967001
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376966999
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376966998
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376966996
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376966974
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376966948
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376966947
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376966945
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376966941
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376966938
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1376966934
Bravo, la photo est bien prise.

Dans ces conditions,le taux de refus est: 13.61%
Veuillez trouver les photos des contaminants.

exemples de contaminants: papier: https://www.fotonower.com/view/25984950?limit=200
exemples de contaminants: autre: https://www.fotonower.com/view/25984954?limit=200
exemples de contaminants: metal: https://www.fotonower.com/view/25984956?limit=200
exemples de contaminants: carton: https://www.fotonower.com/view/25984959?limit=200
exemples de contaminants: pet_clair: https://www.fotonower.com/view/25984960?limit=200
Veuillez trouver le rapport en pdf:https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25980071_14-08-2025_14_33_32.pdf.

Lien vers velours :https://www.fotonower.com/velours/25984950,25984951,25984953,25984954,25984955,25984956,25984957,25984958,25984959,25984960,25984961?tags=papier,flou,environnement,autre,pet_fonce,metal,pehd,background,carton,pet_clair,mal_croppe.


L'équipe Fotonower 202 b'' Server: nginx Date: Thu, 14 Aug 2025 12:33:39 GMT Content-Length: 0 Connection: close X-Message-Id: wGokZ4_xScquokKnjzq1dQ Access-Control-Allow-Origin: https://sendgrid.api-docs.io Access-Control-Allow-Methods: POST Access-Control-Allow-Headers: Authorization, Content-Type, On-behalf-of, x-sg-elas-acl Access-Control-Max-Age: 600 X-No-CORS-Reason: https://sendgrid.com/docs/Classroom/Basics/API/cors.html Strict-Transport-Security: max-age=31536000; includeSubDomains Content-Security-Policy: frame-ancestors 'none' Cache-Control: no-cache X-Content-Type-Options: no-sniff Referrer-Policy: strict-origin-when-cross-origin Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1376968105, 1376968022, 1376967713, 1376967712, 1376967711, 1376967708, 1376967701, 1376967694, 1376967615, 1376967613, 1376967612, 1376967611, 1376967609, 1376967606, 1376967580, 1376967577, 1376967571, 1376967383, 1376967354, 1376967324, 1376967290, 1376967244, 1376967187, 1376967072, 1376967068, 1376967066, 1376967064, 1376967061, 1376967058, 1376967001, 1376966999, 1376966998, 1376966996, 1376966974, 1376966948, 1376966947, 1376966945, 1376966941, 1376966938, 1376966934] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376968105', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376968022', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967713', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967712', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967711', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967708', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967701', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967694', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967615', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967613', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967612', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967611', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967609', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967606', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967580', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967577', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967571', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967383', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967354', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967324', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967290', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967244', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967187', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967072', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967068', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967066', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967064', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967061', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967058', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967001', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966999', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966998', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966996', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966974', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966948', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966947', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966945', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966941', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966938', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966934', None, None, None, None, None, '3534279') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 40 time used for this insertion : 0.01618814468383789 save_final save missing photos in datou_result : time spend for datou_step_exec : 7.666201591491699 time spend to save output : 0.01662611961364746 total time spend for step 9 : 7.682827711105347 step10:split_time_score Thu Aug 14 14:33:39 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score Catched exception ! Connect or reconnect ! TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('09', 53),) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 14082025 25980071 Nombre de photos uploadées : 53 / 23040 (0%) 14082025 25980071 Nombre de photos taguées (types de déchets): 0 / 53 (0%) 14082025 25980071 Nombre de photos taguées (volume) : 0 / 53 (0%) elapsed_time : load_data_split_time_score 1.1920928955078125e-06 elapsed_time : order_list_meta_photo_and_scores 9.5367431640625e-06 ????????????????????????????????????????????????????? elapsed_time : fill_and_build_computed_from_old_data 0.002321481704711914 Catched exception ! Connect or reconnect ! Catched exception ! Connect or reconnect ! elapsed_time : insert_dashboard_record_day_entry 0.20239019393920898 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.1280742026748971 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25974582_14-08-2025_10_11_57.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25974582 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25974582 AND mptpi.`type`=3594 To do Qualite : 0.015116423932613166 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25974586_14-08-2025_10_01_29.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25974586 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25974586 AND mptpi.`type`=3594 To do Qualite : 0.1361061197916667 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25980071_14-08-2025_14_33_32.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25980071 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25980071 AND mptpi.`type`=3594 To do Qualite : 0.12013578869047618 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25980101_14-08-2025_13_01_43.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25980101 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25980101 AND mptpi.`type`=3594 To do Qualite : 0.1582214988425926 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25980105_14-08-2025_12_51_32.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25980105 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25980105 AND mptpi.`type`=3594 To do Qualite : 0.16527360973324515 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25980109_14-08-2025_12_41_51.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25980109 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25980109 AND mptpi.`type`=3594 To do Qualite : 0.03430676118827159 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25983542_14-08-2025_14_21_54.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25983542 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25983542 AND mptpi.`type`=3594 To do Qualite : 0.08205182613168727 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25983560_14-08-2025_14_11_41.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25983560 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25983560 AND mptpi.`type`=3594 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'14082025': {'nb_upload': 53, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1376968105, 1376968022, 1376967713, 1376967712, 1376967711, 1376967708, 1376967701, 1376967694, 1376967615, 1376967613, 1376967612, 1376967611, 1376967609, 1376967606, 1376967580, 1376967577, 1376967571, 1376967383, 1376967354, 1376967324, 1376967290, 1376967244, 1376967187, 1376967072, 1376967068, 1376967066, 1376967064, 1376967061, 1376967058, 1376967001, 1376966999, 1376966998, 1376966996, 1376966974, 1376966948, 1376966947, 1376966945, 1376966941, 1376966938, 1376966934] Looping around the photos to save general results len do output : 1 /25980071Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376968105', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376968022', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967713', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967712', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967711', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967708', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967701', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967694', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967615', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967613', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967612', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967611', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967609', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967606', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967580', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967577', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967571', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967383', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967354', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967324', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967290', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967244', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967187', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967072', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967068', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967066', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967064', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967061', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967058', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376967001', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966999', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966998', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966996', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966974', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966948', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966947', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966945', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966941', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966938', None, None, None, None, None, '3534279') ('3318', None, None, None, None, None, None, None, '3534279') ('3318', '25980071', '1376966934', None, None, None, None, None, '3534279') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 41 time used for this insertion : 0.016195058822631836 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.827549934387207 time spend to save output : 0.016582012176513672 total time spend for step 10 : 0.8441319465637207 caffe_path_current : About to save ! 2 After save, about to update current ! update_current_state 137.97user 28.65system 3:14.92elapsed 85%CPU (0avgtext+0avgdata 3546276maxresident)k 25600inputs+80864outputs (28major+2148726minor)pagefaults 0swaps