python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 3318 ' -s datou_3318 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 3775605 load datou : 3318 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? [(photo_id, hashtag_id, hashtag_type, x0, x1, y0, y1, score, seg_temp, polygons), ...] was removed should we ? chemin de la photo was removed should we ? [ (photo_id_loc, hashtag_id, hashtag_type, x0, x1, y0, y1, score, None), ...] was removed should we ? chemin de la photo was removed should we ? id de la photo (peut être local ou global) was removed should we ? chemin de la photo was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? None was removed should we ? donnée sous forme de nombre was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) load thcls load THCL from format json or kwargs add thcl : 2847 in CacheModelConfig load pdts add pdt : 5275 in CacheModelConfig Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 3318, datou_cur_ids : ['3674434'] with mtr_portfolio_ids : ['26607883'] and first list_photo_ids : [] new path : /proc/3775605/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : mask_detect, crop_condition, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, velours_tree, send_mail_cod, split_time_score over limit max, limiting to limit_max 40 list_input_json : [] origin We have 1 , BFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 22 ; length of list_pids : 22 ; length of list_args : 22 time to download the photos : 2.885242223739624 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 10 step1:mask_detect Sat Sep 6 14:20:32 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 6820 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-09-06 14:20:35.140911: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-09-06 14:20:35.172592: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3492910000 Hz 2025-09-06 14:20:35.174835: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7fea34000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-09-06 14:20:35.174890: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-09-06 14:20:35.178800: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-09-06 14:20:35.295466: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x32b35110 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-09-06 14:20:35.295511: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-09-06 14:20:35.296761: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-09-06 14:20:35.297164: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-09-06 14:20:35.300308: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-09-06 14:20:35.303200: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-09-06 14:20:35.303688: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-09-06 14:20:35.306354: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-09-06 14:20:35.307328: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-09-06 14:20:35.311394: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-09-06 14:20:35.312648: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-09-06 14:20:35.312720: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-09-06 14:20:35.313355: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-09-06 14:20:35.313371: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-09-06 14:20:35.313379: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-09-06 14:20:35.314455: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6268 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-09-06 14:20:35.570168: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-09-06 14:20:35.570298: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-09-06 14:20:35.570327: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-09-06 14:20:35.570353: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-09-06 14:20:35.570377: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-09-06 14:20:35.570401: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-09-06 14:20:35.570424: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-09-06 14:20:35.570449: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-09-06 14:20:35.572121: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-09-06 14:20:35.573786: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-09-06 14:20:35.573823: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-09-06 14:20:35.573837: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-09-06 14:20:35.573850: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-09-06 14:20:35.573862: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-09-06 14:20:35.573875: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-09-06 14:20:35.573887: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-09-06 14:20:35.573900: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-09-06 14:20:35.574846: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-09-06 14:20:35.574879: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-09-06 14:20:35.574887: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-09-06 14:20:35.574894: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-09-06 14:20:35.575889: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6268 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl2847 thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) {'thcl': {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'], 'list_hashtags_csv': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'svm_hashtag_type_desc': 5275, 'photo_desc_type': 5275, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME learn_RUBBIA_REFUS_AMIENS_23 NUM_CLASSES 9 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : learn_RUBBIA_REFUS_AMIENS_23 model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-09-06 14:20:44.392814: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-09-06 14:20:44.579725: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23 /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23/mask_model.h5 size_local : 256009536 size in s3 : 256009536 create time local : 2021-08-09 09:43:22 create time in s3 : 2021-08-06 18:54:04 mask_model.h5 already exist and didn't need to update list_images length : 22 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 20.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 10 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 22.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 11 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 14.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 18 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 17.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 17 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 18.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 8 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 25.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 12 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 29.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 60.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 16.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 11 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 12 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 22.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 12 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 24.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 8 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 30.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 8 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 10.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 9 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 27.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 9 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 28.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 34.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 44.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 15.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 29.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 40.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 8 Detection mask done ! Trying to reset tf kernel 3776252 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 338 tf kernel not reseted sub process len(results) : 22 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 22 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 5627 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl2847 Catched exception ! Connect or reconnect ! thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] time for calcul the mask position with numpy : 0.0004315376281738281 nb_pixel_total : 6703 time to create 1 rle with old method : 0.008077859878540039 length of segment : 128 time for calcul the mask position with numpy : 0.000225067138671875 nb_pixel_total : 4869 time to create 1 rle with old method : 0.005772829055786133 length of segment : 86 time for calcul the mask position with numpy : 0.0005640983581542969 nb_pixel_total : 18528 time to create 1 rle with old method : 0.021543025970458984 length of segment : 194 time for calcul the mask position with numpy : 0.00012350082397460938 nb_pixel_total : 1576 time to create 1 rle with old method : 0.0019228458404541016 length of segment : 44 time for calcul the mask position with numpy : 0.0001583099365234375 nb_pixel_total : 4612 time to create 1 rle with old method : 0.005619049072265625 length of segment : 50 time for calcul the mask position with numpy : 0.002532482147216797 nb_pixel_total : 117636 time to create 1 rle with old method : 0.1779019832611084 length of segment : 518 time for calcul the mask position with numpy : 0.0005788803100585938 nb_pixel_total : 13386 time to create 1 rle with old method : 0.01928257942199707 length of segment : 160 time for calcul the mask position with numpy : 0.002192258834838867 nb_pixel_total : 95524 time to create 1 rle with old method : 0.11243247985839844 length of segment : 511 time for calcul the mask position with numpy : 0.0003514289855957031 nb_pixel_total : 6681 time to create 1 rle with old method : 0.008184432983398438 length of segment : 93 time for calcul the mask position with numpy : 0.0005331039428710938 nb_pixel_total : 16492 time to create 1 rle with old method : 0.021914005279541016 length of segment : 163 time for calcul the mask position with numpy : 0.0002455711364746094 nb_pixel_total : 6281 time to create 1 rle with old method : 0.007621049880981445 length of segment : 92 time for calcul the mask position with numpy : 0.016857147216796875 nb_pixel_total : 644854 time to create 1 rle with new method : 0.2342371940612793 length of segment : 914 time for calcul the mask position with numpy : 0.0005834102630615234 nb_pixel_total : 24498 time to create 1 rle with old method : 0.029124021530151367 length of segment : 242 time for calcul the mask position with numpy : 0.00038814544677734375 nb_pixel_total : 6223 time to create 1 rle with old method : 0.007592916488647461 length of segment : 121 time for calcul the mask position with numpy : 0.0002617835998535156 nb_pixel_total : 6897 time to create 1 rle with old method : 0.011678695678710938 length of segment : 86 time for calcul the mask position with numpy : 0.0003669261932373047 nb_pixel_total : 6651 time to create 1 rle with old method : 0.007952213287353516 length of segment : 115 time for calcul the mask position with numpy : 0.00024628639221191406 nb_pixel_total : 3884 time to create 1 rle with old method : 0.004771232604980469 length of segment : 90 time for calcul the mask position with numpy : 0.00018167495727539062 nb_pixel_total : 3411 time to create 1 rle with old method : 0.0042340755462646484 length of segment : 69 time for calcul the mask position with numpy : 0.0002799034118652344 nb_pixel_total : 5593 time to create 1 rle with old method : 0.006902456283569336 length of segment : 94 time for calcul the mask position with numpy : 0.000240325927734375 nb_pixel_total : 4365 time to create 1 rle with old method : 0.005437135696411133 length of segment : 84 time for calcul the mask position with numpy : 0.00028514862060546875 nb_pixel_total : 5796 time to create 1 rle with old method : 0.007127046585083008 length of segment : 93 time for calcul the mask position with numpy : 0.00019240379333496094 nb_pixel_total : 3368 time to create 1 rle with old method : 0.004183292388916016 length of segment : 78 time for calcul the mask position with numpy : 0.0005297660827636719 nb_pixel_total : 16338 time to create 1 rle with old method : 0.019908666610717773 length of segment : 177 time for calcul the mask position with numpy : 0.00037097930908203125 nb_pixel_total : 8128 time to create 1 rle with old method : 0.00966787338256836 length of segment : 165 time for calcul the mask position with numpy : 0.00016617774963378906 nb_pixel_total : 7074 time to create 1 rle with old method : 0.008446693420410156 length of segment : 95 time for calcul the mask position with numpy : 0.00015544891357421875 nb_pixel_total : 2640 time to create 1 rle with old method : 0.003216266632080078 length of segment : 72 time for calcul the mask position with numpy : 0.0005204677581787109 nb_pixel_total : 10313 time to create 1 rle with old method : 0.012365341186523438 length of segment : 186 time for calcul the mask position with numpy : 0.00013947486877441406 nb_pixel_total : 2722 time to create 1 rle with old method : 0.003478527069091797 length of segment : 49 time for calcul the mask position with numpy : 0.000423431396484375 nb_pixel_total : 12306 time to create 1 rle with old method : 0.01472020149230957 length of segment : 262 time for calcul the mask position with numpy : 0.00013256072998046875 nb_pixel_total : 2598 time to create 1 rle with old method : 0.003382444381713867 length of segment : 54 time for calcul the mask position with numpy : 0.0004780292510986328 nb_pixel_total : 9412 time to create 1 rle with old method : 0.011537313461303711 length of segment : 183 time for calcul the mask position with numpy : 0.00032711029052734375 nb_pixel_total : 7830 time to create 1 rle with old method : 0.00910043716430664 length of segment : 178 time for calcul the mask position with numpy : 0.0003364086151123047 nb_pixel_total : 6494 time to create 1 rle with old method : 0.007548809051513672 length of segment : 127 time for calcul the mask position with numpy : 0.0002257823944091797 nb_pixel_total : 4834 time to create 1 rle with old method : 0.005801677703857422 length of segment : 87 time for calcul the mask position with numpy : 0.0006189346313476562 nb_pixel_total : 18188 time to create 1 rle with old method : 0.02228093147277832 length of segment : 210 time for calcul the mask position with numpy : 0.002560138702392578 nb_pixel_total : 109122 time to create 1 rle with old method : 0.14954423904418945 length of segment : 490 time for calcul the mask position with numpy : 0.0002200603485107422 nb_pixel_total : 4975 time to create 1 rle with old method : 0.006109952926635742 length of segment : 57 time for calcul the mask position with numpy : 0.0015742778778076172 nb_pixel_total : 117739 time to create 1 rle with old method : 0.13694262504577637 length of segment : 520 time for calcul the mask position with numpy : 0.0004119873046875 nb_pixel_total : 6682 time to create 1 rle with old method : 0.008180618286132812 length of segment : 132 time for calcul the mask position with numpy : 0.00022792816162109375 nb_pixel_total : 4896 time to create 1 rle with old method : 0.005891323089599609 length of segment : 88 time for calcul the mask position with numpy : 0.00011324882507324219 nb_pixel_total : 1437 time to create 1 rle with old method : 0.0018625259399414062 length of segment : 41 time for calcul the mask position with numpy : 0.0005829334259033203 nb_pixel_total : 18222 time to create 1 rle with old method : 0.020806074142456055 length of segment : 205 time for calcul the mask position with numpy : 0.00019812583923339844 nb_pixel_total : 4846 time to create 1 rle with old method : 0.005827188491821289 length of segment : 54 time for calcul the mask position with numpy : 0.00038242340087890625 nb_pixel_total : 9339 time to create 1 rle with old method : 0.010956525802612305 length of segment : 126 time for calcul the mask position with numpy : 0.000110626220703125 nb_pixel_total : 1575 time to create 1 rle with old method : 0.0019347667694091797 length of segment : 44 time for calcul the mask position with numpy : 0.0002696514129638672 nb_pixel_total : 8172 time to create 1 rle with old method : 0.010154247283935547 length of segment : 79 time for calcul the mask position with numpy : 0.002912282943725586 nb_pixel_total : 100764 time to create 1 rle with old method : 0.12669682502746582 length of segment : 495 time for calcul the mask position with numpy : 0.0002837181091308594 nb_pixel_total : 4213 time to create 1 rle with old method : 0.00514531135559082 length of segment : 87 time for calcul the mask position with numpy : 8.130073547363281e-05 nb_pixel_total : 2432 time to create 1 rle with old method : 0.002947092056274414 length of segment : 50 time for calcul the mask position with numpy : 0.00047326087951660156 nb_pixel_total : 12104 time to create 1 rle with old method : 0.014101028442382812 length of segment : 141 time for calcul the mask position with numpy : 0.00019025802612304688 nb_pixel_total : 3998 time to create 1 rle with old method : 0.004858493804931641 length of segment : 67 time for calcul the mask position with numpy : 0.0005917549133300781 nb_pixel_total : 10163 time to create 1 rle with old method : 0.01256561279296875 length of segment : 117 time for calcul the mask position with numpy : 0.0002434253692626953 nb_pixel_total : 5338 time to create 1 rle with old method : 0.0064580440521240234 length of segment : 80 time for calcul the mask position with numpy : 0.0022716522216796875 nb_pixel_total : 109001 time to create 1 rle with old method : 0.12616562843322754 length of segment : 514 time for calcul the mask position with numpy : 0.00043201446533203125 nb_pixel_total : 11545 time to create 1 rle with old method : 0.014104127883911133 length of segment : 115 time for calcul the mask position with numpy : 0.00028514862060546875 nb_pixel_total : 6038 time to create 1 rle with old method : 0.007340431213378906 length of segment : 124 time for calcul the mask position with numpy : 0.0023365020751953125 nb_pixel_total : 103318 time to create 1 rle with old method : 0.12488675117492676 length of segment : 515 time for calcul the mask position with numpy : 0.0001373291015625 nb_pixel_total : 3843 time to create 1 rle with old method : 0.0044176578521728516 length of segment : 82 time for calcul the mask position with numpy : 0.0002880096435546875 nb_pixel_total : 6066 time to create 1 rle with old method : 0.006810665130615234 length of segment : 125 time for calcul the mask position with numpy : 0.002329587936401367 nb_pixel_total : 109228 time to create 1 rle with old method : 0.12703585624694824 length of segment : 501 time for calcul the mask position with numpy : 0.00011491775512695312 nb_pixel_total : 1985 time to create 1 rle with old method : 0.002424001693725586 length of segment : 49 time for calcul the mask position with numpy : 0.00038623809814453125 nb_pixel_total : 10402 time to create 1 rle with old method : 0.012076616287231445 length of segment : 140 time for calcul the mask position with numpy : 0.0001995563507080078 nb_pixel_total : 3338 time to create 1 rle with old method : 0.00402069091796875 length of segment : 86 time for calcul the mask position with numpy : 0.0003418922424316406 nb_pixel_total : 6068 time to create 1 rle with old method : 0.006918907165527344 length of segment : 123 time for calcul the mask position with numpy : 0.0001499652862548828 nb_pixel_total : 3426 time to create 1 rle with old method : 0.004355669021606445 length of segment : 59 time for calcul the mask position with numpy : 0.00031256675720214844 nb_pixel_total : 10013 time to create 1 rle with old method : 0.011971235275268555 length of segment : 128 time for calcul the mask position with numpy : 0.0019466876983642578 nb_pixel_total : 95537 time to create 1 rle with old method : 0.10885787010192871 length of segment : 476 time for calcul the mask position with numpy : 0.00019669532775878906 nb_pixel_total : 3726 time to create 1 rle with old method : 0.004595041275024414 length of segment : 64 time for calcul the mask position with numpy : 0.00047206878662109375 nb_pixel_total : 13948 time to create 1 rle with old method : 0.01671576499938965 length of segment : 222 time for calcul the mask position with numpy : 0.0003285408020019531 nb_pixel_total : 4888 time to create 1 rle with old method : 0.006003141403198242 length of segment : 128 time for calcul the mask position with numpy : 0.002124786376953125 nb_pixel_total : 104652 time to create 1 rle with old method : 0.11865925788879395 length of segment : 495 time for calcul the mask position with numpy : 0.00045990943908691406 nb_pixel_total : 17957 time to create 1 rle with old method : 0.021192550659179688 length of segment : 216 time for calcul the mask position with numpy : 7.867813110351562e-05 nb_pixel_total : 2074 time to create 1 rle with old method : 0.002504587173461914 length of segment : 64 time for calcul the mask position with numpy : 0.0017652511596679688 nb_pixel_total : 91337 time to create 1 rle with old method : 0.10521435737609863 length of segment : 488 time for calcul the mask position with numpy : 0.00038933753967285156 nb_pixel_total : 15050 time to create 1 rle with old method : 0.017682313919067383 length of segment : 159 time for calcul the mask position with numpy : 0.00013875961303710938 nb_pixel_total : 3607 time to create 1 rle with old method : 0.00454401969909668 length of segment : 57 time for calcul the mask position with numpy : 0.0024933815002441406 nb_pixel_total : 113512 time to create 1 rle with old method : 0.15466642379760742 length of segment : 550 time for calcul the mask position with numpy : 0.00017881393432617188 nb_pixel_total : 3268 time to create 1 rle with old method : 0.003936052322387695 length of segment : 70 time for calcul the mask position with numpy : 0.0008697509765625 nb_pixel_total : 33381 time to create 1 rle with old method : 0.03841400146484375 length of segment : 286 time for calcul the mask position with numpy : 0.0002028942108154297 nb_pixel_total : 8781 time to create 1 rle with old method : 0.01013803482055664 length of segment : 198 time for calcul the mask position with numpy : 0.028276681900024414 nb_pixel_total : 1142785 time to create 1 rle with new method : 0.0853273868560791 length of segment : 1124 time for calcul the mask position with numpy : 0.007441043853759766 nb_pixel_total : 478952 time to create 1 rle with new method : 0.017586708068847656 length of segment : 1386 time spent for convertir_results : 4.982389450073242 Inside saveOutput : final : False verbose : 0 eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 82 chid ids of type : 3594 Number RLEs to save : 17287 save missing photos in datou_result : time spend for datou_step_exec : 33.564579486846924 time spend to save output : 1.0676534175872803 total time spend for step 1 : 34.632232904434204 step2:crop_condition Sat Sep 6 14:21:07 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure Loading chi in step crop with photo_hashtag_type : 3594 Loading chi in step crop for list_pids : 22 ! batch 1 Loaded 82 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ begin to crop the class : papier param for this class : {'min_score': 0.7} filtre for class : papier hashtag_id of this class : 492668766 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 46 About to insert : list_path_to_insert length 46 new photo from crops ! About to upload 46 photos upload in portfolio : 3736932 init cache_photo without model_param we have 46 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1757161269_3775605 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 46 photos in the portfolio 3736932 time of upload the photos Elapsed time : 12.636281490325928 we have finished the crop for the class : papier begin to crop the class : carton param for this class : {'min_score': 0.7} filtre for class : carton hashtag_id of this class : 492774966 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 15 About to insert : list_path_to_insert length 15 new photo from crops ! About to upload 15 photos upload in portfolio : 3736932 init cache_photo without model_param we have 15 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1757161287_3775605 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 15 photos in the portfolio 3736932 time of upload the photos Elapsed time : 4.15464973449707 we have finished the crop for the class : carton begin to crop the class : metal param for this class : {'min_score': 0.7} filtre for class : metal hashtag_id of this class : 492628673 we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 1 About to insert : list_path_to_insert length 1 new photo from crops ! About to upload 1 photos upload in portfolio : 3736932 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1757161291_3775605 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 1 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.5784876346588135 we have finished the crop for the class : metal begin to crop the class : pet_clair param for this class : {'min_score': 0.7} filtre for class : pet_clair hashtag_id of this class : 2107755846 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 18 About to insert : list_path_to_insert length 18 new photo from crops ! About to upload 18 photos upload in portfolio : 3736932 init cache_photo without model_param we have 18 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1757161303_3775605 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 18 photos in the portfolio 3736932 time of upload the photos Elapsed time : 4.515480041503906 we have finished the crop for the class : pet_clair begin to crop the class : autre param for this class : {'min_score': 0.7} filtre for class : autre hashtag_id of this class : 494826614 we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 2 About to insert : list_path_to_insert length 2 new photo from crops ! About to upload 2 photos upload in portfolio : 3736932 init cache_photo without model_param we have 2 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1757161308_3775605 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 2 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.7542200088500977 we have finished the crop for the class : autre begin to crop the class : pehd param for this class : {'min_score': 0.7} filtre for class : pehd hashtag_id of this class : 628944319 begin to crop the class : pet_fonce param for this class : {'min_score': 0.7} filtre for class : pet_fonce hashtag_id of this class : 2107755900 delete rles from all chi we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : crop_condition we use saveGeneral [1382037342, 1382037287, 1382037250, 1382037218, 1382037193, 1382036794, 1382036755, 1382036729, 1382036694, 1382036666, 1382036636, 1382036438, 1382036416, 1382036390, 1382036357, 1382036330, 1382036305, 1382035956, 1382035932, 1382035900, 1382035862, 1382035831] Looping around the photos to save general results len do output : 82 /1382068795Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068796Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068797Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068798Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068799Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068800Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068801Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068802Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068803Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068804Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068805Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068806Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068807Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068808Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068809Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068810Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068811Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068812Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068813Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068814Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068815Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068816Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068817Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068818Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068819Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068820Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068821Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068822Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068823Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068824Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068825Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068826Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068827Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068828Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068829Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068830Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068831Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068832Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068833Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068834Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068835Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068836Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068837Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068838Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068839Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068840Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068851Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068852Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068853Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068854Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068855Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068856Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068857Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068858Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068859Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068860Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068861Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068862Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068863Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068864Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068865Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068867Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068874Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068875Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068876Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068877Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068878Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068879Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068880Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068881Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068882Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068883Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068884Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068885Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068886Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068887Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068888Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068889Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068890Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068891Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068892Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382068893Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037342', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037287', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037250', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037218', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037193', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036794', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036755', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036729', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036694', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036666', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036636', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036438', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036416', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036390', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036357', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036330', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036305', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035956', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035932', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035900', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035862', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035831', None, None, None, None, None, '3674434') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 268 time used for this insertion : 0.04112696647644043 save_final save missing photos in datou_result : time spend for datou_step_exec : 41.886436462402344 time spend to save output : 0.04449057579040527 total time spend for step 2 : 41.93092703819275 step3:rle_unique_nms_with_priority Sat Sep 6 14:21:48 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 82 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++nb_obj : 7 nb_hashtags : 3 time to prepare the origin masks : 0.9056029319763184 time for calcul the mask position with numpy : 0.17278480529785156 nb_pixel_total : 1906290 time to create 1 rle with new method : 0.08931422233581543 time for calcul the mask position with numpy : 0.007055997848510742 nb_pixel_total : 13386 time to create 1 rle with old method : 0.015223026275634766 time for calcul the mask position with numpy : 0.0076978206634521484 nb_pixel_total : 117636 time to create 1 rle with old method : 0.13333487510681152 time for calcul the mask position with numpy : 0.007131338119506836 nb_pixel_total : 4612 time to create 1 rle with old method : 0.005671977996826172 time for calcul the mask position with numpy : 0.006696462631225586 nb_pixel_total : 1576 time to create 1 rle with old method : 0.0020952224731445312 time for calcul the mask position with numpy : 0.006781101226806641 nb_pixel_total : 18528 time to create 1 rle with old method : 0.021332502365112305 time for calcul the mask position with numpy : 0.006467103958129883 nb_pixel_total : 4869 time to create 1 rle with old method : 0.005464315414428711 time for calcul the mask position with numpy : 0.006033420562744141 nb_pixel_total : 6703 time to create 1 rle with old method : 0.007710695266723633 create new chi : 0.5121498107910156 time to delete rle : 0.021335124969482422 batch 1 Loaded 15 chid ids of type : 3594 +++++++Number RLEs to save : 3440 TO DO : save crop sub photo not yet done ! save time : 0.2465817928314209 nb_obj : 6 nb_hashtags : 2 time to prepare the origin masks : 0.1782362461090088 time for calcul the mask position with numpy : 0.055320024490356445 nb_pixel_total : 1279270 time to create 1 rle with new method : 0.3315901756286621 time for calcul the mask position with numpy : 0.007145881652832031 nb_pixel_total : 24498 time to create 1 rle with old method : 0.02858138084411621 time for calcul the mask position with numpy : 0.01261448860168457 nb_pixel_total : 644854 time to create 1 rle with new method : 0.19948554039001465 time for calcul the mask position with numpy : 0.0058939456939697266 nb_pixel_total : 6281 time to create 1 rle with old method : 0.007117271423339844 time for calcul the mask position with numpy : 0.0061604976654052734 nb_pixel_total : 16492 time to create 1 rle with old method : 0.018584251403808594 time for calcul the mask position with numpy : 0.006043672561645508 nb_pixel_total : 6681 time to create 1 rle with old method : 0.007559061050415039 time for calcul the mask position with numpy : 0.0071392059326171875 nb_pixel_total : 95524 time to create 1 rle with old method : 0.1106424331665039 create new chi : 0.8116204738616943 time to delete rle : 0.0008258819580078125 batch 1 Loaded 13 chid ids of type : 3594 +++++++Number RLEs to save : 5110 TO DO : save crop sub photo not yet done ! save time : 0.3322746753692627 nb_obj : 4 nb_hashtags : 1 time to prepare the origin masks : 0.06358742713928223 time for calcul the mask position with numpy : 0.02393054962158203 nb_pixel_total : 2049945 time to create 1 rle with new method : 0.03144025802612305 time for calcul the mask position with numpy : 0.0062372684478759766 nb_pixel_total : 3884 time to create 1 rle with old method : 0.0045833587646484375 time for calcul the mask position with numpy : 0.005959987640380859 nb_pixel_total : 6651 time to create 1 rle with old method : 0.0076751708984375 time for calcul the mask position with numpy : 0.006112575531005859 nb_pixel_total : 6897 time to create 1 rle with old method : 0.008426904678344727 time for calcul the mask position with numpy : 0.006536722183227539 nb_pixel_total : 6223 time to create 1 rle with old method : 0.007292270660400391 create new chi : 0.10855603218078613 time to delete rle : 0.0003814697265625 batch 1 Loaded 9 chid ids of type : 3594 +++++Number RLEs to save : 1904 TO DO : save crop sub photo not yet done ! save time : 0.15481042861938477 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.06795620918273926 time for calcul the mask position with numpy : 0.18309903144836426 nb_pixel_total : 2054435 time to create 1 rle with new method : 0.06850814819335938 time for calcul the mask position with numpy : 0.005836009979248047 nb_pixel_total : 5796 time to create 1 rle with old method : 0.006425619125366211 time for calcul the mask position with numpy : 0.006017208099365234 nb_pixel_total : 4365 time to create 1 rle with old method : 0.005102872848510742 time for calcul the mask position with numpy : 0.005893707275390625 nb_pixel_total : 5593 time to create 1 rle with old method : 0.0061435699462890625 time for calcul the mask position with numpy : 0.005711078643798828 nb_pixel_total : 3411 time to create 1 rle with old method : 0.0040285587310791016 create new chi : 0.3004941940307617 time to delete rle : 0.000301361083984375 batch 1 Loaded 9 chid ids of type : 3594 ++++Number RLEs to save : 1760 TO DO : save crop sub photo not yet done ! save time : 0.15723180770874023 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.21210265159606934 time for calcul the mask position with numpy : 0.18620967864990234 nb_pixel_total : 2038692 time to create 1 rle with new method : 0.0962982177734375 time for calcul the mask position with numpy : 0.006465911865234375 nb_pixel_total : 7074 time to create 1 rle with old method : 0.008406639099121094 time for calcul the mask position with numpy : 0.006600856781005859 nb_pixel_total : 8128 time to create 1 rle with old method : 0.009431838989257812 time for calcul the mask position with numpy : 0.006097555160522461 nb_pixel_total : 16338 time to create 1 rle with old method : 0.0187985897064209 time for calcul the mask position with numpy : 0.006185770034790039 nb_pixel_total : 3368 time to create 1 rle with old method : 0.004050493240356445 create new chi : 0.359525203704834 time to delete rle : 0.0003807544708251953 batch 1 Loaded 9 chid ids of type : 3594 ++++Number RLEs to save : 2110 TO DO : save crop sub photo not yet done ! save time : 0.17179059982299805 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.06921076774597168 time for calcul the mask position with numpy : 0.1853175163269043 nb_pixel_total : 2045619 time to create 1 rle with new method : 0.13224077224731445 time for calcul the mask position with numpy : 0.006410121917724609 nb_pixel_total : 12306 time to create 1 rle with old method : 0.013855695724487305 time for calcul the mask position with numpy : 0.006109476089477539 nb_pixel_total : 2722 time to create 1 rle with old method : 0.003274679183959961 time for calcul the mask position with numpy : 0.006035804748535156 nb_pixel_total : 10313 time to create 1 rle with old method : 0.011609554290771484 time for calcul the mask position with numpy : 0.005822420120239258 nb_pixel_total : 2640 time to create 1 rle with old method : 0.0030803680419921875 create new chi : 0.38469791412353516 time to delete rle : 0.0004413127899169922 batch 1 Loaded 9 chid ids of type : 3594 +++++Number RLEs to save : 2218 TO DO : save crop sub photo not yet done ! save time : 0.16934466361999512 No data in photo_id : 1382036755 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.04330158233642578 time for calcul the mask position with numpy : 0.020287513732910156 nb_pixel_total : 2061590 time to create 1 rle with new method : 0.18845605850219727 time for calcul the mask position with numpy : 0.006282806396484375 nb_pixel_total : 9412 time to create 1 rle with old method : 0.010654449462890625 time for calcul the mask position with numpy : 0.0057981014251708984 nb_pixel_total : 2598 time to create 1 rle with old method : 0.0028412342071533203 create new chi : 0.2411813735961914 time to delete rle : 0.0002434253692626953 batch 1 Loaded 5 chid ids of type : 3594 +++Number RLEs to save : 1554 TO DO : save crop sub photo not yet done ! save time : 0.1358339786529541 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.05926823616027832 time for calcul the mask position with numpy : 0.21967005729675293 nb_pixel_total : 2065770 time to create 1 rle with new method : 0.1639549732208252 time for calcul the mask position with numpy : 0.0068242549896240234 nb_pixel_total : 7830 time to create 1 rle with old method : 0.008890628814697266 create new chi : 0.4102780818939209 time to delete rle : 0.00023221969604492188 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1436 TO DO : save crop sub photo not yet done ! save time : 0.1204988956451416 nb_obj : 6 nb_hashtags : 3 time to prepare the origin masks : 0.3556187152862549 time for calcul the mask position with numpy : 0.15290212631225586 nb_pixel_total : 1920701 time to create 1 rle with new method : 0.2041761875152588 time for calcul the mask position with numpy : 0.005911350250244141 nb_pixel_total : 9286 time to create 1 rle with old method : 0.010555505752563477 time for calcul the mask position with numpy : 0.005789518356323242 nb_pixel_total : 4975 time to create 1 rle with old method : 0.005606174468994141 time for calcul the mask position with numpy : 0.006210803985595703 nb_pixel_total : 109122 time to create 1 rle with old method : 0.11875295639038086 time for calcul the mask position with numpy : 0.00583338737487793 nb_pixel_total : 18188 time to create 1 rle with old method : 0.021244287490844727 time for calcul the mask position with numpy : 0.006372213363647461 nb_pixel_total : 4834 time to create 1 rle with old method : 0.005923748016357422 time for calcul the mask position with numpy : 0.006264686584472656 nb_pixel_total : 6494 time to create 1 rle with old method : 0.007823705673217773 create new chi : 0.5731525421142578 time to delete rle : 0.0006670951843261719 batch 1 Loaded 13 chid ids of type : 3594 ++++++Number RLEs to save : 3646 TO DO : save crop sub photo not yet done ! save time : 0.2695882320404053 nb_obj : 5 nb_hashtags : 2 time to prepare the origin masks : 0.07231497764587402 time for calcul the mask position with numpy : 0.020006895065307617 nb_pixel_total : 2037517 time to create 1 rle with new method : 0.0803980827331543 time for calcul the mask position with numpy : 0.005935192108154297 nb_pixel_total : 4846 time to create 1 rle with old method : 0.0054781436920166016 time for calcul the mask position with numpy : 0.00584864616394043 nb_pixel_total : 18222 time to create 1 rle with old method : 0.020511865615844727 time for calcul the mask position with numpy : 0.005970001220703125 nb_pixel_total : 1437 time to create 1 rle with old method : 0.001653432846069336 time for calcul the mask position with numpy : 0.005715131759643555 nb_pixel_total : 4896 time to create 1 rle with old method : 0.0055348873138427734 time for calcul the mask position with numpy : 0.006470918655395508 nb_pixel_total : 6682 time to create 1 rle with old method : 0.007570028305053711 create new chi : 0.17859101295471191 time to delete rle : 0.00033545494079589844 batch 1 Loaded 11 chid ids of type : 3594 +++++Number RLEs to save : 2120 TO DO : save crop sub photo not yet done ! save time : 0.163468599319458 nb_obj : 8 nb_hashtags : 3 time to prepare the origin masks : 0.25173258781433105 time for calcul the mask position with numpy : 0.1615283489227295 nb_pixel_total : 1931003 time to create 1 rle with new method : 0.07501387596130371 time for calcul the mask position with numpy : 0.005664825439453125 nb_pixel_total : 3998 time to create 1 rle with old method : 0.004221677780151367 time for calcul the mask position with numpy : 0.005919456481933594 nb_pixel_total : 12104 time to create 1 rle with old method : 0.013229846954345703 time for calcul the mask position with numpy : 0.005561351776123047 nb_pixel_total : 2432 time to create 1 rle with old method : 0.002627849578857422 time for calcul the mask position with numpy : 0.0054972171783447266 nb_pixel_total : 4213 time to create 1 rle with old method : 0.00466465950012207 time for calcul the mask position with numpy : 0.0059452056884765625 nb_pixel_total : 100764 time to create 1 rle with old method : 0.10899543762207031 time for calcul the mask position with numpy : 0.005872488021850586 nb_pixel_total : 8172 time to create 1 rle with old method : 0.008784770965576172 time for calcul the mask position with numpy : 0.005705833435058594 nb_pixel_total : 1575 time to create 1 rle with old method : 0.0016946792602539062 time for calcul the mask position with numpy : 0.0059947967529296875 nb_pixel_total : 9339 time to create 1 rle with old method : 0.00993490219116211 create new chi : 0.4467782974243164 time to delete rle : 0.0005271434783935547 batch 1 Loaded 17 chid ids of type : 3594 ++++++++Number RLEs to save : 3258 TO DO : save crop sub photo not yet done ! save time : 0.23766303062438965 nb_obj : 4 nb_hashtags : 3 time to prepare the origin masks : 0.05354619026184082 time for calcul the mask position with numpy : 0.017620563507080078 nb_pixel_total : 1937553 time to create 1 rle with new method : 0.14843225479125977 time for calcul the mask position with numpy : 0.006077766418457031 nb_pixel_total : 11545 time to create 1 rle with old method : 0.012845754623413086 time for calcul the mask position with numpy : 0.0065157413482666016 nb_pixel_total : 109001 time to create 1 rle with old method : 0.12145733833312988 time for calcul the mask position with numpy : 0.005719900131225586 nb_pixel_total : 5338 time to create 1 rle with old method : 0.005915403366088867 time for calcul the mask position with numpy : 0.005811929702758789 nb_pixel_total : 10163 time to create 1 rle with old method : 0.011301994323730469 create new chi : 0.34212708473205566 time to delete rle : 0.0002658367156982422 batch 1 Loaded 9 chid ids of type : 3594 ++++Number RLEs to save : 2732 TO DO : save crop sub photo not yet done ! save time : 0.21177387237548828 nb_obj : 3 nb_hashtags : 3 time to prepare the origin masks : 0.20686841011047363 time for calcul the mask position with numpy : 0.13264751434326172 nb_pixel_total : 1960401 time to create 1 rle with new method : 0.16559100151062012 time for calcul the mask position with numpy : 0.006125688552856445 nb_pixel_total : 3843 time to create 1 rle with old method : 0.004288911819458008 time for calcul the mask position with numpy : 0.006768941879272461 nb_pixel_total : 103318 time to create 1 rle with old method : 0.1121218204498291 time for calcul the mask position with numpy : 0.00610041618347168 nb_pixel_total : 6038 time to create 1 rle with old method : 0.006827831268310547 create new chi : 0.4504857063293457 time to delete rle : 0.00043511390686035156 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 2522 TO DO : save crop sub photo not yet done ! save time : 0.20644855499267578 nb_obj : 4 nb_hashtags : 3 time to prepare the origin masks : 0.07363367080688477 time for calcul the mask position with numpy : 0.21081852912902832 nb_pixel_total : 1945919 time to create 1 rle with new method : 0.08859777450561523 time for calcul the mask position with numpy : 0.006392955780029297 nb_pixel_total : 10402 time to create 1 rle with old method : 0.012528419494628906 time for calcul the mask position with numpy : 0.005926609039306641 nb_pixel_total : 1985 time to create 1 rle with old method : 0.0023682117462158203 time for calcul the mask position with numpy : 0.0065305233001708984 nb_pixel_total : 109228 time to create 1 rle with old method : 0.12677526473999023 time for calcul the mask position with numpy : 0.006048679351806641 nb_pixel_total : 6066 time to create 1 rle with old method : 0.006943464279174805 create new chi : 0.48390913009643555 time to delete rle : 0.00028586387634277344 batch 1 Loaded 9 chid ids of type : 3594 ++++Number RLEs to save : 2710 TO DO : save crop sub photo not yet done ! save time : 0.21299004554748535 nb_obj : 7 nb_hashtags : 5 time to prepare the origin masks : 0.33563899993896484 time for calcul the mask position with numpy : 0.053227901458740234 nb_pixel_total : 1937544 time to create 1 rle with new method : 0.0976712703704834 time for calcul the mask position with numpy : 0.005989551544189453 nb_pixel_total : 13948 time to create 1 rle with old method : 0.015360832214355469 time for calcul the mask position with numpy : 0.005700111389160156 nb_pixel_total : 3726 time to create 1 rle with old method : 0.004195213317871094 time for calcul the mask position with numpy : 0.006000995635986328 nb_pixel_total : 95537 time to create 1 rle with old method : 0.11440634727478027 time for calcul the mask position with numpy : 0.007361888885498047 nb_pixel_total : 10013 time to create 1 rle with old method : 0.01194453239440918 time for calcul the mask position with numpy : 0.007754087448120117 nb_pixel_total : 3426 time to create 1 rle with old method : 0.0040493011474609375 time for calcul the mask position with numpy : 0.007033586502075195 nb_pixel_total : 6068 time to create 1 rle with old method : 0.007185697555541992 time for calcul the mask position with numpy : 0.0069904327392578125 nb_pixel_total : 3338 time to create 1 rle with old method : 0.0040857791900634766 create new chi : 0.3696248531341553 time to delete rle : 0.0006392002105712891 batch 1 Loaded 15 chid ids of type : 3594 +++++++++Number RLEs to save : 3396 TO DO : save crop sub photo not yet done ! save time : 0.24023199081420898 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.06109619140625 time for calcul the mask position with numpy : 0.018143177032470703 nb_pixel_total : 1945701 time to create 1 rle with new method : 0.028383493423461914 time for calcul the mask position with numpy : 0.005915164947509766 nb_pixel_total : 402 time to create 1 rle with old method : 0.00048732757568359375 time for calcul the mask position with numpy : 0.005902767181396484 nb_pixel_total : 17957 time to create 1 rle with old method : 0.0224764347076416 time for calcul the mask position with numpy : 0.007970333099365234 nb_pixel_total : 104652 time to create 1 rle with old method : 0.11871719360351562 time for calcul the mask position with numpy : 0.006158590316772461 nb_pixel_total : 4888 time to create 1 rle with old method : 0.0059278011322021484 create new chi : 0.2205512523651123 time to delete rle : 0.0002846717834472656 batch 1 Loaded 9 chid ids of type : 3594 ++++Number RLEs to save : 2816 TO DO : save crop sub photo not yet done ! save time : 0.2094898223876953 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.033560752868652344 time for calcul the mask position with numpy : 0.023411273956298828 nb_pixel_total : 1982263 time to create 1 rle with new method : 0.03136730194091797 time for calcul the mask position with numpy : 0.006382942199707031 nb_pixel_total : 91337 time to create 1 rle with old method : 0.1022789478302002 create new chi : 0.16374707221984863 time to delete rle : 0.00029921531677246094 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 2056 TO DO : save crop sub photo not yet done ! save time : 0.16399860382080078 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.04595804214477539 time for calcul the mask position with numpy : 0.01966094970703125 nb_pixel_total : 2058550 time to create 1 rle with new method : 0.31132078170776367 time for calcul the mask position with numpy : 0.005766391754150391 nb_pixel_total : 15050 time to create 1 rle with old method : 0.017189979553222656 create new chi : 0.354189395904541 time to delete rle : 0.0005087852478027344 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1398 TO DO : save crop sub photo not yet done ! save time : 0.12231802940368652 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.08722972869873047 time for calcul the mask position with numpy : 0.18120813369750977 nb_pixel_total : 2069993 time to create 1 rle with new method : 0.24571919441223145 time for calcul the mask position with numpy : 0.006144523620605469 nb_pixel_total : 3607 time to create 1 rle with old method : 0.004178524017333984 create new chi : 0.44778966903686523 time to delete rle : 0.00022554397583007812 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1194 TO DO : save crop sub photo not yet done ! save time : 0.12030339241027832 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.03924989700317383 time for calcul the mask position with numpy : 0.05435919761657715 nb_pixel_total : 1960088 time to create 1 rle with new method : 0.192734956741333 time for calcul the mask position with numpy : 0.0062372684478759766 nb_pixel_total : 113512 time to create 1 rle with old method : 0.12438631057739258 create new chi : 0.38868117332458496 time to delete rle : 0.00032019615173339844 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 2180 TO DO : save crop sub photo not yet done ! save time : 0.16923141479492188 nb_obj : 5 nb_hashtags : 2 time to prepare the origin masks : 0.4056050777435303 time for calcul the mask position with numpy : 0.011150360107421875 nb_pixel_total : 666760 time to create 1 rle with new method : 0.07741498947143555 time for calcul the mask position with numpy : 0.007153749465942383 nb_pixel_total : 262188 time to create 1 rle with new method : 0.19266414642333984 time for calcul the mask position with numpy : 0.013841629028320312 nb_pixel_total : 1099222 time to create 1 rle with new method : 0.04861140251159668 time for calcul the mask position with numpy : 0.006248950958251953 nb_pixel_total : 8781 time to create 1 rle with old method : 0.009876728057861328 time for calcul the mask position with numpy : 0.006974220275878906 nb_pixel_total : 33381 time to create 1 rle with old method : 0.0373384952545166 time for calcul the mask position with numpy : 0.00591588020324707 nb_pixel_total : 3268 time to create 1 rle with old method : 0.0036194324493408203 create new chi : 0.4346027374267578 time to delete rle : 0.0009911060333251953 batch 1 Loaded 11 chid ids of type : 3594 ++++++++++Number RLEs to save : 6396 TO DO : save crop sub photo not yet done ! save time : 0.41657400131225586 map_output_result : {1382037342: (0.0, 'Should be the crop_list due to order', 0), 1382037287: (0.0, 'Should be the crop_list due to order', 0), 1382037250: (0.0, 'Should be the crop_list due to order', 0), 1382037218: (0.0, 'Should be the crop_list due to order', 0), 1382037193: (0.0, 'Should be the crop_list due to order', 0), 1382036794: (0.0, 'Should be the crop_list due to order', 0), 1382036755: (0.0, 'Should be the crop_list due to order', 0.0), 1382036729: (0.0, 'Should be the crop_list due to order', 0), 1382036694: (0.0, 'Should be the crop_list due to order', 0), 1382036666: (0.0, 'Should be the crop_list due to order', 0), 1382036636: (0.0, 'Should be the crop_list due to order', 0), 1382036438: (0.0, 'Should be the crop_list due to order', 0), 1382036416: (0.0, 'Should be the crop_list due to order', 0), 1382036390: (0.0, 'Should be the crop_list due to order', 0), 1382036357: (0.0, 'Should be the crop_list due to order', 0), 1382036330: (0.0, 'Should be the crop_list due to order', 0), 1382036305: (0.0, 'Should be the crop_list due to order', 0), 1382035956: (0.0, 'Should be the crop_list due to order', 0), 1382035932: (0.0, 'Should be the crop_list due to order', 0), 1382035900: (0.0, 'Should be the crop_list due to order', 0), 1382035862: (0.0, 'Should be the crop_list due to order', 0), 1382035831: (0.0, 'Should be the crop_list due to order', 0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1382037342, 1382037287, 1382037250, 1382037218, 1382037193, 1382036794, 1382036755, 1382036729, 1382036694, 1382036666, 1382036636, 1382036438, 1382036416, 1382036390, 1382036357, 1382036330, 1382036305, 1382035956, 1382035932, 1382035900, 1382035862, 1382035831] Looping around the photos to save general results len do output : 22 /1382037342.Didn't retrieve data . /1382037287.Didn't retrieve data . /1382037250.Didn't retrieve data . /1382037218.Didn't retrieve data . /1382037193.Didn't retrieve data . /1382036794.Didn't retrieve data . /1382036755.Didn't retrieve data . /1382036729.Didn't retrieve data . /1382036694.Didn't retrieve data . /1382036666.Didn't retrieve data . /1382036636.Didn't retrieve data . /1382036438.Didn't retrieve data . /1382036416.Didn't retrieve data . /1382036390.Didn't retrieve data . /1382036357.Didn't retrieve data . /1382036330.Didn't retrieve data . /1382036305.Didn't retrieve data . /1382035956.Didn't retrieve data . /1382035932.Didn't retrieve data . /1382035900.Didn't retrieve data . /1382035862.Didn't retrieve data . /1382035831.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037342', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037287', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037250', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037218', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037193', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036794', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036755', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036729', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036694', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036666', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036636', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036438', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036416', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036390', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036357', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036330', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036305', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035956', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035932', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035900', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035862', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035831', None, None, None, None, None, '3674434') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 66 time used for this insertion : 0.015990018844604492 save_final save missing photos in datou_result : time spend for datou_step_exec : 16.530805110931396 time spend to save output : 0.016687870025634766 total time spend for step 3 : 16.54749298095703 step4:ventilate_hashtags_in_portfolio Sat Sep 6 14:22:05 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 26607883 get user id for portfolio 26607883 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26607883 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('autre','pehd','papier','background','flou','pet_fonce','pet_clair','carton','environnement','mal_croppe','metal')) AND mptpi.`min_score`=0.5 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26607883 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('autre','pehd','papier','background','flou','pet_fonce','pet_clair','carton','environnement','mal_croppe','metal')) AND mptpi.`min_score`=0.5 To do To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26607883 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('autre','pehd','papier','background','flou','pet_fonce','pet_clair','carton','environnement','mal_croppe','metal')) AND mptpi.`min_score`=0.5 To do lien utilise dans velours : https://marlene.fotonower.com/velours/26609187,26609188,26609189,26609190,26609191,26609192,26609193,26609194,26609195,26609196,26609197?tags=autre,pehd,papier,background,flou,pet_fonce,pet_clair,carton,environnement,mal_croppe,metal Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1382037342, 1382037287, 1382037250, 1382037218, 1382037193, 1382036794, 1382036755, 1382036729, 1382036694, 1382036666, 1382036636, 1382036438, 1382036416, 1382036390, 1382036357, 1382036330, 1382036305, 1382035956, 1382035932, 1382035900, 1382035862, 1382035831] Looping around the photos to save general results len do output : 1 /26607883. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037342', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037287', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037250', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037218', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037193', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036794', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036755', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036729', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036694', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036666', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036636', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036438', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036416', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036390', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036357', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036330', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036305', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035956', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035932', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035900', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035862', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035831', None, None, None, None, None, '3674434') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 23 time used for this insertion : 0.015984535217285156 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.6963820457458496 time spend to save output : 0.01630854606628418 total time spend for step 4 : 0.7126905918121338 step5:final Sat Sep 6 14:22:06 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Catched exception ! Connect or reconnect ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1382037342: ('0.08088690726711562',), 1382037287: ('0.08088690726711562',), 1382037250: ('0.08088690726711562',), 1382037218: ('0.08088690726711562',), 1382037193: ('0.08088690726711562',), 1382036794: ('0.08088690726711562',), 1382036755: ('0.08088690726711562',), 1382036729: ('0.08088690726711562',), 1382036694: ('0.08088690726711562',), 1382036666: ('0.08088690726711562',), 1382036636: ('0.08088690726711562',), 1382036438: ('0.08088690726711562',), 1382036416: ('0.08088690726711562',), 1382036390: ('0.08088690726711562',), 1382036357: ('0.08088690726711562',), 1382036330: ('0.08088690726711562',), 1382036305: ('0.08088690726711562',), 1382035956: ('0.08088690726711562',), 1382035932: ('0.08088690726711562',), 1382035900: ('0.08088690726711562',), 1382035862: ('0.08088690726711562',), 1382035831: ('0.08088690726711562',)} new output for save of step final : {1382037342: ('0.08088690726711562',), 1382037287: ('0.08088690726711562',), 1382037250: ('0.08088690726711562',), 1382037218: ('0.08088690726711562',), 1382037193: ('0.08088690726711562',), 1382036794: ('0.08088690726711562',), 1382036755: ('0.08088690726711562',), 1382036729: ('0.08088690726711562',), 1382036694: ('0.08088690726711562',), 1382036666: ('0.08088690726711562',), 1382036636: ('0.08088690726711562',), 1382036438: ('0.08088690726711562',), 1382036416: ('0.08088690726711562',), 1382036390: ('0.08088690726711562',), 1382036357: ('0.08088690726711562',), 1382036330: ('0.08088690726711562',), 1382036305: ('0.08088690726711562',), 1382035956: ('0.08088690726711562',), 1382035932: ('0.08088690726711562',), 1382035900: ('0.08088690726711562',), 1382035862: ('0.08088690726711562',), 1382035831: ('0.08088690726711562',)} [1382037342, 1382037287, 1382037250, 1382037218, 1382037193, 1382036794, 1382036755, 1382036729, 1382036694, 1382036666, 1382036636, 1382036438, 1382036416, 1382036390, 1382036357, 1382036330, 1382036305, 1382035956, 1382035932, 1382035900, 1382035862, 1382035831] Looping around the photos to save general results len do output : 22 /1382037342.Didn't retrieve data . /1382037287.Didn't retrieve data . /1382037250.Didn't retrieve data . /1382037218.Didn't retrieve data . /1382037193.Didn't retrieve data . /1382036794.Didn't retrieve data . /1382036755.Didn't retrieve data . /1382036729.Didn't retrieve data . /1382036694.Didn't retrieve data . /1382036666.Didn't retrieve data . /1382036636.Didn't retrieve data . /1382036438.Didn't retrieve data . /1382036416.Didn't retrieve data . /1382036390.Didn't retrieve data . /1382036357.Didn't retrieve data . /1382036330.Didn't retrieve data . /1382036305.Didn't retrieve data . /1382035956.Didn't retrieve data . /1382035932.Didn't retrieve data . /1382035900.Didn't retrieve data . /1382035862.Didn't retrieve data . /1382035831.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037342', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037287', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037250', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037218', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037193', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036794', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036755', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036729', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036694', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036666', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036636', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036438', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036416', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036390', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036357', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036330', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036305', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035956', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035932', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035900', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035862', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035831', None, None, None, None, None, '3674434') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 66 time used for this insertion : 0.017726421356201172 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.11993241310119629 time spend to save output : 0.018636465072631836 total time spend for step 5 : 0.13856887817382812 step6:blur_detection Sat Sep 6 14:22:06 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection methode: ratio et variance treat image : temp/1757161229_3775605_1382037342_daa2c57480b83fafc45cc01e09d96c47.jpg resize: (1080, 1920) 1382037342 -4.1135810188401285 treat image : temp/1757161229_3775605_1382037287_d98c13cc7371cdcea9969ccdb6aca6c0.jpg resize: (1080, 1920) 1382037287 0.04129409836808059 treat image : temp/1757161229_3775605_1382037250_08863206205f41af48c714ee103cfc02.jpg resize: (1080, 1920) 1382037250 -3.3354788064698346 treat image : temp/1757161229_3775605_1382037218_afb1c6897c94cecc27c15f0e2560887f.jpg resize: (1080, 1920) 1382037218 -0.28209322470427284 treat image : temp/1757161229_3775605_1382037193_a22f081c8de215b49483c881f193b504.jpg resize: (1080, 1920) 1382037193 1.2072178442668562 treat image : temp/1757161229_3775605_1382036794_268d3d0e6297c3bf7aaefe08679eed83.jpg resize: (1080, 1920) 1382036794 -0.0755807206003832 treat image : temp/1757161229_3775605_1382036755_bd6e593bbb80aae8eaddfc5ad4c9a8eb.jpg resize: (1080, 1920) 1382036755 1.4239599860585983 treat image : temp/1757161229_3775605_1382036729_f5734c724e66510039d7009339b63b2d.jpg resize: (1080, 1920) 1382036729 0.25552133895419904 treat image : temp/1757161229_3775605_1382036694_1b8715c21286ede60dbc36a5458c5020.jpg resize: (1080, 1920) 1382036694 -0.9279539274771483 treat image : temp/1757161229_3775605_1382036666_90d4f30d2c7b96270a76c48f97fe918c.jpg resize: (1080, 1920) 1382036666 -4.093313529528721 treat image : temp/1757161229_3775605_1382036636_c895ec6da8a5a2b4bb115741f102cea2.jpg resize: (1080, 1920) 1382036636 -3.976225583622483 treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a.jpg resize: (1080, 1920) 1382036438 -4.068598736414259 treat image : temp/1757161229_3775605_1382036416_84a840d8bd18c5bad089a60d9a896824.jpg resize: (1080, 1920) 1382036416 0.9122645627333621 treat image : temp/1757161229_3775605_1382036390_ee88b6eb4317b30f801601230038d150.jpg resize: (1080, 1920) 1382036390 -2.6442380471630127 treat image : temp/1757161229_3775605_1382036357_05e7e1b7bb9209b2c96127b1bc346b19.jpg resize: (1080, 1920) 1382036357 -4.3061843785866865 treat image : temp/1757161229_3775605_1382036330_1cf58c52641d3aba0c7c02355164888a.jpg resize: (1080, 1920) 1382036330 0.25185376230483425 treat image : temp/1757161229_3775605_1382036305_be210d675162097aa8502fe5cb7ec39c.jpg resize: (1080, 1920) 1382036305 2.853306565447556 treat image : temp/1757161229_3775605_1382035956_d736044742c300d5bf23bf6c6755ef5b.jpg resize: (1080, 1920) 1382035956 -0.13470861502898313 treat image : temp/1757161229_3775605_1382035932_5dcec588c29869fd34f03184190f6258.jpg resize: (1080, 1920) 1382035932 -0.3824098816063131 treat image : temp/1757161229_3775605_1382035900_dbd4c95e0a7f7652c06cbd823ff0bef8.jpg resize: (1080, 1920) 1382035900 1.0160854465480342 treat image : temp/1757161229_3775605_1382035862_bb038e71b471a0c227032caf4459f956.jpg resize: (1080, 1920) 1382035862 -4.149637530614191 treat image : temp/1757161229_3775605_1382035831_658d0d111f642bdc1fee129467684f9f.jpg resize: (1080, 1920) 1382035831 2.7101507174497383 treat image : temp/1757161229_3775605_1382037342_daa2c57480b83fafc45cc01e09d96c47_rle_crop_3948072079_0.png resize: (128, 100) 1382068795 -2.0316553421141967 treat image : temp/1757161229_3775605_1382037342_daa2c57480b83fafc45cc01e09d96c47_rle_crop_3948072080_0.png resize: (86, 98) 1382068796 0.7010729983658567 treat image : temp/1757161229_3775605_1382037342_daa2c57480b83fafc45cc01e09d96c47_rle_crop_3948072081_0.png resize: (171, 190) 1382068797 -2.1042652037444927 treat image : temp/1757161229_3775605_1382037342_daa2c57480b83fafc45cc01e09d96c47_rle_crop_3948072082_0.png resize: (44, 49) 1382068798 0.8256398900064252 treat image : temp/1757161229_3775605_1382037287_d98c13cc7371cdcea9969ccdb6aca6c0_rle_crop_3948072091_0.png resize: (201, 184) 1382068799 -2.213936410435515 treat image : temp/1757161229_3775605_1382037250_08863206205f41af48c714ee103cfc02_rle_crop_3948072092_0.png resize: (121, 100) 1382068800 -1.9911893780438223 treat image : temp/1757161229_3775605_1382037250_08863206205f41af48c714ee103cfc02_rle_crop_3948072093_0.png resize: (79, 130) 1382068801 -3.0417934006787064 treat image : temp/1757161229_3775605_1382037250_08863206205f41af48c714ee103cfc02_rle_crop_3948072094_0.png resize: (115, 95) 1382068802 -3.5974340462377836 treat image : temp/1757161229_3775605_1382037250_08863206205f41af48c714ee103cfc02_rle_crop_3948072095_0.png resize: (90, 86) 1382068803 -1.736123082893472 treat image : temp/1757161229_3775605_1382037218_afb1c6897c94cecc27c15f0e2560887f_rle_crop_3948072096_0.png resize: (69, 61) 1382068804 2.1797951304127183 treat image : temp/1757161229_3775605_1382037218_afb1c6897c94cecc27c15f0e2560887f_rle_crop_3948072097_0.png resize: (94, 97) 1382068805 -3.0610979877715323 treat image : temp/1757161229_3775605_1382037218_afb1c6897c94cecc27c15f0e2560887f_rle_crop_3948072098_0.png resize: (84, 95) 1382068806 -1.859436772353869 treat image : temp/1757161229_3775605_1382037193_a22f081c8de215b49483c881f193b504_rle_crop_3948072100_0.png resize: (78, 65) 1382068807 -1.266916227650277 treat image : temp/1757161229_3775605_1382037193_a22f081c8de215b49483c881f193b504_rle_crop_3948072101_0.png resize: (147, 193) 1382068808 -2.022830860461438 treat image : temp/1757161229_3775605_1382037193_a22f081c8de215b49483c881f193b504_rle_crop_3948072102_0.png resize: (165, 94) 1382068809 -1.9474216332659964 treat image : temp/1757161229_3775605_1382036794_268d3d0e6297c3bf7aaefe08679eed83_rle_crop_3948072104_0.png resize: (69, 52) 1382068810 3.561348256469405 treat image : temp/1757161229_3775605_1382036794_268d3d0e6297c3bf7aaefe08679eed83_rle_crop_3948072106_0.png resize: (49, 78) 1382068811 0.9742961391116005 treat image : temp/1757161229_3775605_1382036794_268d3d0e6297c3bf7aaefe08679eed83_rle_crop_3948072107_0.png resize: (244, 95) 1382068812 -1.991246096487652 treat image : temp/1757161229_3775605_1382036729_f5734c724e66510039d7009339b63b2d_rle_crop_3948072109_0.png resize: (126, 182) 1382068813 -2.594352630979745 treat image : temp/1757161229_3775605_1382036694_1b8715c21286ede60dbc36a5458c5020_rle_crop_3948072110_0.png resize: (170, 83) 1382068814 -0.593621647974844 treat image : temp/1757161229_3775605_1382036666_90d4f30d2c7b96270a76c48f97fe918c_rle_crop_3948072111_0.png resize: (127, 96) 1382068815 -1.9661443863365549 treat image : temp/1757161229_3775605_1382036666_90d4f30d2c7b96270a76c48f97fe918c_rle_crop_3948072112_0.png resize: (87, 101) 1382068816 1.1984140117927011 treat image : temp/1757161229_3775605_1382036666_90d4f30d2c7b96270a76c48f97fe918c_rle_crop_3948072113_0.png resize: (173, 187) 1382068817 -2.104837742980923 treat image : temp/1757161229_3775605_1382036636_c895ec6da8a5a2b4bb115741f102cea2_rle_crop_3948072117_0.png resize: (132, 97) 1382068818 -2.2543647538916467 treat image : temp/1757161229_3775605_1382036636_c895ec6da8a5a2b4bb115741f102cea2_rle_crop_3948072118_0.png resize: (88, 101) 1382068819 0.34538935511665086 treat image : temp/1757161229_3775605_1382036636_c895ec6da8a5a2b4bb115741f102cea2_rle_crop_3948072119_0.png resize: (41, 48) 1382068820 0.9737359461330208 treat image : temp/1757161229_3775605_1382036636_c895ec6da8a5a2b4bb115741f102cea2_rle_crop_3948072120_0.png resize: (170, 191) 1382068821 -2.092742475961032 treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a_rle_crop_3948072122_0.png resize: (124, 136) 1382068822 -1.775379276021056 treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a_rle_crop_3948072123_0.png resize: (44, 40) 1382068823 3.400641371917735 treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a_rle_crop_3948072124_0.png resize: (76, 167) 1382068824 -2.825602878346974 treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a_rle_crop_3948072127_0.png resize: (50, 59) 1382068825 0.24729355008067316 treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a_rle_crop_3948072128_0.png resize: (141, 171) 1382068826 -2.0620138650744995 treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a_rle_crop_3948072129_0.png resize: (66, 81) 1382068827 -1.5605541080142085 treat image : temp/1757161229_3775605_1382036416_84a840d8bd18c5bad089a60d9a896824_rle_crop_3948072130_0.png resize: (112, 153) 1382068828 -0.8157469942469264 treat image : temp/1757161229_3775605_1382036416_84a840d8bd18c5bad089a60d9a896824_rle_crop_3948072133_0.png resize: (114, 192) 1382068829 0.4816122896400871 treat image : temp/1757161229_3775605_1382036390_ee88b6eb4317b30f801601230038d150_rle_crop_3948072134_0.png resize: (124, 95) 1382068830 -1.3585252883896786 treat image : temp/1757161229_3775605_1382036357_05e7e1b7bb9209b2c96127b1bc346b19_rle_crop_3948072137_0.png resize: (125, 92) 1382068831 -2.544223712343959 treat image : temp/1757161229_3775605_1382036357_05e7e1b7bb9209b2c96127b1bc346b19_rle_crop_3948072139_0.png resize: (48, 51) 1382068832 5.905635744788032 treat image : temp/1757161229_3775605_1382036330_1cf58c52641d3aba0c7c02355164888a_rle_crop_3948072142_0.png resize: (123, 101) 1382068833 -1.7608084180310932 treat image : temp/1757161229_3775605_1382036330_1cf58c52641d3aba0c7c02355164888a_rle_crop_3948072143_0.png resize: (59, 88) 1382068834 -0.7051022236348883 treat image : temp/1757161229_3775605_1382036330_1cf58c52641d3aba0c7c02355164888a_rle_crop_3948072147_0.png resize: (138, 169) 1382068835 -2.645487731580006 treat image : temp/1757161229_3775605_1382036305_be210d675162097aa8502fe5cb7ec39c_rle_crop_3948072148_0.png resize: (127, 66) 1382068836 -1.5676755300190042 treat image : temp/1757161229_3775605_1382036305_be210d675162097aa8502fe5cb7ec39c_rle_crop_3948072151_0.png resize: (64, 49) 1382068837 -2.3140509263290676 treat image : temp/1757161229_3775605_1382035932_5dcec588c29869fd34f03184190f6258_rle_crop_3948072153_0.png resize: (157, 132) 1382068838 -0.7465481441494815 treat image : temp/1757161229_3775605_1382035900_dbd4c95e0a7f7652c06cbd823ff0bef8_rle_crop_3948072154_0.png resize: (56, 83) 1382068839 3.1230609511998635 treat image : temp/1757161229_3775605_1382035831_658d0d111f642bdc1fee129467684f9f_rle_crop_3948072156_0.png resize: (70, 74) 1382068840 -1.7036562569735532 treat image : temp/1757161229_3775605_1382037342_daa2c57480b83fafc45cc01e09d96c47_rle_crop_3948072084_0.png resize: (514, 349) 1382068851 0.5135081832129715 treat image : temp/1757161229_3775605_1382037287_d98c13cc7371cdcea9969ccdb6aca6c0_rle_crop_3948072086_0.png resize: (503, 290) 1382068852 -0.048665216271131295 treat image : temp/1757161229_3775605_1382037287_d98c13cc7371cdcea9969ccdb6aca6c0_rle_crop_3948072087_0.png resize: (93, 113) 1382068853 -0.9850583908377449 treat image : temp/1757161229_3775605_1382037287_d98c13cc7371cdcea9969ccdb6aca6c0_rle_crop_3948072088_0.png resize: (151, 172) 1382068854 -1.959830538244736 treat image : temp/1757161229_3775605_1382037287_d98c13cc7371cdcea9969ccdb6aca6c0_rle_crop_3948072089_0.png resize: (92, 89) 1382068855 -0.6920810886621266 treat image : temp/1757161229_3775605_1382037287_d98c13cc7371cdcea9969ccdb6aca6c0_rle_crop_3948072090_0.png resize: (908, 974) 1382068856 0.45805672947426457 treat image : temp/1757161229_3775605_1382037218_afb1c6897c94cecc27c15f0e2560887f_rle_crop_3948072099_0.png resize: (85, 90) 1382068857 -0.8323097366412144 treat image : temp/1757161229_3775605_1382037193_a22f081c8de215b49483c881f193b504_rle_crop_3948072103_0.png resize: (95, 94) 1382068858 0.2672034078788143 treat image : temp/1757161229_3775605_1382036666_90d4f30d2c7b96270a76c48f97fe918c_rle_crop_3948072116_0.png resize: (517, 360) 1382068859 -0.39090422760688975 treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a_rle_crop_3948072126_0.png resize: (86, 97) 1382068860 -2.2143849272439615 treat image : temp/1757161229_3775605_1382036416_84a840d8bd18c5bad089a60d9a896824_rle_crop_3948072131_0.png resize: (80, 115) 1382068861 -2.3517329610736812 treat image : temp/1757161229_3775605_1382036390_ee88b6eb4317b30f801601230038d150_rle_crop_3948072136_0.png resize: (82, 62) 1382068862 0.942494157603464 treat image : temp/1757161229_3775605_1382036330_1cf58c52641d3aba0c7c02355164888a_rle_crop_3948072141_0.png resize: (86, 68) 1382068863 -1.4524890467154092 treat image : temp/1757161229_3775605_1382036305_be210d675162097aa8502fe5cb7ec39c_rle_crop_3948072149_0.png resize: (493, 338) 1382068864 0.24264368941274486 treat image : temp/1757161229_3775605_1382036305_be210d675162097aa8502fe5cb7ec39c_rle_crop_3948072150_0.png resize: (176, 147) 1382068865 -1.3212718802116319 treat image : temp/1757161229_3775605_1382036330_1cf58c52641d3aba0c7c02355164888a_rle_crop_3948072146_0.png resize: (58, 83) 1382068867 -1.9690575027898656 treat image : temp/1757161229_3775605_1382037342_daa2c57480b83fafc45cc01e09d96c47_rle_crop_3948072083_0.png resize: (50, 123) 1382068874 -2.010470946052937 treat image : temp/1757161229_3775605_1382037342_daa2c57480b83fafc45cc01e09d96c47_rle_crop_3948072085_0.png resize: (132, 133) 1382068875 -3.1841352182322216 treat image : temp/1757161229_3775605_1382036794_268d3d0e6297c3bf7aaefe08679eed83_rle_crop_3948072105_0.png resize: (177, 117) 1382068876 -1.2933745680874487 treat image : temp/1757161229_3775605_1382036729_f5734c724e66510039d7009339b63b2d_rle_crop_3948072108_0.png resize: (54, 65) 1382068877 0.08313520546529288 treat image : temp/1757161229_3775605_1382036666_90d4f30d2c7b96270a76c48f97fe918c_rle_crop_3948072114_0.png resize: (490, 322) 1382068878 0.5381083098172564 treat image : temp/1757161229_3775605_1382036666_90d4f30d2c7b96270a76c48f97fe918c_rle_crop_3948072115_0.png resize: (57, 123) 1382068879 -2.5281266533475084 treat image : temp/1757161229_3775605_1382036636_c895ec6da8a5a2b4bb115741f102cea2_rle_crop_3948072121_0.png resize: (54, 119) 1382068880 -2.218045375996397 treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a_rle_crop_3948072125_0.png resize: (495, 331) 1382068881 0.1821305466079658 treat image : temp/1757161229_3775605_1382036416_84a840d8bd18c5bad089a60d9a896824_rle_crop_3948072132_0.png resize: (511, 336) 1382068882 -0.01929049864241386 treat image : temp/1757161229_3775605_1382036390_ee88b6eb4317b30f801601230038d150_rle_crop_3948072135_0.png resize: (509, 328) 1382068883 0.303684067229433 treat image : temp/1757161229_3775605_1382036357_05e7e1b7bb9209b2c96127b1bc346b19_rle_crop_3948072138_0.png resize: (500, 343) 1382068884 0.20602975265791876 treat image : temp/1757161229_3775605_1382036330_1cf58c52641d3aba0c7c02355164888a_rle_crop_3948072145_0.png resize: (474, 322) 1382068885 -0.030467185826862992 treat image : temp/1757161229_3775605_1382035956_d736044742c300d5bf23bf6c6755ef5b_rle_crop_3948072152_0.png resize: (483, 328) 1382068886 -0.2703242556886488 treat image : temp/1757161229_3775605_1382035862_bb038e71b471a0c227032caf4459f956_rle_crop_3948072155_0.png resize: (538, 372) 1382068887 0.13512307238505777 treat image : temp/1757161229_3775605_1382035831_658d0d111f642bdc1fee129467684f9f_rle_crop_3948072157_0.png resize: (246, 237) 1382068888 -0.6657876172440891 treat image : temp/1757161229_3775605_1382035831_658d0d111f642bdc1fee129467684f9f_rle_crop_3948072158_0.png resize: (132, 104) 1382068889 -0.8513626120841875 treat image : temp/1757161229_3775605_1382035831_658d0d111f642bdc1fee129467684f9f_rle_crop_3948072159_0.png resize: (982, 1495) 1382068890 0.5790388825993686 treat image : temp/1757161229_3775605_1382035831_658d0d111f642bdc1fee129467684f9f_rle_crop_3948072160_0.png resize: (855, 918) 1382068891 -0.5356496424373346 treat image : temp/1757161229_3775605_1382036357_05e7e1b7bb9209b2c96127b1bc346b19_rle_crop_3948072140_0.png resize: (140, 100) 1382068892 -4.2874148791158735 treat image : temp/1757161229_3775605_1382036330_1cf58c52641d3aba0c7c02355164888a_rle_crop_3948072144_0.png resize: (128, 110) 1382068893 -3.3448072640333284 Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 104 time used for this insertion : 0.01758432388305664 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 104 time used for this insertion : 0.028110027313232422 save missing photos in datou_result : time spend for datou_step_exec : 18.179991722106934 time spend to save output : 0.05029606819152832 total time spend for step 6 : 18.230287790298462 step7:brightness Sat Sep 6 14:22:24 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness treat image : temp/1757161229_3775605_1382037342_daa2c57480b83fafc45cc01e09d96c47.jpg treat image : temp/1757161229_3775605_1382037287_d98c13cc7371cdcea9969ccdb6aca6c0.jpg treat image : temp/1757161229_3775605_1382037250_08863206205f41af48c714ee103cfc02.jpg treat image : temp/1757161229_3775605_1382037218_afb1c6897c94cecc27c15f0e2560887f.jpg treat image : temp/1757161229_3775605_1382037193_a22f081c8de215b49483c881f193b504.jpg treat image : temp/1757161229_3775605_1382036794_268d3d0e6297c3bf7aaefe08679eed83.jpg treat image : temp/1757161229_3775605_1382036755_bd6e593bbb80aae8eaddfc5ad4c9a8eb.jpg treat image : temp/1757161229_3775605_1382036729_f5734c724e66510039d7009339b63b2d.jpg treat image : temp/1757161229_3775605_1382036694_1b8715c21286ede60dbc36a5458c5020.jpg treat image : temp/1757161229_3775605_1382036666_90d4f30d2c7b96270a76c48f97fe918c.jpg treat image : temp/1757161229_3775605_1382036636_c895ec6da8a5a2b4bb115741f102cea2.jpg treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a.jpg treat image : temp/1757161229_3775605_1382036416_84a840d8bd18c5bad089a60d9a896824.jpg treat image : temp/1757161229_3775605_1382036390_ee88b6eb4317b30f801601230038d150.jpg treat image : temp/1757161229_3775605_1382036357_05e7e1b7bb9209b2c96127b1bc346b19.jpg treat image : temp/1757161229_3775605_1382036330_1cf58c52641d3aba0c7c02355164888a.jpg treat image : temp/1757161229_3775605_1382036305_be210d675162097aa8502fe5cb7ec39c.jpg treat image : temp/1757161229_3775605_1382035956_d736044742c300d5bf23bf6c6755ef5b.jpg treat image : temp/1757161229_3775605_1382035932_5dcec588c29869fd34f03184190f6258.jpg treat image : temp/1757161229_3775605_1382035900_dbd4c95e0a7f7652c06cbd823ff0bef8.jpg treat image : temp/1757161229_3775605_1382035862_bb038e71b471a0c227032caf4459f956.jpg treat image : temp/1757161229_3775605_1382035831_658d0d111f642bdc1fee129467684f9f.jpg treat image : temp/1757161229_3775605_1382037342_daa2c57480b83fafc45cc01e09d96c47_rle_crop_3948072079_0.png treat image : temp/1757161229_3775605_1382037342_daa2c57480b83fafc45cc01e09d96c47_rle_crop_3948072080_0.png treat image : temp/1757161229_3775605_1382037342_daa2c57480b83fafc45cc01e09d96c47_rle_crop_3948072081_0.png treat image : temp/1757161229_3775605_1382037342_daa2c57480b83fafc45cc01e09d96c47_rle_crop_3948072082_0.png treat image : temp/1757161229_3775605_1382037287_d98c13cc7371cdcea9969ccdb6aca6c0_rle_crop_3948072091_0.png treat image : temp/1757161229_3775605_1382037250_08863206205f41af48c714ee103cfc02_rle_crop_3948072092_0.png treat image : temp/1757161229_3775605_1382037250_08863206205f41af48c714ee103cfc02_rle_crop_3948072093_0.png treat image : temp/1757161229_3775605_1382037250_08863206205f41af48c714ee103cfc02_rle_crop_3948072094_0.png treat image : temp/1757161229_3775605_1382037250_08863206205f41af48c714ee103cfc02_rle_crop_3948072095_0.png treat image : temp/1757161229_3775605_1382037218_afb1c6897c94cecc27c15f0e2560887f_rle_crop_3948072096_0.png treat image : temp/1757161229_3775605_1382037218_afb1c6897c94cecc27c15f0e2560887f_rle_crop_3948072097_0.png treat image : temp/1757161229_3775605_1382037218_afb1c6897c94cecc27c15f0e2560887f_rle_crop_3948072098_0.png treat image : temp/1757161229_3775605_1382037193_a22f081c8de215b49483c881f193b504_rle_crop_3948072100_0.png treat image : temp/1757161229_3775605_1382037193_a22f081c8de215b49483c881f193b504_rle_crop_3948072101_0.png treat image : temp/1757161229_3775605_1382037193_a22f081c8de215b49483c881f193b504_rle_crop_3948072102_0.png treat image : temp/1757161229_3775605_1382036794_268d3d0e6297c3bf7aaefe08679eed83_rle_crop_3948072104_0.png treat image : temp/1757161229_3775605_1382036794_268d3d0e6297c3bf7aaefe08679eed83_rle_crop_3948072106_0.png treat image : temp/1757161229_3775605_1382036794_268d3d0e6297c3bf7aaefe08679eed83_rle_crop_3948072107_0.png treat image : temp/1757161229_3775605_1382036729_f5734c724e66510039d7009339b63b2d_rle_crop_3948072109_0.png treat image : temp/1757161229_3775605_1382036694_1b8715c21286ede60dbc36a5458c5020_rle_crop_3948072110_0.png treat image : temp/1757161229_3775605_1382036666_90d4f30d2c7b96270a76c48f97fe918c_rle_crop_3948072111_0.png treat image : temp/1757161229_3775605_1382036666_90d4f30d2c7b96270a76c48f97fe918c_rle_crop_3948072112_0.png treat image : temp/1757161229_3775605_1382036666_90d4f30d2c7b96270a76c48f97fe918c_rle_crop_3948072113_0.png treat image : temp/1757161229_3775605_1382036636_c895ec6da8a5a2b4bb115741f102cea2_rle_crop_3948072117_0.png treat image : temp/1757161229_3775605_1382036636_c895ec6da8a5a2b4bb115741f102cea2_rle_crop_3948072118_0.png treat image : temp/1757161229_3775605_1382036636_c895ec6da8a5a2b4bb115741f102cea2_rle_crop_3948072119_0.png treat image : temp/1757161229_3775605_1382036636_c895ec6da8a5a2b4bb115741f102cea2_rle_crop_3948072120_0.png treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a_rle_crop_3948072122_0.png treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a_rle_crop_3948072123_0.png treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a_rle_crop_3948072124_0.png treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a_rle_crop_3948072127_0.png treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a_rle_crop_3948072128_0.png treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a_rle_crop_3948072129_0.png treat image : temp/1757161229_3775605_1382036416_84a840d8bd18c5bad089a60d9a896824_rle_crop_3948072130_0.png treat image : temp/1757161229_3775605_1382036416_84a840d8bd18c5bad089a60d9a896824_rle_crop_3948072133_0.png treat image : temp/1757161229_3775605_1382036390_ee88b6eb4317b30f801601230038d150_rle_crop_3948072134_0.png treat image : temp/1757161229_3775605_1382036357_05e7e1b7bb9209b2c96127b1bc346b19_rle_crop_3948072137_0.png treat image : temp/1757161229_3775605_1382036357_05e7e1b7bb9209b2c96127b1bc346b19_rle_crop_3948072139_0.png treat image : temp/1757161229_3775605_1382036330_1cf58c52641d3aba0c7c02355164888a_rle_crop_3948072142_0.png treat image : temp/1757161229_3775605_1382036330_1cf58c52641d3aba0c7c02355164888a_rle_crop_3948072143_0.png treat image : temp/1757161229_3775605_1382036330_1cf58c52641d3aba0c7c02355164888a_rle_crop_3948072147_0.png treat image : temp/1757161229_3775605_1382036305_be210d675162097aa8502fe5cb7ec39c_rle_crop_3948072148_0.png treat image : temp/1757161229_3775605_1382036305_be210d675162097aa8502fe5cb7ec39c_rle_crop_3948072151_0.png treat image : temp/1757161229_3775605_1382035932_5dcec588c29869fd34f03184190f6258_rle_crop_3948072153_0.png treat image : temp/1757161229_3775605_1382035900_dbd4c95e0a7f7652c06cbd823ff0bef8_rle_crop_3948072154_0.png treat image : temp/1757161229_3775605_1382035831_658d0d111f642bdc1fee129467684f9f_rle_crop_3948072156_0.png treat image : temp/1757161229_3775605_1382037342_daa2c57480b83fafc45cc01e09d96c47_rle_crop_3948072084_0.png treat image : temp/1757161229_3775605_1382037287_d98c13cc7371cdcea9969ccdb6aca6c0_rle_crop_3948072086_0.png treat image : temp/1757161229_3775605_1382037287_d98c13cc7371cdcea9969ccdb6aca6c0_rle_crop_3948072087_0.png treat image : temp/1757161229_3775605_1382037287_d98c13cc7371cdcea9969ccdb6aca6c0_rle_crop_3948072088_0.png treat image : temp/1757161229_3775605_1382037287_d98c13cc7371cdcea9969ccdb6aca6c0_rle_crop_3948072089_0.png treat image : temp/1757161229_3775605_1382037287_d98c13cc7371cdcea9969ccdb6aca6c0_rle_crop_3948072090_0.png treat image : temp/1757161229_3775605_1382037218_afb1c6897c94cecc27c15f0e2560887f_rle_crop_3948072099_0.png treat image : temp/1757161229_3775605_1382037193_a22f081c8de215b49483c881f193b504_rle_crop_3948072103_0.png treat image : temp/1757161229_3775605_1382036666_90d4f30d2c7b96270a76c48f97fe918c_rle_crop_3948072116_0.png treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a_rle_crop_3948072126_0.png treat image : temp/1757161229_3775605_1382036416_84a840d8bd18c5bad089a60d9a896824_rle_crop_3948072131_0.png treat image : temp/1757161229_3775605_1382036390_ee88b6eb4317b30f801601230038d150_rle_crop_3948072136_0.png treat image : temp/1757161229_3775605_1382036330_1cf58c52641d3aba0c7c02355164888a_rle_crop_3948072141_0.png treat image : temp/1757161229_3775605_1382036305_be210d675162097aa8502fe5cb7ec39c_rle_crop_3948072149_0.png treat image : temp/1757161229_3775605_1382036305_be210d675162097aa8502fe5cb7ec39c_rle_crop_3948072150_0.png treat image : temp/1757161229_3775605_1382036330_1cf58c52641d3aba0c7c02355164888a_rle_crop_3948072146_0.png treat image : temp/1757161229_3775605_1382037342_daa2c57480b83fafc45cc01e09d96c47_rle_crop_3948072083_0.png treat image : temp/1757161229_3775605_1382037342_daa2c57480b83fafc45cc01e09d96c47_rle_crop_3948072085_0.png treat image : temp/1757161229_3775605_1382036794_268d3d0e6297c3bf7aaefe08679eed83_rle_crop_3948072105_0.png treat image : temp/1757161229_3775605_1382036729_f5734c724e66510039d7009339b63b2d_rle_crop_3948072108_0.png treat image : temp/1757161229_3775605_1382036666_90d4f30d2c7b96270a76c48f97fe918c_rle_crop_3948072114_0.png treat image : temp/1757161229_3775605_1382036666_90d4f30d2c7b96270a76c48f97fe918c_rle_crop_3948072115_0.png treat image : temp/1757161229_3775605_1382036636_c895ec6da8a5a2b4bb115741f102cea2_rle_crop_3948072121_0.png treat image : temp/1757161229_3775605_1382036438_9feead0a315d33b4aa099764639c5f7a_rle_crop_3948072125_0.png treat image : temp/1757161229_3775605_1382036416_84a840d8bd18c5bad089a60d9a896824_rle_crop_3948072132_0.png treat image : temp/1757161229_3775605_1382036390_ee88b6eb4317b30f801601230038d150_rle_crop_3948072135_0.png treat image : temp/1757161229_3775605_1382036357_05e7e1b7bb9209b2c96127b1bc346b19_rle_crop_3948072138_0.png treat image : temp/1757161229_3775605_1382036330_1cf58c52641d3aba0c7c02355164888a_rle_crop_3948072145_0.png treat image : temp/1757161229_3775605_1382035956_d736044742c300d5bf23bf6c6755ef5b_rle_crop_3948072152_0.png treat image : temp/1757161229_3775605_1382035862_bb038e71b471a0c227032caf4459f956_rle_crop_3948072155_0.png treat image : temp/1757161229_3775605_1382035831_658d0d111f642bdc1fee129467684f9f_rle_crop_3948072157_0.png treat image : temp/1757161229_3775605_1382035831_658d0d111f642bdc1fee129467684f9f_rle_crop_3948072158_0.png treat image : temp/1757161229_3775605_1382035831_658d0d111f642bdc1fee129467684f9f_rle_crop_3948072159_0.png treat image : temp/1757161229_3775605_1382035831_658d0d111f642bdc1fee129467684f9f_rle_crop_3948072160_0.png treat image : temp/1757161229_3775605_1382036357_05e7e1b7bb9209b2c96127b1bc346b19_rle_crop_3948072140_0.png treat image : temp/1757161229_3775605_1382036330_1cf58c52641d3aba0c7c02355164888a_rle_crop_3948072144_0.png Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 104 time used for this insertion : 0.018910646438598633 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 104 time used for this insertion : 0.0276033878326416 save missing photos in datou_result : time spend for datou_step_exec : 5.930551528930664 time spend to save output : 0.05187511444091797 total time spend for step 7 : 5.982426643371582 step8:velours_tree Sat Sep 6 14:22:30 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure can't find the photo_desc_type Inside saveOutput : final : False verbose : 0 ouput is None No outpout to save, returning out of save general time spend for datou_step_exec : 0.10094952583312988 time spend to save output : 5.3882598876953125e-05 total time spend for step 8 : 0.10100340843200684 step9:send_mail_cod Sat Sep 6 14:22:30 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin/workarea/git/Velours/python in order to get the selector url, please entre the license of selector results_Auto_P26607883_06-09-2025_14_22_30.pdf 26609187 change filename to text .change filename to text .imagette266091871757161350 26609188 imagette266091881757161350 26609189 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette266091891757161350 26609190 imagette266091901757161351 26609191 imagette266091911757161351 26609192 imagette266091921757161351 26609193 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette266091931757161351 26609194 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette266091941757161353 26609196 imagette266091961757161354 26609197 change filename to text .imagette266091971757161354 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=26607883 and hashtag_type = 3594 and pcr.hashtag_id = h.hashtag_id; velour_link : https://marlene.fotonower.com/velours/26609187,26609188,26609189,26609190,26609191,26609192,26609193,26609194,26609195,26609196,26609197?tags=autre,pehd,papier,background,flou,pet_fonce,pet_clair,carton,environnement,mal_croppe,metal args[1382037342] : ((1382037342, -4.1135810188401285, 492609224), (1382037342, 0.4712724138981418, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382037287] : ((1382037287, 0.04129409836808059, 492688767), (1382037287, 0.30128478071022907, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382037250] : ((1382037250, -3.3354788064698346, 492609224), (1382037250, 0.2288216516445613, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382037218] : ((1382037218, -0.28209322470427284, 492688767), (1382037218, 0.2809808478056659, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382037193] : ((1382037193, 1.2072178442668562, 492688767), (1382037193, 0.1779083956265327, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382036794] : ((1382036794, -0.0755807206003832, 492688767), (1382036794, 0.6628307900867884, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382036755] : ((1382036755, 1.4239599860585983, 492688767), (1382036755, 0.40747840112053274, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382036729] : ((1382036729, 0.25552133895419904, 492688767), (1382036729, 0.36867202219897016, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382036694] : ((1382036694, -0.9279539274771483, 492688767), (1382036694, 1.2701444911480104, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382036666] : ((1382036666, -4.093313529528721, 492609224), (1382036666, 0.5185860126398477, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382036636] : ((1382036636, -3.976225583622483, 492609224), (1382036636, 0.25259088782661654, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382036438] : ((1382036438, -4.068598736414259, 492609224), (1382036438, 0.39264089779688577, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382036416] : ((1382036416, 0.9122645627333621, 492688767), (1382036416, 0.23488107315830972, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382036390] : ((1382036390, -2.6442380471630127, 492609224), (1382036390, 0.5057446450781591, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382036357] : ((1382036357, -4.3061843785866865, 492609224), (1382036357, 0.3214335736392752, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382036330] : ((1382036330, 0.25185376230483425, 492688767), (1382036330, 0.39366900424031953, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382036305] : ((1382036305, 2.853306565447556, 492688767), (1382036305, 0.47861958245908337, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382035956] : ((1382035956, -0.13470861502898313, 492688767), (1382035956, 0.816583753896488, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382035932] : ((1382035932, -0.3824098816063131, 492688767), (1382035932, 1.0209926026134735, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382035900] : ((1382035900, 1.0160854465480342, 492688767), (1382035900, 0.2556823037045139, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382035862] : ((1382035862, -4.149637530614191, 492609224), (1382035862, 0.3500879716666347, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com args[1382035831] : ((1382035831, 2.7101507174497383, 492688767), (1382035831, 0.6621347828717121, 2107752395), '0.08088690726711562') We are sending mail with results at report@fotonower.com refus_total : 0.08088690726711562 2022-04-13 10:29:59 0 SELECT ph.photo_id,ph.url,ph.username,ph.uploaded_at,ph.text FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=26607883 AND mpp.hide_status=0 ORDER BY mpp.order LIMIT 0, 1000 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26607883_06-09-2025_14_22_30.pdf results_Auto_P26607883_06-09-2025_14_22_30.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26607883_06-09-2025_14_22_30.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('3318','26607883','results_Auto_P26607883_06-09-2025_14_22_30.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26607883_06-09-2025_14_22_30.pdf','pdf','','0.6','0.08088690726711562') message_in_mail: Bonjour,
Veuillez trouver ci dessous les résultats du service carac on demand pour le portfolio: https://www.fotonower.com/view/26607883

https://www.fotonower.com/image?json=false&list_photos_id=1382037342
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382037287
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382037250
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382037218
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382037193
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.2072178442668562)
https://www.fotonower.com/image?json=false&list_photos_id=1382036794
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382036755
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.4239599860585983)
https://www.fotonower.com/image?json=false&list_photos_id=1382036729
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382036694
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382036666
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382036636
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382036438
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382036416
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382036390
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382036357
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382036330
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382036305
La photo est trop floue, merci de reprendre une photo.(avec le score = 2.853306565447556)
https://www.fotonower.com/image?json=false&list_photos_id=1382035956
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382035932
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382035900
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.0160854465480342)
https://www.fotonower.com/image?json=false&list_photos_id=1382035862
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382035831
La photo est trop floue, merci de reprendre une photo.(avec le score = 2.7101507174497383)

Dans ces conditions,le taux de refus est: 8.09%
Veuillez trouver les photos des contaminants.

exemples de contaminants: autre: https://www.fotonower.com/view/26609187?limit=200
exemples de contaminants: papier: https://www.fotonower.com/view/26609189?limit=200
exemples de contaminants: pet_clair: https://www.fotonower.com/view/26609193?limit=200
exemples de contaminants: carton: https://www.fotonower.com/view/26609194?limit=200
exemples de contaminants: metal: https://www.fotonower.com/view/26609197?limit=200
Veuillez trouver le rapport en pdf:https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26607883_06-09-2025_14_22_30.pdf.

Lien vers velours :https://marlene.fotonower.com/velours/26609187,26609188,26609189,26609190,26609191,26609192,26609193,26609194,26609195,26609196,26609197?tags=autre,pehd,papier,background,flou,pet_fonce,pet_clair,carton,environnement,mal_croppe,metal.


L'équipe Fotonower 202 b'' Server: nginx Date: Sat, 06 Sep 2025 12:22:36 GMT Content-Length: 0 Connection: close X-Message-Id: XHXkU3BMTMeGubpsdyFTdA Access-Control-Allow-Origin: https://sendgrid.api-docs.io Access-Control-Allow-Methods: POST Access-Control-Allow-Headers: Authorization, Content-Type, On-behalf-of, x-sg-elas-acl Access-Control-Max-Age: 600 X-No-CORS-Reason: https://sendgrid.com/docs/Classroom/Basics/API/cors.html Strict-Transport-Security: max-age=31536000; includeSubDomains Content-Security-Policy: frame-ancestors 'none' Cache-Control: no-cache X-Content-Type-Options: no-sniff Referrer-Policy: strict-origin-when-cross-origin Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1382037342, 1382037287, 1382037250, 1382037218, 1382037193, 1382036794, 1382036755, 1382036729, 1382036694, 1382036666, 1382036636, 1382036438, 1382036416, 1382036390, 1382036357, 1382036330, 1382036305, 1382035956, 1382035932, 1382035900, 1382035862, 1382035831] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037342', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037287', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037250', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037218', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037193', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036794', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036755', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036729', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036694', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036666', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036636', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036438', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036416', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036390', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036357', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036330', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036305', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035956', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035932', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035900', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035862', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035831', None, None, None, None, None, '3674434') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 22 time used for this insertion : 0.018843889236450195 save_final save missing photos in datou_result : time spend for datou_step_exec : 5.732496738433838 time spend to save output : 0.01916337013244629 total time spend for step 9 : 5.751660108566284 step10:split_time_score Sat Sep 6 14:22:36 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score Catched exception ! Connect or reconnect ! TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('12', 22),) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 06092025 26607883 Nombre de photos uploadées : 22 / 23040 (0%) 06092025 26607883 Nombre de photos taguées (types de déchets): 0 / 22 (0%) 06092025 26607883 Nombre de photos taguées (volume) : 0 / 22 (0%) elapsed_time : load_data_split_time_score 2.6226043701171875e-06 elapsed_time : order_list_meta_photo_and_scores 6.67572021484375e-06 ?????????????????????? elapsed_time : fill_and_build_computed_from_old_data 0.0010116100311279297 Catched exception ! Connect or reconnect ! Catched exception ! Connect or reconnect ! elapsed_time : insert_dashboard_record_day_entry 0.20222258567810059 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.21615461033950617 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26601766_06-09-2025_07_41_58.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26601766 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26601766 AND mptpi.`type`=3594 To do Qualite : 0.20595384837962966 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26604828_06-09-2025_10_51_43.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26604828 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26604828 AND mptpi.`type`=3594 To do Qualite : 0.1469003986625515 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26604829_06-09-2025_11_01_39.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26604829 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26604829 AND mptpi.`type`=3594 To do Qualite : 0.16525622477445395 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26604830_06-09-2025_10_41_53.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26604830 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26604830 AND mptpi.`type`=3594 To do Qualite : 0.05311668113425926 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26604831_06-09-2025_10_31_08.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26604831 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26604831 AND mptpi.`type`=3594 To do Qualite : 0.08088690726711562 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26607883_06-09-2025_14_22_30.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26607883 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26607883 AND mptpi.`type`=3594 To do Qualite : 0.021496002657750337 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26608746_06-09-2025_14_11_25.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26608746 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26608746 AND mptpi.`type`=3594 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'06092025': {'nb_upload': 22, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1382037342, 1382037287, 1382037250, 1382037218, 1382037193, 1382036794, 1382036755, 1382036729, 1382036694, 1382036666, 1382036636, 1382036438, 1382036416, 1382036390, 1382036357, 1382036330, 1382036305, 1382035956, 1382035932, 1382035900, 1382035862, 1382035831] Looping around the photos to save general results len do output : 1 /26607883Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037342', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037287', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037250', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037218', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382037193', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036794', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036755', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036729', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036694', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036666', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036636', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036438', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036416', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036390', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036357', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036330', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382036305', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035956', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035932', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035900', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035862', None, None, None, None, None, '3674434') ('3318', None, None, None, None, None, None, None, '3674434') ('3318', '26607883', '1382035831', None, None, None, None, None, '3674434') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 23 time used for this insertion : 0.016857385635375977 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.7638664245605469 time spend to save output : 0.017186641693115234 total time spend for step 10 : 0.7810530662536621 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 22 set_done_treatment 72.25user 31.39system 2:11.27elapsed 78%CPU (0avgtext+0avgdata 3180864maxresident)k 249056inputs+42584outputs (15major+2433074minor)pagefaults 0swaps