python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 3318 ' -s datou_3318 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 3774618 load datou : 3318 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? [(photo_id, hashtag_id, hashtag_type, x0, x1, y0, y1, score, seg_temp, polygons), ...] was removed should we ? chemin de la photo was removed should we ? [ (photo_id_loc, hashtag_id, hashtag_type, x0, x1, y0, y1, score, None), ...] was removed should we ? chemin de la photo was removed should we ? id de la photo (peut être local ou global) was removed should we ? chemin de la photo was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? None was removed should we ? donnée sous forme de nombre was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) load thcls load THCL from format json or kwargs add thcl : 2847 in CacheModelConfig load pdts add pdt : 5275 in CacheModelConfig Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 3318, datou_cur_ids : ['3730515'] with mtr_portfolio_ids : ['26911018'] and first list_photo_ids : [] new path : /proc/3774618/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : mask_detect, crop_condition, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, velours_tree, send_mail_cod, split_time_score over limit max, limiting to limit_max 40 list_input_json : [] origin We have 1 , BFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 5 ; length of list_pids : 5 ; length of list_args : 5 time to download the photos : 0.9414958953857422 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 10 step1:mask_detect Tue Sep 16 21:20:30 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 6807 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-09-16 21:20:33.175665: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-09-16 21:20:33.200706: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3492910000 Hz 2025-09-16 21:20:33.202731: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f35a4000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-09-16 21:20:33.202783: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-09-16 21:20:33.206433: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-09-16 21:20:33.350967: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x33036d70 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-09-16 21:20:33.351013: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-09-16 21:20:33.352224: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-09-16 21:20:33.352604: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-09-16 21:20:33.355294: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-09-16 21:20:33.357728: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-09-16 21:20:33.358175: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-09-16 21:20:33.361073: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-09-16 21:20:33.362125: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-09-16 21:20:33.365962: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-09-16 21:20:33.367144: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-09-16 21:20:33.367211: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-09-16 21:20:33.367822: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-09-16 21:20:33.367836: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-09-16 21:20:33.367845: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-09-16 21:20:33.368972: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6256 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-09-16 21:20:33.654465: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-09-16 21:20:33.654559: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-09-16 21:20:33.654587: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-09-16 21:20:33.654612: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-09-16 21:20:33.654636: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-09-16 21:20:33.654660: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-09-16 21:20:33.654684: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-09-16 21:20:33.654708: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-09-16 21:20:33.656414: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-09-16 21:20:33.657990: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-09-16 21:20:33.658028: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-09-16 21:20:33.658048: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-09-16 21:20:33.658066: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-09-16 21:20:33.658083: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-09-16 21:20:33.658101: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-09-16 21:20:33.658118: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-09-16 21:20:33.658136: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-09-16 21:20:33.659423: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-09-16 21:20:33.659453: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-09-16 21:20:33.659463: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-09-16 21:20:33.659472: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-09-16 21:20:33.660822: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6256 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl2847 thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) {'thcl': {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'], 'list_hashtags_csv': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'svm_hashtag_type_desc': 5275, 'photo_desc_type': 5275, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME learn_RUBBIA_REFUS_AMIENS_23 NUM_CLASSES 9 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : learn_RUBBIA_REFUS_AMIENS_23 model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-09-16 21:20:43.517190: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-09-16 21:20:43.800915: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-09-16 21:20:45.951179: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:45.980692: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 3.60G (3865470464 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:45.981446: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 3.24G (3478923264 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:45.982373: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.92G (3131030784 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:45.983230: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.62G (2817927680 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:45.983931: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.36G (2536134912 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:45.984643: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.12G (2282521344 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:45.984703: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-16 21:20:45.985498: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:45.985525: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-16 21:20:45.995900: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:45.995963: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-16 21:20:45.996597: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:45.996617: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-16 21:20:46.006293: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.006402: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 466.56MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-16 21:20:46.007485: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.007517: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 466.56MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-16 21:20:46.055745: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.055802: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-16 21:20:46.056396: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.056415: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-16 21:20:46.064988: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.065020: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 243.25MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-16 21:20:46.065613: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.065629: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 243.25MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-16 21:20:46.118971: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.119599: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.124881: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.125497: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.188163: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.188759: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.191660: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.192329: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.227321: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.227920: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.231041: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.231660: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.237307: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.237917: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.239770: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.240348: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.247591: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.248174: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.249736: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.250292: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.288620: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.289281: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.289900: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.290486: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.294427: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.295081: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.313932: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.314491: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.315022: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.315557: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.329897: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.330550: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.331128: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.331705: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.338121: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.338712: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.344978: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.345652: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.358593: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.359197: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.363468: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.364114: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.364772: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.365454: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.366381: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.367013: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.378748: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.379346: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.379914: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.380458: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.380986: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.381511: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.382034: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.382559: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.391990: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.392619: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.398792: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.399322: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.475628: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.476230: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.485214: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.485802: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.511566: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.512314: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.513151: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.513791: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.518613: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.519185: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.519754: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.520315: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.522426: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.522452: W tensorflow/core/kernels/gpu_utils.cc:49] Failed to allocate memory for convolution redzone checking; skipping this check. This is benign and only means that we won't check cudnn for out-of-bounds reads and writes. This message will only be printed once. 2025-09-16 21:20:46.531802: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.532427: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.542858: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.543455: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.544075: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.544648: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.545224: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-16 21:20:46.545810: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory local folder : /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23 /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23/mask_model.h5 size_local : 256009536 size in s3 : 256009536 create time local : 2021-08-09 09:43:22 create time in s3 : 2021-08-06 18:54:04 mask_model.h5 already exist and didn't need to update list_images length : 5 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 29 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 21 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 29 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 16 NEW PHOTO Processing 1 images image shape: (2160, 3840, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3840.00000 nb d'objets trouves : 25 Detection mask done ! Trying to reset tf kernel 3775270 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 644 tf kernel not reseted sub process len(results) : 5 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 5 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 1837 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl2847 Catched exception ! Connect or reconnect ! thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] time for calcul the mask position with numpy : 0.0011723041534423828 nb_pixel_total : 18127 time to create 1 rle with old method : 0.02541518211364746 length of segment : 215 time for calcul the mask position with numpy : 0.0041158199310302734 nb_pixel_total : 127563 time to create 1 rle with old method : 0.14696788787841797 length of segment : 573 time for calcul the mask position with numpy : 0.0005681514739990234 nb_pixel_total : 20786 time to create 1 rle with old method : 0.02336287498474121 length of segment : 194 time for calcul the mask position with numpy : 0.0003619194030761719 nb_pixel_total : 19207 time to create 1 rle with old method : 0.022039413452148438 length of segment : 151 time for calcul the mask position with numpy : 0.01288294792175293 nb_pixel_total : 270897 time to create 1 rle with new method : 0.030401229858398438 length of segment : 680 time for calcul the mask position with numpy : 0.002968311309814453 nb_pixel_total : 134251 time to create 1 rle with old method : 0.15577077865600586 length of segment : 477 time for calcul the mask position with numpy : 0.0008556842803955078 nb_pixel_total : 16018 time to create 1 rle with old method : 0.018169879913330078 length of segment : 186 time for calcul the mask position with numpy : 0.0009191036224365234 nb_pixel_total : 23793 time to create 1 rle with old method : 0.025538206100463867 length of segment : 163 time for calcul the mask position with numpy : 0.007009029388427734 nb_pixel_total : 187804 time to create 1 rle with new method : 0.014276266098022461 length of segment : 800 time for calcul the mask position with numpy : 0.0008397102355957031 nb_pixel_total : 19645 time to create 1 rle with old method : 0.02255535125732422 length of segment : 184 time for calcul the mask position with numpy : 0.0017085075378417969 nb_pixel_total : 60262 time to create 1 rle with old method : 0.06459879875183105 length of segment : 290 time for calcul the mask position with numpy : 0.002077341079711914 nb_pixel_total : 21083 time to create 1 rle with old method : 0.023354530334472656 length of segment : 193 time for calcul the mask position with numpy : 0.001504659652709961 nb_pixel_total : 34622 time to create 1 rle with old method : 0.03948211669921875 length of segment : 229 time for calcul the mask position with numpy : 0.0005548000335693359 nb_pixel_total : 13021 time to create 1 rle with old method : 0.015266180038452148 length of segment : 155 time for calcul the mask position with numpy : 0.0018379688262939453 nb_pixel_total : 26191 time to create 1 rle with old method : 0.029835939407348633 length of segment : 397 time for calcul the mask position with numpy : 0.00035381317138671875 nb_pixel_total : 7834 time to create 1 rle with old method : 0.008639097213745117 length of segment : 120 time for calcul the mask position with numpy : 0.002687692642211914 nb_pixel_total : 82235 time to create 1 rle with old method : 0.09624624252319336 length of segment : 406 time for calcul the mask position with numpy : 0.0017919540405273438 nb_pixel_total : 63269 time to create 1 rle with old method : 0.06962871551513672 length of segment : 342 time for calcul the mask position with numpy : 0.000537872314453125 nb_pixel_total : 7783 time to create 1 rle with old method : 0.00871133804321289 length of segment : 173 time for calcul the mask position with numpy : 0.000690460205078125 nb_pixel_total : 13532 time to create 1 rle with old method : 0.019208431243896484 length of segment : 184 time for calcul the mask position with numpy : 0.002177000045776367 nb_pixel_total : 53618 time to create 1 rle with old method : 0.06180310249328613 length of segment : 315 time for calcul the mask position with numpy : 0.0018055438995361328 nb_pixel_total : 41884 time to create 1 rle with old method : 0.045502662658691406 length of segment : 298 time for calcul the mask position with numpy : 0.004477739334106445 nb_pixel_total : 123089 time to create 1 rle with old method : 0.13440561294555664 length of segment : 506 time for calcul the mask position with numpy : 0.004408121109008789 nb_pixel_total : 164457 time to create 1 rle with new method : 0.01049661636352539 length of segment : 515 time for calcul the mask position with numpy : 0.001973390579223633 nb_pixel_total : 62533 time to create 1 rle with old method : 0.06793403625488281 length of segment : 348 time for calcul the mask position with numpy : 0.0008745193481445312 nb_pixel_total : 25112 time to create 1 rle with old method : 0.02854013442993164 length of segment : 143 time for calcul the mask position with numpy : 0.008768320083618164 nb_pixel_total : 246704 time to create 1 rle with new method : 0.018271207809448242 length of segment : 583 time for calcul the mask position with numpy : 0.0015256404876708984 nb_pixel_total : 49679 time to create 1 rle with old method : 0.054155826568603516 length of segment : 271 time for calcul the mask position with numpy : 0.0010886192321777344 nb_pixel_total : 21432 time to create 1 rle with old method : 0.02400040626525879 length of segment : 259 time for calcul the mask position with numpy : 0.028662919998168945 nb_pixel_total : 417156 time to create 1 rle with new method : 0.04812264442443848 length of segment : 573 time for calcul the mask position with numpy : 0.001772165298461914 nb_pixel_total : 23952 time to create 1 rle with old method : 0.02729058265686035 length of segment : 223 time for calcul the mask position with numpy : 0.009777307510375977 nb_pixel_total : 201798 time to create 1 rle with new method : 0.020781517028808594 length of segment : 842 time for calcul the mask position with numpy : 0.002835988998413086 nb_pixel_total : 92348 time to create 1 rle with old method : 0.10327339172363281 length of segment : 468 time for calcul the mask position with numpy : 0.0010912418365478516 nb_pixel_total : 22040 time to create 1 rle with old method : 0.024793624877929688 length of segment : 251 time for calcul the mask position with numpy : 0.001771688461303711 nb_pixel_total : 64765 time to create 1 rle with old method : 0.07481002807617188 length of segment : 282 time for calcul the mask position with numpy : 0.005382061004638672 nb_pixel_total : 185076 time to create 1 rle with new method : 0.010223150253295898 length of segment : 340 time for calcul the mask position with numpy : 0.0029671192169189453 nb_pixel_total : 97224 time to create 1 rle with old method : 0.10833406448364258 length of segment : 352 time for calcul the mask position with numpy : 0.0011610984802246094 nb_pixel_total : 37069 time to create 1 rle with old method : 0.04452157020568848 length of segment : 197 time for calcul the mask position with numpy : 0.0019631385803222656 nb_pixel_total : 53331 time to create 1 rle with old method : 0.05883669853210449 length of segment : 202 time for calcul the mask position with numpy : 0.0019199848175048828 nb_pixel_total : 61206 time to create 1 rle with old method : 0.06590819358825684 length of segment : 315 time for calcul the mask position with numpy : 0.0006074905395507812 nb_pixel_total : 14021 time to create 1 rle with old method : 0.015595197677612305 length of segment : 123 time for calcul the mask position with numpy : 0.0002999305725097656 nb_pixel_total : 8268 time to create 1 rle with old method : 0.009925365447998047 length of segment : 90 time for calcul the mask position with numpy : 0.00154876708984375 nb_pixel_total : 53702 time to create 1 rle with old method : 0.05958986282348633 length of segment : 391 time for calcul the mask position with numpy : 0.0014553070068359375 nb_pixel_total : 63044 time to create 1 rle with old method : 0.07321310043334961 length of segment : 172 time for calcul the mask position with numpy : 0.005069732666015625 nb_pixel_total : 150511 time to create 1 rle with new method : 0.007851362228393555 length of segment : 354 time for calcul the mask position with numpy : 0.0005578994750976562 nb_pixel_total : 18755 time to create 1 rle with old method : 0.02243638038635254 length of segment : 72 time for calcul the mask position with numpy : 0.002099275588989258 nb_pixel_total : 56864 time to create 1 rle with old method : 0.06374907493591309 length of segment : 259 time for calcul the mask position with numpy : 0.0018079280853271484 nb_pixel_total : 41572 time to create 1 rle with old method : 0.04706287384033203 length of segment : 268 time for calcul the mask position with numpy : 0.0030181407928466797 nb_pixel_total : 53444 time to create 1 rle with old method : 0.06380057334899902 length of segment : 414 time for calcul the mask position with numpy : 0.0044403076171875 nb_pixel_total : 133797 time to create 1 rle with old method : 0.14760756492614746 length of segment : 458 time for calcul the mask position with numpy : 0.0037169456481933594 nb_pixel_total : 104651 time to create 1 rle with old method : 0.1141517162322998 length of segment : 583 time for calcul the mask position with numpy : 0.0016274452209472656 nb_pixel_total : 46559 time to create 1 rle with old method : 0.054375648498535156 length of segment : 260 time for calcul the mask position with numpy : 0.0006990432739257812 nb_pixel_total : 15621 time to create 1 rle with old method : 0.018447160720825195 length of segment : 125 time for calcul the mask position with numpy : 0.0007462501525878906 nb_pixel_total : 19257 time to create 1 rle with old method : 0.0267946720123291 length of segment : 122 time for calcul the mask position with numpy : 0.002578258514404297 nb_pixel_total : 92708 time to create 1 rle with old method : 0.10766291618347168 length of segment : 366 time for calcul the mask position with numpy : 0.0010080337524414062 nb_pixel_total : 24680 time to create 1 rle with old method : 0.028448104858398438 length of segment : 242 time for calcul the mask position with numpy : 0.002355337142944336 nb_pixel_total : 100500 time to create 1 rle with old method : 0.1124870777130127 length of segment : 278 time for calcul the mask position with numpy : 0.0016205310821533203 nb_pixel_total : 33659 time to create 1 rle with old method : 0.040093421936035156 length of segment : 298 time for calcul the mask position with numpy : 0.004006385803222656 nb_pixel_total : 135992 time to create 1 rle with old method : 0.15042781829833984 length of segment : 594 time for calcul the mask position with numpy : 0.0006489753723144531 nb_pixel_total : 18419 time to create 1 rle with old method : 0.02083563804626465 length of segment : 158 time for calcul the mask position with numpy : 0.001430511474609375 nb_pixel_total : 50384 time to create 1 rle with old method : 0.056516170501708984 length of segment : 256 time for calcul the mask position with numpy : 0.001600027084350586 nb_pixel_total : 60533 time to create 1 rle with old method : 0.08204007148742676 length of segment : 414 time for calcul the mask position with numpy : 0.0018768310546875 nb_pixel_total : 52900 time to create 1 rle with old method : 0.05942368507385254 length of segment : 223 time for calcul the mask position with numpy : 0.0008788108825683594 nb_pixel_total : 25112 time to create 1 rle with old method : 0.028388023376464844 length of segment : 148 time spent for convertir_results : 7.865492820739746 Inside saveOutput : final : False verbose : 0 eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 64 chid ids of type : 3594 Number RLEs to save : 20063 save missing photos in datou_result : time spend for datou_step_exec : 54.23674154281616 time spend to save output : 1.210461139678955 total time spend for step 1 : 55.44720268249512 step2:crop_condition Tue Sep 16 21:21:25 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure Loading chi in step crop with photo_hashtag_type : 3594 Loading chi in step crop for list_pids : 5 ! batch 1 Loaded 64 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ begin to crop the class : papier param for this class : {'min_score': 0.7} filtre for class : papier hashtag_id of this class : 492668766 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 38 About to insert : list_path_to_insert length 38 new photo from crops ! About to upload 38 photos upload in portfolio : 3736932 init cache_photo without model_param we have 38 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1758050498_3774618 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 38 photos in the portfolio 3736932 time of upload the photos Elapsed time : 9.247769594192505 we have finished the crop for the class : papier begin to crop the class : carton param for this class : {'min_score': 0.7} filtre for class : carton hashtag_id of this class : 492774966 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 3 About to insert : list_path_to_insert length 3 new photo from crops ! About to upload 3 photos upload in portfolio : 3736932 init cache_photo without model_param we have 3 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1758050509_3774618 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 3 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.9988713264465332 we have finished the crop for the class : carton begin to crop the class : metal param for this class : {'min_score': 0.7} filtre for class : metal hashtag_id of this class : 492628673 begin to crop the class : pet_clair param for this class : {'min_score': 0.7} filtre for class : pet_clair hashtag_id of this class : 2107755846 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 22 About to insert : list_path_to_insert length 22 new photo from crops ! About to upload 22 photos upload in portfolio : 3736932 init cache_photo without model_param we have 22 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1758050520_3774618 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 22 photos in the portfolio 3736932 time of upload the photos Elapsed time : 6.861158847808838 we have finished the crop for the class : pet_clair begin to crop the class : autre param for this class : {'min_score': 0.7} filtre for class : autre hashtag_id of this class : 494826614 we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 1 About to insert : list_path_to_insert length 1 new photo from crops ! About to upload 1 photos upload in portfolio : 3736932 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1758050528_3774618 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 1 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.5026137828826904 we have finished the crop for the class : autre begin to crop the class : pehd param for this class : {'min_score': 0.7} filtre for class : pehd hashtag_id of this class : 628944319 begin to crop the class : pet_fonce param for this class : {'min_score': 0.7} filtre for class : pet_fonce hashtag_id of this class : 2107755900 delete rles from all chi we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : crop_condition we use saveGeneral [1383919936, 1383919931, 1383919917, 1383919916, 1383919896] Looping around the photos to save general results len do output : 64 /1383994971Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994972Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994973Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994974Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994975Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994976Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994977Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994978Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994979Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994980Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994981Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994982Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994983Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994984Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994985Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994986Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994987Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994988Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994989Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994990Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994991Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994992Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994993Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994994Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994995Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994996Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994997Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994998Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383994999Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995000Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995001Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995002Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995003Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995004Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995006Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995007Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995008Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995009Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995010Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995011Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995012Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995021Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995022Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995023Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995024Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995025Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995026Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995027Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995028Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995029Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995030Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995031Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995032Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995033Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995034Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995035Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995036Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995037Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995038Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995039Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995040Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995041Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995042Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1383995044Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919936', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919931', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919917', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919916', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919896', None, None, None, None, None, '3730515') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 197 time used for this insertion : 0.024172306060791016 save_final save missing photos in datou_result : time spend for datou_step_exec : 43.19283890724182 time spend to save output : 0.029837846755981445 total time spend for step 2 : 43.2226767539978 step3:rle_unique_nms_with_priority Tue Sep 16 21:22:09 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 64 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++nb_obj : 13 nb_hashtags : 2 time to prepare the origin masks : 6.53531289100647 time for calcul the mask position with numpy : 0.624384880065918 nb_pixel_total : 7363240 time to create 1 rle with new method : 0.9412410259246826 time for calcul the mask position with numpy : 0.044130563735961914 nb_pixel_total : 13021 time to create 1 rle with old method : 0.014511585235595703 time for calcul the mask position with numpy : 0.041254520416259766 nb_pixel_total : 34622 time to create 1 rle with old method : 0.03863978385925293 time for calcul the mask position with numpy : 0.036675214767456055 nb_pixel_total : 5121 time to create 1 rle with old method : 0.005896806716918945 time for calcul the mask position with numpy : 0.0365757942199707 nb_pixel_total : 59950 time to create 1 rle with old method : 0.0660696029663086 time for calcul the mask position with numpy : 0.039385318756103516 nb_pixel_total : 187804 time to create 1 rle with new method : 0.7926228046417236 time for calcul the mask position with numpy : 0.03955841064453125 nb_pixel_total : 23793 time to create 1 rle with old method : 0.026366472244262695 time for calcul the mask position with numpy : 0.027225255966186523 nb_pixel_total : 16018 time to create 1 rle with old method : 0.01779937744140625 time for calcul the mask position with numpy : 0.027578115463256836 nb_pixel_total : 134251 time to create 1 rle with old method : 0.1526350975036621 time for calcul the mask position with numpy : 0.027057886123657227 nb_pixel_total : 270897 time to create 1 rle with new method : 1.1013851165771484 time for calcul the mask position with numpy : 0.02547001838684082 nb_pixel_total : 19207 time to create 1 rle with old method : 0.021249771118164062 time for calcul the mask position with numpy : 0.026507139205932617 nb_pixel_total : 20786 time to create 1 rle with old method : 0.023004531860351562 time for calcul the mask position with numpy : 0.025940418243408203 nb_pixel_total : 127563 time to create 1 rle with old method : 0.1421337127685547 time for calcul the mask position with numpy : 0.0247647762298584 nb_pixel_total : 18127 time to create 1 rle with old method : 0.020230770111083984 create new chi : 4.505941152572632 time to delete rle : 0.029892444610595703 batch 1 Loaded 28 chid ids of type : 3594 +++++++++++++++++Number RLEs to save : 10551 TO DO : save crop sub photo not yet done ! save time : 0.6309854984283447 nb_obj : 6 nb_hashtags : 3 time to prepare the origin masks : 4.091551065444946 time for calcul the mask position with numpy : 1.8756487369537354 nb_pixel_total : 8093556 time to create 1 rle with new method : 0.7996082305908203 time for calcul the mask position with numpy : 0.02755880355834961 nb_pixel_total : 13532 time to create 1 rle with old method : 0.015191793441772461 time for calcul the mask position with numpy : 0.03040170669555664 nb_pixel_total : 7783 time to create 1 rle with old method : 0.008904695510864258 time for calcul the mask position with numpy : 0.03443002700805664 nb_pixel_total : 63269 time to create 1 rle with old method : 0.07006001472473145 time for calcul the mask position with numpy : 0.043169260025024414 nb_pixel_total : 82235 time to create 1 rle with old method : 0.09279823303222656 time for calcul the mask position with numpy : 0.044768571853637695 nb_pixel_total : 7834 time to create 1 rle with old method : 0.00867152214050293 time for calcul the mask position with numpy : 0.04101872444152832 nb_pixel_total : 26191 time to create 1 rle with old method : 0.03161001205444336 create new chi : 3.173861265182495 time to delete rle : 0.0008807182312011719 batch 1 Loaded 13 chid ids of type : 3594 +++++++++++++++Number RLEs to save : 5404 TO DO : save crop sub photo not yet done ! save time : 0.3395717144012451 nb_obj : 21 nb_hashtags : 2 time to prepare the origin masks : 10.61009430885315 time for calcul the mask position with numpy : 0.9344713687896729 nb_pixel_total : 6237590 time to create 1 rle with new method : 1.3601117134094238 time for calcul the mask position with numpy : 0.04156970977783203 nb_pixel_total : 14021 time to create 1 rle with old method : 0.015323162078857422 time for calcul the mask position with numpy : 0.04690265655517578 nb_pixel_total : 61206 time to create 1 rle with old method : 0.06786537170410156 time for calcul the mask position with numpy : 0.04221224784851074 nb_pixel_total : 53331 time to create 1 rle with old method : 0.05853629112243652 time for calcul the mask position with numpy : 0.04101419448852539 nb_pixel_total : 37069 time to create 1 rle with old method : 0.04099154472351074 time for calcul the mask position with numpy : 0.0410308837890625 nb_pixel_total : 97224 time to create 1 rle with old method : 0.10722756385803223 time for calcul the mask position with numpy : 0.041413068771362305 nb_pixel_total : 185076 time to create 1 rle with new method : 1.0686702728271484 time for calcul the mask position with numpy : 0.025896072387695312 nb_pixel_total : 64765 time to create 1 rle with old method : 0.0717778205871582 time for calcul the mask position with numpy : 0.027860641479492188 nb_pixel_total : 22040 time to create 1 rle with old method : 0.02818584442138672 time for calcul the mask position with numpy : 0.03152012825012207 nb_pixel_total : 92348 time to create 1 rle with old method : 0.11373496055603027 time for calcul the mask position with numpy : 0.034279584884643555 nb_pixel_total : 201798 time to create 1 rle with new method : 0.9128150939941406 time for calcul the mask position with numpy : 0.027017831802368164 nb_pixel_total : 23952 time to create 1 rle with old method : 0.026555776596069336 time for calcul the mask position with numpy : 0.04346013069152832 nb_pixel_total : 417087 time to create 1 rle with new method : 0.5543079376220703 time for calcul the mask position with numpy : 0.041455984115600586 nb_pixel_total : 21432 time to create 1 rle with old method : 0.023758888244628906 time for calcul the mask position with numpy : 0.044388771057128906 nb_pixel_total : 49679 time to create 1 rle with old method : 0.05469775199890137 time for calcul the mask position with numpy : 0.04405641555786133 nb_pixel_total : 246704 time to create 1 rle with new method : 1.0955243110656738 time for calcul the mask position with numpy : 0.042694091796875 nb_pixel_total : 23497 time to create 1 rle with old method : 0.02940821647644043 time for calcul the mask position with numpy : 0.041794776916503906 nb_pixel_total : 62533 time to create 1 rle with old method : 0.09203743934631348 time for calcul the mask position with numpy : 0.044655561447143555 nb_pixel_total : 164457 time to create 1 rle with new method : 0.9019713401794434 time for calcul the mask position with numpy : 0.044936418533325195 nb_pixel_total : 123089 time to create 1 rle with old method : 0.13989043235778809 time for calcul the mask position with numpy : 0.04246330261230469 nb_pixel_total : 41884 time to create 1 rle with old method : 0.055661916732788086 time for calcul the mask position with numpy : 0.029597997665405273 nb_pixel_total : 53618 time to create 1 rle with old method : 0.05944514274597168 create new chi : 8.819539308547974 time to delete rle : 0.0025186538696289062 batch 1 Loaded 43 chid ids of type : 3594 ++++++++++++++++++++++++++++++++Number RLEs to save : 16904 TO DO : save crop sub photo not yet done ! save time : 0.9760980606079102 nb_obj : 7 nb_hashtags : 2 time to prepare the origin masks : 4.4923484325408936 time for calcul the mask position with numpy : 0.8434276580810547 nb_pixel_total : 7901684 time to create 1 rle with new method : 0.9627468585968018 time for calcul the mask position with numpy : 0.028143644332885742 nb_pixel_total : 41572 time to create 1 rle with old method : 0.04710125923156738 time for calcul the mask position with numpy : 0.031252384185791016 nb_pixel_total : 56864 time to create 1 rle with old method : 0.08382034301757812 time for calcul the mask position with numpy : 0.0383756160736084 nb_pixel_total : 18755 time to create 1 rle with old method : 0.020052194595336914 time for calcul the mask position with numpy : 0.026333093643188477 nb_pixel_total : 150511 time to create 1 rle with new method : 0.945350170135498 time for calcul the mask position with numpy : 0.02629828453063965 nb_pixel_total : 63044 time to create 1 rle with old method : 0.06970739364624023 time for calcul the mask position with numpy : 0.024834394454956055 nb_pixel_total : 53702 time to create 1 rle with old method : 0.05949211120605469 time for calcul the mask position with numpy : 0.025112390518188477 nb_pixel_total : 8268 time to create 1 rle with old method : 0.009536981582641602 create new chi : 3.3214762210845947 time to delete rle : 0.000896453857421875 batch 1 Loaded 15 chid ids of type : 3594 +++++++++Number RLEs to save : 5372 TO DO : save crop sub photo not yet done ! save time : 0.3201274871826172 nb_obj : 16 nb_hashtags : 4 time to prepare the origin masks : 7.696313381195068 time for calcul the mask position with numpy : 0.7141528129577637 nb_pixel_total : 7326184 time to create 1 rle with new method : 0.7902157306671143 time for calcul the mask position with numpy : 0.04135870933532715 nb_pixel_total : 25112 time to create 1 rle with old method : 0.028115510940551758 time for calcul the mask position with numpy : 0.041742801666259766 nb_pixel_total : 52900 time to create 1 rle with old method : 0.062310218811035156 time for calcul the mask position with numpy : 0.04297494888305664 nb_pixel_total : 60533 time to create 1 rle with old method : 0.06955599784851074 time for calcul the mask position with numpy : 0.042495012283325195 nb_pixel_total : 50384 time to create 1 rle with old method : 0.05603218078613281 time for calcul the mask position with numpy : 0.04148149490356445 nb_pixel_total : 18419 time to create 1 rle with old method : 0.02025008201599121 time for calcul the mask position with numpy : 0.04270052909851074 nb_pixel_total : 135992 time to create 1 rle with old method : 0.1495668888092041 time for calcul the mask position with numpy : 0.04771924018859863 nb_pixel_total : 33659 time to create 1 rle with old method : 0.03838014602661133 time for calcul the mask position with numpy : 0.043242454528808594 nb_pixel_total : 100500 time to create 1 rle with old method : 0.1111304759979248 time for calcul the mask position with numpy : 0.04391908645629883 nb_pixel_total : 24680 time to create 1 rle with old method : 0.027833223342895508 time for calcul the mask position with numpy : 0.04268026351928711 nb_pixel_total : 92708 time to create 1 rle with old method : 0.11833715438842773 time for calcul the mask position with numpy : 0.04201650619506836 nb_pixel_total : 19257 time to create 1 rle with old method : 0.021735668182373047 time for calcul the mask position with numpy : 0.04131484031677246 nb_pixel_total : 15621 time to create 1 rle with old method : 0.017441749572753906 time for calcul the mask position with numpy : 0.04128670692443848 nb_pixel_total : 46559 time to create 1 rle with old method : 0.05169963836669922 time for calcul the mask position with numpy : 0.02559661865234375 nb_pixel_total : 104651 time to create 1 rle with old method : 0.11746978759765625 time for calcul the mask position with numpy : 0.034438133239746094 nb_pixel_total : 133797 time to create 1 rle with old method : 0.16983795166015625 time for calcul the mask position with numpy : 0.026736021041870117 nb_pixel_total : 53444 time to create 1 rle with old method : 0.058763742446899414 create new chi : 3.3111557960510254 time to delete rle : 0.002287626266479492 batch 1 Loaded 33 chid ids of type : 3594 ++++++++++++++++++++Number RLEs to save : 12038 TO DO : save crop sub photo not yet done ! save time : 0.7207934856414795 map_output_result : {1383919936: (0.0, 'Should be the crop_list due to order', 0), 1383919931: (0.0, 'Should be the crop_list due to order', 0), 1383919917: (0.0, 'Should be the crop_list due to order', 0), 1383919916: (0.0, 'Should be the crop_list due to order', 0), 1383919896: (0.0, 'Should be the crop_list due to order', 0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1383919936, 1383919931, 1383919917, 1383919916, 1383919896] Looping around the photos to save general results len do output : 5 /1383919936.Didn't retrieve data . /1383919931.Didn't retrieve data . /1383919917.Didn't retrieve data . /1383919916.Didn't retrieve data . /1383919896.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919936', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919931', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919917', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919916', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919896', None, None, None, None, None, '3730515') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 15 time used for this insertion : 0.012947320938110352 save_final save missing photos in datou_result : time spend for datou_step_exec : 60.371809244155884 time spend to save output : 0.013442754745483398 total time spend for step 3 : 60.38525199890137 step4:ventilate_hashtags_in_portfolio Tue Sep 16 21:23:09 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 26911018 get user id for portfolio 26911018 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26911018 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('pehd','pet_clair','carton','pet_fonce','papier','metal','flou','mal_croppe','background','autre','environnement')) AND mptpi.`min_score`=0.5 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26911018 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('pehd','pet_clair','carton','pet_fonce','papier','metal','flou','mal_croppe','background','autre','environnement')) AND mptpi.`min_score`=0.5 To do Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26911018 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('pehd','pet_clair','carton','pet_fonce','papier','metal','flou','mal_croppe','background','autre','environnement')) AND mptpi.`min_score`=0.5 To do lien utilise dans velours : https://marlene.fotonower.com/velours/26911873,26911874,26911875,26911876,26911877,26911878,26911879,26911880,26911881,26911882,26911883?tags=pehd,pet_clair,carton,pet_fonce,papier,metal,flou,mal_croppe,background,autre,environnement Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1383919936, 1383919931, 1383919917, 1383919916, 1383919896] Looping around the photos to save general results len do output : 1 /26911018. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919936', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919931', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919917', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919916', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919896', None, None, None, None, None, '3730515') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 6 time used for this insertion : 0.045886993408203125 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.7843778133392334 time spend to save output : 0.04610943794250488 total time spend for step 4 : 1.8304872512817383 step5:final Tue Sep 16 21:23:11 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Catched exception ! Connect or reconnect ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1383919936: ('0.10970645254629634',), 1383919931: ('0.10970645254629634',), 1383919917: ('0.10970645254629634',), 1383919916: ('0.10970645254629634',), 1383919896: ('0.10970645254629634',)} new output for save of step final : {1383919936: ('0.10970645254629634',), 1383919931: ('0.10970645254629634',), 1383919917: ('0.10970645254629634',), 1383919916: ('0.10970645254629634',), 1383919896: ('0.10970645254629634',)} [1383919936, 1383919931, 1383919917, 1383919916, 1383919896] Looping around the photos to save general results len do output : 5 /1383919936.Didn't retrieve data . /1383919931.Didn't retrieve data . /1383919917.Didn't retrieve data . /1383919916.Didn't retrieve data . /1383919896.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919936', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919931', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919917', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919916', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919896', None, None, None, None, None, '3730515') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 15 time used for this insertion : 0.012155294418334961 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.13396668434143066 time spend to save output : 0.012724161148071289 total time spend for step 5 : 0.14669084548950195 step6:blur_detection Tue Sep 16 21:23:11 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection methode: ratio et variance treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88.jpg resize: (2160, 3840) 1383919936 -6.814282432432434 treat image : temp/1758050429_3774618_1383919931_2165b841083b1030553059088e67074a.jpg resize: (2160, 3840) 1383919931 -6.885385419320041 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4.jpg resize: (2160, 3840) 1383919917 -6.7737824236236595 treat image : temp/1758050429_3774618_1383919916_32c7d7f93e96bfc3051a813e39d2c758.jpg resize: (2160, 3840) 1383919916 -6.791287607877103 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e.jpg resize: (2160, 3840) 1383919896 -6.535633999609527 treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985987_0.png resize: (215, 160) 1383994971 -2.7199810597333034 treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985988_0.png resize: (573, 491) 1383994972 -1.395347367789645 treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985989_0.png resize: (191, 181) 1383994973 -2.887536743879958 treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985990_0.png resize: (144, 180) 1383994974 -4.250030281445652 treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985991_0.png resize: (627, 670) 1383994975 -4.47406587026429 treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985993_0.png resize: (186, 125) 1383994976 -3.99745912518359 treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985995_0.png resize: (797, 407) 1383994977 -0.05125310080774237 treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985996_0.png resize: (180, 195) 1383994978 -2.2436285450857327 treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985997_0.png resize: (285, 298) 1383994979 -2.8533980372143564 treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985998_0.png resize: (271, 362) 1383994980 -2.7123783611367647 treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985999_0.png resize: (227, 248) 1383994981 -3.9098396741796773 treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959986000_0.png resize: (141, 130) 1383994982 -4.557182515113395 treat image : temp/1758050429_3774618_1383919931_2165b841083b1030553059088e67074a_rle_crop_3959986001_0.png resize: (358, 248) 1383994983 -3.215599065382454 treat image : temp/1758050429_3774618_1383919931_2165b841083b1030553059088e67074a_rle_crop_3959986005_0.png resize: (151, 84) 1383994984 -4.197286161122661 treat image : temp/1758050429_3774618_1383919931_2165b841083b1030553059088e67074a_rle_crop_3959986006_0.png resize: (126, 180) 1383994985 -2.545716410608826 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986007_0.png resize: (315, 271) 1383994986 -1.2872651561078612 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986008_0.png resize: (297, 225) 1383994987 -2.780077773834467 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986009_0.png resize: (486, 353) 1383994988 -4.163971764983825 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986011_0.png resize: (342, 314) 1383994989 -4.542903903740906 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986012_0.png resize: (125, 335) 1383994990 -3.1792341203864076 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986014_0.png resize: (270, 263) 1383994991 -4.9425979047669575 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986015_0.png resize: (209, 201) 1383994992 -4.115645367727139 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986018_0.png resize: (727, 479) 1383994993 -4.649682505308297 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986022_0.png resize: (331, 639) 1383994994 -2.1569269479396675 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986026_0.png resize: (301, 244) 1383994995 -3.576434860998987 treat image : temp/1758050429_3774618_1383919916_32c7d7f93e96bfc3051a813e39d2c758_rle_crop_3959986028_0.png resize: (88, 110) 1383994996 -2.421729572475746 treat image : temp/1758050429_3774618_1383919916_32c7d7f93e96bfc3051a813e39d2c758_rle_crop_3959986029_0.png resize: (323, 231) 1383994997 0.6267269988485891 treat image : temp/1758050429_3774618_1383919916_32c7d7f93e96bfc3051a813e39d2c758_rle_crop_3959986032_0.png resize: (70, 355) 1383994998 -4.343201298730042 treat image : temp/1758050429_3774618_1383919916_32c7d7f93e96bfc3051a813e39d2c758_rle_crop_3959986033_0.png resize: (254, 328) 1383994999 -2.5653573875193145 treat image : temp/1758050429_3774618_1383919916_32c7d7f93e96bfc3051a813e39d2c758_rle_crop_3959986034_0.png resize: (260, 240) 1383995000 1.37058359501176 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986035_0.png resize: (414, 214) 1383995001 -2.1610460814067256 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986036_0.png resize: (435, 455) 1383995002 -3.7178014331763176 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986039_0.png resize: (123, 193) 1383995003 -3.8340632614896433 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986040_0.png resize: (122, 263) 1383995004 -1.8996441198110448 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986041_0.png resize: (364, 353) 1383995006 -3.2435093173614473 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986042_0.png resize: (238, 151) 1383995007 -2.2804588125610925 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986046_0.png resize: (155, 165) 1383995008 -0.2265580947645514 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986050_0.png resize: (148, 220) 1383995009 -4.991389311985223 treat image : temp/1758050429_3774618_1383919931_2165b841083b1030553059088e67074a_rle_crop_3959986002_0.png resize: (117, 85) 1383995010 -3.1021479980985944 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986037_0.png resize: (530, 283) 1383995011 -3.7880819985951466 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986048_0.png resize: (392, 214) 1383995012 -4.472267429455524 treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985992_0.png resize: (467, 498) 1383995021 -4.04433360907433 treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985994_0.png resize: (163, 217) 1383995022 -1.2321481297962198 treat image : temp/1758050429_3774618_1383919931_2165b841083b1030553059088e67074a_rle_crop_3959986003_0.png resize: (353, 374) 1383995023 -4.917245304294226 treat image : temp/1758050429_3774618_1383919931_2165b841083b1030553059088e67074a_rle_crop_3959986004_0.png resize: (286, 301) 1383995024 -3.831634475083897 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986010_0.png resize: (498, 431) 1383995025 -4.72063907111195 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986013_0.png resize: (448, 770) 1383995026 -4.2585249446036615 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986016_0.png resize: (543, 991) 1383995027 -5.316263179981912 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986017_0.png resize: (164, 388) 1383995028 -3.1660686812891763 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986019_0.png resize: (464, 221) 1383995029 -5.0586284826055845 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986020_0.png resize: (251, 116) 1383995030 -4.638660031875431 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986021_0.png resize: (274, 358) 1383995031 -5.3608056306923055 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986023_0.png resize: (322, 413) 1383995032 -4.055266691264008 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986024_0.png resize: (187, 254) 1383995033 -3.9589748569453618 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986025_0.png resize: (152, 510) 1383995034 -4.941602285350214 treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986027_0.png resize: (123, 143) 1383995035 -2.2793989275404205 treat image : temp/1758050429_3774618_1383919916_32c7d7f93e96bfc3051a813e39d2c758_rle_crop_3959986030_0.png resize: (167, 441) 1383995036 -2.7586505860461044 treat image : temp/1758050429_3774618_1383919916_32c7d7f93e96bfc3051a813e39d2c758_rle_crop_3959986031_0.png resize: (331, 576) 1383995037 -4.809421715289745 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986038_0.png resize: (259, 302) 1383995038 -3.5441375218065154 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986044_0.png resize: (246, 195) 1383995039 -3.8826877874971464 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986045_0.png resize: (577, 302) 1383995040 -4.672786129569232 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986047_0.png resize: (255, 319) 1383995041 -3.011525076491677 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986049_0.png resize: (220, 360) 1383995042 -1.076507845469063 treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986043_0.png resize: (274, 420) 1383995044 -4.7320037788318094 Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 69 time used for this insertion : 0.015327930450439453 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 69 time used for this insertion : 0.014833211898803711 save missing photos in datou_result : time spend for datou_step_exec : 20.839803457260132 time spend to save output : 0.03595685958862305 total time spend for step 6 : 20.875760316848755 step7:brightness Tue Sep 16 21:23:32 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88.jpg treat image : temp/1758050429_3774618_1383919931_2165b841083b1030553059088e67074a.jpg treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4.jpg treat image : temp/1758050429_3774618_1383919916_32c7d7f93e96bfc3051a813e39d2c758.jpg treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e.jpg treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985987_0.png treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985988_0.png treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985989_0.png treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985990_0.png treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985991_0.png treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985993_0.png treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985995_0.png treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985996_0.png treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985997_0.png treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985998_0.png treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985999_0.png treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959986000_0.png treat image : temp/1758050429_3774618_1383919931_2165b841083b1030553059088e67074a_rle_crop_3959986001_0.png treat image : temp/1758050429_3774618_1383919931_2165b841083b1030553059088e67074a_rle_crop_3959986005_0.png treat image : temp/1758050429_3774618_1383919931_2165b841083b1030553059088e67074a_rle_crop_3959986006_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986007_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986008_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986009_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986011_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986012_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986014_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986015_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986018_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986022_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986026_0.png treat image : temp/1758050429_3774618_1383919916_32c7d7f93e96bfc3051a813e39d2c758_rle_crop_3959986028_0.png treat image : temp/1758050429_3774618_1383919916_32c7d7f93e96bfc3051a813e39d2c758_rle_crop_3959986029_0.png treat image : temp/1758050429_3774618_1383919916_32c7d7f93e96bfc3051a813e39d2c758_rle_crop_3959986032_0.png treat image : temp/1758050429_3774618_1383919916_32c7d7f93e96bfc3051a813e39d2c758_rle_crop_3959986033_0.png treat image : temp/1758050429_3774618_1383919916_32c7d7f93e96bfc3051a813e39d2c758_rle_crop_3959986034_0.png treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986035_0.png treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986036_0.png treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986039_0.png treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986040_0.png treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986041_0.png treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986042_0.png treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986046_0.png treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986050_0.png treat image : temp/1758050429_3774618_1383919931_2165b841083b1030553059088e67074a_rle_crop_3959986002_0.png treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986037_0.png treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986048_0.png treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985992_0.png treat image : temp/1758050429_3774618_1383919936_621331c49b62a5c8aeea820d2f179f88_rle_crop_3959985994_0.png treat image : temp/1758050429_3774618_1383919931_2165b841083b1030553059088e67074a_rle_crop_3959986003_0.png treat image : temp/1758050429_3774618_1383919931_2165b841083b1030553059088e67074a_rle_crop_3959986004_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986010_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986013_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986016_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986017_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986019_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986020_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986021_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986023_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986024_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986025_0.png treat image : temp/1758050429_3774618_1383919917_c3df37c23cece5b38aa9f236a07710c4_rle_crop_3959986027_0.png treat image : temp/1758050429_3774618_1383919916_32c7d7f93e96bfc3051a813e39d2c758_rle_crop_3959986030_0.png treat image : temp/1758050429_3774618_1383919916_32c7d7f93e96bfc3051a813e39d2c758_rle_crop_3959986031_0.png treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986038_0.png treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986044_0.png treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986045_0.png treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986047_0.png treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986049_0.png treat image : temp/1758050429_3774618_1383919896_d4cf9e42ccb9e6ec1b8dcdae8f93275e_rle_crop_3959986043_0.png Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 69 time used for this insertion : 0.013479471206665039 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 69 time used for this insertion : 0.014875173568725586 save missing photos in datou_result : time spend for datou_step_exec : 5.551677227020264 time spend to save output : 0.03314614295959473 total time spend for step 7 : 5.584823369979858 step8:velours_tree Tue Sep 16 21:23:37 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure can't find the photo_desc_type Inside saveOutput : final : False verbose : 0 ouput is None No outpout to save, returning out of save general time spend for datou_step_exec : 1.6259078979492188 time spend to save output : 6.961822509765625e-05 total time spend for step 8 : 1.6259775161743164 step9:send_mail_cod Tue Sep 16 21:23:39 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin/workarea/git/Velours/python in order to get the selector url, please entre the license of selector results_Auto_P26911018_16-09-2025_21_23_39.pdf 26911873 imagette269118731758050619 26911874 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette269118741758050619 26911875 change filename to text .change filename to text .change filename to text .imagette269118751758050621 26911876 imagette269118761758050621 26911877 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette269118771758050621 26911878 imagette269118781758050623 26911879 imagette269118791758050623 26911880 imagette269118801758050623 26911881 imagette269118811758050623 26911882 change filename to text .imagette269118821758050623 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=26911018 and hashtag_type = 3594 and pcr.hashtag_id = h.hashtag_id; velour_link : https://marlene.fotonower.com/velours/26911873,26911874,26911875,26911876,26911877,26911878,26911879,26911880,26911881,26911882,26911883?tags=pehd,pet_clair,carton,pet_fonce,papier,metal,flou,mal_croppe,background,autre,environnement args[1383919936] : ((1383919936, -6.814282432432434, 492609224), (1383919936, 1.1201577306228208, 2107752395), '0.10970645254629634') We are sending mail with results at report@fotonower.com args[1383919931] : ((1383919931, -6.885385419320041, 492609224), (1383919931, 1.0143772479618363, 2107752395), '0.10970645254629634') We are sending mail with results at report@fotonower.com args[1383919917] : ((1383919917, -6.7737824236236595, 492609224), (1383919917, 0.9843245182657578, 2107752395), '0.10970645254629634') We are sending mail with results at report@fotonower.com args[1383919916] : ((1383919916, -6.791287607877103, 492609224), (1383919916, 1.1314847122900955, 2107752395), '0.10970645254629634') We are sending mail with results at report@fotonower.com args[1383919896] : ((1383919896, -6.535633999609527, 492609224), (1383919896, 1.0978074340442463, 2107752395), '0.10970645254629634') We are sending mail with results at report@fotonower.com refus_total : 0.10970645254629634 2022-04-13 10:29:59 0 SELECT ph.photo_id,ph.url,ph.username,ph.uploaded_at,ph.text FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=26911018 AND mpp.hide_status=0 ORDER BY mpp.order LIMIT 0, 1000 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26911018_16-09-2025_21_23_39.pdf results_Auto_P26911018_16-09-2025_21_23_39.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26911018_16-09-2025_21_23_39.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('3318','26911018','results_Auto_P26911018_16-09-2025_21_23_39.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26911018_16-09-2025_21_23_39.pdf','pdf','','0.6','0.10970645254629634') message_in_mail: Bonjour,
Veuillez trouver ci dessous les résultats du service carac on demand pour le portfolio: https://www.fotonower.com/view/26911018

https://www.fotonower.com/image?json=false&list_photos_id=1383919936
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1383919931
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1383919917
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1383919916
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1383919896
Bravo, la photo est bien prise.

Dans ces conditions,le taux de refus est: 10.97%
Veuillez trouver les photos des contaminants.

exemples de contaminants: pet_clair: https://www.fotonower.com/view/26911874?limit=200
exemples de contaminants: carton: https://www.fotonower.com/view/26911875?limit=200
exemples de contaminants: papier: https://www.fotonower.com/view/26911877?limit=200
exemples de contaminants: autre: https://www.fotonower.com/view/26911882?limit=200
Veuillez trouver le rapport en pdf:https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26911018_16-09-2025_21_23_39.pdf.

Lien vers velours :https://marlene.fotonower.com/velours/26911873,26911874,26911875,26911876,26911877,26911878,26911879,26911880,26911881,26911882,26911883?tags=pehd,pet_clair,carton,pet_fonce,papier,metal,flou,mal_croppe,background,autre,environnement.


L'équipe Fotonower 202 b'' Server: nginx Date: Tue, 16 Sep 2025 19:23:45 GMT Content-Length: 0 Connection: close X-Message-Id: bTVBqQ-OSiqB_g0Rj_oFuA Access-Control-Allow-Origin: https://sendgrid.api-docs.io Access-Control-Allow-Methods: POST Access-Control-Allow-Headers: Authorization, Content-Type, On-behalf-of, x-sg-elas-acl Access-Control-Max-Age: 600 X-No-CORS-Reason: https://sendgrid.com/docs/Classroom/Basics/API/cors.html Strict-Transport-Security: max-age=31536000; includeSubDomains Content-Security-Policy: frame-ancestors 'none' Cache-Control: no-cache X-Content-Type-Options: no-sniff Referrer-Policy: strict-origin-when-cross-origin Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1383919936, 1383919931, 1383919917, 1383919916, 1383919896] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919936', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919931', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919917', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919916', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919896', None, None, None, None, None, '3730515') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 5 time used for this insertion : 0.012262821197509766 save_final save missing photos in datou_result : time spend for datou_step_exec : 6.396907091140747 time spend to save output : 0.01247715950012207 total time spend for step 9 : 6.409384250640869 step10:split_time_score Tue Sep 16 21:23:45 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score Catched exception ! Connect or reconnect ! TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('14', 5),) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 16092025 26911018 Nombre de photos uploadées : 5 / 23040 (0%) 16092025 26911018 Nombre de photos taguées (types de déchets): 0 / 5 (0%) 16092025 26911018 Nombre de photos taguées (volume) : 0 / 5 (0%) elapsed_time : load_data_split_time_score 1.6689300537109375e-06 elapsed_time : order_list_meta_photo_and_scores 5.4836273193359375e-06 ????? elapsed_time : fill_and_build_computed_from_old_data 0.0003371238708496094 Catched exception ! Connect or reconnect ! Catched exception ! Connect or reconnect ! elapsed_time : insert_dashboard_record_day_entry 0.20164990425109863 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.023909218230979815 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26876770_16-09-2025_07_22_12.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26876770 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26876770 AND mptpi.`type`=3726 To do Qualite : 0.03517855010745843 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26884485_16-09-2025_10_17_10.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26884485 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26884485 AND mptpi.`type`=3726 To do Qualite : 0.026004526996149944 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26884487_16-09-2025_10_08_10.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26884487 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26884487 AND mptpi.`type`=3726 To do Qualite : 0.05801695521476336 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26885907_16-09-2025_10_43_09.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26885907 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26885907 AND mptpi.`type`=3594 To do Qualite : 0.03359804880153723 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26889598_16-09-2025_12_17_17.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26889598 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26889598 AND mptpi.`type`=3726 To do Qualite : 0.044813846757225065 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26889611_16-09-2025_12_08_10.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26889611 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26889611 AND mptpi.`type`=3726 To do Qualite : 0.0808746920156426 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26890986_16-09-2025_13_22_14.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26890986 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26890986 AND mptpi.`type`=3594 To do Qualite : 0.028585023675735784 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26894919_16-09-2025_15_13_01.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26894919 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26894919 AND mptpi.`type`=3726 To do Qualite : 0.10970645254629634 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26911018_16-09-2025_21_23_39.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26911018 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26911018 AND mptpi.`type`=3594 To do Qualite : 0.08206244300645348 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26905388_16-09-2025_19_22_16.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26905388 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26905388 AND mptpi.`type`=3594 To do Qualite : 0.008911552800446566 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26902485_16-09-2025_18_16_53.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26902485 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26902485 AND mptpi.`type`=3726 To do Qualite : 0.03508466555748456 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26909866_16-09-2025_20_54_31.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26909866 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26909866 AND mptpi.`type`=3594 To do Qualite : 0.043743910069078795 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26911053_16-09-2025_21_07_16.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26911053 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26911053 AND mptpi.`type`=3726 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'16092025': {'nb_upload': 5, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1383919936, 1383919931, 1383919917, 1383919916, 1383919896] Looping around the photos to save general results len do output : 1 /26911018Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919936', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919931', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919917', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919916', None, None, None, None, None, '3730515') ('3318', None, None, None, None, None, None, None, '3730515') ('3318', '26911018', '1383919896', None, None, None, None, None, '3730515') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 6 time used for this insertion : 0.013257026672363281 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.2198240756988525 time spend to save output : 0.01346731185913086 total time spend for step 10 : 1.2332913875579834 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 5 set_done_treatment 85.67user 85.98system 3:22.64elapsed 84%CPU (0avgtext+0avgdata 2809696maxresident)k 1006104inputs+93368outputs (22505major+8009437minor)pagefaults 0swaps