python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 3318 ' -s datou_3318 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 3754840 load datou : 3318 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? [(photo_id, hashtag_id, hashtag_type, x0, x1, y0, y1, score, seg_temp, polygons), ...] was removed should we ? chemin de la photo was removed should we ? [ (photo_id_loc, hashtag_id, hashtag_type, x0, x1, y0, y1, score, None), ...] was removed should we ? chemin de la photo was removed should we ? id de la photo (peut être local ou global) was removed should we ? chemin de la photo was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? None was removed should we ? donnée sous forme de nombre was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) load thcls load THCL from format json or kwargs add thcl : 2847 in CacheModelConfig load pdts add pdt : 5275 in CacheModelConfig Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 3318, datou_cur_ids : ['3674576'] with mtr_portfolio_ids : ['26608746'] and first list_photo_ids : [] new path : /proc/3754840/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : mask_detect, crop_condition, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, velours_tree, send_mail_cod, split_time_score over limit max, limiting to limit_max 40 list_input_json : [] origin We have 1 , BFBFBFBFBFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 9 ; length of list_pids : 9 ; length of list_args : 9 time to download the photos : 1.1120648384094238 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 10 step1:mask_detect Sat Sep 6 14:10:28 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 10599 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-09-06 14:10:31.466347: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-09-06 14:10:31.474281: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3492910000 Hz 2025-09-06 14:10:31.475803: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f980c000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-09-06 14:10:31.475849: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-09-06 14:10:31.478528: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-09-06 14:10:31.612513: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x3f087f20 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-09-06 14:10:31.612556: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-09-06 14:10:31.613991: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-09-06 14:10:31.614395: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-09-06 14:10:31.617519: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-09-06 14:10:31.620462: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-09-06 14:10:31.620958: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-09-06 14:10:31.623621: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-09-06 14:10:31.624977: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-09-06 14:10:31.629999: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-09-06 14:10:31.631637: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-09-06 14:10:31.631707: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-09-06 14:10:31.632452: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-09-06 14:10:31.632469: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-09-06 14:10:31.632477: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-09-06 14:10:31.637392: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9674 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-09-06 14:10:31.920898: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-09-06 14:10:31.920977: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-09-06 14:10:31.920995: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-09-06 14:10:31.921010: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-09-06 14:10:31.921025: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-09-06 14:10:31.921039: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-09-06 14:10:31.921054: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-09-06 14:10:31.921068: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-09-06 14:10:31.922296: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-09-06 14:10:31.923372: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-09-06 14:10:31.923403: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-09-06 14:10:31.923420: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-09-06 14:10:31.923436: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-09-06 14:10:31.923452: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-09-06 14:10:31.923468: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-09-06 14:10:31.923484: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-09-06 14:10:31.923500: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-09-06 14:10:31.924682: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-09-06 14:10:31.924707: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-09-06 14:10:31.924715: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-09-06 14:10:31.924722: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-09-06 14:10:31.925932: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9674 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl2847 thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) {'thcl': {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'], 'list_hashtags_csv': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'svm_hashtag_type_desc': 5275, 'photo_desc_type': 5275, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME learn_RUBBIA_REFUS_AMIENS_23 NUM_CLASSES 9 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : learn_RUBBIA_REFUS_AMIENS_23 model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-09-06 14:10:40.686974: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-09-06 14:10:40.883171: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-09-06 14:10:42.290286: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:42.290961: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 3.60G (3865470464 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:42.291569: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 3.24G (3478923264 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:42.292152: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.92G (3131030784 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:42.292795: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.62G (2817927680 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.146908: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.147003: W tensorflow/core/common_runtime/bfc_allocator.cc:311] Garbage collection: deallocate free memory regions (i.e., allocations) so that we can re-allocate a larger region to avoid OOM due to memory fragmentation. If you see this message frequently, you are running near the threshold of the available device memory and re-allocation may incur great performance overhead. You may try smaller batch sizes to observe the performance impact. Set TF_ENABLE_GPU_GARBAGE_COLLECTION=false if you'd like to disable this feature. 2025-09-06 14:10:43.207405: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.207488: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.67GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-06 14:10:43.208948: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.208988: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.67GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-06 14:10:43.218478: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.218520: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 3.29GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-06 14:10:43.219130: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.219169: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 3.29GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-06 14:10:43.226616: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.226642: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 1.78GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-06 14:10:43.227235: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.227252: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 1.78GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-06 14:10:43.258897: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.258975: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 19.91MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-06 14:10:43.258994: W tensorflow/core/kernels/gpu_utils.cc:49] Failed to allocate memory for convolution redzone checking; skipping this check. This is benign and only means that we won't check cudnn for out-of-bounds reads and writes. This message will only be printed once. 2025-09-06 14:10:43.260101: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.260134: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 16.00MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-06 14:10:43.261342: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.261388: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 16.00MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-06 14:10:43.269349: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.269412: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 63.85MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-09-06 14:10:43.270223: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.271047: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.271827: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.281377: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.282075: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.305454: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.306186: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.306860: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.307539: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.313023: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.313769: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.314444: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.315114: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.317184: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.328010: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.328730: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.339386: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.340091: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.340833: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.341480: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.342132: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-09-06 14:10:43.342761: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory local folder : /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23 /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23/mask_model.h5 size_local : 256009536 size in s3 : 256009536 create time local : 2021-08-09 09:43:22 create time in s3 : 2021-08-06 18:54:04 mask_model.h5 already exist and didn't need to update list_images length : 9 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 21.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 30.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 1 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 32.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 35.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 12 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 28.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 9 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 29.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 36.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 8 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 30.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 Detection mask done ! Trying to reset tf kernel 3755401 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 4828 tf kernel not reseted sub process len(results) : 9 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 9 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 6021 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl2847 Catched exception ! Connect or reconnect ! thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] time for calcul the mask position with numpy : 0.0002982616424560547 nb_pixel_total : 9786 time to create 1 rle with old method : 0.011203289031982422 length of segment : 172 time for calcul the mask position with numpy : 8.606910705566406e-05 nb_pixel_total : 3937 time to create 1 rle with old method : 0.004486799240112305 length of segment : 81 time for calcul the mask position with numpy : 0.00011587142944335938 nb_pixel_total : 6252 time to create 1 rle with old method : 0.007188558578491211 length of segment : 73 time for calcul the mask position with numpy : 0.00042939186096191406 nb_pixel_total : 24104 time to create 1 rle with old method : 0.027129411697387695 length of segment : 226 time for calcul the mask position with numpy : 5.888938903808594e-05 nb_pixel_total : 1976 time to create 1 rle with old method : 0.0023720264434814453 length of segment : 49 time for calcul the mask position with numpy : 0.00025653839111328125 nb_pixel_total : 8298 time to create 1 rle with old method : 0.009318351745605469 length of segment : 178 time for calcul the mask position with numpy : 0.0020608901977539062 nb_pixel_total : 127479 time to create 1 rle with old method : 0.14165759086608887 length of segment : 342 time for calcul the mask position with numpy : 0.0011775493621826172 nb_pixel_total : 80094 time to create 1 rle with old method : 0.08766531944274902 length of segment : 392 time for calcul the mask position with numpy : 0.0001266002655029297 nb_pixel_total : 5889 time to create 1 rle with old method : 0.0067555904388427734 length of segment : 88 time for calcul the mask position with numpy : 0.00011849403381347656 nb_pixel_total : 3762 time to create 1 rle with old method : 0.0047075748443603516 length of segment : 90 time for calcul the mask position with numpy : 0.000530242919921875 nb_pixel_total : 41070 time to create 1 rle with old method : 0.04621267318725586 length of segment : 205 time for calcul the mask position with numpy : 0.00011730194091796875 nb_pixel_total : 6634 time to create 1 rle with old method : 0.007637500762939453 length of segment : 110 time for calcul the mask position with numpy : 0.00014281272888183594 nb_pixel_total : 7832 time to create 1 rle with old method : 0.008946895599365234 length of segment : 150 time for calcul the mask position with numpy : 0.0001049041748046875 nb_pixel_total : 4753 time to create 1 rle with old method : 0.005591869354248047 length of segment : 78 time for calcul the mask position with numpy : 0.00026988983154296875 nb_pixel_total : 15916 time to create 1 rle with old method : 0.01862049102783203 length of segment : 160 time for calcul the mask position with numpy : 0.00017404556274414062 nb_pixel_total : 8673 time to create 1 rle with old method : 0.010166645050048828 length of segment : 145 time for calcul the mask position with numpy : 0.00017118453979492188 nb_pixel_total : 12343 time to create 1 rle with old method : 0.01392221450805664 length of segment : 123 time for calcul the mask position with numpy : 0.0002772808074951172 nb_pixel_total : 20870 time to create 1 rle with old method : 0.02443552017211914 length of segment : 165 time for calcul the mask position with numpy : 0.000476837158203125 nb_pixel_total : 34556 time to create 1 rle with old method : 0.03957366943359375 length of segment : 217 time for calcul the mask position with numpy : 0.00010013580322265625 nb_pixel_total : 5036 time to create 1 rle with old method : 0.006104946136474609 length of segment : 72 time for calcul the mask position with numpy : 0.00016021728515625 nb_pixel_total : 7835 time to create 1 rle with old method : 0.009018898010253906 length of segment : 114 time for calcul the mask position with numpy : 0.0001327991485595703 nb_pixel_total : 6481 time to create 1 rle with old method : 0.0075647830963134766 length of segment : 122 time spent for convertir_results : 1.2838466167449951 Inside saveOutput : final : False verbose : 0 eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 22 chid ids of type : 3594 Number RLEs to save : 3352 save missing photos in datou_result : time spend for datou_step_exec : 25.64249849319458 time spend to save output : 0.2628664970397949 total time spend for step 1 : 25.905364990234375 step2:crop_condition Sat Sep 6 14:10:54 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure Loading chi in step crop with photo_hashtag_type : 3594 Loading chi in step crop for list_pids : 9 ! batch 1 Loaded 22 chid ids of type : 3594 +++++++++++++++++++++++++++++++ begin to crop the class : papier param for this class : {'min_score': 0.7} filtre for class : papier hashtag_id of this class : 492668766 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 13 About to insert : list_path_to_insert length 13 new photo from crops ! About to upload 13 photos upload in portfolio : 3736932 init cache_photo without model_param we have 13 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1757160656_3754840 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 13 photos in the portfolio 3736932 time of upload the photos Elapsed time : 4.402739524841309 we have finished the crop for the class : papier begin to crop the class : carton param for this class : {'min_score': 0.7} filtre for class : carton hashtag_id of this class : 492774966 begin to crop the class : metal param for this class : {'min_score': 0.7} filtre for class : metal hashtag_id of this class : 492628673 begin to crop the class : pet_clair param for this class : {'min_score': 0.7} filtre for class : pet_clair hashtag_id of this class : 2107755846 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 7 About to insert : list_path_to_insert length 7 new photo from crops ! About to upload 7 photos upload in portfolio : 3736932 init cache_photo without model_param we have 7 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1757160661_3754840 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 7 photos in the portfolio 3736932 time of upload the photos Elapsed time : 2.5664477348327637 we have finished the crop for the class : pet_clair begin to crop the class : autre param for this class : {'min_score': 0.7} filtre for class : autre hashtag_id of this class : 494826614 we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 1 About to insert : list_path_to_insert length 1 new photo from crops ! About to upload 1 photos upload in portfolio : 3736932 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1757160664_3754840 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 1 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.5360798835754395 we have finished the crop for the class : autre begin to crop the class : pehd param for this class : {'min_score': 0.7} filtre for class : pehd hashtag_id of this class : 628944319 begin to crop the class : pet_fonce param for this class : {'min_score': 0.7} filtre for class : pet_fonce hashtag_id of this class : 2107755900 we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 1 About to insert : list_path_to_insert length 1 new photo from crops ! About to upload 1 photos upload in portfolio : 3736932 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1757160665_3754840 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 1 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.5577182769775391 we have finished the crop for the class : pet_fonce delete rles from all chi we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : crop_condition we use saveGeneral [1382057109, 1382057107, 1382057096, 1382057064, 1382057036, 1382057012, 1382056935, 1382056933, 1382056930] Looping around the photos to save general results len do output : 22 /1382066014Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066016Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066017Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066019Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066020Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066022Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066023Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066024Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066026Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066027Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066029Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066030Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066031Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066043Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066044Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066045Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066046Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066047Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066048Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066049Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066060Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1382066071Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057109', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057107', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057096', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057064', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057036', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057012', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056935', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056933', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056930', None, None, None, None, None, '3674576') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 75 time used for this insertion : 0.01854395866394043 save_final save missing photos in datou_result : time spend for datou_step_exec : 11.093380451202393 time spend to save output : 0.01956629753112793 total time spend for step 2 : 11.11294674873352 step3:rle_unique_nms_with_priority Sat Sep 6 14:11:05 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 22 chid ids of type : 3594 +++++++++++++++++++++++++++++++nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.5146088600158691 time for calcul the mask position with numpy : 0.18474411964416504 nb_pixel_total : 2063814 time to create 1 rle with new method : 0.5223970413208008 time for calcul the mask position with numpy : 0.006660938262939453 nb_pixel_total : 9786 time to create 1 rle with old method : 0.011033058166503906 create new chi : 0.7351598739624023 time to delete rle : 0.025089502334594727 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1424 TO DO : save crop sub photo not yet done ! save time : 0.12357568740844727 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.3106980323791504 time for calcul the mask position with numpy : 0.06653857231140137 nb_pixel_total : 2063411 time to create 1 rle with new method : 0.07049798965454102 time for calcul the mask position with numpy : 0.00613713264465332 nb_pixel_total : 6252 time to create 1 rle with old method : 0.0070459842681884766 time for calcul the mask position with numpy : 0.0058629512786865234 nb_pixel_total : 3937 time to create 1 rle with old method : 0.004283428192138672 create new chi : 0.1698460578918457 time to delete rle : 0.00022649765014648438 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1388 TO DO : save crop sub photo not yet done ! save time : 0.10987305641174316 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.036960601806640625 time for calcul the mask position with numpy : 0.03498029708862305 nb_pixel_total : 2049496 time to create 1 rle with new method : 0.07904601097106934 time for calcul the mask position with numpy : 0.006119489669799805 nb_pixel_total : 24104 time to create 1 rle with old method : 0.025919675827026367 create new chi : 0.15440678596496582 time to delete rle : 0.00020551681518554688 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1532 TO DO : save crop sub photo not yet done ! save time : 0.12418222427368164 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.21480655670166016 time for calcul the mask position with numpy : 0.05488467216491699 nb_pixel_total : 2063326 time to create 1 rle with new method : 0.19550585746765137 time for calcul the mask position with numpy : 0.0062885284423828125 nb_pixel_total : 8298 time to create 1 rle with old method : 0.009203195571899414 time for calcul the mask position with numpy : 0.006605386734008789 nb_pixel_total : 1976 time to create 1 rle with old method : 0.002256155014038086 create new chi : 0.2847151756286621 time to delete rle : 0.00023436546325683594 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1534 TO DO : save crop sub photo not yet done ! save time : 0.14766836166381836 nb_obj : 5 nb_hashtags : 2 time to prepare the origin masks : 0.14921283721923828 time for calcul the mask position with numpy : 0.07011556625366211 nb_pixel_total : 1851081 time to create 1 rle with new method : 0.5172629356384277 time for calcul the mask position with numpy : 0.011127948760986328 nb_pixel_total : 6634 time to create 1 rle with old method : 0.007238864898681641 time for calcul the mask position with numpy : 0.006617546081542969 nb_pixel_total : 2423 time to create 1 rle with old method : 0.002897977828979492 time for calcul the mask position with numpy : 0.0069904327392578125 nb_pixel_total : 5889 time to create 1 rle with old method : 0.006285667419433594 time for calcul the mask position with numpy : 0.006857872009277344 nb_pixel_total : 80094 time to create 1 rle with old method : 0.0843191146850586 time for calcul the mask position with numpy : 0.008752107620239258 nb_pixel_total : 127479 time to create 1 rle with old method : 0.1399087905883789 create new chi : 0.8785271644592285 time to delete rle : 0.00037479400634765625 batch 1 Loaded 12 chid ids of type : 3594 +++++++++Number RLEs to save : 3014 TO DO : save crop sub photo not yet done ! save time : 0.24050331115722656 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.0801854133605957 time for calcul the mask position with numpy : 0.02321481704711914 nb_pixel_total : 2061015 time to create 1 rle with new method : 0.06307792663574219 time for calcul the mask position with numpy : 0.005655765533447266 nb_pixel_total : 4753 time to create 1 rle with old method : 0.004982948303222656 time for calcul the mask position with numpy : 0.005681276321411133 nb_pixel_total : 7832 time to create 1 rle with old method : 0.0081329345703125 create new chi : 0.11098670959472656 time to delete rle : 0.0002224445343017578 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1536 TO DO : save crop sub photo not yet done ! save time : 0.15528273582458496 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.032512664794921875 time for calcul the mask position with numpy : 0.020111083984375 nb_pixel_total : 2057684 time to create 1 rle with new method : 0.03514289855957031 time for calcul the mask position with numpy : 0.005782604217529297 nb_pixel_total : 15916 time to create 1 rle with old method : 0.017313241958618164 create new chi : 0.07858419418334961 time to delete rle : 0.0002143383026123047 batch 1 Loaded 3 chid ids of type : 3594 +++Number RLEs to save : 1400 TO DO : save crop sub photo not yet done ! save time : 0.131211519241333 nb_obj : 5 nb_hashtags : 2 time to prepare the origin masks : 1.5473952293395996 time for calcul the mask position with numpy : 0.40361738204956055 nb_pixel_total : 1992122 time to create 1 rle with new method : 0.07690215110778809 time for calcul the mask position with numpy : 0.006372690200805664 nb_pixel_total : 5036 time to create 1 rle with old method : 0.005777597427368164 time for calcul the mask position with numpy : 0.01063990592956543 nb_pixel_total : 34556 time to create 1 rle with old method : 0.03874611854553223 time for calcul the mask position with numpy : 0.009406328201293945 nb_pixel_total : 20870 time to create 1 rle with old method : 0.023436307907104492 time for calcul the mask position with numpy : 0.010200738906860352 nb_pixel_total : 12343 time to create 1 rle with old method : 0.014055490493774414 time for calcul the mask position with numpy : 0.006152153015136719 nb_pixel_total : 8673 time to create 1 rle with old method : 0.009869813919067383 create new chi : 0.6261699199676514 time to delete rle : 0.0004138946533203125 batch 1 Loaded 11 chid ids of type : 3594 +++++++++Number RLEs to save : 2524 TO DO : save crop sub photo not yet done ! save time : 0.19719457626342773 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.049846649169921875 time for calcul the mask position with numpy : 0.023145675659179688 nb_pixel_total : 2059284 time to create 1 rle with new method : 0.34868478775024414 time for calcul the mask position with numpy : 0.007656574249267578 nb_pixel_total : 6481 time to create 1 rle with old method : 0.008411645889282227 time for calcul the mask position with numpy : 0.007357358932495117 nb_pixel_total : 7835 time to create 1 rle with old method : 0.008905410766601562 create new chi : 0.4044361114501953 time to delete rle : 0.00030303001403808594 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1552 TO DO : save crop sub photo not yet done ! save time : 0.1486356258392334 map_output_result : {1382057109: (0.0, 'Should be the crop_list due to order', 0), 1382057107: (0.0, 'Should be the crop_list due to order', 0), 1382057096: (0.0, 'Should be the crop_list due to order', 0), 1382057064: (0.0, 'Should be the crop_list due to order', 0), 1382057036: (0.0, 'Should be the crop_list due to order', 0), 1382057012: (0.0, 'Should be the crop_list due to order', 0), 1382056935: (0.0, 'Should be the crop_list due to order', 0), 1382056933: (0.0, 'Should be the crop_list due to order', 0), 1382056930: (0.0, 'Should be the crop_list due to order', 0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1382057109, 1382057107, 1382057096, 1382057064, 1382057036, 1382057012, 1382056935, 1382056933, 1382056930] Looping around the photos to save general results len do output : 9 /1382057109.Didn't retrieve data . /1382057107.Didn't retrieve data . /1382057096.Didn't retrieve data . /1382057064.Didn't retrieve data . /1382057036.Didn't retrieve data . /1382057012.Didn't retrieve data . /1382056935.Didn't retrieve data . /1382056933.Didn't retrieve data . /1382056930.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057109', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057107', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057096', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057064', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057036', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057012', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056935', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056933', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056930', None, None, None, None, None, '3674576') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 27 time used for this insertion : 0.013845682144165039 save_final save missing photos in datou_result : time spend for datou_step_exec : 8.09540581703186 time spend to save output : 0.014261007308959961 total time spend for step 3 : 8.10966682434082 step4:ventilate_hashtags_in_portfolio Sat Sep 6 14:11:13 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 26608746 get user id for portfolio 26608746 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26608746 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('mal_croppe','pet_clair','background','autre','pehd','environnement','pet_fonce','papier','metal','flou','carton')) AND mptpi.`min_score`=0.5 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26608746 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('mal_croppe','pet_clair','background','autre','pehd','environnement','pet_fonce','papier','metal','flou','carton')) AND mptpi.`min_score`=0.5 To do Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26608746 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('mal_croppe','pet_clair','background','autre','pehd','environnement','pet_fonce','papier','metal','flou','carton')) AND mptpi.`min_score`=0.5 To do lien utilise dans velours : https://marlene.fotonower.com/velours/26608910,26608911,26608912,26608913,26608914,26608915,26608916,26608917,26608918,26608919,26608920?tags=mal_croppe,pet_clair,background,autre,pehd,environnement,pet_fonce,papier,metal,flou,carton Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1382057109, 1382057107, 1382057096, 1382057064, 1382057036, 1382057012, 1382056935, 1382056933, 1382056930] Looping around the photos to save general results len do output : 1 /26608746. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057109', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057107', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057096', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057064', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057036', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057012', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056935', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056933', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056930', None, None, None, None, None, '3674576') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 10 time used for this insertion : 0.019840717315673828 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.7147703170776367 time spend to save output : 0.02025771141052246 total time spend for step 4 : 1.7350280284881592 step5:final Sat Sep 6 14:11:15 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Catched exception ! Connect or reconnect ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1382057109: ('0.021496002657750337',), 1382057107: ('0.021496002657750337',), 1382057096: ('0.021496002657750337',), 1382057064: ('0.021496002657750337',), 1382057036: ('0.021496002657750337',), 1382057012: ('0.021496002657750337',), 1382056935: ('0.021496002657750337',), 1382056933: ('0.021496002657750337',), 1382056930: ('0.021496002657750337',)} new output for save of step final : {1382057109: ('0.021496002657750337',), 1382057107: ('0.021496002657750337',), 1382057096: ('0.021496002657750337',), 1382057064: ('0.021496002657750337',), 1382057036: ('0.021496002657750337',), 1382057012: ('0.021496002657750337',), 1382056935: ('0.021496002657750337',), 1382056933: ('0.021496002657750337',), 1382056930: ('0.021496002657750337',)} [1382057109, 1382057107, 1382057096, 1382057064, 1382057036, 1382057012, 1382056935, 1382056933, 1382056930] Looping around the photos to save general results len do output : 9 /1382057109.Didn't retrieve data . /1382057107.Didn't retrieve data . /1382057096.Didn't retrieve data . /1382057064.Didn't retrieve data . /1382057036.Didn't retrieve data . /1382057012.Didn't retrieve data . /1382056935.Didn't retrieve data . /1382056933.Didn't retrieve data . /1382056930.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057109', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057107', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057096', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057064', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057036', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057012', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056935', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056933', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056930', None, None, None, None, None, '3674576') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 27 time used for this insertion : 0.013361215591430664 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.12863421440124512 time spend to save output : 0.013862848281860352 total time spend for step 5 : 0.14249706268310547 step6:blur_detection Sat Sep 6 14:11:15 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection methode: ratio et variance treat image : temp/1757160627_3754840_1382057109_9583940748b3fbf674bebb1d8f93d14a.jpg resize: (1080, 1920) 1382057109 -1.1817501912665263 treat image : temp/1757160627_3754840_1382057107_824c38fd28be25c7c64f36fd2ae18a9d.jpg resize: (1080, 1920) 1382057107 -1.05731577710641 treat image : temp/1757160627_3754840_1382057096_46428eeeff971a1345c7fc89d24bd62e.jpg resize: (1080, 1920) 1382057096 -0.44475606193827083 treat image : temp/1757160627_3754840_1382057064_c74d00e09f253e5413ff2dabde9633a0.jpg resize: (1080, 1920) 1382057064 0.48188617071792694 treat image : temp/1757160627_3754840_1382057036_5c00726081678b4d9bf563cc240b1437.jpg resize: (1080, 1920) 1382057036 -0.348249057704559 treat image : temp/1757160627_3754840_1382057012_0e6d7e7532e5a068c780d60df731f377.jpg resize: (1080, 1920) 1382057012 -0.20230023188979243 treat image : temp/1757160627_3754840_1382056935_942b8476df67fb0b39197bf2ac596f2d.jpg resize: (1080, 1920) 1382056935 -0.5732393094476806 treat image : temp/1757160627_3754840_1382056933_cd2308b0b489711d18cde82e40ca289a.jpg resize: (1080, 1920) 1382056933 -0.6957511558557204 treat image : temp/1757160627_3754840_1382056930_800b5bed4dbebab533a2f899c7ff1697.jpg resize: (1080, 1920) 1382056930 -0.3334613851096006 treat image : temp/1757160627_3754840_1382057107_824c38fd28be25c7c64f36fd2ae18a9d_rle_crop_3948070647_0.png resize: (81, 57) 1382066014 0.4735082315374256 treat image : temp/1757160627_3754840_1382057064_c74d00e09f253e5413ff2dabde9633a0_rle_crop_3948070651_0.png resize: (178, 105) 1382066016 -2.0663911247332765 treat image : temp/1757160627_3754840_1382057036_5c00726081678b4d9bf563cc240b1437_rle_crop_3948070657_0.png resize: (107, 79) 1382066017 -1.1244615109229208 treat image : temp/1757160627_3754840_1382057036_5c00726081678b4d9bf563cc240b1437_rle_crop_3948070656_0.png resize: (169, 343) 1382066019 -2.5075566225066823 treat image : temp/1757160627_3754840_1382057036_5c00726081678b4d9bf563cc240b1437_rle_crop_3948070652_0.png resize: (338, 602) 1382066020 -2.543912006939461 treat image : temp/1757160627_3754840_1382057036_5c00726081678b4d9bf563cc240b1437_rle_crop_3948070655_0.png resize: (65, 127) 1382066022 -1.2050380127328184 treat image : temp/1757160627_3754840_1382057036_5c00726081678b4d9bf563cc240b1437_rle_crop_3948070654_0.png resize: (87, 96) 1382066023 0.6894289069016442 treat image : temp/1757160627_3754840_1382057012_0e6d7e7532e5a068c780d60df731f377_rle_crop_3948070658_0.png resize: (150, 80) 1382066024 -1.0556533382764899 treat image : temp/1757160627_3754840_1382057012_0e6d7e7532e5a068c780d60df731f377_rle_crop_3948070659_0.png resize: (78, 96) 1382066026 -1.2834536148924771 treat image : temp/1757160627_3754840_1382056935_942b8476df67fb0b39197bf2ac596f2d_rle_crop_3948070660_0.png resize: (135, 198) 1382066027 -1.9933931205764586 treat image : temp/1757160627_3754840_1382056933_cd2308b0b489711d18cde82e40ca289a_rle_crop_3948070662_0.png resize: (123, 136) 1382066029 -0.13430249858784446 treat image : temp/1757160627_3754840_1382056930_800b5bed4dbebab533a2f899c7ff1697_rle_crop_3948070666_0.png resize: (114, 122) 1382066030 -2.4262244244673896 treat image : temp/1757160627_3754840_1382056930_800b5bed4dbebab533a2f899c7ff1697_rle_crop_3948070667_0.png resize: (122, 83) 1382066031 -0.7197769664669934 treat image : temp/1757160627_3754840_1382057096_46428eeeff971a1345c7fc89d24bd62e_rle_crop_3948070649_0.png resize: (197, 204) 1382066043 -1.5941423525396154 treat image : temp/1757160627_3754840_1382057064_c74d00e09f253e5413ff2dabde9633a0_rle_crop_3948070650_0.png resize: (49, 50) 1382066044 4.130080808903635 treat image : temp/1757160627_3754840_1382057036_5c00726081678b4d9bf563cc240b1437_rle_crop_3948070653_0.png resize: (388, 330) 1382066045 -0.23148163861295448 treat image : temp/1757160627_3754840_1382056933_cd2308b0b489711d18cde82e40ca289a_rle_crop_3948070664_0.png resize: (213, 244) 1382066046 -1.5080279221845037 treat image : temp/1757160627_3754840_1382056933_cd2308b0b489711d18cde82e40ca289a_rle_crop_3948070665_0.png resize: (66, 98) 1382066047 -0.03871291204078842 treat image : temp/1757160627_3754840_1382056933_cd2308b0b489711d18cde82e40ca289a_rle_crop_3948070661_0.png resize: (114, 123) 1382066048 -0.8909475002949396 treat image : temp/1757160627_3754840_1382056933_cd2308b0b489711d18cde82e40ca289a_rle_crop_3948070663_0.png resize: (144, 207) 1382066049 -1.4099481184617226 treat image : temp/1757160627_3754840_1382057107_824c38fd28be25c7c64f36fd2ae18a9d_rle_crop_3948070648_0.png resize: (70, 104) 1382066060 3.6090967559162053 treat image : temp/1757160627_3754840_1382057109_9583940748b3fbf674bebb1d8f93d14a_rle_crop_3948070646_0.png resize: (172, 125) 1382066071 -1.7841549251883906 Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 31 time used for this insertion : 0.013839006423950195 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 31 time used for this insertion : 0.013196229934692383 save missing photos in datou_result : time spend for datou_step_exec : 7.245824575424194 time spend to save output : 0.0320894718170166 total time spend for step 6 : 7.277914047241211 step7:brightness Sat Sep 6 14:11:22 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness treat image : temp/1757160627_3754840_1382057109_9583940748b3fbf674bebb1d8f93d14a.jpg treat image : temp/1757160627_3754840_1382057107_824c38fd28be25c7c64f36fd2ae18a9d.jpg treat image : temp/1757160627_3754840_1382057096_46428eeeff971a1345c7fc89d24bd62e.jpg treat image : temp/1757160627_3754840_1382057064_c74d00e09f253e5413ff2dabde9633a0.jpg treat image : temp/1757160627_3754840_1382057036_5c00726081678b4d9bf563cc240b1437.jpg treat image : temp/1757160627_3754840_1382057012_0e6d7e7532e5a068c780d60df731f377.jpg treat image : temp/1757160627_3754840_1382056935_942b8476df67fb0b39197bf2ac596f2d.jpg treat image : temp/1757160627_3754840_1382056933_cd2308b0b489711d18cde82e40ca289a.jpg treat image : temp/1757160627_3754840_1382056930_800b5bed4dbebab533a2f899c7ff1697.jpg treat image : temp/1757160627_3754840_1382057107_824c38fd28be25c7c64f36fd2ae18a9d_rle_crop_3948070647_0.png treat image : temp/1757160627_3754840_1382057064_c74d00e09f253e5413ff2dabde9633a0_rle_crop_3948070651_0.png treat image : temp/1757160627_3754840_1382057036_5c00726081678b4d9bf563cc240b1437_rle_crop_3948070657_0.png treat image : temp/1757160627_3754840_1382057036_5c00726081678b4d9bf563cc240b1437_rle_crop_3948070656_0.png treat image : temp/1757160627_3754840_1382057036_5c00726081678b4d9bf563cc240b1437_rle_crop_3948070652_0.png treat image : temp/1757160627_3754840_1382057036_5c00726081678b4d9bf563cc240b1437_rle_crop_3948070655_0.png treat image : temp/1757160627_3754840_1382057036_5c00726081678b4d9bf563cc240b1437_rle_crop_3948070654_0.png treat image : temp/1757160627_3754840_1382057012_0e6d7e7532e5a068c780d60df731f377_rle_crop_3948070658_0.png treat image : temp/1757160627_3754840_1382057012_0e6d7e7532e5a068c780d60df731f377_rle_crop_3948070659_0.png treat image : temp/1757160627_3754840_1382056935_942b8476df67fb0b39197bf2ac596f2d_rle_crop_3948070660_0.png treat image : temp/1757160627_3754840_1382056933_cd2308b0b489711d18cde82e40ca289a_rle_crop_3948070662_0.png treat image : temp/1757160627_3754840_1382056930_800b5bed4dbebab533a2f899c7ff1697_rle_crop_3948070666_0.png treat image : temp/1757160627_3754840_1382056930_800b5bed4dbebab533a2f899c7ff1697_rle_crop_3948070667_0.png treat image : temp/1757160627_3754840_1382057096_46428eeeff971a1345c7fc89d24bd62e_rle_crop_3948070649_0.png treat image : temp/1757160627_3754840_1382057064_c74d00e09f253e5413ff2dabde9633a0_rle_crop_3948070650_0.png treat image : temp/1757160627_3754840_1382057036_5c00726081678b4d9bf563cc240b1437_rle_crop_3948070653_0.png treat image : temp/1757160627_3754840_1382056933_cd2308b0b489711d18cde82e40ca289a_rle_crop_3948070664_0.png treat image : temp/1757160627_3754840_1382056933_cd2308b0b489711d18cde82e40ca289a_rle_crop_3948070665_0.png treat image : temp/1757160627_3754840_1382056933_cd2308b0b489711d18cde82e40ca289a_rle_crop_3948070661_0.png treat image : temp/1757160627_3754840_1382056933_cd2308b0b489711d18cde82e40ca289a_rle_crop_3948070663_0.png treat image : temp/1757160627_3754840_1382057107_824c38fd28be25c7c64f36fd2ae18a9d_rle_crop_3948070648_0.png treat image : temp/1757160627_3754840_1382057109_9583940748b3fbf674bebb1d8f93d14a_rle_crop_3948070646_0.png Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 31 time used for this insertion : 0.013985633850097656 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 31 time used for this insertion : 0.01793813705444336 save missing photos in datou_result : time spend for datou_step_exec : 2.306337833404541 time spend to save output : 0.036573171615600586 total time spend for step 7 : 2.3429110050201416 step8:velours_tree Sat Sep 6 14:11:25 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure can't find the photo_desc_type Inside saveOutput : final : False verbose : 0 ouput is None No outpout to save, returning out of save general time spend for datou_step_exec : 0.22803497314453125 time spend to save output : 4.601478576660156e-05 total time spend for step 8 : 0.22808098793029785 step9:send_mail_cod Sat Sep 6 14:11:25 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin/workarea/git/Velours/python in order to get the selector url, please entre the license of selector results_Auto_P26608746_06-09-2025_14_11_25.pdf 26608910 imagette266089101757160685 26608911 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette266089111757160685 26608912 imagette266089121757160685 26608913 change filename to text .imagette266089131757160685 26608914 imagette266089141757160686 26608916 change filename to text .imagette266089161757160686 26608917 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette266089171757160686 26608918 imagette266089181757160686 26608919 imagette266089191757160686 26608920 imagette266089201757160687 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=26608746 and hashtag_type = 3594 and pcr.hashtag_id = h.hashtag_id; velour_link : https://marlene.fotonower.com/velours/26608910,26608911,26608912,26608913,26608914,26608915,26608916,26608917,26608918,26608919,26608920?tags=mal_croppe,pet_clair,background,autre,pehd,environnement,pet_fonce,papier,metal,flou,carton args[1382057109] : ((1382057109, -1.1817501912665263, 492688767), (1382057109, 0.8926829530208302, 2107752395), '0.021496002657750337') We are sending mail with results at report@fotonower.com args[1382057107] : ((1382057107, -1.05731577710641, 492688767), (1382057107, 0.45160368569156684, 2107752395), '0.021496002657750337') We are sending mail with results at report@fotonower.com args[1382057096] : ((1382057096, -0.44475606193827083, 492688767), (1382057096, 0.8408548121701204, 2107752395), '0.021496002657750337') We are sending mail with results at report@fotonower.com args[1382057064] : ((1382057064, 0.48188617071792694, 492688767), (1382057064, 0.6073292886578757, 2107752395), '0.021496002657750337') We are sending mail with results at report@fotonower.com args[1382057036] : ((1382057036, -0.348249057704559, 492688767), (1382057036, 0.5471902798304226, 2107752395), '0.021496002657750337') We are sending mail with results at report@fotonower.com args[1382057012] : ((1382057012, -0.20230023188979243, 492688767), (1382057012, 0.6632206476222129, 2107752395), '0.021496002657750337') We are sending mail with results at report@fotonower.com args[1382056935] : ((1382056935, -0.5732393094476806, 492688767), (1382056935, 1.0044050746942952, 2107752395), '0.021496002657750337') We are sending mail with results at report@fotonower.com args[1382056933] : ((1382056933, -0.6957511558557204, 492688767), (1382056933, 0.6823156988612903, 2107752395), '0.021496002657750337') We are sending mail with results at report@fotonower.com args[1382056930] : ((1382056930, -0.3334613851096006, 492688767), (1382056930, 0.8501670596099444, 2107752395), '0.021496002657750337') We are sending mail with results at report@fotonower.com refus_total : 0.021496002657750337 2022-04-13 10:29:59 0 SELECT ph.photo_id,ph.url,ph.username,ph.uploaded_at,ph.text FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=26608746 AND mpp.hide_status=0 ORDER BY mpp.order LIMIT 0, 1000 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26608746_06-09-2025_14_11_25.pdf results_Auto_P26608746_06-09-2025_14_11_25.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26608746_06-09-2025_14_11_25.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('3318','26608746','results_Auto_P26608746_06-09-2025_14_11_25.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26608746_06-09-2025_14_11_25.pdf','pdf','','0.15','0.021496002657750337') message_in_mail: Bonjour,
Veuillez trouver ci dessous les résultats du service carac on demand pour le portfolio: https://www.fotonower.com/view/26608746

https://www.fotonower.com/image?json=false&list_photos_id=1382057109
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382057107
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382057096
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382057064
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382057036
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382057012
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382056935
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382056933
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1382056930
Bravo, la photo est bien prise.

Dans ces conditions,le taux de refus est: 2.15%
Veuillez trouver les photos des contaminants.

exemples de contaminants: pet_clair: https://www.fotonower.com/view/26608911?limit=200
exemples de contaminants: autre: https://www.fotonower.com/view/26608913?limit=200
exemples de contaminants: pet_fonce: https://www.fotonower.com/view/26608916?limit=200
exemples de contaminants: papier: https://www.fotonower.com/view/26608917?limit=200
Veuillez trouver le rapport en pdf:https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26608746_06-09-2025_14_11_25.pdf.

Lien vers velours :https://marlene.fotonower.com/velours/26608910,26608911,26608912,26608913,26608914,26608915,26608916,26608917,26608918,26608919,26608920?tags=mal_croppe,pet_clair,background,autre,pehd,environnement,pet_fonce,papier,metal,flou,carton.


L'équipe Fotonower 202 b'' Server: nginx Date: Sat, 06 Sep 2025 12:11:28 GMT Content-Length: 0 Connection: close X-Message-Id: V16E5ZHVTPG8fLcHQ22Kqg Access-Control-Allow-Origin: https://sendgrid.api-docs.io Access-Control-Allow-Methods: POST Access-Control-Allow-Headers: Authorization, Content-Type, On-behalf-of, x-sg-elas-acl Access-Control-Max-Age: 600 X-No-CORS-Reason: https://sendgrid.com/docs/Classroom/Basics/API/cors.html Strict-Transport-Security: max-age=31536000; includeSubDomains Content-Security-Policy: frame-ancestors 'none' Cache-Control: no-cache X-Content-Type-Options: no-sniff Referrer-Policy: strict-origin-when-cross-origin Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1382057109, 1382057107, 1382057096, 1382057064, 1382057036, 1382057012, 1382056935, 1382056933, 1382056930] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057109', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057107', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057096', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057064', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057036', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057012', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056935', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056933', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056930', None, None, None, None, None, '3674576') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 9 time used for this insertion : 0.018064498901367188 save_final save missing photos in datou_result : time spend for datou_step_exec : 2.992018461227417 time spend to save output : 0.018238544464111328 total time spend for step 9 : 3.0102570056915283 step10:split_time_score Sat Sep 6 14:11:28 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score Catched exception ! Connect or reconnect ! TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('13', 9),) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 06092025 26608746 Nombre de photos uploadées : 9 / 23040 (0%) 06092025 26608746 Nombre de photos taguées (types de déchets): 0 / 9 (0%) 06092025 26608746 Nombre de photos taguées (volume) : 0 / 9 (0%) elapsed_time : load_data_split_time_score 2.6226043701171875e-06 elapsed_time : order_list_meta_photo_and_scores 6.9141387939453125e-06 ????????? elapsed_time : fill_and_build_computed_from_old_data 0.0005738735198974609 Catched exception ! Connect or reconnect ! Catched exception ! Connect or reconnect ! elapsed_time : insert_dashboard_record_day_entry 0.20519566535949707 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.21615461033950617 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26601766_06-09-2025_07_41_58.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26601766 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26601766 AND mptpi.`type`=3594 To do Qualite : 0.20595384837962966 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26604828_06-09-2025_10_51_43.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26604828 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26604828 AND mptpi.`type`=3594 To do Qualite : 0.1469003986625515 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26604829_06-09-2025_11_01_39.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26604829 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26604829 AND mptpi.`type`=3594 To do Qualite : 0.16525622477445395 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26604830_06-09-2025_10_41_53.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26604830 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26604830 AND mptpi.`type`=3594 To do Qualite : 0.05311668113425926 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26604831_06-09-2025_10_31_08.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26604831 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26604831 AND mptpi.`type`=3594 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26607883 order by id desc limit 1 Qualite : 0.021496002657750337 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P26608746_06-09-2025_14_11_25.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 26608746 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26608746 AND mptpi.`type`=3594 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'06092025': {'nb_upload': 9, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1382057109, 1382057107, 1382057096, 1382057064, 1382057036, 1382057012, 1382056935, 1382056933, 1382056930] Looping around the photos to save general results len do output : 1 /26608746Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057109', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057107', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057096', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057064', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057036', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382057012', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056935', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056933', None, None, None, None, None, '3674576') ('3318', None, None, None, None, None, None, None, '3674576') ('3318', '26608746', '1382056930', None, None, None, None, None, '3674576') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 10 time used for this insertion : 0.016315221786499023 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.7321016788482666 time spend to save output : 0.01658177375793457 total time spend for step 10 : 1.7486834526062012 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 9 set_done_treatment 29.29user 20.71system 1:05.34elapsed 76%CPU (0avgtext+0avgdata 2586272maxresident)k 549128inputs+11000outputs (262major+1396234minor)pagefaults 0swaps