python /home/admin/mtr/script_for_cron.py -j default -m 20 -a 'python3 ~/workarea/git/Velours/python/prod/datou.py -j batch_current -C 2502430' -s traitement_3459 -M 0 -S 0 -U 100,80,95 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/home/admin/workarea/git/apy', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 3672928 load datou : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : step 0 init_dummy_multi_datou is not linked in the step_by_step architecture ! WARNING : step 1294 init_dummy_multi_datou is not linked in the step_by_step architecture ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : (photo_id, hashtag_id, score_max) was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? load thcls load pdts Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 3459, datou_cur_ids : ['2502430'] with mtr_portfolio_ids : ['19815787'] and first list_photo_ids : [] new path : /proc/3672928/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! List Step Type Loaded in datou : mask_detect, crop_condition, thcl, merge_mask_thcl_custom, rle_unique_nms_with_priority, crop_condition, ventilate_hashtags_in_portfolio, final, velours_tree, send_mail_cod, split_time_score over limit max, limiting to limit_max 20 list_input_json : [] origin We have 1 , BFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 19 ; length of list_pids : 19 ; length of list_args : 19 time to download the photos : 3.441582441329956 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 11 step1:mask_detect Tue Feb 4 11:34:08 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 10553 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-02-04 11:34:10.999545: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-02-04 11:34:11.023052: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493065000 Hz 2025-02-04 11:34:11.024391: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f85a4000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-02-04 11:34:11.024411: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-02-04 11:34:11.026973: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-02-04 11:34:11.280232: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x2d09afd0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-02-04 11:34:11.280297: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-02-04 11:34:11.281608: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-02-04 11:34:11.285641: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-02-04 11:34:11.290317: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-02-04 11:34:11.293460: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-02-04 11:34:11.294976: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-02-04 11:34:11.299010: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-02-04 11:34:11.300585: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-02-04 11:34:11.307284: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-02-04 11:34:11.308834: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-02-04 11:34:11.308910: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-02-04 11:34:11.309653: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-02-04 11:34:11.309668: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-02-04 11:34:11.309677: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-02-04 11:34:11.310995: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9777 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-02-04 11:34:11.598918: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-02-04 11:34:11.599022: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-02-04 11:34:11.599046: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-02-04 11:34:11.599067: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-02-04 11:34:11.599087: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-02-04 11:34:11.599107: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-02-04 11:34:11.599126: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-02-04 11:34:11.599146: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-02-04 11:34:11.600810: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-02-04 11:34:11.602116: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-02-04 11:34:11.602159: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-02-04 11:34:11.602181: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-02-04 11:34:11.602202: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-02-04 11:34:11.602223: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-02-04 11:34:11.602243: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-02-04 11:34:11.602263: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-02-04 11:34:11.602284: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-02-04 11:34:11.604425: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-02-04 11:34:11.604487: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-02-04 11:34:11.604504: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-02-04 11:34:11.604519: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-02-04 11:34:11.606752: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9777 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl2896 thcls : [{'id': 2896, 'mtr_user_id': 31, 'name': 'learn_convoyeur_qualipapia_nantes_poly_100521_1', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,carton_brun,carton_gris,cartonnette,kraft,autre_refus,metal,plastique,teint_dans_la_masse,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3663, 'photo_desc_type': 5309, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0'}] thcl {'id': 2896, 'mtr_user_id': 31, 'name': 'learn_convoyeur_qualipapia_nantes_poly_100521_1', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,carton_brun,carton_gris,cartonnette,kraft,autre_refus,metal,plastique,teint_dans_la_masse,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3663, 'photo_desc_type': 5309, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5309 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5309, 'learn_convoyeur_qualipapia_nantes_poly_100521_1', 16384, 25088, 'learn_convoyeur_qualipapia_nantes_poly_100521_1', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 5, 10, 19, 20, 46), datetime.datetime(2021, 5, 10, 19, 20, 46)) {'thcl': {'id': 2896, 'mtr_user_id': 31, 'name': 'learn_convoyeur_qualipapia_nantes_poly_100521_1', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,carton_brun,carton_gris,cartonnette,kraft,autre_refus,metal,plastique,teint_dans_la_masse,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3663, 'photo_desc_type': 5309, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['background', 'carton_brun', 'carton_gris', 'cartonnette', 'kraft', 'autre_refus', 'metal', 'plastique', 'teint_dans_la_masse', 'environnement'], 'list_hashtags_csv': 'background,carton_brun,carton_gris,cartonnette,kraft,autre_refus,metal,plastique,teint_dans_la_masse,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3663, 'svm_hashtag_type_desc': 5309, 'photo_desc_type': 5309, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['background', 'carton_brun', 'carton_gris', 'cartonnette', 'kraft', 'autre_refus', 'metal', 'plastique', 'teint_dans_la_masse', 'environnement'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME learn_convoyeur_qualipapia_nantes_poly_100521_1 NUM_CLASSES 10 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : learn_convoyeur_qualipapia_nantes_poly_100521_1 model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-02-04 11:34:18.631814: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-02-04 11:34:18.790289: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/learn_convoyeur_qualipapia_nantes_poly_100521_1 /data/models_weight/learn_convoyeur_qualipapia_nantes_poly_100521_1/mask_model.h5 size_local : 256031040 size in s3 : 256031040 create time local : 2021-08-09 05:45:48 create time in s3 : 2021-08-06 18:59:51 mask_model.h5 already exist and didn't need to update list_images length : 19 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 8 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 18) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 Detection mask done ! Trying to reset tf kernel 3673065 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 5264 tf kernel not reseted sub process len(results) : 19 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 19 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 10553 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl2896 Catched exception ! Connect or reconnect ! thcls : [{'id': 2896, 'mtr_user_id': 31, 'name': 'learn_convoyeur_qualipapia_nantes_poly_100521_1', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,carton_brun,carton_gris,cartonnette,kraft,autre_refus,metal,plastique,teint_dans_la_masse,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3663, 'photo_desc_type': 5309, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0'}] thcl {'id': 2896, 'mtr_user_id': 31, 'name': 'learn_convoyeur_qualipapia_nantes_poly_100521_1', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,carton_brun,carton_gris,cartonnette,kraft,autre_refus,metal,plastique,teint_dans_la_masse,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3663, 'photo_desc_type': 5309, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5309 ['background', 'carton_brun', 'carton_gris', 'cartonnette', 'kraft', 'autre_refus', 'metal', 'plastique', 'teint_dans_la_masse', 'environnement'] time for calcul the mask position with numpy : 0.0006582736968994141 nb_pixel_total : 28412 time to create 1 rle with old method : 0.032004356384277344 length of segment : 258 time for calcul the mask position with numpy : 0.0012645721435546875 nb_pixel_total : 49459 time to create 1 rle with old method : 0.05490756034851074 length of segment : 429 time for calcul the mask position with numpy : 0.014380216598510742 nb_pixel_total : 661783 time to create 1 rle with new method : 0.03815031051635742 length of segment : 1586 time for calcul the mask position with numpy : 0.0055658817291259766 nb_pixel_total : 305599 time to create 1 rle with new method : 0.011878728866577148 length of segment : 1076 time for calcul the mask position with numpy : 0.00011444091796875 nb_pixel_total : 3635 time to create 1 rle with old method : 0.004222393035888672 length of segment : 89 time for calcul the mask position with numpy : 0.00039315223693847656 nb_pixel_total : 12286 time to create 1 rle with old method : 0.014123201370239258 length of segment : 213 time for calcul the mask position with numpy : 0.008645057678222656 nb_pixel_total : 521619 time to create 1 rle with new method : 0.027811050415039062 length of segment : 1333 time for calcul the mask position with numpy : 0.0014073848724365234 nb_pixel_total : 79426 time to create 1 rle with old method : 0.09600234031677246 length of segment : 373 time for calcul the mask position with numpy : 0.0008802413940429688 nb_pixel_total : 52275 time to create 1 rle with old method : 0.060487985610961914 length of segment : 439 time for calcul the mask position with numpy : 0.0006477832794189453 nb_pixel_total : 30930 time to create 1 rle with old method : 0.035951852798461914 length of segment : 334 time for calcul the mask position with numpy : 0.013203144073486328 nb_pixel_total : 642559 time to create 1 rle with new method : 0.035332679748535156 length of segment : 1252 time for calcul the mask position with numpy : 0.0006229877471923828 nb_pixel_total : 44626 time to create 1 rle with old method : 0.05338001251220703 length of segment : 175 time for calcul the mask position with numpy : 0.0009293556213378906 nb_pixel_total : 47036 time to create 1 rle with old method : 0.0554356575012207 length of segment : 273 time for calcul the mask position with numpy : 0.003721952438354492 nb_pixel_total : 139163 time to create 1 rle with old method : 0.15059256553649902 length of segment : 580 time for calcul the mask position with numpy : 0.006887197494506836 nb_pixel_total : 284944 time to create 1 rle with new method : 0.014769554138183594 length of segment : 748 time for calcul the mask position with numpy : 0.003496885299682617 nb_pixel_total : 157121 time to create 1 rle with new method : 0.008795976638793945 length of segment : 758 time for calcul the mask position with numpy : 0.0008993148803710938 nb_pixel_total : 30498 time to create 1 rle with old method : 0.03500652313232422 length of segment : 297 time for calcul the mask position with numpy : 0.0039975643157958984 nb_pixel_total : 303747 time to create 1 rle with new method : 0.01087641716003418 length of segment : 936 time for calcul the mask position with numpy : 0.0017600059509277344 nb_pixel_total : 124622 time to create 1 rle with old method : 0.14145922660827637 length of segment : 388 time for calcul the mask position with numpy : 0.005503654479980469 nb_pixel_total : 344511 time to create 1 rle with new method : 0.015651464462280273 length of segment : 2221 time for calcul the mask position with numpy : 0.0007224082946777344 nb_pixel_total : 26396 time to create 1 rle with old method : 0.03190922737121582 length of segment : 154 time for calcul the mask position with numpy : 0.0027234554290771484 nb_pixel_total : 126195 time to create 1 rle with old method : 0.15389204025268555 length of segment : 600 time for calcul the mask position with numpy : 0.00013637542724609375 nb_pixel_total : 4876 time to create 1 rle with old method : 0.006241559982299805 length of segment : 56 time for calcul the mask position with numpy : 0.009311199188232422 nb_pixel_total : 353632 time to create 1 rle with new method : 0.028837203979492188 length of segment : 1021 time for calcul the mask position with numpy : 0.004445552825927734 nb_pixel_total : 242660 time to create 1 rle with new method : 0.009592056274414062 length of segment : 861 time for calcul the mask position with numpy : 0.014029264450073242 nb_pixel_total : 972138 time to create 1 rle with new method : 0.03886008262634277 length of segment : 1716 time for calcul the mask position with numpy : 0.006348848342895508 nb_pixel_total : 306385 time to create 1 rle with new method : 0.010717153549194336 length of segment : 1005 time for calcul the mask position with numpy : 0.0015249252319335938 nb_pixel_total : 59325 time to create 1 rle with old method : 0.08847332000732422 length of segment : 332 time for calcul the mask position with numpy : 0.002817869186401367 nb_pixel_total : 174211 time to create 1 rle with new method : 0.005051374435424805 length of segment : 850 time for calcul the mask position with numpy : 0.013419866561889648 nb_pixel_total : 839580 time to create 1 rle with new method : 0.03274798393249512 length of segment : 1544 time for calcul the mask position with numpy : 0.0007193088531494141 nb_pixel_total : 52054 time to create 1 rle with old method : 0.06020069122314453 length of segment : 173 time for calcul the mask position with numpy : 0.005480766296386719 nb_pixel_total : 418519 time to create 1 rle with new method : 0.014172792434692383 length of segment : 1450 time for calcul the mask position with numpy : 0.003624439239501953 nb_pixel_total : 197057 time to create 1 rle with new method : 0.0064318180084228516 length of segment : 663 time for calcul the mask position with numpy : 0.015592336654663086 nb_pixel_total : 880146 time to create 1 rle with new method : 0.03970170021057129 length of segment : 1928 time for calcul the mask position with numpy : 0.0011489391326904297 nb_pixel_total : 84951 time to create 1 rle with old method : 0.10585570335388184 length of segment : 268 time for calcul the mask position with numpy : 0.0003604888916015625 nb_pixel_total : 10871 time to create 1 rle with old method : 0.013118743896484375 length of segment : 99 time for calcul the mask position with numpy : 0.005838632583618164 nb_pixel_total : 234850 time to create 1 rle with new method : 0.011063098907470703 length of segment : 995 time for calcul the mask position with numpy : 0.007519721984863281 nb_pixel_total : 132043 time to create 1 rle with old method : 0.15109920501708984 length of segment : 439 time for calcul the mask position with numpy : 0.0008819103240966797 nb_pixel_total : 54195 time to create 1 rle with old method : 0.06438589096069336 length of segment : 294 time for calcul the mask position with numpy : 0.0015978813171386719 nb_pixel_total : 107327 time to create 1 rle with old method : 0.12179064750671387 length of segment : 430 time for calcul the mask position with numpy : 0.0018765926361083984 nb_pixel_total : 128457 time to create 1 rle with old method : 0.14965558052062988 length of segment : 320 time for calcul the mask position with numpy : 0.0036995410919189453 nb_pixel_total : 288857 time to create 1 rle with new method : 0.009226799011230469 length of segment : 1418 time for calcul the mask position with numpy : 0.0054624080657958984 nb_pixel_total : 379321 time to create 1 rle with new method : 0.010871410369873047 length of segment : 1115 time for calcul the mask position with numpy : 0.0013246536254882812 nb_pixel_total : 83067 time to create 1 rle with old method : 0.09462404251098633 length of segment : 318 time for calcul the mask position with numpy : 0.017738819122314453 nb_pixel_total : 1229185 time to create 1 rle with new method : 0.037529706954956055 length of segment : 1818 time for calcul the mask position with numpy : 0.00045108795166015625 nb_pixel_total : 15106 time to create 1 rle with old method : 0.0175015926361084 length of segment : 172 time for calcul the mask position with numpy : 0.0010175704956054688 nb_pixel_total : 88846 time to create 1 rle with old method : 0.10630059242248535 length of segment : 298 time for calcul the mask position with numpy : 0.0011811256408691406 nb_pixel_total : 24640 time to create 1 rle with old method : 0.029036998748779297 length of segment : 610 time for calcul the mask position with numpy : 0.0005319118499755859 nb_pixel_total : 32725 time to create 1 rle with old method : 0.03852534294128418 length of segment : 177 time for calcul the mask position with numpy : 0.000982522964477539 nb_pixel_total : 41933 time to create 1 rle with old method : 0.049788713455200195 length of segment : 276 time for calcul the mask position with numpy : 0.01513528823852539 nb_pixel_total : 893708 time to create 1 rle with new method : 0.031017303466796875 length of segment : 1262 time for calcul the mask position with numpy : 0.000949859619140625 nb_pixel_total : 61657 time to create 1 rle with old method : 0.08960962295532227 length of segment : 307 time for calcul the mask position with numpy : 0.0017879009246826172 nb_pixel_total : 116002 time to create 1 rle with old method : 0.16672372817993164 length of segment : 538 time for calcul the mask position with numpy : 0.0009539127349853516 nb_pixel_total : 57868 time to create 1 rle with old method : 0.0687246322631836 length of segment : 258 time for calcul the mask position with numpy : 0.0034780502319335938 nb_pixel_total : 298421 time to create 1 rle with new method : 0.008529901504516602 length of segment : 1210 time for calcul the mask position with numpy : 0.0002696514129638672 nb_pixel_total : 12785 time to create 1 rle with old method : 0.015254974365234375 length of segment : 397 time for calcul the mask position with numpy : 0.002258777618408203 nb_pixel_total : 130469 time to create 1 rle with old method : 0.14901041984558105 length of segment : 575 time for calcul the mask position with numpy : 0.024198293685913086 nb_pixel_total : 1623749 time to create 1 rle with new method : 0.03894400596618652 length of segment : 1320 time for calcul the mask position with numpy : 0.004086017608642578 nb_pixel_total : 323695 time to create 1 rle with new method : 0.009690284729003906 length of segment : 1043 time for calcul the mask position with numpy : 0.0015611648559570312 nb_pixel_total : 85690 time to create 1 rle with old method : 0.11610937118530273 length of segment : 297 time for calcul the mask position with numpy : 0.004228830337524414 nb_pixel_total : 187114 time to create 1 rle with new method : 0.010161161422729492 length of segment : 709 time for calcul the mask position with numpy : 0.013477802276611328 nb_pixel_total : 890000 time to create 1 rle with new method : 0.0367436408996582 length of segment : 1845 time for calcul the mask position with numpy : 0.0035583972930908203 nb_pixel_total : 294972 time to create 1 rle with new method : 0.008669614791870117 length of segment : 1086 time for calcul the mask position with numpy : 0.014804601669311523 nb_pixel_total : 833355 time to create 1 rle with new method : 0.03140616416931152 length of segment : 1288 time for calcul the mask position with numpy : 0.0001933574676513672 nb_pixel_total : 5111 time to create 1 rle with old method : 0.008785486221313477 length of segment : 67 time for calcul the mask position with numpy : 0.0007221698760986328 nb_pixel_total : 16964 time to create 1 rle with old method : 0.029239177703857422 length of segment : 181 time for calcul the mask position with numpy : 0.015778064727783203 nb_pixel_total : 790876 time to create 1 rle with new method : 0.03923916816711426 length of segment : 2104 time for calcul the mask position with numpy : 0.00019216537475585938 nb_pixel_total : 7086 time to create 1 rle with old method : 0.009382247924804688 length of segment : 110 time for calcul the mask position with numpy : 0.0037152767181396484 nb_pixel_total : 235377 time to create 1 rle with new method : 0.01072239875793457 length of segment : 1661 time for calcul the mask position with numpy : 0.0001964569091796875 nb_pixel_total : 7212 time to create 1 rle with old method : 0.008697032928466797 length of segment : 100 time for calcul the mask position with numpy : 0.006129741668701172 nb_pixel_total : 332773 time to create 1 rle with new method : 0.010438919067382812 length of segment : 1279 time for calcul the mask position with numpy : 0.019402265548706055 nb_pixel_total : 1401823 time to create 1 rle with new method : 0.04201173782348633 length of segment : 2045 time for calcul the mask position with numpy : 0.0015263557434082031 nb_pixel_total : 106965 time to create 1 rle with old method : 0.12134575843811035 length of segment : 516 time for calcul the mask position with numpy : 0.005187034606933594 nb_pixel_total : 327103 time to create 1 rle with new method : 0.01278829574584961 length of segment : 1565 time for calcul the mask position with numpy : 0.00036334991455078125 nb_pixel_total : 6451 time to create 1 rle with old method : 0.007933616638183594 length of segment : 108 time for calcul the mask position with numpy : 0.0015325546264648438 nb_pixel_total : 60062 time to create 1 rle with old method : 0.0695657730102539 length of segment : 536 time for calcul the mask position with numpy : 0.01533055305480957 nb_pixel_total : 855177 time to create 1 rle with new method : 0.036180734634399414 length of segment : 1452 time for calcul the mask position with numpy : 0.0019583702087402344 nb_pixel_total : 66398 time to create 1 rle with old method : 0.09377312660217285 length of segment : 478 time for calcul the mask position with numpy : 0.0013453960418701172 nb_pixel_total : 50881 time to create 1 rle with old method : 0.05956125259399414 length of segment : 446 time for calcul the mask position with numpy : 0.007291316986083984 nb_pixel_total : 319300 time to create 1 rle with new method : 0.01819610595703125 length of segment : 1525 time for calcul the mask position with numpy : 0.00191497802734375 nb_pixel_total : 124498 time to create 1 rle with old method : 0.15329790115356445 length of segment : 292 time for calcul the mask position with numpy : 0.00014281272888183594 nb_pixel_total : 2857 time to create 1 rle with old method : 0.003665924072265625 length of segment : 66 time for calcul the mask position with numpy : 0.01102900505065918 nb_pixel_total : 416666 time to create 1 rle with new method : 0.023713350296020508 length of segment : 1243 time for calcul the mask position with numpy : 0.0008985996246337891 nb_pixel_total : 41897 time to create 1 rle with old method : 0.04786539077758789 length of segment : 352 time for calcul the mask position with numpy : 0.005459308624267578 nb_pixel_total : 405003 time to create 1 rle with new method : 0.010703325271606445 length of segment : 1325 time spent for convertir_results : 14.035248756408691 Inside saveOutput : final : False verbose : 0 eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 85 chid ids of type : 3663 Number RLEs to save : 64744 save missing photos in datou_result : time spend for datou_step_exec : 34.70655870437622 time spend to save output : 3.5764644145965576 total time spend for step 1 : 38.28302311897278 step2:crop_condition Tue Feb 4 11:34:46 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure Loading chi in step crop with photo_hashtag_type : 3663 Loading chi in step crop for list_pids : 19 ! batch 1 Loaded 85 chid ids of type : 3663 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ begin to crop the class : teint_dans_la_masse param for this class : {'min_score': 0.7} filtre for class : teint_dans_la_masse hashtag_id of this class : 2107752385 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 8 About to insert : list_path_to_insert length 8 new photo from crops ! we have finished the crop for the class : teint_dans_la_masse begin to crop the class : autre_refus param for this class : {'min_score': 0.5} filtre for class : autre_refus hashtag_id of this class : 2107752406 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 4 About to insert : list_path_to_insert length 4 new photo from crops ! we have finished the crop for the class : autre_refus begin to crop the class : carton_gris param for this class : {'min_score': 0.5} filtre for class : carton_gris hashtag_id of this class : 2107753020 begin to crop the class : cartonnette param for this class : {'min_score': 0.5} filtre for class : cartonnette hashtag_id of this class : 702398920 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 8 About to insert : list_path_to_insert length 8 new photo from crops ! we have finished the crop for the class : cartonnette begin to crop the class : carton_brun param for this class : {'min_score': 0.7} filtre for class : carton_brun hashtag_id of this class : 2107753024 begin to crop the class : plastique param for this class : {'min_score': 0.5} filtre for class : plastique hashtag_id of this class : 492725882 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 5 About to insert : list_path_to_insert length 5 new photo from crops ! we have finished the crop for the class : plastique begin to crop the class : kraft param for this class : {'min_score': 0.5} filtre for class : kraft hashtag_id of this class : 493202403 begin to crop the class : metal param for this class : {'min_score': 0.5} filtre for class : metal hashtag_id of this class : 492628673 delete rles for these photos Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : crop_condition we use saveGeneral [1330534413, 1330534408, 1330534401, 1330534397, 1330534392, 1330534252, 1330534000, 1330533997, 1330533904, 1330533901, 1330533896, 1330533856, 1330533813, 1330533811, 1330533808, 1330533804, 1330532846, 1330532841, 1330532793] Looping around the photos to save general results len do output : 25 /-3653389954Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653389960Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653389976Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653389992Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653390008Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653390014Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653390016Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653390033Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653390002Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653390005Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653390004Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653390021Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653389970Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653389983Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653389984Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653389995Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653390000Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653389998Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653390013Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653390032Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653389961Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653389979Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653389988Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653389987Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /-3653390018Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534413', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534408', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534401', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534397', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534392', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534252', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534000', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533997', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533904', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533901', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533896', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533856', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533813', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533811', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533808', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533804', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532846', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532841', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532793', None, None, None, None, None, '2502430') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 94 time used for this insertion : 0.17087173461914062 save_final save missing photos in datou_result : time spend for datou_step_exec : 9.768730163574219 time spend to save output : 0.17209243774414062 total time spend for step 2 : 9.94082260131836 step3:thcl Tue Feb 4 11:34:56 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step Thcl ! nombre de thcls : 2 we are using the classfication for multi_thcl [2456, 2868] time to import caffe and check if the image exist : 0.004983425140380859 time to convert the images to numpy array : 2.384185791015625e-06 time to import caffe and check if the image exist : 0.0066204071044921875 time to convert the images to numpy array : 0.0034723281860351562 time to import caffe and check if the image exist : 0.004660606384277344 time to convert the images to numpy array : 0.020705461502075195 time to import caffe and check if the image exist : 0.005213499069213867 time to convert the images to numpy array : 0.026190996170043945 time to import caffe and check if the image exist : 0.009851455688476562 time to convert the images to numpy array : 0.02081608772277832 time to import caffe and check if the image exist : 0.005065202713012695 time to convert the images to numpy array : 0.030487537384033203 time to import caffe and check if the image exist : 0.008869647979736328 time to convert the images to numpy array : 0.02731633186340332 time to import caffe and check if the image exist : 0.005688905715942383 time to convert the images to numpy array : 0.03211045265197754 time to import caffe and check if the image exist : 0.0072863101959228516 time to convert the images to numpy array : 0.03702855110168457 time to import caffe and check if the image exist : 0.006300687789916992 time to convert the images to numpy array : 0.0459139347076416 total time to convert the images to numpy array : 0.35007786750793457 list photo_ids error: [] list photo_ids correct : [-3653390018, -3653389954, -3653389960, -3653389976, -3653389992, -3653390008, -3653390014, -3653390013, -3653390032, -3653389961, -3653389970, -3653389983, -3653389984, -3653389995, -3653390000, -3653389998, -3653389979, -3653389988, -3653389987, -3653390005, -3653390004, -3653390021, -3653390016, -3653390033, -3653390002] number of photos to traite : 25 try to delete the photos incorrect in DB tagging for thcl : 2456 To do loadFromThcl(), then load ParamDescType : thcl2456 thcls : [{'id': 2456, 'mtr_user_id': 31, 'name': 'learn_qualipapia_papier_refus_from_vlg_data_aug', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'papier,refus', 'svm_portfolios_learning': '3028087,3028251', 'photo_hashtag_type': 3049, 'photo_desc_type': 4999, 'type_classification': 'caffe', 'hashtag_id_list': '492668766,538914404'}] thcl {'id': 2456, 'mtr_user_id': 31, 'name': 'learn_qualipapia_papier_refus_from_vlg_data_aug', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'papier,refus', 'svm_portfolios_learning': '3028087,3028251', 'photo_hashtag_type': 3049, 'photo_desc_type': 4999, 'type_classification': 'caffe', 'hashtag_id_list': '492668766,538914404'} Update svm_hashtag_type_desc : 4999 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (4999, 'learn_qualipapia_papier_refus_from_vlg_data_aug', 16384, 25088, 'learn_qualipapia_papier_refus_from_vlg_data_aug', 'res5b', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2020, 10, 23, 14, 27, 22), datetime.datetime(2020, 10, 23, 14, 27, 22)) To loadFromThcl() : net_4999 begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 6997 max_wait_temp : 1 max_wait : 0 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (4999, 'learn_qualipapia_papier_refus_from_vlg_data_aug', 16384, 25088, 'learn_qualipapia_papier_refus_from_vlg_data_aug', 'res5b', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2020, 10, 23, 14, 27, 22), datetime.datetime(2020, 10, 23, 14, 27, 22)) None mean_file_type : mean_file_path : prototxt_file_path : model : learn_qualipapia_papier_refus_from_vlg_data_aug Inside get_net Inside get_net before cache_data_model model_param file didn't exist Inside get_net before CDM.load_model_par_type model_name : learn_qualipapia_papier_refus_from_vlg_data_aug model_type : caffe list file need : ['caffemodel', 'deploy_conv_normal.prototxt', 'deploy_fc.prototxt', 'deploy.prototxt', 'mean.npy', 'synset_words.txt'] file exist in s3 : ['caffemodel', 'deploy.prototxt', 'mean.npy', 'synset_words.txt'] file manque in s3 : ['deploy_conv_normal.prototxt', 'deploy_fc.prototxt'] local folder : /data/models_weight/learn_qualipapia_papier_refus_from_vlg_data_aug /data/models_weight/learn_qualipapia_papier_refus_from_vlg_data_aug/caffemodel size_local : 44972172 size in s3 : 44972172 create time local : 2021-08-09 05:55:48 create time in s3 : 2021-08-06 19:28:49 caffemodel already exist and didn't need to update /data/models_weight/learn_qualipapia_papier_refus_from_vlg_data_aug/deploy.prototxt size_local : 17311 size in s3 : 17311 create time local : 2021-08-09 05:55:48 create time in s3 : 2021-08-06 19:28:49 deploy.prototxt already exist and didn't need to update /data/models_weight/learn_qualipapia_papier_refus_from_vlg_data_aug/mean.npy size_local : 1572992 size in s3 : 1572992 create time local : 2021-08-09 05:55:48 create time in s3 : 2021-08-06 19:28:51 mean.npy already exist and didn't need to update /data/models_weight/learn_qualipapia_papier_refus_from_vlg_data_aug/synset_words.txt size_local : 57 size in s3 : 57 create time local : 2021-08-09 05:55:48 create time in s3 : 2021-08-06 19:28:49 synset_words.txt already exist and didn't need to update Inside get_net after CDM.load_model_par_type After if not only_with_local_cache: /home/admin/workarea/install/darknet/:/home/admin/workarea/git/Velours/python:/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python:/home/admin/mtr/.credentials:/home/admin/workarea/install/caffe/python:/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools/:/home/admin/workarea/git/fotonowerpip/:/home/admin/workarea/install/segment-anything:/home/admin//workarea/git/pyfvs/:/home/admin/workarea/git/apy/ Here before set mode gpu Doing nothing but we could set mode gpu after set mode gpu prototxt_filename : /data/models_weight/learn_qualipapia_papier_refus_from_vlg_data_aug/deploy.prototxt caffemodel_filename : /data/models_weight/learn_qualipapia_papier_refus_from_vlg_data_aug/caffemodel now we set caffe to gpu mode before predict begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 6778 max_wait_temp : 1 max_wait : 0 dict_keys(['prob']) time used to do the prepocess of the images : 0.253645658493042 time used to do the prediction : 0.09723043441772461 we don't save the descriptors for this thcl 2456 tagging for thcl : 2868 To do loadFromThcl(), then load ParamDescType : thcl2868 thcls : [{'id': 2868, 'mtr_user_id': 31, 'name': 'learn_papier_nantes_300421', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Carton_brun,Carton_gris,Teint_Dans_La_Masse,autre_refus,cartonnette,environnement,kraft,metal,papier,plastique', 'svm_portfolios_learning': '3752117,3752118,3752123,3752106,3752116,3752124,3752119,3581575,3486029,3752122', 'photo_hashtag_type': 3632, 'photo_desc_type': 5288, 'type_classification': 'caffe', 'hashtag_id_list': '2107753024,2107753020,2107752385,2107752406,702398920,493012381,493202403,492628673,492668766,492725882'}] thcl {'id': 2868, 'mtr_user_id': 31, 'name': 'learn_papier_nantes_300421', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Carton_brun,Carton_gris,Teint_Dans_La_Masse,autre_refus,cartonnette,environnement,kraft,metal,papier,plastique', 'svm_portfolios_learning': '3752117,3752118,3752123,3752106,3752116,3752124,3752119,3581575,3486029,3752122', 'photo_hashtag_type': 3632, 'photo_desc_type': 5288, 'type_classification': 'caffe', 'hashtag_id_list': '2107753024,2107753020,2107752385,2107752406,702398920,493012381,493202403,492628673,492668766,492725882'} Update svm_hashtag_type_desc : 5288 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5288, 'learn_papier_nantes_300421', 512, 512, 'learn_papier_nantes_300421', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 30, 17, 9, 41), datetime.datetime(2021, 4, 30, 17, 9, 41)) To loadFromThcl() : net_5288 begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 6776 max_wait_temp : 1 max_wait : 0 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5288, 'learn_papier_nantes_300421', 512, 512, 'learn_papier_nantes_300421', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 30, 17, 9, 41), datetime.datetime(2021, 4, 30, 17, 9, 41)) None mean_file_type : mean_file_path : prototxt_file_path : model : learn_papier_nantes_300421 Inside get_net Inside get_net before cache_data_model model_param file didn't exist Inside get_net before CDM.load_model_par_type model_name : learn_papier_nantes_300421 model_type : caffe list file need : ['caffemodel', 'deploy_conv_normal.prototxt', 'deploy_fc.prototxt', 'deploy.prototxt', 'mean.npy', 'synset_words.txt'] file exist in s3 : ['caffemodel', 'deploy.prototxt', 'mean.npy', 'synset_words.txt'] file manque in s3 : ['deploy_conv_normal.prototxt', 'deploy_fc.prototxt'] local folder : /data/models_weight/learn_papier_nantes_300421 /data/models_weight/learn_papier_nantes_300421/caffemodel size_local : 44791983 size in s3 : 44791983 create time local : 2021-08-09 05:55:59 create time in s3 : 2021-08-06 19:22:13 caffemodel already exist and didn't need to update /data/models_weight/learn_papier_nantes_300421/deploy.prototxt size_local : 17255 size in s3 : 17255 create time local : 2021-08-09 05:55:59 create time in s3 : 2021-08-06 19:22:12 deploy.prototxt already exist and didn't need to update /data/models_weight/learn_papier_nantes_300421/mean.npy size_local : 1572992 size in s3 : 1572992 create time local : 2021-08-09 05:55:59 create time in s3 : 2021-08-06 19:22:14 mean.npy already exist and didn't need to update /data/models_weight/learn_papier_nantes_300421/synset_words.txt size_local : 331 size in s3 : 331 create time local : 2021-08-09 05:56:00 create time in s3 : 2021-08-06 19:22:12 synset_words.txt already exist and didn't need to update Inside get_net after CDM.load_model_par_type After if not only_with_local_cache: /home/admin/workarea/install/darknet/:/home/admin/workarea/git/Velours/python:/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python:/home/admin/mtr/.credentials:/home/admin/workarea/install/caffe/python:/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools/:/home/admin/workarea/git/fotonowerpip/:/home/admin/workarea/install/segment-anything:/home/admin//workarea/git/pyfvs/:/home/admin/workarea/git/apy/ Here before set mode gpu Doing nothing but we could set mode gpu after set mode gpu prototxt_filename : /data/models_weight/learn_papier_nantes_300421/deploy.prototxt caffemodel_filename : /data/models_weight/learn_papier_nantes_300421/caffemodel now we set caffe to gpu mode before predict begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 6778 max_wait_temp : 1 max_wait : 0 dict_keys(['prob']) time used to do the prepocess of the images : 0.2880897521972656 time used to do the prediction : 0.07587122917175293 we don't save the descriptors for this thcl 2868 Inside saveOutput : final : False verbose : 0 time used to find the portfolios of the photos test new format of the output of the step_thcl begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 0 time used for this insertion : 9.059906005859375e-06 save missing photos in datou_result : time spend for datou_step_exec : 7.651365280151367 time spend to save output : 0.0002560615539550781 total time spend for step 3 : 7.651621341705322 step4:merge_mask_thcl_custom Tue Feb 4 11:35:04 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Begin step merge_mask_thcl_custom batch 1 Loaded 85 chid ids of type : 3663 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++As expected we have just one thcl present End of step merge_mask_thcl_custom Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : merge_mask_thcl_custom we use saveGeneral [1330534413, 1330534408, 1330534401, 1330534397, 1330534392, 1330534252, 1330534000, 1330533997, 1330533904, 1330533901, 1330533896, 1330533856, 1330533813, 1330533811, 1330533808, 1330533804, 1330532846, 1330532841, 1330532793] Looping around the photos to save general results len do output : 19 /1330534413Didn't retrieve data .Didn't retrieve data . /1330534408Didn't retrieve data .Didn't retrieve data . /1330534401Didn't retrieve data .Didn't retrieve data . /1330534397Didn't retrieve data .Didn't retrieve data . /1330534392Didn't retrieve data .Didn't retrieve data . /1330534252Didn't retrieve data .Didn't retrieve data . /1330534000Didn't retrieve data .Didn't retrieve data . /1330533997Didn't retrieve data .Didn't retrieve data . /1330533904Didn't retrieve data .Didn't retrieve data . /1330533901Didn't retrieve data .Didn't retrieve data . /1330533896Didn't retrieve data .Didn't retrieve data . /1330533856Didn't retrieve data .Didn't retrieve data . /1330533813Didn't retrieve data .Didn't retrieve data . /1330533811Didn't retrieve data .Didn't retrieve data . /1330533808Didn't retrieve data .Didn't retrieve data . /1330533804Didn't retrieve data .Didn't retrieve data . /1330532846Didn't retrieve data .Didn't retrieve data . /1330532841Didn't retrieve data .Didn't retrieve data . /1330532793Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534413', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534408', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534401', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534397', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534392', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534252', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534000', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533997', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533904', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533901', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533896', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533856', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533813', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533811', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533808', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533804', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532846', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532841', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532793', None, None, None, None, None, '2502430') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 57 time used for this insertion : 0.020578861236572266 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.06010603904724121 time spend to save output : 0.021374225616455078 total time spend for step 4 : 0.08148026466369629 step5:rle_unique_nms_with_priority Tue Feb 4 11:35:04 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms nb_obj : 4 nb_hashtags : 1 time to prepare the origin masks : 0.505185604095459 time for calcul the mask position with numpy : 0.01794600486755371 nb_pixel_total : 1243150 time to create 1 rle with new method : 0.04425525665283203 time for calcul the mask position with numpy : 0.007306814193725586 nb_pixel_total : 89731 time to create 1 rle with old method : 0.09940934181213379 time for calcul the mask position with numpy : 0.010900020599365234 nb_pixel_total : 663091 time to create 1 rle with new method : 0.03980112075805664 time for calcul the mask position with numpy : 0.0065038204193115234 nb_pixel_total : 49244 time to create 1 rle with old method : 0.05300712585449219 time for calcul the mask position with numpy : 0.006090402603149414 nb_pixel_total : 28384 time to create 1 rle with old method : 0.032819509506225586 create new chi : 0.3274703025817871 time to delete rle : 0.017885684967041016 batch 1 Loaded 5 chid ids of type : 3726 Number RLEs to save : 5393 TO DO : save crop sub photo not yet done ! save time : 0.3299834728240967 nb_obj : 5 nb_hashtags : 2 time to prepare the origin masks : 0.43229007720947266 time for calcul the mask position with numpy : 0.020787715911865234 nb_pixel_total : 1361248 time to create 1 rle with new method : 0.047112464904785156 time for calcul the mask position with numpy : 0.007161855697631836 nb_pixel_total : 52239 time to create 1 rle with old method : 0.05847787857055664 time for calcul the mask position with numpy : 0.007787227630615234 nb_pixel_total : 79207 time to create 1 rle with old method : 0.08785271644592285 time for calcul the mask position with numpy : 0.010260581970214844 nb_pixel_total : 565000 time to create 1 rle with new method : 0.037872314453125 time for calcul the mask position with numpy : 0.006379604339599609 nb_pixel_total : 12279 time to create 1 rle with old method : 0.01428675651550293 time for calcul the mask position with numpy : 0.006427288055419922 nb_pixel_total : 3627 time to create 1 rle with old method : 0.0043179988861083984 create new chi : 0.31352806091308594 time to delete rle : 0.0004951953887939453 batch 1 Loaded 6 chid ids of type : 3726 Number RLEs to save : 5299 TO DO : save crop sub photo not yet done ! save time : 0.39504146575927734 nb_obj : 5 nb_hashtags : 3 time to prepare the origin masks : 0.4710845947265625 time for calcul the mask position with numpy : 0.018658161163330078 nb_pixel_total : 1297079 time to create 1 rle with new method : 0.050867319107055664 time for calcul the mask position with numpy : 0.007328987121582031 nb_pixel_total : 44531 time to create 1 rle with old method : 0.05006003379821777 time for calcul the mask position with numpy : 0.006600856781005859 nb_pixel_total : 46800 time to create 1 rle with old method : 0.05461478233337402 time for calcul the mask position with numpy : 0.0063686370849609375 nb_pixel_total : 12246 time to create 1 rle with old method : 0.013939619064331055 time for calcul the mask position with numpy : 0.011498212814331055 nb_pixel_total : 642127 time to create 1 rle with new method : 0.03650331497192383 time for calcul the mask position with numpy : 0.006700038909912109 nb_pixel_total : 30817 time to create 1 rle with old method : 0.038364410400390625 create new chi : 0.3059251308441162 time to delete rle : 0.0005488395690917969 batch 1 Loaded 6 chid ids of type : 3726 Number RLEs to save : 5555 TO DO : save crop sub photo not yet done ! save time : 0.37464261054992676 nb_obj : 4 nb_hashtags : 1 time to prepare the origin masks : 0.26952624320983887 time for calcul the mask position with numpy : 0.01804351806640625 nb_pixel_total : 1297795 time to create 1 rle with new method : 0.03589296340942383 time for calcul the mask position with numpy : 0.009483575820922852 nb_pixel_total : 303773 time to create 1 rle with new method : 0.031949520111083984 time for calcul the mask position with numpy : 0.006784915924072266 nb_pixel_total : 30435 time to create 1 rle with old method : 0.03475475311279297 time for calcul the mask position with numpy : 0.00753331184387207 nb_pixel_total : 157196 time to create 1 rle with new method : 0.03038811683654785 time for calcul the mask position with numpy : 0.008645772933959961 nb_pixel_total : 284401 time to create 1 rle with new method : 0.02937483787536621 create new chi : 0.2177586555480957 time to delete rle : 0.0005893707275390625 batch 1 Loaded 5 chid ids of type : 3726 Number RLEs to save : 6560 TO DO : save crop sub photo not yet done ! save time : 0.39573168754577637 nb_obj : 6 nb_hashtags : 2 time to prepare the origin masks : 0.553717851638794 time for calcul the mask position with numpy : 0.021986722946166992 nb_pixel_total : 1372714 time to create 1 rle with new method : 0.0533909797668457 time for calcul the mask position with numpy : 0.0072400569915771484 nb_pixel_total : 83303 time to create 1 rle with old method : 0.09800863265991211 time for calcul the mask position with numpy : 0.006763458251953125 nb_pixel_total : 214 time to create 1 rle with old method : 0.0003299713134765625 time for calcul the mask position with numpy : 0.007033586502075195 nb_pixel_total : 126379 time to create 1 rle with old method : 0.14459967613220215 time for calcul the mask position with numpy : 0.006666660308837891 nb_pixel_total : 26116 time to create 1 rle with old method : 0.03064417839050293 time for calcul the mask position with numpy : 0.009056568145751953 nb_pixel_total : 340956 time to create 1 rle with new method : 0.04213523864746094 time for calcul the mask position with numpy : 0.007003307342529297 nb_pixel_total : 123918 time to create 1 rle with old method : 0.14030218124389648 create new chi : 0.5813136100769043 time to delete rle : 0.0008056163787841797 batch 1 Loaded 7 chid ids of type : 3726 Number RLEs to save : 9062 TO DO : save crop sub photo not yet done ! save time : 0.6754550933837891 nb_obj : 3 nb_hashtags : 1 time to prepare the origin masks : 0.29410338401794434 time for calcul the mask position with numpy : 0.012256860733032227 nb_pixel_total : 836590 time to create 1 rle with new method : 0.0315861701965332 time for calcul the mask position with numpy : 0.007287025451660156 nb_pixel_total : 143095 time to create 1 rle with old method : 0.1613624095916748 time for calcul the mask position with numpy : 0.012052059173583984 nb_pixel_total : 851806 time to create 1 rle with new method : 0.029015541076660156 time for calcul the mask position with numpy : 0.00831294059753418 nb_pixel_total : 242109 time to create 1 rle with new method : 0.029441356658935547 create new chi : 0.29481077194213867 time to delete rle : 0.0006339550018310547 batch 1 Loaded 4 chid ids of type : 3726 Number RLEs to save : 6301 TO DO : save crop sub photo not yet done ! save time : 0.43784356117248535 nb_obj : 5 nb_hashtags : 3 time to prepare the origin masks : 0.5546977519989014 time for calcul the mask position with numpy : 0.017156362533569336 nb_pixel_total : 848761 time to create 1 rle with new method : 0.046645164489746094 time for calcul the mask position with numpy : 0.006308317184448242 nb_pixel_total : 3460 time to create 1 rle with old method : 0.00423741340637207 time for calcul the mask position with numpy : 0.006213188171386719 nb_pixel_total : 20400 time to create 1 rle with old method : 0.022650718688964844 time for calcul the mask position with numpy : 0.00738525390625 nb_pixel_total : 272703 time to create 1 rle with new method : 0.03515172004699707 time for calcul the mask position with numpy : 0.011006832122802734 nb_pixel_total : 754434 time to create 1 rle with new method : 0.030219316482543945 time for calcul the mask position with numpy : 0.008085250854492188 nb_pixel_total : 173842 time to create 1 rle with new method : 0.035770416259765625 create new chi : 0.23866486549377441 time to delete rle : 0.0006632804870605469 batch 1 Loaded 6 chid ids of type : 3726 Number RLEs to save : 7020 TO DO : save crop sub photo not yet done ! save time : 0.4265148639678955 nb_obj : 4 nb_hashtags : 3 time to prepare the origin masks : 0.2667686939239502 time for calcul the mask position with numpy : 0.016328096389770508 nb_pixel_total : 1094960 time to create 1 rle with new method : 0.032498836517333984 time for calcul the mask position with numpy : 0.007313251495361328 nb_pixel_total : 10818 time to create 1 rle with old method : 0.012197732925415039 time for calcul the mask position with numpy : 0.007214784622192383 nb_pixel_total : 39775 time to create 1 rle with old method : 0.044588327407836914 time for calcul the mask position with numpy : 0.011525154113769531 nb_pixel_total : 731304 time to create 1 rle with new method : 0.03241777420043945 time for calcul the mask position with numpy : 0.007210731506347656 nb_pixel_total : 196743 time to create 1 rle with new method : 0.027346134185791016 create new chi : 0.20268583297729492 time to delete rle : 0.0006206035614013672 batch 1 Loaded 5 chid ids of type : 3726 Number RLEs to save : 6044 TO DO : save crop sub photo not yet done ! save time : 0.45789051055908203 nb_obj : 6 nb_hashtags : 3 time to prepare the origin masks : 0.7319626808166504 time for calcul the mask position with numpy : 0.01816248893737793 nb_pixel_total : 1246076 time to create 1 rle with new method : 0.04288887977600098 time for calcul the mask position with numpy : 0.0068285465240478516 nb_pixel_total : 106753 time to create 1 rle with old method : 0.12141752243041992 time for calcul the mask position with numpy : 0.008136272430419922 nb_pixel_total : 288006 time to create 1 rle with new method : 0.03251004219055176 time for calcul the mask position with numpy : 0.006814479827880859 nb_pixel_total : 12081 time to create 1 rle with old method : 0.014469146728515625 time for calcul the mask position with numpy : 0.0063855648040771484 nb_pixel_total : 54070 time to create 1 rle with old method : 0.06155252456665039 time for calcul the mask position with numpy : 0.0072629451751708984 nb_pixel_total : 131589 time to create 1 rle with old method : 0.14846515655517578 time for calcul the mask position with numpy : 0.008550405502319336 nb_pixel_total : 235025 time to create 1 rle with new method : 0.03799128532409668 create new chi : 0.5252645015716553 time to delete rle : 0.0010175704956054688 batch 1 Loaded 7 chid ids of type : 3726 Number RLEs to save : 8513 TO DO : save crop sub photo not yet done ! save time : 0.6098048686981201 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.34717535972595215 time for calcul the mask position with numpy : 0.016039133071899414 nb_pixel_total : 753189 time to create 1 rle with new method : 0.04124569892883301 time for calcul the mask position with numpy : 0.006982088088989258 nb_pixel_total : 1844 time to create 1 rle with old method : 0.0025048255920410156 time for calcul the mask position with numpy : 0.006954669952392578 nb_pixel_total : 15071 time to create 1 rle with old method : 0.016953468322753906 time for calcul the mask position with numpy : 0.007951736450195312 nb_pixel_total : 72403 time to create 1 rle with old method : 0.08678054809570312 time for calcul the mask position with numpy : 0.016998767852783203 nb_pixel_total : 1231093 time to create 1 rle with new method : 0.029002904891967773 create new chi : 0.2323436737060547 time to delete rle : 0.0004892349243164062 batch 1 Loaded 5 chid ids of type : 3726 Number RLEs to save : 5300 TO DO : save crop sub photo not yet done ! save time : 0.33850765228271484 nb_obj : 7 nb_hashtags : 4 time to prepare the origin masks : 0.6642992496490479 time for calcul the mask position with numpy : 0.013804912567138672 nb_pixel_total : 890695 time to create 1 rle with new method : 0.04639244079589844 time for calcul the mask position with numpy : 0.007587432861328125 nb_pixel_total : 130071 time to create 1 rle with old method : 0.14092659950256348 time for calcul the mask position with numpy : 0.0065839290618896484 nb_pixel_total : 41125 time to create 1 rle with old method : 0.04492592811584473 time for calcul the mask position with numpy : 0.006082296371459961 nb_pixel_total : 1733 time to create 1 rle with old method : 0.0022721290588378906 time for calcul the mask position with numpy : 0.0063018798828125 nb_pixel_total : 27177 time to create 1 rle with old method : 0.03094029426574707 time for calcul the mask position with numpy : 0.00624394416809082 nb_pixel_total : 12761 time to create 1 rle with old method : 0.013867855072021484 time for calcul the mask position with numpy : 0.006386756896972656 nb_pixel_total : 61185 time to create 1 rle with old method : 0.06840276718139648 time for calcul the mask position with numpy : 0.017924070358276367 nb_pixel_total : 908853 time to create 1 rle with new method : 0.039624691009521484 create new chi : 0.46307969093322754 time to delete rle : 0.000659942626953125 batch 1 Loaded 8 chid ids of type : 3726 Number RLEs to save : 7107 TO DO : save crop sub photo not yet done ! save time : 0.4153158664703369 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.25060606002807617 time for calcul the mask position with numpy : 0.015263795852661133 nb_pixel_total : 326385 time to create 1 rle with new method : 0.029130935668945312 time for calcul the mask position with numpy : 0.006586790084838867 nb_pixel_total : 120585 time to create 1 rle with old method : 0.13028240203857422 time for calcul the mask position with numpy : 0.017344236373901367 nb_pixel_total : 1626630 time to create 1 rle with new method : 0.028050661087036133 create new chi : 0.23041105270385742 time to delete rle : 0.0004215240478515625 batch 1 Loaded 3 chid ids of type : 3726 Number RLEs to save : 4641 TO DO : save crop sub photo not yet done ! save time : 0.2890050411224365 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.4355049133300781 time for calcul the mask position with numpy : 0.01663970947265625 nb_pixel_total : 1013663 time to create 1 rle with new method : 0.04673933982849121 time for calcul the mask position with numpy : 0.008480072021484375 nb_pixel_total : 17131 time to create 1 rle with old method : 0.020406007766723633 time for calcul the mask position with numpy : 0.007376432418823242 nb_pixel_total : 85010 time to create 1 rle with old method : 0.09429192543029785 time for calcul the mask position with numpy : 0.013645410537719727 nb_pixel_total : 771619 time to create 1 rle with new method : 0.04334139823913574 time for calcul the mask position with numpy : 0.007750749588012695 nb_pixel_total : 186177 time to create 1 rle with new method : 0.03528881072998047 create new chi : 0.30578017234802246 time to delete rle : 0.000614166259765625 batch 1 Loaded 5 chid ids of type : 3726 Number RLEs to save : 6109 TO DO : save crop sub photo not yet done ! save time : 0.46839380264282227 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.22532033920288086 time for calcul the mask position with numpy : 0.015899181365966797 nb_pixel_total : 1239543 time to create 1 rle with new method : 0.03673696517944336 time for calcul the mask position with numpy : 0.00975489616394043 nb_pixel_total : 3617 time to create 1 rle with old method : 0.005568504333496094 time for calcul the mask position with numpy : 0.01385354995727539 nb_pixel_total : 830440 time to create 1 rle with new method : 0.03544449806213379 create new chi : 0.12129497528076172 time to delete rle : 0.0003459453582763672 batch 1 Loaded 3 chid ids of type : 3726 Number RLEs to save : 3681 TO DO : save crop sub photo not yet done ! save time : 0.2563199996948242 nb_obj : 5 nb_hashtags : 2 time to prepare the origin masks : 0.4565544128417969 time for calcul the mask position with numpy : 0.01686382293701172 nb_pixel_total : 1166835 time to create 1 rle with new method : 0.047255754470825195 time for calcul the mask position with numpy : 0.0071773529052734375 nb_pixel_total : 96867 time to create 1 rle with old method : 0.11423516273498535 time for calcul the mask position with numpy : 0.006222248077392578 nb_pixel_total : 1561 time to create 1 rle with old method : 0.0018856525421142578 time for calcul the mask position with numpy : 0.01163029670715332 nb_pixel_total : 784328 time to create 1 rle with new method : 0.04353642463684082 time for calcul the mask position with numpy : 0.0064089298248291016 nb_pixel_total : 16949 time to create 1 rle with old method : 0.0186767578125 time for calcul the mask position with numpy : 0.006032228469848633 nb_pixel_total : 7060 time to create 1 rle with old method : 0.00814962387084961 create new chi : 0.29750800132751465 time to delete rle : 0.0009119510650634766 batch 1 Loaded 6 chid ids of type : 3726 Number RLEs to save : 8109 TO DO : save crop sub photo not yet done ! save time : 0.4981727600097656 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.8333714008331299 time for calcul the mask position with numpy : 0.010320425033569336 nb_pixel_total : 376267 time to create 1 rle with new method : 0.03782343864440918 time for calcul the mask position with numpy : 0.006704568862915039 nb_pixel_total : 1217 time to create 1 rle with old method : 0.001379251480102539 time for calcul the mask position with numpy : 0.006512880325317383 nb_pixel_total : 49873 time to create 1 rle with old method : 0.0567011833190918 time for calcul the mask position with numpy : 0.018565654754638672 nb_pixel_total : 1313625 time to create 1 rle with new method : 0.03824734687805176 time for calcul the mask position with numpy : 0.010088682174682617 nb_pixel_total : 332618 time to create 1 rle with new method : 0.03580021858215332 create new chi : 0.22827935218811035 time to delete rle : 0.0005521774291992188 batch 1 Loaded 5 chid ids of type : 3726 Number RLEs to save : 6569 TO DO : save crop sub photo not yet done ! save time : 0.8937869071960449 nb_obj : 3 nb_hashtags : 1 time to prepare the origin masks : 0.3813159465789795 time for calcul the mask position with numpy : 0.018588542938232422 nb_pixel_total : 1142196 time to create 1 rle with new method : 0.04191136360168457 time for calcul the mask position with numpy : 0.012202262878417969 nb_pixel_total : 865175 time to create 1 rle with new method : 0.03659319877624512 time for calcul the mask position with numpy : 0.006123542785644531 nb_pixel_total : 59823 time to create 1 rle with old method : 0.06485962867736816 time for calcul the mask position with numpy : 0.006124258041381836 nb_pixel_total : 6406 time to create 1 rle with old method : 0.007307767868041992 create new chi : 0.19702601432800293 time to delete rle : 0.0005359649658203125 batch 1 Loaded 4 chid ids of type : 3726 Number RLEs to save : 4970 TO DO : save crop sub photo not yet done ! save time : 0.32472801208496094 nb_obj : 4 nb_hashtags : 1 time to prepare the origin masks : 0.47870612144470215 time for calcul the mask position with numpy : 0.025072813034057617 nb_pixel_total : 1634579 time to create 1 rle with new method : 0.04348015785217285 time for calcul the mask position with numpy : 0.006734371185302734 nb_pixel_total : 2844 time to create 1 rle with old method : 0.0032470226287841797 time for calcul the mask position with numpy : 0.008044004440307617 nb_pixel_total : 319173 time to create 1 rle with new method : 0.039673566818237305 time for calcul the mask position with numpy : 0.007227420806884766 nb_pixel_total : 50735 time to create 1 rle with old method : 0.07541823387145996 time for calcul the mask position with numpy : 0.00741124153137207 nb_pixel_total : 66269 time to create 1 rle with old method : 0.08377933502197266 create new chi : 0.3052206039428711 time to delete rle : 0.0007014274597167969 batch 1 Loaded 5 chid ids of type : 3726 Number RLEs to save : 5950 TO DO : save crop sub photo not yet done ! save time : 0.37221598625183105 nb_obj : 3 nb_hashtags : 3 time to prepare the origin masks : 0.548529863357544 time for calcul the mask position with numpy : 0.018274545669555664 nb_pixel_total : 1522299 time to create 1 rle with new method : 0.04033684730529785 time for calcul the mask position with numpy : 0.006913185119628906 nb_pixel_total : 86634 time to create 1 rle with old method : 0.10194993019104004 time for calcul the mask position with numpy : 0.006940603256225586 nb_pixel_total : 41853 time to create 1 rle with old method : 0.05623054504394531 time for calcul the mask position with numpy : 0.008780956268310547 nb_pixel_total : 422814 time to create 1 rle with new method : 0.03226327896118164 create new chi : 0.27406978607177734 time to delete rle : 0.0005433559417724609 batch 1 Loaded 4 chid ids of type : 3726 Number RLEs to save : 4937 TO DO : save crop sub photo not yet done ! save time : 0.7113351821899414 map_output_result : {1330534413: (0.0, 'Should be the crop_list due to order', 0.0), 1330534408: (0.0, 'Should be the crop_list due to order', 0.0), 1330534401: (0.0, 'Should be the crop_list due to order', 0.0), 1330534397: (0.0, 'Should be the crop_list due to order', 0.0), 1330534392: (0.0, 'Should be the crop_list due to order', 0.0), 1330534252: (0.0, 'Should be the crop_list due to order', 0.0), 1330534000: (0.0, 'Should be the crop_list due to order', 0.0), 1330533997: (0.0, 'Should be the crop_list due to order', 0.0), 1330533904: (0.0, 'Should be the crop_list due to order', 0.0), 1330533901: (0.0, 'Should be the crop_list due to order', 0.0), 1330533896: (0.0, 'Should be the crop_list due to order', 0.0), 1330533856: (0.0, 'Should be the crop_list due to order', 0.0), 1330533813: (0.0, 'Should be the crop_list due to order', 0.0), 1330533811: (0.0, 'Should be the crop_list due to order', 0.0), 1330533808: (0.0, 'Should be the crop_list due to order', 0.0), 1330533804: (0.0, 'Should be the crop_list due to order', 0.0), 1330532846: (0.0, 'Should be the crop_list due to order', 0.0), 1330532841: (0.0, 'Should be the crop_list due to order', 0.0), 1330532793: (0.0, 'Should be the crop_list due to order', 0.0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1330534413, 1330534408, 1330534401, 1330534397, 1330534392, 1330534252, 1330534000, 1330533997, 1330533904, 1330533901, 1330533896, 1330533856, 1330533813, 1330533811, 1330533808, 1330533804, 1330532846, 1330532841, 1330532793] Looping around the photos to save general results len do output : 19 /1330534413.Didn't retrieve data . /1330534408.Didn't retrieve data . /1330534401.Didn't retrieve data . /1330534397.Didn't retrieve data . /1330534392.Didn't retrieve data . /1330534252.Didn't retrieve data . /1330534000.Didn't retrieve data . /1330533997.Didn't retrieve data . /1330533904.Didn't retrieve data . /1330533901.Didn't retrieve data . /1330533896.Didn't retrieve data . /1330533856.Didn't retrieve data . /1330533813.Didn't retrieve data . /1330533811.Didn't retrieve data . /1330533808.Didn't retrieve data . /1330533804.Didn't retrieve data . /1330532846.Didn't retrieve data . /1330532841.Didn't retrieve data . /1330532793.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534413', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534408', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534401', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534397', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534392', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534252', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534000', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533997', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533904', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533901', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533896', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533856', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533813', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533811', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533808', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533804', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532846', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532841', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532793', None, None, None, None, None, '2502430') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 57 time used for this insertion : 0.013939857482910156 save_final save missing photos in datou_result : time spend for datou_step_exec : 24.214709281921387 time spend to save output : 0.014616966247558594 total time spend for step 5 : 24.229326248168945 step6:crop_condition Tue Feb 4 11:35:28 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure some photos are not treated, begin crop_condition Loading chi in step crop with photo_hashtag_type : 3726 Loading chi in step crop for list_pids : 19 ! batch 1 Loaded 99 chid ids of type : 3726 begin to crop the class : teint_dans_la_masse param for this class : {'min_score': 0.7} filtre for class : teint_dans_la_masse hashtag_id of this class : 2107752385 Next one ! Next one ! Next one ! Next one ! Next one ! Next one ! map_result returned by crop_photo_return_map_crop : length : 6 About to insert : list_path_to_insert length 6 new photo from crops ! About to upload 6 photos upload in portfolio : 4869462 init cache_photo without model_param we have 6 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1738665330_3672928 we have uploaded 6 photos in the portfolio 4869462 time of upload the photos Elapsed time : 1.4588806629180908 we have finished the crop for the class : teint_dans_la_masse begin to crop the class : autre_refus param for this class : {'min_score': 0.5} filtre for class : autre_refus hashtag_id of this class : 2107752406 Next one ! map_result returned by crop_photo_return_map_crop : length : 1 About to insert : list_path_to_insert length 1 new photo from crops ! About to upload 1 photos upload in portfolio : 4869462 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1738665332_3672928 we have uploaded 1 photos in the portfolio 4869462 time of upload the photos Elapsed time : 0.6116552352905273 we have finished the crop for the class : autre_refus begin to crop the class : carton_gris param for this class : {'min_score': 0.5} filtre for class : carton_gris hashtag_id of this class : 2107753020 Next one ! Next one ! Next one ! map_result returned by crop_photo_return_map_crop : length : 3 About to insert : list_path_to_insert length 3 new photo from crops ! About to upload 3 photos upload in portfolio : 4869462 init cache_photo without model_param we have 3 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1738665334_3672928 we have uploaded 3 photos in the portfolio 4869462 time of upload the photos Elapsed time : 0.8118200302124023 we have finished the crop for the class : carton_gris begin to crop the class : cartonnette param for this class : {'min_score': 0.5} filtre for class : cartonnette hashtag_id of this class : 702398920 Next one ! map_result returned by crop_photo_return_map_crop : length : 1 About to insert : list_path_to_insert length 1 new photo from crops ! About to upload 1 photos upload in portfolio : 4869462 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1738665335_3672928 we have uploaded 1 photos in the portfolio 4869462 time of upload the photos Elapsed time : 0.7220194339752197 we have finished the crop for the class : cartonnette begin to crop the class : carton_brun param for this class : {'min_score': 0.7} filtre for class : carton_brun hashtag_id of this class : 2107753024 begin to crop the class : plastique param for this class : {'min_score': 0.5} filtre for class : plastique hashtag_id of this class : 492725882 begin to crop the class : kraft param for this class : {'min_score': 0.5} filtre for class : kraft hashtag_id of this class : 493202403 begin to crop the class : metal param for this class : {'min_score': 0.5} filtre for class : metal hashtag_id of this class : 492628673 delete rles for these photos Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : crop_condition we use saveGeneral [1330534413, 1330534408, 1330534401, 1330534397, 1330534392, 1330534252, 1330534000, 1330533997, 1330533904, 1330533901, 1330533896, 1330533856, 1330533813, 1330533811, 1330533808, 1330533804, 1330532846, 1330532841, 1330532793] Looping around the photos to save general results len do output : 11 /1334528942Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1334528943Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1334528945Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1334528946Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1334528947Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1334528948Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1334528949Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1334528952Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1334528953Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1334528954Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1334528956Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534413', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534408', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534401', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534397', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534392', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534252', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534000', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533997', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533904', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533901', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533896', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533856', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533813', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533811', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533808', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533804', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532846', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532841', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532793', None, None, None, None, None, '2502430') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 52 time used for this insertion : 0.03884768486022949 save_final save missing photos in datou_result : time spend for datou_step_exec : 8.640655755996704 time spend to save output : 0.039475202560424805 total time spend for step 6 : 8.680130958557129 step7:ventilate_hashtags_in_portfolio Tue Feb 4 11:35:37 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! To do loadFromThcl(), then load ParamDescType : thcl2725 thcls : [{'id': 2725, 'mtr_user_id': 31, 'name': 'learn_qualipapia_rle_210302_2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Carton_brun,Carton_gris,Teint_Dans_La_Masse,autre_refus,cartonnette,environnement,kraft,metal,papier,plastique', 'svm_portfolios_learning': '3460440,3460441,3460446,3460434,3460439,3467416,3460442,3460443,3486028,3460445', 'photo_hashtag_type': 3410, 'photo_desc_type': 5186, 'type_classification': 'caffe', 'hashtag_id_list': '2107753024,2107753020,2107752385,2107752406,702398920,493012381,493202403,492628673,492668766,492725882'}] thcl {'id': 2725, 'mtr_user_id': 31, 'name': 'learn_qualipapia_rle_210302_2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Carton_brun,Carton_gris,Teint_Dans_La_Masse,autre_refus,cartonnette,environnement,kraft,metal,papier,plastique', 'svm_portfolios_learning': '3460440,3460441,3460446,3460434,3460439,3467416,3460442,3460443,3486028,3460445', 'photo_hashtag_type': 3410, 'photo_desc_type': 5186, 'type_classification': 'caffe', 'hashtag_id_list': '2107753024,2107753020,2107752385,2107752406,702398920,493012381,493202403,492628673,492668766,492725882'} Update svm_hashtag_type_desc : 5186 Iterating over portfolio : 19815787 get user id for portfolio 19815787 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=19815787 AND mptpi.`type`=3726 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('autre_refus','cartonnette','Teint_Dans_La_Masse','environnement','papier','plastique','flou','kraft','mal_croppe','metal','Carton_brun','Carton_gris')) AND mptpi.`min_score`=0.5 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=19815787 AND mptpi.`type`=3726 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('autre_refus','cartonnette','Teint_Dans_La_Masse','environnement','papier','plastique','flou','kraft','mal_croppe','metal','Carton_brun','Carton_gris')) AND mptpi.`min_score`=0.5 To do To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=19815787 AND mptpi.`type`=3726 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('autre_refus','cartonnette','Teint_Dans_La_Masse','environnement','papier','plastique','flou','kraft','mal_croppe','metal','Carton_brun','Carton_gris')) AND mptpi.`min_score`=0.5 To do lien utilise dans velours : https://www.fotonower.com/velours/20207260,20207261,20207262,20207263,20207264,20207265,20207266,20207267,20207268,20207269,20207270,20207271?tags=autre_refus,cartonnette,Teint_Dans_La_Masse,environnement,papier,plastique,flou,kraft,mal_croppe,metal,Carton_brun,Carton_gris Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1330534413, 1330534408, 1330534401, 1330534397, 1330534392, 1330534252, 1330534000, 1330533997, 1330533904, 1330533901, 1330533896, 1330533856, 1330533813, 1330533811, 1330533808, 1330533804, 1330532846, 1330532841, 1330532793] Looping around the photos to save general results len do output : 1 /19815787. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534413', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534408', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534401', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534397', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534392', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534252', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534000', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533997', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533904', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533901', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533896', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533856', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533813', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533811', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533808', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533804', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532846', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532841', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532793', None, None, None, None, None, '2502430') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 20 time used for this insertion : 0.022515296936035156 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.6868736743927002 time spend to save output : 0.022822141647338867 total time spend for step 7 : 0.7096958160400391 step8:final Tue Feb 4 11:35:37 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Catched exception ! Connect or reconnect ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1330534413: ('0.038452217442877974',), 1330534408: ('0.038452217442877974',), 1330534401: ('0.038452217442877974',), 1330534397: ('0.038452217442877974',), 1330534392: ('0.038452217442877974',), 1330534252: ('0.038452217442877974',), 1330534000: ('0.038452217442877974',), 1330533997: ('0.038452217442877974',), 1330533904: ('0.038452217442877974',), 1330533901: ('0.038452217442877974',), 1330533896: ('0.038452217442877974',), 1330533856: ('0.038452217442877974',), 1330533813: ('0.038452217442877974',), 1330533811: ('0.038452217442877974',), 1330533808: ('0.038452217442877974',), 1330533804: ('0.038452217442877974',), 1330532846: ('0.038452217442877974',), 1330532841: ('0.038452217442877974',), 1330532793: ('0.038452217442877974',)} new output for save of step final : {1330534413: ('0.038452217442877974',), 1330534408: ('0.038452217442877974',), 1330534401: ('0.038452217442877974',), 1330534397: ('0.038452217442877974',), 1330534392: ('0.038452217442877974',), 1330534252: ('0.038452217442877974',), 1330534000: ('0.038452217442877974',), 1330533997: ('0.038452217442877974',), 1330533904: ('0.038452217442877974',), 1330533901: ('0.038452217442877974',), 1330533896: ('0.038452217442877974',), 1330533856: ('0.038452217442877974',), 1330533813: ('0.038452217442877974',), 1330533811: ('0.038452217442877974',), 1330533808: ('0.038452217442877974',), 1330533804: ('0.038452217442877974',), 1330532846: ('0.038452217442877974',), 1330532841: ('0.038452217442877974',), 1330532793: ('0.038452217442877974',)} [1330534413, 1330534408, 1330534401, 1330534397, 1330534392, 1330534252, 1330534000, 1330533997, 1330533904, 1330533901, 1330533896, 1330533856, 1330533813, 1330533811, 1330533808, 1330533804, 1330532846, 1330532841, 1330532793] Looping around the photos to save general results len do output : 19 /1330534413.Didn't retrieve data . /1330534408.Didn't retrieve data . /1330534401.Didn't retrieve data . /1330534397.Didn't retrieve data . /1330534392.Didn't retrieve data . /1330534252.Didn't retrieve data . /1330534000.Didn't retrieve data . /1330533997.Didn't retrieve data . /1330533904.Didn't retrieve data . /1330533901.Didn't retrieve data . /1330533896.Didn't retrieve data . /1330533856.Didn't retrieve data . /1330533813.Didn't retrieve data . /1330533811.Didn't retrieve data . /1330533808.Didn't retrieve data . /1330533804.Didn't retrieve data . /1330532846.Didn't retrieve data . /1330532841.Didn't retrieve data . /1330532793.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534413', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534408', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534401', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534397', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534392', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534252', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534000', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533997', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533904', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533901', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533896', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533856', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533813', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533811', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533808', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533804', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532846', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532841', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532793', None, None, None, None, None, '2502430') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 57 time used for this insertion : 0.013145923614501953 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.10750913619995117 time spend to save output : 0.013913393020629883 total time spend for step 8 : 0.12142252922058105 step9:velours_tree Tue Feb 4 11:35:37 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure can't find the photo_desc_type Inside saveOutput : final : False verbose : 0 ouput is None No outpout to save, returning out of save general time spend for datou_step_exec : 0.03475499153137207 time spend to save output : 5.412101745605469e-05 total time spend for step 9 : 0.034809112548828125 step10:send_mail_cod Tue Feb 4 11:35:37 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin in order to get the selector url, please entre the license of selector results_Auto_P19815787_04-02-2025_11_35_37.pdf 20207260 change filename to text .imagette202072601738665337 20207261 change filename to text .imagette202072611738665338 20207265 imagette202072651738665338 20207266 imagette202072661738665338 20207267 imagette202072671738665338 20207268 imagette202072681738665338 20207269 imagette202072691738665338 20207262 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette202072621738665338 20207270 imagette202072701738665338 20207271 change filename to text .change filename to text .change filename to text .imagette202072711738665338 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=19815787 and hashtag_type = 3726 and pcr.hashtag_id = h.hashtag_id; velour_link : https://www.fotonower.com/velours/20207260,20207261,20207262,20207263,20207264,20207265,20207266,20207267,20207268,20207269,20207270,20207271?tags=autre_refus,cartonnette,Teint_Dans_La_Masse,environnement,papier,plastique,flou,kraft,mal_croppe,metal,Carton_brun,Carton_gris your option no_mail is active, we will not send the real mail to your client args[1330534413] : ('0.038452217442877974',) no score found for photo 1330534413 We are sending mail with results at report@fotonower.com args[1330534408] : ('0.038452217442877974',) no score found for photo 1330534408 We are sending mail with results at report@fotonower.com args[1330534401] : ('0.038452217442877974',) no score found for photo 1330534401 We are sending mail with results at report@fotonower.com args[1330534397] : ('0.038452217442877974',) no score found for photo 1330534397 We are sending mail with results at report@fotonower.com args[1330534392] : ('0.038452217442877974',) no score found for photo 1330534392 We are sending mail with results at report@fotonower.com args[1330534252] : ('0.038452217442877974',) no score found for photo 1330534252 We are sending mail with results at report@fotonower.com args[1330534000] : ('0.038452217442877974',) no score found for photo 1330534000 We are sending mail with results at report@fotonower.com args[1330533997] : ('0.038452217442877974',) no score found for photo 1330533997 We are sending mail with results at report@fotonower.com args[1330533904] : ('0.038452217442877974',) no score found for photo 1330533904 We are sending mail with results at report@fotonower.com args[1330533901] : ('0.038452217442877974',) no score found for photo 1330533901 We are sending mail with results at report@fotonower.com args[1330533896] : ('0.038452217442877974',) no score found for photo 1330533896 We are sending mail with results at report@fotonower.com args[1330533856] : ('0.038452217442877974',) no score found for photo 1330533856 We are sending mail with results at report@fotonower.com args[1330533813] : ('0.038452217442877974',) no score found for photo 1330533813 We are sending mail with results at report@fotonower.com args[1330533811] : ('0.038452217442877974',) no score found for photo 1330533811 We are sending mail with results at report@fotonower.com args[1330533808] : ('0.038452217442877974',) no score found for photo 1330533808 We are sending mail with results at report@fotonower.com args[1330533804] : ('0.038452217442877974',) no score found for photo 1330533804 We are sending mail with results at report@fotonower.com args[1330532846] : ('0.038452217442877974',) no score found for photo 1330532846 We are sending mail with results at report@fotonower.com args[1330532841] : ('0.038452217442877974',) no score found for photo 1330532841 We are sending mail with results at report@fotonower.com args[1330532793] : ('0.038452217442877974',) no score found for photo 1330532793 We are sending mail with results at report@fotonower.com refus_total : 0.038452217442877974 2022-04-13 10:29:59 0 SELECT ph.photo_id,ph.url,ph.username,ph.uploaded_at,ph.text FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=19815787 AND mpp.hide_status=0 ORDER BY mpp.order LIMIT 0, 1000 SELECT photo_id, url FROM MTRBack.photos ph WHERE photo_id IN (1330534413,1330533901,1330534408,1330534401,1330534397,1330534392,1330534252,1330534000,1330533997,1330533904,1330532793,1330533896,1330533856,1330533813,1330533811,1330533808,1330533804,1330532846,1330532841) Found this number of photos: 19 begin to download photo : 1330534413 begin to download photo : 1330534397 begin to download photo : 1330533997 begin to download photo : 1330533856 begin to download photo : 1330533804 download finish for photo 1330533804 begin to download photo : 1330532846 download finish for photo 1330533856 begin to download photo : 1330533813 download finish for photo 1330534397 begin to download photo : 1330534392 download finish for photo 1330534413 begin to download photo : 1330533901 download finish for photo 1330533997 begin to download photo : 1330533904 download finish for photo 1330534392 begin to download photo : 1330534252 download finish for photo 1330533813 begin to download photo : 1330533811 download finish for photo 1330532846 begin to download photo : 1330532841 download finish for photo 1330533901 begin to download photo : 1330534408 download finish for photo 1330532841 download finish for photo 1330533904 begin to download photo : 1330532793 download finish for photo 1330534252 begin to download photo : 1330534000 download finish for photo 1330534408 begin to download photo : 1330534401 download finish for photo 1330534000 download finish for photo 1330533811 begin to download photo : 1330533808 download finish for photo 1330532793 begin to download photo : 1330533896 download finish for photo 1330533808 download finish for photo 1330534401 download finish for photo 1330533896 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P19815787_04-02-2025_11_35_37.pdf results_Auto_P19815787_04-02-2025_11_35_37.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P19815787_04-02-2025_11_35_37.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('3459','19815787','results_Auto_P19815787_04-02-2025_11_35_37.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P19815787_04-02-2025_11_35_37.pdf','pdf','','0.21','0.038452217442877974') Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1330534413, 1330534408, 1330534401, 1330534397, 1330534392, 1330534252, 1330534000, 1330533997, 1330533904, 1330533901, 1330533896, 1330533856, 1330533813, 1330533811, 1330533808, 1330533804, 1330532846, 1330532841, 1330532793] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534413', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534408', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534401', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534397', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534392', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534252', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534000', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533997', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533904', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533901', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533896', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533856', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533813', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533811', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533808', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533804', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532846', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532841', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532793', None, None, None, None, None, '2502430') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 19 time used for this insertion : 0.02017807960510254 save_final save missing photos in datou_result : time spend for datou_step_exec : 2.628744602203369 time spend to save output : 0.020422697067260742 total time spend for step 10 : 2.64916729927063 step11:split_time_score Tue Feb 4 11:35:40 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score Catched exception ! Connect or reconnect ! TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('15', 19),) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 21012025 19815787 Nombre de photos uploadées : 19 / 23040 (0%) 21012025 19815787 Nombre de photos taguées (types de déchets): 0 / 19 (0%) 21012025 19815787 Nombre de photos taguées (volume) : 0 / 19 (0%) elapsed_time : load_data_split_time_score 2.1457672119140625e-06 elapsed_time : order_list_meta_photo_and_scores 7.3909759521484375e-06 ??????????????????? elapsed_time : fill_and_build_computed_from_old_data 0.0009512901306152344 elapsed_time : insert_dashboard_record_day_entry 0.021887540817260742 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.04809637498190257 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P19791644_21-01-2025_09_02_16.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19791644 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=19791644 AND mptpi.`type`=3726 To do Qualite : 0.0407800529489664 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P19791645_21-01-2025_09_22_15.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19791645 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=19791645 AND mptpi.`type`=3726 To do Qualite : 0.05806526819103342 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P19792450_21-01-2025_08_51_18.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19792450 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=19792450 AND mptpi.`type`=3726 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19795840 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19795842 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19795857 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19797270 order by id desc limit 1 Qualite : 0.0489462853096413 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P19797271_21-01-2025_11_21_41.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19797271 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=19797271 AND mptpi.`type`=3726 To do Qualite : 0.024754002648240084 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P19797272_21-01-2025_11_33_51.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19797272 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=19797272 AND mptpi.`type`=3726 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19798289 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19799584 order by id desc limit 1 Qualite : 0.034673920167750547 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P19800775_21-01-2025_14_04_38.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19800775 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=19800775 AND mptpi.`type`=3726 To do Qualite : 0.05428035426423572 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P19801172_21-01-2025_13_33_11.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19801172 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=19801172 AND mptpi.`type`=3726 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19804578 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19804579 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19804580 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19804581 order by id desc limit 1 Qualite : 0.03321823590790911 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P19804718_21-01-2025_15_23_32.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19804718 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=19804718 AND mptpi.`type`=3726 To do Qualite : 0.027324753035856044 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P19806030_22-01-2025_08_33_27.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19806030 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=19806030 AND mptpi.`type`=3726 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19806031 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19815786 order by id desc limit 1 Qualite : 0.038452217442877974 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P19815787_04-02-2025_11_35_37.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19815787 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=19815787 AND mptpi.`type`=3726 To do Qualite : 0.012731219344216792 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P19815788_30-01-2025_12_41_32.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19815788 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=19815788 AND mptpi.`type`=3726 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19815789 order by id desc limit 1 Qualite : 0.03828743062344873 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P19815790_22-01-2025_12_21_28.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19815790 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=19815790 AND mptpi.`type`=3726 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19815791 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 19815792 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007294 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007296 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007297 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007298 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007299 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007300 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007301 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007302 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007303 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007304 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007305 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007306 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007307 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007308 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007309 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007334 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007359 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007360 order by id desc limit 1 Qualite : 0.021048126452339815 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P20007361_28-01-2025_10_02_22.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 20007361 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=20007361 AND mptpi.`type`=3726 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'21012025': {'nb_upload': 19, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1330534413, 1330534408, 1330534401, 1330534397, 1330534392, 1330534252, 1330534000, 1330533997, 1330533904, 1330533901, 1330533896, 1330533856, 1330533813, 1330533811, 1330533808, 1330533804, 1330532846, 1330532841, 1330532793] Looping around the photos to save general results len do output : 1 /19815787Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534413', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534408', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534401', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534397', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534392', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534252', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330534000', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533997', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533904', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533901', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533896', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533856', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533813', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533811', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533808', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330533804', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532846', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532841', None, None, None, None, None, '2502430') ('3459', None, None, None, None, None, None, None, '2502430') ('3459', '19815787', '1330532793', None, None, None, None, None, '2502430') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 20 time used for this insertion : 0.018524169921875 save_final save missing photos in datou_result : time spend for datou_step_exec : 27.17305278778076 time spend to save output : 0.019070148468017578 total time spend for step 11 : 27.19212293624878 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 19 set_done_treatment 62.14user 23.63system 2:05.94elapsed 68%CPU (0avgtext+0avgdata 2930148maxresident)k 130928inputs+38144outputs (418major+1855723minor)pagefaults 0swaps