python /home/admin/mtr/script_for_cron.py -j default -m 20 -a 'python3 ~/workarea/git/Velours/python/prod/datou.py -j batch_current -C 3623671' -s traitement_sts -M 0 -S 0 -U 100,80,95 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/home/admin/workarea/git/apy', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 3448121 load datou : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : step 0 init_dummy_multi_datou is not linked in the step_by_step architecture ! WARNING : step 1294 init_dummy_multi_datou is not linked in the step_by_step architecture ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : (photo_id, hashtag_id, score_max) was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? load thcls load pdts Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec no input labels no input values updating current state to 1 list_input_json: {} Current got : datou_id : 4746, datou_cur_ids : ['3623671'] with mtr_portfolio_ids : ['26290613'] and first list_photo_ids : [] new path : /proc/3448121/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 14102 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 13182 split_time_score_with_photo have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 13182 split_time_score_with_photo is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 14090 launch_next_datou_same_portfolio is not consistent : 1 used against 0 in the step definition ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 1 of step 13182 doesn't seem to be define in the database( WARNING : type of input 0 of step 14090 doesn't seem to be define in the database( WARNING : type of output 2 of step 14102 doesn't seem to be define in the database( WARNING : type of input 2 of step 13182 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : mask_detect, split_time_score_with_photo, launch_next_datou_same_portfolio over limit max, limiting to limit_max 150 list_input_json : {} origin We have 1 , BFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 39 ; length of list_pids : 39 ; length of list_args : 39 time to download the photos : 8.896324157714844 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 3 step1:mask_detect Thu Aug 28 12:08:13 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : False begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 6823 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-08-28 12:08:16.439102: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-08-28 12:08:16.464486: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3492910000 Hz 2025-08-28 12:08:16.466034: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f9e34000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-08-28 12:08:16.466083: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-08-28 12:08:16.468524: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-08-28 12:08:16.578023: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0xd8478e0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-08-28 12:08:16.578059: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-08-28 12:08:16.578838: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-08-28 12:08:16.579120: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-08-28 12:08:16.581039: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-08-28 12:08:16.583039: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-08-28 12:08:16.583337: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-08-28 12:08:16.585515: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-08-28 12:08:16.586489: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-08-28 12:08:16.590686: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-08-28 12:08:16.591902: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-08-28 12:08:16.591985: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-08-28 12:08:16.592658: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-08-28 12:08:16.592676: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-08-28 12:08:16.592684: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-08-28 12:08:16.593765: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6271 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-08-28 12:08:16.822302: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-08-28 12:08:16.822415: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-08-28 12:08:16.822431: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-08-28 12:08:16.822446: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-08-28 12:08:16.822461: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-08-28 12:08:16.822474: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-08-28 12:08:16.822488: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-08-28 12:08:16.822502: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-08-28 12:08:16.823469: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-08-28 12:08:16.824519: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-08-28 12:08:16.824551: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-08-28 12:08:16.824567: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-08-28 12:08:16.824582: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-08-28 12:08:16.824597: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-08-28 12:08:16.824611: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-08-28 12:08:16.824626: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-08-28 12:08:16.824655: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-08-28 12:08:16.825610: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-08-28 12:08:16.825642: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-08-28 12:08:16.825650: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-08-28 12:08:16.825658: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-08-28 12:08:16.826668: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6271 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl3916 thcls : [{'id': 3916, 'mtr_user_id': 31, 'name': 'learn_mask_pancarte_110625_2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,badge,pancarte', 'svm_portfolios_learning': '0,0,0', 'photo_hashtag_type': 5014, 'photo_desc_type': 6109, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0'}] thcl {'id': 3916, 'mtr_user_id': 31, 'name': 'learn_mask_pancarte_110625_2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,badge,pancarte', 'svm_portfolios_learning': '0,0,0', 'photo_hashtag_type': 5014, 'photo_desc_type': 6109, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0'} Update svm_hashtag_type_desc : 6109 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (6109, 'learn_mask_pancarte_110625_2', 16384, 25088, 'learn_mask_pancarte_110625_2', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 3, datetime.datetime(2025, 6, 11, 13, 23, 2), datetime.datetime(2025, 6, 11, 13, 23, 2)) {'thcl': {'id': 3916, 'mtr_user_id': 31, 'name': 'learn_mask_pancarte_110625_2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,badge,pancarte', 'svm_portfolios_learning': '0,0,0', 'photo_hashtag_type': 5014, 'photo_desc_type': 6109, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0'}, 'list_hashtags': ['background', 'badge', 'pancarte'], 'list_hashtags_csv': 'background,badge,pancarte', 'svm_portfolios_learning': '0,0,0', 'photo_hashtag_type': 5014, 'svm_hashtag_type_desc': 6109, 'photo_desc_type': 6109, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['background', 'badge', 'pancarte'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME learn_mask_pancarte_110625_2 NUM_CLASSES 3 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : learn_mask_pancarte_110625_2 model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-08-28 12:08:37.065646: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-08-28 12:08:37.341746: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/learn_mask_pancarte_110625_2 /data/models_weight/learn_mask_pancarte_110625_2/mask_model.h5 size_local : 255878432 size in s3 : 255878432 create time local : 2025-06-11 13:03:21 create time in s3 : 2025-06-11 11:03:27 mask_model.h5 already exist and didn't need to update list_images length : 39 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 0 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 0 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 0 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 0 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 1 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 1 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 0 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 1 NEW PHOTO Processing 1 images image shape: (3120, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -120.26250 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 3120.00000 nb d'objets trouves : 0 NEW PHOTO Processing 1 images image shape: (3120, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 3120.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (3120, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 3120.00000 nb d'objets trouves : 1 NEW PHOTO Processing 1 images image shape: (3120, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -120.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 3120.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (3120, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 3120.00000 nb d'objets trouves : 1 NEW PHOTO Processing 1 images image shape: (3120, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -122.03594 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 3120.00000 nb d'objets trouves : 0 NEW PHOTO Processing 1 images image shape: (3120, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 3120.00000 nb d'objets trouves : 0 NEW PHOTO Processing 1 images image shape: (3120, 3120, 3) min: 9.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -107.56719 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 3120.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 150.91250 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 0 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 1 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 18.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 1 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 1 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 0 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 1 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 1 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (4160, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 4160.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (3120, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 3120.00000 nb d'objets trouves : 1 NEW PHOTO Processing 1 images image shape: (3120, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 3120.00000 nb d'objets trouves : 1 NEW PHOTO Processing 1 images image shape: (3120, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 3120.00000 nb d'objets trouves : 1 NEW PHOTO Processing 1 images image shape: (3120, 3120, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 11) min: 0.00000 max: 3120.00000 nb d'objets trouves : 0 Detection mask done ! Trying to reset tf kernel 3448385 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 5311 tf kernel not reseted sub process len(results) : 39 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 39 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 10603 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl3916 Catched exception ! Connect or reconnect ! thcls : [{'id': 3916, 'mtr_user_id': 31, 'name': 'learn_mask_pancarte_110625_2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,badge,pancarte', 'svm_portfolios_learning': '0,0,0', 'photo_hashtag_type': 5014, 'photo_desc_type': 6109, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0'}] thcl {'id': 3916, 'mtr_user_id': 31, 'name': 'learn_mask_pancarte_110625_2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,badge,pancarte', 'svm_portfolios_learning': '0,0,0', 'photo_hashtag_type': 5014, 'photo_desc_type': 6109, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0'} Update svm_hashtag_type_desc : 6109 ['background', 'badge', 'pancarte'] time for calcul the mask position with numpy : 0.0038607120513916016 nb_pixel_total : 246603 time to create 1 rle with new method : 0.009473800659179688 length of segment : 544 time for calcul the mask position with numpy : 0.006592988967895508 nb_pixel_total : 491306 time to create 1 rle with new method : 0.03335976600646973 length of segment : 1203 time for calcul the mask position with numpy : 0.0056972503662109375 nb_pixel_total : 438520 time to create 1 rle with new method : 0.013943195343017578 length of segment : 721 time for calcul the mask position with numpy : 0.0044939517974853516 nb_pixel_total : 339772 time to create 1 rle with new method : 0.012325525283813477 length of segment : 919 time for calcul the mask position with numpy : 0.011896133422851562 nb_pixel_total : 766042 time to create 1 rle with new method : 1.1150062084197998 length of segment : 368 time for calcul the mask position with numpy : 0.0008845329284667969 nb_pixel_total : 71798 time to create 1 rle with old method : 0.08214402198791504 length of segment : 392 time for calcul the mask position with numpy : 0.0016410350799560547 nb_pixel_total : 152496 time to create 1 rle with new method : 0.003391265869140625 length of segment : 546 time for calcul the mask position with numpy : 0.0019834041595458984 nb_pixel_total : 140705 time to create 1 rle with old method : 0.15932297706604004 length of segment : 589 time for calcul the mask position with numpy : 0.004889011383056641 nb_pixel_total : 399518 time to create 1 rle with new method : 0.02053093910217285 length of segment : 852 time for calcul the mask position with numpy : 0.0013325214385986328 nb_pixel_total : 91966 time to create 1 rle with old method : 0.10854554176330566 length of segment : 443 time for calcul the mask position with numpy : 0.0021414756774902344 nb_pixel_total : 131996 time to create 1 rle with old method : 0.14478778839111328 length of segment : 371 time for calcul the mask position with numpy : 0.002045869827270508 nb_pixel_total : 119761 time to create 1 rle with old method : 0.12978196144104004 length of segment : 451 time for calcul the mask position with numpy : 0.0022649765014648438 nb_pixel_total : 123892 time to create 1 rle with old method : 0.13403916358947754 length of segment : 569 time for calcul the mask position with numpy : 0.0005078315734863281 nb_pixel_total : 21620 time to create 1 rle with old method : 0.024198055267333984 length of segment : 185 time for calcul the mask position with numpy : 0.007940053939819336 nb_pixel_total : 415549 time to create 1 rle with new method : 0.016294002532958984 length of segment : 731 time for calcul the mask position with numpy : 0.0010540485382080078 nb_pixel_total : 61970 time to create 1 rle with old method : 0.06767439842224121 length of segment : 363 time for calcul the mask position with numpy : 0.0004858970642089844 nb_pixel_total : 33947 time to create 1 rle with old method : 0.03762245178222656 length of segment : 176 time for calcul the mask position with numpy : 0.013130903244018555 nb_pixel_total : 816311 time to create 1 rle with new method : 0.04505777359008789 length of segment : 621 time for calcul the mask position with numpy : 0.012470483779907227 nb_pixel_total : 820729 time to create 1 rle with new method : 0.13869976997375488 length of segment : 615 time for calcul the mask position with numpy : 0.0022416114807128906 nb_pixel_total : 213352 time to create 1 rle with new method : 0.004721879959106445 length of segment : 508 time for calcul the mask position with numpy : 0.005161762237548828 nb_pixel_total : 409739 time to create 1 rle with new method : 0.012514591217041016 length of segment : 715 time for calcul the mask position with numpy : 0.0012619495391845703 nb_pixel_total : 106660 time to create 1 rle with old method : 0.11691808700561523 length of segment : 420 time for calcul the mask position with numpy : 0.0013909339904785156 nb_pixel_total : 104467 time to create 1 rle with old method : 0.11508917808532715 length of segment : 419 time for calcul the mask position with numpy : 0.0009801387786865234 nb_pixel_total : 84742 time to create 1 rle with old method : 0.09687089920043945 length of segment : 300 time for calcul the mask position with numpy : 0.0035674571990966797 nb_pixel_total : 298063 time to create 1 rle with new method : 0.010028839111328125 length of segment : 907 time for calcul the mask position with numpy : 0.0016355514526367188 nb_pixel_total : 118073 time to create 1 rle with old method : 0.12727808952331543 length of segment : 621 time for calcul the mask position with numpy : 0.00883340835571289 nb_pixel_total : 582081 time to create 1 rle with new method : 0.020750761032104492 length of segment : 644 time for calcul the mask position with numpy : 0.00854945182800293 nb_pixel_total : 594439 time to create 1 rle with new method : 0.017211437225341797 length of segment : 957 time for calcul the mask position with numpy : 0.0030410289764404297 nb_pixel_total : 251633 time to create 1 rle with new method : 0.008560657501220703 length of segment : 640 time for calcul the mask position with numpy : 0.0016551017761230469 nb_pixel_total : 125556 time to create 1 rle with old method : 0.13674044609069824 length of segment : 477 time for calcul the mask position with numpy : 0.0006012916564941406 nb_pixel_total : 58249 time to create 1 rle with old method : 0.06504487991333008 length of segment : 280 time for calcul the mask position with numpy : 0.0012629032135009766 nb_pixel_total : 93396 time to create 1 rle with old method : 0.10175848007202148 length of segment : 455 time for calcul the mask position with numpy : 0.0008871555328369141 nb_pixel_total : 61629 time to create 1 rle with old method : 0.06802701950073242 length of segment : 337 time for calcul the mask position with numpy : 0.0036890506744384766 nb_pixel_total : 271124 time to create 1 rle with new method : 0.008018016815185547 length of segment : 755 time for calcul the mask position with numpy : 0.26894617080688477 nb_pixel_total : 4374677 time to create 1 rle with new method : 0.42929577827453613 length of segment : 1839 time for calcul the mask position with numpy : 0.21339988708496094 nb_pixel_total : 3895376 time to create 1 rle with new method : 0.5071961879730225 length of segment : 1972 time for calcul the mask position with numpy : 0.0014872550964355469 nb_pixel_total : 63535 time to create 1 rle with old method : 0.07115793228149414 length of segment : 267 time for calcul the mask position with numpy : 0.009955167770385742 nb_pixel_total : 576833 time to create 1 rle with new method : 0.016431808471679688 length of segment : 928 time for calcul the mask position with numpy : 0.0009415149688720703 nb_pixel_total : 46368 time to create 1 rle with old method : 0.05167078971862793 length of segment : 244 time for calcul the mask position with numpy : 0.0016045570373535156 nb_pixel_total : 76210 time to create 1 rle with old method : 0.08577346801757812 length of segment : 347 time for calcul the mask position with numpy : 0.001241445541381836 nb_pixel_total : 75921 time to create 1 rle with old method : 0.0850989818572998 length of segment : 291 time for calcul the mask position with numpy : 0.004396915435791016 nb_pixel_total : 277799 time to create 1 rle with new method : 0.008849859237670898 length of segment : 606 time for calcul the mask position with numpy : 0.006979465484619141 nb_pixel_total : 424513 time to create 1 rle with new method : 0.012868404388427734 length of segment : 688 time for calcul the mask position with numpy : 0.0009307861328125 nb_pixel_total : 44177 time to create 1 rle with old method : 0.04971122741699219 length of segment : 220 time for calcul the mask position with numpy : 0.023041725158691406 nb_pixel_total : 1317209 time to create 1 rle with new method : 0.23153233528137207 length of segment : 1009 time for calcul the mask position with numpy : 0.0828239917755127 nb_pixel_total : 1833761 time to create 1 rle with new method : 0.3364238739013672 length of segment : 1475 time for calcul the mask position with numpy : 0.015987873077392578 nb_pixel_total : 848772 time to create 1 rle with new method : 0.126725435256958 length of segment : 417 time for calcul the mask position with numpy : 0.013748884201049805 nb_pixel_total : 1066695 time to create 1 rle with new method : 0.05862283706665039 length of segment : 600 time for calcul the mask position with numpy : 0.0005354881286621094 nb_pixel_total : 31447 time to create 1 rle with old method : 0.03600716590881348 length of segment : 184 time for calcul the mask position with numpy : 0.00197601318359375 nb_pixel_total : 193647 time to create 1 rle with new method : 0.004172325134277344 length of segment : 592 time for calcul the mask position with numpy : 0.0022492408752441406 nb_pixel_total : 195059 time to create 1 rle with new method : 0.0036706924438476562 length of segment : 432 time spent for convertir_results : 7.5233330726623535 Inside saveOutput : final : False verbose : 0 eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 51 chid ids of type : 5014 Number RLEs to save : 0 save missing photos in datou_result : time spend for datou_step_exec : 126.6344256401062 time spend to save output : 1.079937219619751 total time spend for step 1 : 127.71436285972595 step2:split_time_score_with_photo Thu Aug 28 12:10:21 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed Inconsistent list_input size and nb_input information : 0 vs 2 VR 22-3-18 : For now we do not clean correctly the datou structure ----- Debut du copier-coller des param necessaire pour fonction main de STS ----- begin split time score 2022-04-13 10:29:59 0 Catched exception ! Connect or reconnect ! TODO : Insert select and so on Catched exception ! Connect or reconnect ! Begin split_port_in_batch_balle thcls : [{'id': 3379, 'mtr_user_id': 31, 'name': 'learn_classif_flux_maj_generique_effnet_v2_s_02062022', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'aluminium,ela,film_pedb,flux_dev,jrm,pcm,pcnc,pehd_pp,pet_clair,refus,tapis_vide', 'svm_portfolios_learning': '5515864,5515840,5515844,5515850,6244400,6237996,6237998,5515847,5515841,5515868,5515866', 'photo_hashtag_type': 4374, 'photo_desc_type': 5680, 'type_classification': 'tf_classification2', 'hashtag_id_list': '493546845,492741797,2107760237,2107760238,495916461,560181804,1284539308,2107760239,2107755846,538914404,2107748999'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('51', 1), ('52', 3), ('32', 3), ('33', 9), ('42', 2), ('43', 4), ('59', 5), ('39', 1), ('40', 4), ('20', 2), ('21', 3), ('22', 2)) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 26082025 26290613 Nombre de photos uploadées : 39 / 23040 (0%) 26082025 26290613 Nombre de photos taguées (types de déchets): 0 / 39 (0%) 26082025 26290613 Nombre de photos taguées (volume) : 0 / 39 (0%) elapsed_time : load_data_split_time_score 4.0531158447265625e-06 elapsed_time : order_list_meta_photo_and_scores 1.0728836059570312e-05 ??????????????????????????????????????? elapsed_time : fill_and_build_computed_from_old_data 0.0028929710388183594 Catched exception ! Connect or reconnect ! Catched exception ! Connect or reconnect ! elapsed_time : insert_dashboard_record_day_entry 0.21006178855895996 ***** BEGIN SPLIT BY DARK ***** To DO 08/10/21 elapsed_time : SPLIT_BY_DARK 0.005489349365234375 ***** END SPLIT BY DARK ***** ***** BEGIN SPLIT TIME ***** ```````````````````````````````````````list printed: [[0, 1, 2, 3], [4, 5, 6, 7], [8, 9, 10, 11, 12, 13], [14, 15, 16, 17, 18], [19, 20, 21, 22, 23, 24, 25, 26], [27, 28, 29, 30, 31], [32, 33, 34, 35, 36, 37, 38]] forced_hashtag: entrant force hashtag to entrant elapsed_time : SPLIT_TIME 0.005988597869873047 ***** END SPLIT TIME ***** NUMBER BATCH : 7 list_ponderation used : [0.001, 0.001, 0.001, 0.001, 0.001] , list_hashtag_class_create_as_list : ['entrant'] ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info result_one_balle_Type_entrant:{'day': '26082025', 'map_nb_amount': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0}, 'map_time_amount': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0}, 'duration': 18.0, 'nb_balles_papier': 0, 'begin_time_port': 'IMG_20250826_085151.jpg'} Production hashtag (incorrect ponderation at 20-10-18) : 0 ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info result_one_balle_Type_entrant:{'day': '26082025', 'map_nb_amount': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0}, 'map_time_amount': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0}, 'duration': 20.0, 'nb_balles_papier': 0, 'begin_time_port': 'IMG_20250826_103253.jpg'} Production hashtag (incorrect ponderation at 20-10-18) : 0 ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info result_one_balle_Type_entrant:{'day': '26082025', 'map_nb_amount': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0}, 'map_time_amount': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0}, 'duration': 42.0, 'nb_balles_papier': 0, 'begin_time_port': 'IMG_20250826_104239.jpg'} Production hashtag (incorrect ponderation at 20-10-18) : 0 ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info result_one_balle_Type_entrant:{'day': '26082025', 'map_nb_amount': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0}, 'map_time_amount': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0}, 'duration': 28.0, 'nb_balles_papier': 0, 'begin_time_port': 'IMG_20250826_105900.jpg'} Production hashtag (incorrect ponderation at 20-10-18) : 0 ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info result_one_balle_Type_entrant:{'day': '26082025', 'map_nb_amount': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0}, 'map_time_amount': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0}, 'duration': 64.0, 'nb_balles_papier': 0, 'begin_time_port': 'IMG_20250826_113237.jpg'} Production hashtag (incorrect ponderation at 20-10-18) : 0 ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info result_one_balle_Type_entrant:{'day': '26082025', 'map_nb_amount': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0}, 'map_time_amount': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0}, 'duration': 35.0, 'nb_balles_papier': 0, 'begin_time_port': 'IMG_20250826_113953.jpg'} Production hashtag (incorrect ponderation at 20-10-18) : 0 ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info result_one_balle_Type_entrant:{'day': '26082025', 'map_nb_amount': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0}, 'map_time_amount': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0}, 'duration': 131.0, 'nb_balles_papier': 0, 'begin_time_port': 'IMG_20250826_122035.jpg'} Production hashtag (incorrect ponderation at 20-10-18) : 0 We have rejected 0 photos because of the batch_size condition ! NUMBER BATCH list_of_portfolios_to_create : 7 Catched exception ! Connect or reconnect ! list_same_port_ids : [] list_same_port_ids : [] list_same_port_ids : [] list_same_port_ids : [] list_same_port_ids : [] list_same_port_ids : [] list_same_port_ids : [] batch 1 Loaded 1 chid ids of type : 5014 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13156 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 13160 mask_detect have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 13160 mask_detect is not consistent : 3 used against 2 in the step definition ! WARNING : number of inputs for step 13158 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13158 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13157 crop_condition is not consistent : 4 used against 2 in the step definition ! WARNING : number of outputs for step 13157 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of outputs for step 13161 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13159 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 13159 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 13156 doesn't seem to be define in the database( WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13161 doesn't seem to be define in the database( WARNING : type of input 3 of step 13159 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 0 of step 13163 have datatype=6 whereas input 2 of step 13157 have datatype=None WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 0 of step 13162 have datatype=6 whereas input 2 of step 13157 have datatype=None We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13161 have datatype=10 whereas input 3 of step 13164 have datatype=6 WARNING : output 0 of step 13161 have datatype=10 whereas input 0 of step 13166 have datatype=18 WARNING : type of input 5 of step 13164 doesn't seem to be define in the database( WARNING : output 0 of step 13166 have datatype=11 whereas input 5 of step 13164 have datatype=None WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 1 of step 13158 have datatype=7 whereas input 2 of step 13157 have datatype=None WARNING : type of output 3 of step 13157 doesn't seem to be define in the database( WARNING : type of input 3 of step 13161 doesn't seem to be define in the database( WARNING : type of output 2 of step 13160 doesn't seem to be define in the database( WARNING : type of input 1 of step 13163 doesn't seem to be define in the database( WARNING : type of output 2 of step 13160 doesn't seem to be define in the database( WARNING : type of input 1 of step 13162 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26291504 AND mptpi.`type`=4855 To do batch 1 Loaded 2 chid ids of type : 5014 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13156 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 13160 mask_detect have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 13160 mask_detect is not consistent : 3 used against 2 in the step definition ! WARNING : number of inputs for step 13158 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13158 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13157 crop_condition is not consistent : 4 used against 2 in the step definition ! WARNING : number of outputs for step 13157 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of outputs for step 13161 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13159 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 13159 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 13156 doesn't seem to be define in the database( WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13161 doesn't seem to be define in the database( WARNING : type of input 3 of step 13159 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 0 of step 13163 have datatype=6 whereas input 2 of step 13157 have datatype=None WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 0 of step 13162 have datatype=6 whereas input 2 of step 13157 have datatype=None We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13161 have datatype=10 whereas input 3 of step 13164 have datatype=6 WARNING : output 0 of step 13161 have datatype=10 whereas input 0 of step 13166 have datatype=18 WARNING : type of input 5 of step 13164 doesn't seem to be define in the database( WARNING : output 0 of step 13166 have datatype=11 whereas input 5 of step 13164 have datatype=None WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 1 of step 13158 have datatype=7 whereas input 2 of step 13157 have datatype=None WARNING : type of output 3 of step 13157 doesn't seem to be define in the database( WARNING : type of input 3 of step 13161 doesn't seem to be define in the database( WARNING : type of output 2 of step 13160 doesn't seem to be define in the database( WARNING : type of input 1 of step 13163 doesn't seem to be define in the database( WARNING : type of output 2 of step 13160 doesn't seem to be define in the database( WARNING : type of input 1 of step 13162 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26291505 AND mptpi.`type`=4855 To do batch 1 Loaded 2 chid ids of type : 5014 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13156 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 13160 mask_detect have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 13160 mask_detect is not consistent : 3 used against 2 in the step definition ! WARNING : number of inputs for step 13158 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13158 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13157 crop_condition is not consistent : 4 used against 2 in the step definition ! WARNING : number of outputs for step 13157 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of outputs for step 13161 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13159 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 13159 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 13156 doesn't seem to be define in the database( WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13161 doesn't seem to be define in the database( WARNING : type of input 3 of step 13159 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 0 of step 13163 have datatype=6 whereas input 2 of step 13157 have datatype=None WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 0 of step 13162 have datatype=6 whereas input 2 of step 13157 have datatype=None We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13161 have datatype=10 whereas input 3 of step 13164 have datatype=6 WARNING : output 0 of step 13161 have datatype=10 whereas input 0 of step 13166 have datatype=18 WARNING : type of input 5 of step 13164 doesn't seem to be define in the database( WARNING : output 0 of step 13166 have datatype=11 whereas input 5 of step 13164 have datatype=None WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 1 of step 13158 have datatype=7 whereas input 2 of step 13157 have datatype=None WARNING : type of output 3 of step 13157 doesn't seem to be define in the database( WARNING : type of input 3 of step 13161 doesn't seem to be define in the database( WARNING : type of output 2 of step 13160 doesn't seem to be define in the database( WARNING : type of input 1 of step 13163 doesn't seem to be define in the database( WARNING : type of output 2 of step 13160 doesn't seem to be define in the database( WARNING : type of input 1 of step 13162 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26291506 AND mptpi.`type`=4855 To do batch 1 Loaded 3 chid ids of type : 5014 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13156 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 13160 mask_detect have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 13160 mask_detect is not consistent : 3 used against 2 in the step definition ! WARNING : number of inputs for step 13158 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13158 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13157 crop_condition is not consistent : 4 used against 2 in the step definition ! WARNING : number of outputs for step 13157 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of outputs for step 13161 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13159 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 13159 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 13156 doesn't seem to be define in the database( WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13161 doesn't seem to be define in the database( WARNING : type of input 3 of step 13159 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 0 of step 13163 have datatype=6 whereas input 2 of step 13157 have datatype=None WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 0 of step 13162 have datatype=6 whereas input 2 of step 13157 have datatype=None We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13161 have datatype=10 whereas input 3 of step 13164 have datatype=6 WARNING : output 0 of step 13161 have datatype=10 whereas input 0 of step 13166 have datatype=18 WARNING : type of input 5 of step 13164 doesn't seem to be define in the database( WARNING : output 0 of step 13166 have datatype=11 whereas input 5 of step 13164 have datatype=None WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 1 of step 13158 have datatype=7 whereas input 2 of step 13157 have datatype=None WARNING : type of output 3 of step 13157 doesn't seem to be define in the database( WARNING : type of input 3 of step 13161 doesn't seem to be define in the database( WARNING : type of output 2 of step 13160 doesn't seem to be define in the database( WARNING : type of input 1 of step 13163 doesn't seem to be define in the database( WARNING : type of output 2 of step 13160 doesn't seem to be define in the database( WARNING : type of input 1 of step 13162 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26291507 AND mptpi.`type`=4855 To do batch 1 Loaded 3 chid ids of type : 5014 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : step 0 init_dummy_multi_datou is not linked in the step_by_step architecture ! WARNING : step 1294 init_dummy_multi_datou is not linked in the step_by_step architecture ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! DataTypes for each output/input checked ! TODO SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26291508 To do batch 1 Loaded 3 chid ids of type : 5014 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13156 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 13160 mask_detect have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 13160 mask_detect is not consistent : 3 used against 2 in the step definition ! WARNING : number of inputs for step 13158 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13158 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13157 crop_condition is not consistent : 4 used against 2 in the step definition ! WARNING : number of outputs for step 13157 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of outputs for step 13161 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13159 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 13159 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 13156 doesn't seem to be define in the database( WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13161 doesn't seem to be define in the database( WARNING : type of input 3 of step 13159 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 0 of step 13163 have datatype=6 whereas input 2 of step 13157 have datatype=None WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 0 of step 13162 have datatype=6 whereas input 2 of step 13157 have datatype=None We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13161 have datatype=10 whereas input 3 of step 13164 have datatype=6 WARNING : output 0 of step 13161 have datatype=10 whereas input 0 of step 13166 have datatype=18 WARNING : type of input 5 of step 13164 doesn't seem to be define in the database( WARNING : output 0 of step 13166 have datatype=11 whereas input 5 of step 13164 have datatype=None WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 1 of step 13158 have datatype=7 whereas input 2 of step 13157 have datatype=None WARNING : type of output 3 of step 13157 doesn't seem to be define in the database( WARNING : type of input 3 of step 13161 doesn't seem to be define in the database( WARNING : type of output 2 of step 13160 doesn't seem to be define in the database( WARNING : type of input 1 of step 13163 doesn't seem to be define in the database( WARNING : type of output 2 of step 13160 doesn't seem to be define in the database( WARNING : type of input 1 of step 13162 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26291509 AND mptpi.`type`=4855 To do batch 1 Loaded 1 chid ids of type : 5014 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13156 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 13160 mask_detect have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 13160 mask_detect is not consistent : 3 used against 2 in the step definition ! WARNING : number of inputs for step 13158 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13158 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13157 crop_condition is not consistent : 4 used against 2 in the step definition ! WARNING : number of outputs for step 13157 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of outputs for step 13161 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13159 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 13159 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 13156 doesn't seem to be define in the database( WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13161 doesn't seem to be define in the database( WARNING : type of input 3 of step 13159 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 0 of step 13163 have datatype=6 whereas input 2 of step 13157 have datatype=None WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 0 of step 13162 have datatype=6 whereas input 2 of step 13157 have datatype=None We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13161 have datatype=10 whereas input 3 of step 13164 have datatype=6 WARNING : output 0 of step 13161 have datatype=10 whereas input 0 of step 13166 have datatype=18 WARNING : type of input 5 of step 13164 doesn't seem to be define in the database( WARNING : output 0 of step 13166 have datatype=11 whereas input 5 of step 13164 have datatype=None WARNING : type of input 2 of step 13157 doesn't seem to be define in the database( WARNING : output 1 of step 13158 have datatype=7 whereas input 2 of step 13157 have datatype=None WARNING : type of output 3 of step 13157 doesn't seem to be define in the database( WARNING : type of input 3 of step 13161 doesn't seem to be define in the database( WARNING : type of output 2 of step 13160 doesn't seem to be define in the database( WARNING : type of input 1 of step 13163 doesn't seem to be define in the database( WARNING : type of output 2 of step 13160 doesn't seem to be define in the database( WARNING : type of input 1 of step 13162 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=26291510 AND mptpi.`type`=4855 To do elapsed_time : count_nb_balles_and_create_portfolio 7.487907409667969 # DISPLAY ALL COLLECTED DATA : {'26082025': {'nb_upload': 39, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} ------ Fin du Copier-Coller ------ ---------- ONE RESULT --------- ([[0, 1, 2, 3], [4, 5, 6, 7], [8, 9, 10, 11, 12, 13], [14, 15, 16, 17, 18], [19, 20, 21, 22, 23, 24, 25, 26], [27, 28, 29, 30, 31], [32, 33, 34, 35, 36, 37, 38]], {'Rungis_entrant': [(0, 1), (1, 2), (2, 3), (3, 4), (4, 5), (5, 6), (6, 7)]}, {26291504: {'list_of_photos': [1379844813, 1379844831, 1379844865, 1379844884], 'hashtag': 'entrant'}, 26291505: {'list_of_photos': [1379844911, 1379844924, 1379845112, 1379845121], 'hashtag': 'entrant'}, 26291506: {'list_of_photos': [1379845123, 1379845128, 1379845131, 1379845132, 1379845296, 1379845298], 'hashtag': 'entrant'}, 26291507: {'list_of_photos': [1379845302, 1379845304, 1379845305, 1379845307, 1379845325], 'hashtag': 'entrant'}, 26291508: {'list_of_photos': [1379845328, 1379845354, 1379845366, 1379845377, 1379845378, 1379845424, 1379845427, 1379845430], 'hashtag': 'entrant'}, 26291509: {'list_of_photos': [1379845431, 1379845432, 1379845433, 1379845453, 1379845456], 'hashtag': 'entrant'}, 26291510: {'list_of_photos': [1379845459, 1379845462, 1379845464, 1379845465, 1379845508, 1379845512, 1379845514], 'hashtag': 'entrant'}}, {2107760258: 39}, {'amount_uploaded_and_tagged': {'26082025': {'nb_upload': 39, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}}, 'map_amount_per_hashtag': {'Rungis_entrant': [(0, 1), (1, 2), (2, 3), (3, 4), (4, 5), (5, 6), (6, 7)]}, 'count': {'Rungis_entrant': [(0, 1), (1, 2), (2, 3), (3, 4), (4, 5), (5, 6), (6, 7)]}}) ---------- END de ONE RESULT ---------- Suppression des photos Telecharges Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score_with_photo we use saveGeneral [1379845514, 1379845512, 1379845508, 1379845465, 1379845464, 1379845462, 1379845459, 1379845456, 1379845453, 1379845433, 1379845432, 1379845431, 1379845430, 1379845427, 1379845424, 1379845378, 1379845377, 1379845366, 1379845354, 1379845328, 1379845325, 1379845307, 1379845305, 1379845304, 1379845302, 1379845298, 1379845296, 1379845132, 1379845131, 1379845128, 1379845123, 1379845121, 1379845112, 1379844924, 1379844911, 1379844884, 1379844865, 1379844831, 1379844813] Looping around the photos to save general results len do output : 5 /[[0, 1, 2, 3], [4, 5, 6, 7], [8, 9, 10, 11, 12, 13], [14, 15, 16, 17, 18], [19, 20, 21, 22, 23, 24, 25, 26], [27, 28, 29, 30, 31], [32, 33, 34, 35, 36, 37, 38]] /{'Rungis_entrant': [(0, 1), (1, 2), (2, 3), (3, 4), (4, 5), (5, 6), (6, 7)]} /{26291504: {'list_of_photos': [1379844813, 1379844831, 1379844865, 1379844884], 'hashtag': 'entrant'}, 26291505: {'list_of_photos': [1379844911, 1379844924, 1379845112, 1379845121], 'hashtag': 'entrant'}, 26291506: {'list_of_photos': [1379845123, 1379845128, 1379845131, 1379845132, 1379845296, 1379845298], 'hashtag': 'entrant'}, 26291507: {'list_of_photos': [1379845302, 1379845304, 1379845305, 1379845307, 1379845325], 'hashtag': 'entrant'}, 26291508: {'list_of_photos': [1379845328, 1379845354, 1379845366, 1379845377, 1379845378, 1379845424, 1379845427, 1379845430], 'hashtag': 'entrant'}, 26291509: {'list_of_photos': [1379845431, 1379845432, 1379845433, 1379845453, 1379845456], 'hashtag': 'entrant'}, 26291510: {'list_of_photos': [1379845459, 1379845462, 1379845464, 1379845465, 1379845508, 1379845512, 1379845514], 'hashtag': 'entrant'}} /{2107760258: 39} /{'amount_uploaded_and_tagged': {'26082025': {'nb_upload': 39, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}}, 'map_amount_per_hashtag': {'Rungis_entrant': [(0, 1), (1, 2), (2, 3), (3, 4), (4, 5), (5, 6), (6, 7)]}, 'count': {'Rungis_entrant': [(0, 1), (1, 2), (2, 3), (3, 4), (4, 5), (5, 6), (6, 7)]}} before output type Managing all output in save final without adding information in the mtr_datou_result ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845514', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845512', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845508', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845465', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845464', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845462', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845459', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845456', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845453', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845433', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845432', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845431', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845430', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845427', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845424', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845378', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845377', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845366', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845354', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845328', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845325', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845307', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845305', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845304', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845302', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845298', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845296', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845132', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845131', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845128', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845123', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845121', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845112', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379844924', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379844911', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379844884', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379844865', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379844831', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379844813', None, None, None, None, None, '3623671') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 39 time used for this insertion : 0.14490103721618652 save_final save missing photos in datou_result : time spend for datou_step_exec : 11.02710747718811 time spend to save output : 0.1452953815460205 total time spend for step 2 : 11.17240285873413 step3:launch_next_datou_same_portfolio Thu Aug 28 12:10:32 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed VR 22-3-18 : For now we do not clean correctly the datou structure Number of photos for current instance 39 (0 missing) for 39 photos in portfolios Datou 4746 on portfolios [26290613] is finished : launching datou 4148 on portfolios [26290613] new current id 3623819 on portfolio 26290613 for datou 4148 Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : launch_next_datou_same_portfolio we use saveGeneral [1379845514, 1379845512, 1379845508, 1379845465, 1379845464, 1379845462, 1379845459, 1379845456, 1379845453, 1379845433, 1379845432, 1379845431, 1379845430, 1379845427, 1379845424, 1379845378, 1379845377, 1379845366, 1379845354, 1379845328, 1379845325, 1379845307, 1379845305, 1379845304, 1379845302, 1379845298, 1379845296, 1379845132, 1379845131, 1379845128, 1379845123, 1379845121, 1379845112, 1379844924, 1379844911, 1379844884, 1379844865, 1379844831, 1379844813] Looping around the photos to save general results len do output : 0 before output type Managing all output in save final without adding information in the mtr_datou_result ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845514', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845512', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845508', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845465', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845464', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845462', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845459', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845456', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845453', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845433', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845432', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845431', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845430', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845427', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845424', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845378', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845377', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845366', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845354', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845328', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845325', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845307', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845305', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845304', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845302', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845298', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845296', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845132', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845131', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845128', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845123', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845121', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379845112', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379844924', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379844911', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379844884', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379844865', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379844831', None, None, None, None, None, '3623671') ('4746', None, None, None, None, None, None, None, '3623671') ('4746', '26290613', '1379844813', None, None, None, None, None, '3623671') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 39 time used for this insertion : 0.16783595085144043 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.02736186981201172 time spend to save output : 0.1681663990020752 total time spend for step 3 : 0.19552826881408691 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 39 set_done_treatment 37.77user 100.96system 2:32.28elapsed 91%CPU (0avgtext+0avgdata 5692824maxresident)k 1017584inputs+257296outputs (10542major+9609838minor)pagefaults 0swaps