python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 3318 ' -s datou_3318 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 491570 load datou : 3318 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? [(photo_id, hashtag_id, hashtag_type, x0, x1, y0, y1, score, seg_temp, polygons), ...] was removed should we ? chemin de la photo was removed should we ? [ (photo_id_loc, hashtag_id, hashtag_type, x0, x1, y0, y1, score, None), ...] was removed should we ? chemin de la photo was removed should we ? id de la photo (peut être local ou global) was removed should we ? chemin de la photo was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? None was removed should we ? donnée sous forme de nombre was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) load thcls load THCL from format json or kwargs add thcl : 2847 in CacheModelConfig load pdts add pdt : 5275 in CacheModelConfig Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 3318, datou_cur_ids : ['2712765'] with mtr_portfolio_ids : ['21943741'] and first list_photo_ids : [] new path : /proc/491570/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : mask_detect, crop_condition, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, velours_tree, send_mail_cod, split_time_score over limit max, limiting to limit_max 40 list_input_json : [] origin We have 1 , BFBFBFBFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 8 ; length of list_pids : 8 ; length of list_args : 8 time to download the photos : 1.474721908569336 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 10 step1:mask_detect Tue Apr 1 12:00:31 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 10372 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-04-01 12:00:34.555862: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-04-01 12:00:34.583178: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493065000 Hz 2025-04-01 12:00:34.585361: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f0598000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-04-01 12:00:34.585423: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-04-01 12:00:34.589648: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-04-01 12:00:34.751488: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x1563f790 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-04-01 12:00:34.751549: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-04-01 12:00:34.753105: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-04-01 12:00:34.753946: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-04-01 12:00:34.759059: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-04-01 12:00:34.762652: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-04-01 12:00:34.763416: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-04-01 12:00:34.767188: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-04-01 12:00:34.768758: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-04-01 12:00:34.775647: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-04-01 12:00:34.777732: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-04-01 12:00:34.777978: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-04-01 12:00:34.779235: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-04-01 12:00:34.779317: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-04-01 12:00:34.779331: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-04-01 12:00:34.781862: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 8750 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-04-01 12:00:35.156467: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-04-01 12:00:35.156583: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-04-01 12:00:35.156613: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-04-01 12:00:35.156638: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-04-01 12:00:35.156661: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-04-01 12:00:35.156683: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-04-01 12:00:35.156706: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-04-01 12:00:35.156729: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-04-01 12:00:35.158826: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-04-01 12:00:35.160530: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-04-01 12:00:35.160617: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-04-01 12:00:35.160643: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-04-01 12:00:35.160665: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-04-01 12:00:35.160688: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-04-01 12:00:35.160710: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-04-01 12:00:35.160733: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-04-01 12:00:35.160756: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-04-01 12:00:35.162581: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-04-01 12:00:35.162632: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-04-01 12:00:35.162648: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-04-01 12:00:35.162659: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-04-01 12:00:35.164316: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 8750 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl2847 thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) {'thcl': {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'], 'list_hashtags_csv': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'svm_hashtag_type_desc': 5275, 'photo_desc_type': 5275, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME learn_RUBBIA_REFUS_AMIENS_23 NUM_CLASSES 9 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : learn_RUBBIA_REFUS_AMIENS_23 model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-04-01 12:00:45.123665: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-04-01 12:00:45.357090: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-04-01 12:00:49.493556: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:49.494496: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 3.60G (3865470464 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:49.495165: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 3.24G (3478923264 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:49.495846: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.92G (3131030784 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:49.496459: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.62G (2817927680 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.613616: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.614559: W tensorflow/core/common_runtime/bfc_allocator.cc:311] Garbage collection: deallocate free memory regions (i.e., allocations) so that we can re-allocate a larger region to avoid OOM due to memory fragmentation. If you see this message frequently, you are running near the threshold of the available device memory and re-allocation may incur great performance overhead. You may try smaller batch sizes to observe the performance impact. Set TF_ENABLE_GPU_GARBAGE_COLLECTION=false if you'd like to disable this feature. 2025-04-01 12:00:50.675353: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.675473: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.67GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-04-01 12:00:50.676720: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.676750: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.67GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-04-01 12:00:50.704525: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.704582: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 3.29GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-04-01 12:00:50.705197: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.705249: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 3.29GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-04-01 12:00:50.725662: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.725695: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 1.78GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-04-01 12:00:50.726305: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.726321: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 1.78GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-04-01 12:00:50.790976: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.791046: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 19.91MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-04-01 12:00:50.791064: W tensorflow/core/kernels/gpu_utils.cc:49] Failed to allocate memory for convolution redzone checking; skipping this check. This is benign and only means that we won't check cudnn for out-of-bounds reads and writes. This message will only be printed once. 2025-04-01 12:00:50.792140: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.792171: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 16.00MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-04-01 12:00:50.793319: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.793351: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 16.00MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-04-01 12:00:50.800993: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.801043: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 63.85MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-04-01 12:00:50.802071: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.803144: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.803811: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.811858: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.812494: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.841050: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.841732: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.842371: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.843017: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.847445: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.848131: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.848756: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.849368: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.852035: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.860981: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.861626: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.871627: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.872329: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.872971: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.873559: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.874150: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-04-01 12:00:50.874769: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory local folder : /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23 /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23/mask_model.h5 size_local : 256009536 size in s3 : 256009536 create time local : 2021-08-09 09:43:22 create time in s3 : 2021-08-06 18:54:04 mask_model.h5 already exist and didn't need to update list_images length : 8 NEW PHOTO Processing 1 images image shape: (2160, 3264, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3264.00000 nb d'objets trouves : 27 NEW PHOTO Processing 1 images image shape: (2160, 3264, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3264.00000 nb d'objets trouves : 41 NEW PHOTO Processing 1 images image shape: (2160, 3264, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3264.00000 nb d'objets trouves : 50 NEW PHOTO Processing 1 images image shape: (2160, 3264, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3264.00000 nb d'objets trouves : 48 NEW PHOTO Processing 1 images image shape: (2160, 3264, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3264.00000 nb d'objets trouves : 17 NEW PHOTO Processing 1 images image shape: (2160, 3264, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3264.00000 nb d'objets trouves : 32 NEW PHOTO Processing 1 images image shape: (2160, 3264, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3264.00000 nb d'objets trouves : 36 NEW PHOTO Processing 1 images image shape: (2160, 3264, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 3264.00000 nb d'objets trouves : 34 Detection mask done ! Trying to reset tf kernel 494384 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 1831 tf kernel not reseted sub process len(results) : 8 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 8 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 5728 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl2847 Catched exception ! Connect or reconnect ! thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] time for calcul the mask position with numpy : 0.0012402534484863281 nb_pixel_total : 36084 time to create 1 rle with old method : 0.0419924259185791 length of segment : 221 time for calcul the mask position with numpy : 0.0017197132110595703 nb_pixel_total : 58170 time to create 1 rle with old method : 0.0664680004119873 length of segment : 277 time for calcul the mask position with numpy : 0.003086566925048828 nb_pixel_total : 94396 time to create 1 rle with old method : 0.11244845390319824 length of segment : 420 time for calcul the mask position with numpy : 0.00042510032653808594 nb_pixel_total : 14528 time to create 1 rle with old method : 0.017441272735595703 length of segment : 154 time for calcul the mask position with numpy : 0.0011134147644042969 nb_pixel_total : 38823 time to create 1 rle with old method : 0.04562258720397949 length of segment : 488 time for calcul the mask position with numpy : 0.00046062469482421875 nb_pixel_total : 10998 time to create 1 rle with old method : 0.0128326416015625 length of segment : 125 time for calcul the mask position with numpy : 0.001817941665649414 nb_pixel_total : 79725 time to create 1 rle with old method : 0.10647821426391602 length of segment : 528 time for calcul the mask position with numpy : 0.0006761550903320312 nb_pixel_total : 23608 time to create 1 rle with old method : 0.02721118927001953 length of segment : 411 time for calcul the mask position with numpy : 0.00037360191345214844 nb_pixel_total : 5472 time to create 1 rle with old method : 0.006516456604003906 length of segment : 226 time for calcul the mask position with numpy : 0.0004177093505859375 nb_pixel_total : 14734 time to create 1 rle with old method : 0.017437458038330078 length of segment : 155 time for calcul the mask position with numpy : 0.0024154186248779297 nb_pixel_total : 96693 time to create 1 rle with old method : 0.11173844337463379 length of segment : 259 time for calcul the mask position with numpy : 0.0028197765350341797 nb_pixel_total : 94533 time to create 1 rle with old method : 0.10968947410583496 length of segment : 305 time for calcul the mask position with numpy : 0.0004391670227050781 nb_pixel_total : 16927 time to create 1 rle with old method : 0.01961660385131836 length of segment : 148 time for calcul the mask position with numpy : 0.005286216735839844 nb_pixel_total : 132001 time to create 1 rle with old method : 0.16134333610534668 length of segment : 669 time for calcul the mask position with numpy : 0.0019412040710449219 nb_pixel_total : 76769 time to create 1 rle with old method : 0.09592080116271973 length of segment : 288 time for calcul the mask position with numpy : 0.0019419193267822266 nb_pixel_total : 75040 time to create 1 rle with old method : 0.08625507354736328 length of segment : 177 time for calcul the mask position with numpy : 0.0015447139739990234 nb_pixel_total : 86978 time to create 1 rle with old method : 0.1246786117553711 length of segment : 223 time for calcul the mask position with numpy : 0.0006716251373291016 nb_pixel_total : 18778 time to create 1 rle with old method : 0.02134227752685547 length of segment : 315 time for calcul the mask position with numpy : 0.0005450248718261719 nb_pixel_total : 14161 time to create 1 rle with old method : 0.01639533042907715 length of segment : 214 time for calcul the mask position with numpy : 0.000982522964477539 nb_pixel_total : 39889 time to create 1 rle with old method : 0.04677319526672363 length of segment : 145 time for calcul the mask position with numpy : 0.0008327960968017578 nb_pixel_total : 46499 time to create 1 rle with old method : 0.05398130416870117 length of segment : 176 time for calcul the mask position with numpy : 0.0004107952117919922 nb_pixel_total : 24167 time to create 1 rle with old method : 0.028959989547729492 length of segment : 159 time for calcul the mask position with numpy : 0.000301361083984375 nb_pixel_total : 6711 time to create 1 rle with old method : 0.007915019989013672 length of segment : 168 time for calcul the mask position with numpy : 0.0005996227264404297 nb_pixel_total : 26621 time to create 1 rle with old method : 0.0306704044342041 length of segment : 282 time for calcul the mask position with numpy : 0.0004291534423828125 nb_pixel_total : 12560 time to create 1 rle with old method : 0.014934062957763672 length of segment : 95 time for calcul the mask position with numpy : 0.0010042190551757812 nb_pixel_total : 47909 time to create 1 rle with old method : 0.056075334548950195 length of segment : 234 time for calcul the mask position with numpy : 0.0023589134216308594 nb_pixel_total : 96446 time to create 1 rle with old method : 0.138624906539917 length of segment : 306 time for calcul the mask position with numpy : 0.0007698535919189453 nb_pixel_total : 22063 time to create 1 rle with old method : 0.02584075927734375 length of segment : 629 time for calcul the mask position with numpy : 0.004532814025878906 nb_pixel_total : 71612 time to create 1 rle with old method : 0.0866079330444336 length of segment : 478 time for calcul the mask position with numpy : 0.0040318965911865234 nb_pixel_total : 52750 time to create 1 rle with old method : 0.063812255859375 length of segment : 528 time for calcul the mask position with numpy : 0.0013358592987060547 nb_pixel_total : 20154 time to create 1 rle with old method : 0.023662805557250977 length of segment : 326 time for calcul the mask position with numpy : 0.002637624740600586 nb_pixel_total : 48957 time to create 1 rle with old method : 0.056128501892089844 length of segment : 277 time for calcul the mask position with numpy : 0.0016357898712158203 nb_pixel_total : 33255 time to create 1 rle with old method : 0.03837275505065918 length of segment : 210 time for calcul the mask position with numpy : 0.0007421970367431641 nb_pixel_total : 12775 time to create 1 rle with old method : 0.01508951187133789 length of segment : 160 time for calcul the mask position with numpy : 0.0040435791015625 nb_pixel_total : 73181 time to create 1 rle with old method : 0.08744931221008301 length of segment : 440 time for calcul the mask position with numpy : 0.003239870071411133 nb_pixel_total : 114556 time to create 1 rle with old method : 0.1318073272705078 length of segment : 478 time for calcul the mask position with numpy : 0.0023207664489746094 nb_pixel_total : 39246 time to create 1 rle with old method : 0.04589271545410156 length of segment : 261 time for calcul the mask position with numpy : 0.0011355876922607422 nb_pixel_total : 29666 time to create 1 rle with old method : 0.03585219383239746 length of segment : 211 time for calcul the mask position with numpy : 0.0029773712158203125 nb_pixel_total : 40774 time to create 1 rle with old method : 0.04760479927062988 length of segment : 416 time for calcul the mask position with numpy : 0.011409997940063477 nb_pixel_total : 187309 time to create 1 rle with new method : 0.011855602264404297 length of segment : 477 time for calcul the mask position with numpy : 0.009209632873535156 nb_pixel_total : 145625 time to create 1 rle with old method : 0.16521000862121582 length of segment : 632 time for calcul the mask position with numpy : 0.001413583755493164 nb_pixel_total : 27246 time to create 1 rle with old method : 0.031170129776000977 length of segment : 372 time for calcul the mask position with numpy : 0.00027561187744140625 nb_pixel_total : 10970 time to create 1 rle with old method : 0.015126228332519531 length of segment : 81 time for calcul the mask position with numpy : 0.001470327377319336 nb_pixel_total : 30901 time to create 1 rle with old method : 0.03758406639099121 length of segment : 271 time for calcul the mask position with numpy : 0.0006773471832275391 nb_pixel_total : 21845 time to create 1 rle with old method : 0.025559663772583008 length of segment : 212 time for calcul the mask position with numpy : 0.00504302978515625 nb_pixel_total : 97613 time to create 1 rle with old method : 0.11008501052856445 length of segment : 361 time for calcul the mask position with numpy : 0.003321409225463867 nb_pixel_total : 39885 time to create 1 rle with old method : 0.04587960243225098 length of segment : 369 time for calcul the mask position with numpy : 0.003162384033203125 nb_pixel_total : 79268 time to create 1 rle with old method : 0.09467434883117676 length of segment : 240 time for calcul the mask position with numpy : 0.0002474784851074219 nb_pixel_total : 2593 time to create 1 rle with old method : 0.0031156539916992188 length of segment : 62 time for calcul the mask position with numpy : 0.00018596649169921875 nb_pixel_total : 3003 time to create 1 rle with old method : 0.0035948753356933594 length of segment : 74 time for calcul the mask position with numpy : 0.001638650894165039 nb_pixel_total : 76848 time to create 1 rle with old method : 0.11366128921508789 length of segment : 238 time for calcul the mask position with numpy : 0.0008502006530761719 nb_pixel_total : 20420 time to create 1 rle with old method : 0.023370742797851562 length of segment : 249 time for calcul the mask position with numpy : 0.0005404949188232422 nb_pixel_total : 24233 time to create 1 rle with old method : 0.0285947322845459 length of segment : 132 time for calcul the mask position with numpy : 0.0003237724304199219 nb_pixel_total : 12179 time to create 1 rle with old method : 0.014265298843383789 length of segment : 118 time for calcul the mask position with numpy : 0.0017778873443603516 nb_pixel_total : 18699 time to create 1 rle with old method : 0.023044586181640625 length of segment : 254 time for calcul the mask position with numpy : 0.001359701156616211 nb_pixel_total : 20030 time to create 1 rle with old method : 0.023095369338989258 length of segment : 238 time for calcul the mask position with numpy : 0.004775047302246094 nb_pixel_total : 95935 time to create 1 rle with old method : 0.1121823787689209 length of segment : 364 time for calcul the mask position with numpy : 0.001519918441772461 nb_pixel_total : 37973 time to create 1 rle with old method : 0.043770790100097656 length of segment : 265 time for calcul the mask position with numpy : 0.0018460750579833984 nb_pixel_total : 33058 time to create 1 rle with old method : 0.03783869743347168 length of segment : 273 time for calcul the mask position with numpy : 0.0006124973297119141 nb_pixel_total : 9517 time to create 1 rle with old method : 0.011185407638549805 length of segment : 131 time for calcul the mask position with numpy : 0.0002899169921875 nb_pixel_total : 12621 time to create 1 rle with old method : 0.014720678329467773 length of segment : 193 time for calcul the mask position with numpy : 0.0004870891571044922 nb_pixel_total : 9832 time to create 1 rle with old method : 0.011652469635009766 length of segment : 92 time for calcul the mask position with numpy : 0.0025322437286376953 nb_pixel_total : 44144 time to create 1 rle with old method : 0.05026602745056152 length of segment : 429 time for calcul the mask position with numpy : 0.006573915481567383 nb_pixel_total : 179232 time to create 1 rle with new method : 0.010655403137207031 length of segment : 432 time for calcul the mask position with numpy : 0.0012400150299072266 nb_pixel_total : 29560 time to create 1 rle with old method : 0.03443408012390137 length of segment : 270 time for calcul the mask position with numpy : 0.00684809684753418 nb_pixel_total : 100026 time to create 1 rle with old method : 0.11996340751647949 length of segment : 334 time for calcul the mask position with numpy : 0.0015511512756347656 nb_pixel_total : 29393 time to create 1 rle with old method : 0.03481435775756836 length of segment : 382 time for calcul the mask position with numpy : 0.0032401084899902344 nb_pixel_total : 80492 time to create 1 rle with old method : 0.09332394599914551 length of segment : 465 time for calcul the mask position with numpy : 0.0002319812774658203 nb_pixel_total : 2796 time to create 1 rle with old method : 0.003331899642944336 length of segment : 78 time for calcul the mask position with numpy : 0.003863096237182617 nb_pixel_total : 96664 time to create 1 rle with old method : 0.12821650505065918 length of segment : 345 time for calcul the mask position with numpy : 0.0014944076538085938 nb_pixel_total : 33664 time to create 1 rle with old method : 0.041380882263183594 length of segment : 258 time for calcul the mask position with numpy : 0.0041654109954833984 nb_pixel_total : 99247 time to create 1 rle with old method : 0.11783981323242188 length of segment : 361 time for calcul the mask position with numpy : 0.0003943443298339844 nb_pixel_total : 14625 time to create 1 rle with old method : 0.01688838005065918 length of segment : 200 time for calcul the mask position with numpy : 0.00031375885009765625 nb_pixel_total : 9986 time to create 1 rle with old method : 0.012391805648803711 length of segment : 88 time for calcul the mask position with numpy : 0.001360177993774414 nb_pixel_total : 33214 time to create 1 rle with old method : 0.041198015213012695 length of segment : 302 time for calcul the mask position with numpy : 0.0010619163513183594 nb_pixel_total : 37232 time to create 1 rle with old method : 0.04628562927246094 length of segment : 182 time for calcul the mask position with numpy : 0.0035982131958007812 nb_pixel_total : 81160 time to create 1 rle with old method : 0.09855151176452637 length of segment : 245 time for calcul the mask position with numpy : 0.0010972023010253906 nb_pixel_total : 15464 time to create 1 rle with old method : 0.01833343505859375 length of segment : 240 time for calcul the mask position with numpy : 0.0016269683837890625 nb_pixel_total : 37433 time to create 1 rle with old method : 0.0440521240234375 length of segment : 256 time for calcul the mask position with numpy : 0.00038623809814453125 nb_pixel_total : 7555 time to create 1 rle with old method : 0.011321067810058594 length of segment : 101 time for calcul the mask position with numpy : 0.0006301403045654297 nb_pixel_total : 16507 time to create 1 rle with old method : 0.024260520935058594 length of segment : 125 time for calcul the mask position with numpy : 0.0014755725860595703 nb_pixel_total : 33729 time to create 1 rle with old method : 0.04048275947570801 length of segment : 312 time for calcul the mask position with numpy : 0.0009152889251708984 nb_pixel_total : 33716 time to create 1 rle with old method : 0.0393068790435791 length of segment : 189 time for calcul the mask position with numpy : 0.0005919933319091797 nb_pixel_total : 20274 time to create 1 rle with old method : 0.025717735290527344 length of segment : 160 time for calcul the mask position with numpy : 0.001834869384765625 nb_pixel_total : 62818 time to create 1 rle with old method : 0.07767915725708008 length of segment : 395 time for calcul the mask position with numpy : 0.0023336410522460938 nb_pixel_total : 87211 time to create 1 rle with old method : 0.12934207916259766 length of segment : 256 time for calcul the mask position with numpy : 0.00043129920959472656 nb_pixel_total : 13386 time to create 1 rle with old method : 0.01580214500427246 length of segment : 300 time for calcul the mask position with numpy : 0.0018279552459716797 nb_pixel_total : 54878 time to create 1 rle with old method : 0.06421589851379395 length of segment : 306 time for calcul the mask position with numpy : 0.00038552284240722656 nb_pixel_total : 8328 time to create 1 rle with old method : 0.009841680526733398 length of segment : 125 time for calcul the mask position with numpy : 0.0006549358367919922 nb_pixel_total : 19590 time to create 1 rle with old method : 0.03224062919616699 length of segment : 151 time for calcul the mask position with numpy : 0.0004286766052246094 nb_pixel_total : 13265 time to create 1 rle with old method : 0.01558828353881836 length of segment : 147 time for calcul the mask position with numpy : 0.0010111331939697266 nb_pixel_total : 36585 time to create 1 rle with old method : 0.042230844497680664 length of segment : 183 time for calcul the mask position with numpy : 0.0004019737243652344 nb_pixel_total : 9900 time to create 1 rle with old method : 0.011484384536743164 length of segment : 127 time for calcul the mask position with numpy : 0.0010704994201660156 nb_pixel_total : 36710 time to create 1 rle with old method : 0.04223370552062988 length of segment : 188 time for calcul the mask position with numpy : 0.003038644790649414 nb_pixel_total : 87350 time to create 1 rle with old method : 0.10258293151855469 length of segment : 304 time for calcul the mask position with numpy : 0.0013222694396972656 nb_pixel_total : 23751 time to create 1 rle with old method : 0.0273282527923584 length of segment : 311 time for calcul the mask position with numpy : 0.0010423660278320312 nb_pixel_total : 22899 time to create 1 rle with old method : 0.03172922134399414 length of segment : 187 time for calcul the mask position with numpy : 0.0055768489837646484 nb_pixel_total : 109614 time to create 1 rle with old method : 0.13532662391662598 length of segment : 398 time for calcul the mask position with numpy : 0.0007810592651367188 nb_pixel_total : 14362 time to create 1 rle with old method : 0.01731252670288086 length of segment : 154 time for calcul the mask position with numpy : 0.00247955322265625 nb_pixel_total : 40641 time to create 1 rle with old method : 0.0717155933380127 length of segment : 315 time for calcul the mask position with numpy : 0.0007102489471435547 nb_pixel_total : 14270 time to create 1 rle with old method : 0.01751112937927246 length of segment : 131 time for calcul the mask position with numpy : 0.00040912628173828125 nb_pixel_total : 10313 time to create 1 rle with old method : 0.012314081192016602 length of segment : 113 time for calcul the mask position with numpy : 0.0028314590454101562 nb_pixel_total : 86167 time to create 1 rle with old method : 0.10403704643249512 length of segment : 333 time for calcul the mask position with numpy : 0.0010707378387451172 nb_pixel_total : 19641 time to create 1 rle with old method : 0.022783994674682617 length of segment : 206 time for calcul the mask position with numpy : 0.0011301040649414062 nb_pixel_total : 30126 time to create 1 rle with old method : 0.035039424896240234 length of segment : 217 time for calcul the mask position with numpy : 0.000545501708984375 nb_pixel_total : 11721 time to create 1 rle with old method : 0.013716697692871094 length of segment : 131 time for calcul the mask position with numpy : 0.0018393993377685547 nb_pixel_total : 14928 time to create 1 rle with old method : 0.017770767211914062 length of segment : 165 time for calcul the mask position with numpy : 0.001039266586303711 nb_pixel_total : 19457 time to create 1 rle with old method : 0.022989988327026367 length of segment : 179 time for calcul the mask position with numpy : 0.010009288787841797 nb_pixel_total : 228540 time to create 1 rle with new method : 0.018683433532714844 length of segment : 773 time for calcul the mask position with numpy : 0.00043272972106933594 nb_pixel_total : 11459 time to create 1 rle with old method : 0.01552271842956543 length of segment : 103 time for calcul the mask position with numpy : 0.0015325546264648438 nb_pixel_total : 25917 time to create 1 rle with old method : 0.03284788131713867 length of segment : 193 time for calcul the mask position with numpy : 0.0006852149963378906 nb_pixel_total : 18822 time to create 1 rle with old method : 0.0224306583404541 length of segment : 174 time for calcul the mask position with numpy : 0.0011262893676757812 nb_pixel_total : 18742 time to create 1 rle with old method : 0.021648883819580078 length of segment : 284 time for calcul the mask position with numpy : 0.0004558563232421875 nb_pixel_total : 14287 time to create 1 rle with old method : 0.017636537551879883 length of segment : 136 time for calcul the mask position with numpy : 0.0006015300750732422 nb_pixel_total : 8213 time to create 1 rle with old method : 0.01164102554321289 length of segment : 112 time for calcul the mask position with numpy : 0.004858255386352539 nb_pixel_total : 193365 time to create 1 rle with new method : 0.01928091049194336 length of segment : 522 time for calcul the mask position with numpy : 0.0007054805755615234 nb_pixel_total : 12678 time to create 1 rle with old method : 0.015782594680786133 length of segment : 171 time for calcul the mask position with numpy : 0.0014569759368896484 nb_pixel_total : 30832 time to create 1 rle with old method : 0.03800320625305176 length of segment : 220 time for calcul the mask position with numpy : 0.0006928443908691406 nb_pixel_total : 10346 time to create 1 rle with old method : 0.014138221740722656 length of segment : 138 time for calcul the mask position with numpy : 0.0012087821960449219 nb_pixel_total : 31204 time to create 1 rle with old method : 0.039354562759399414 length of segment : 228 time for calcul the mask position with numpy : 0.0012710094451904297 nb_pixel_total : 33412 time to create 1 rle with old method : 0.043313026428222656 length of segment : 271 time for calcul the mask position with numpy : 0.001302957534790039 nb_pixel_total : 26691 time to create 1 rle with old method : 0.03188490867614746 length of segment : 164 time for calcul the mask position with numpy : 0.001976490020751953 nb_pixel_total : 41522 time to create 1 rle with old method : 0.05024218559265137 length of segment : 261 time for calcul the mask position with numpy : 0.001852273941040039 nb_pixel_total : 46228 time to create 1 rle with old method : 0.06923532485961914 length of segment : 311 time for calcul the mask position with numpy : 0.0004050731658935547 nb_pixel_total : 9569 time to create 1 rle with old method : 0.011360645294189453 length of segment : 128 time for calcul the mask position with numpy : 0.003008604049682617 nb_pixel_total : 76737 time to create 1 rle with old method : 0.08781147003173828 length of segment : 324 time for calcul the mask position with numpy : 0.006066322326660156 nb_pixel_total : 168677 time to create 1 rle with new method : 0.012454748153686523 length of segment : 508 time for calcul the mask position with numpy : 0.0029892921447753906 nb_pixel_total : 71086 time to create 1 rle with old method : 0.08784103393554688 length of segment : 451 time for calcul the mask position with numpy : 0.0012993812561035156 nb_pixel_total : 35852 time to create 1 rle with old method : 0.04291224479675293 length of segment : 312 time for calcul the mask position with numpy : 0.0010995864868164062 nb_pixel_total : 28838 time to create 1 rle with old method : 0.03526163101196289 length of segment : 197 time for calcul the mask position with numpy : 0.0006608963012695312 nb_pixel_total : 19469 time to create 1 rle with old method : 0.023264408111572266 length of segment : 175 time for calcul the mask position with numpy : 0.0010006427764892578 nb_pixel_total : 25655 time to create 1 rle with old method : 0.03047657012939453 length of segment : 181 time for calcul the mask position with numpy : 0.0015244483947753906 nb_pixel_total : 26319 time to create 1 rle with old method : 0.03526139259338379 length of segment : 321 time for calcul the mask position with numpy : 0.0016057491302490234 nb_pixel_total : 40322 time to create 1 rle with old method : 0.05434846878051758 length of segment : 144 time for calcul the mask position with numpy : 0.0007958412170410156 nb_pixel_total : 17164 time to create 1 rle with old method : 0.02019667625427246 length of segment : 140 time for calcul the mask position with numpy : 0.002902984619140625 nb_pixel_total : 80560 time to create 1 rle with old method : 0.09290909767150879 length of segment : 294 time for calcul the mask position with numpy : 0.0005285739898681641 nb_pixel_total : 9547 time to create 1 rle with old method : 0.011213541030883789 length of segment : 103 time for calcul the mask position with numpy : 0.0008981227874755859 nb_pixel_total : 20584 time to create 1 rle with old method : 0.024152278900146484 length of segment : 192 time for calcul the mask position with numpy : 0.00033783912658691406 nb_pixel_total : 7202 time to create 1 rle with old method : 0.008617401123046875 length of segment : 74 time for calcul the mask position with numpy : 0.0010156631469726562 nb_pixel_total : 18645 time to create 1 rle with old method : 0.021892786026000977 length of segment : 182 time for calcul the mask position with numpy : 0.0006096363067626953 nb_pixel_total : 13919 time to create 1 rle with old method : 0.019672393798828125 length of segment : 104 time for calcul the mask position with numpy : 0.0006535053253173828 nb_pixel_total : 13525 time to create 1 rle with old method : 0.018070459365844727 length of segment : 117 time for calcul the mask position with numpy : 0.0008351802825927734 nb_pixel_total : 19570 time to create 1 rle with old method : 0.023891925811767578 length of segment : 122 time for calcul the mask position with numpy : 0.0016055107116699219 nb_pixel_total : 35567 time to create 1 rle with old method : 0.04163670539855957 length of segment : 306 time for calcul the mask position with numpy : 0.0005950927734375 nb_pixel_total : 9501 time to create 1 rle with old method : 0.011327981948852539 length of segment : 108 time for calcul the mask position with numpy : 0.0018706321716308594 nb_pixel_total : 39942 time to create 1 rle with old method : 0.0491023063659668 length of segment : 252 time for calcul the mask position with numpy : 0.0004851818084716797 nb_pixel_total : 15844 time to create 1 rle with old method : 0.019686460494995117 length of segment : 155 time for calcul the mask position with numpy : 0.0006177425384521484 nb_pixel_total : 17204 time to create 1 rle with old method : 0.02628350257873535 length of segment : 133 time for calcul the mask position with numpy : 0.00043964385986328125 nb_pixel_total : 8639 time to create 1 rle with old method : 0.010278701782226562 length of segment : 119 time for calcul the mask position with numpy : 0.00179290771484375 nb_pixel_total : 35553 time to create 1 rle with old method : 0.04876208305358887 length of segment : 249 time for calcul the mask position with numpy : 0.00035953521728515625 nb_pixel_total : 4451 time to create 1 rle with old method : 0.005311250686645508 length of segment : 96 time for calcul the mask position with numpy : 0.00026226043701171875 nb_pixel_total : 9428 time to create 1 rle with old method : 0.012642145156860352 length of segment : 147 time for calcul the mask position with numpy : 0.000667572021484375 nb_pixel_total : 9610 time to create 1 rle with old method : 0.014404296875 length of segment : 96 time for calcul the mask position with numpy : 0.0009627342224121094 nb_pixel_total : 12037 time to create 1 rle with old method : 0.017583847045898438 length of segment : 202 time for calcul the mask position with numpy : 0.0013997554779052734 nb_pixel_total : 26722 time to create 1 rle with old method : 0.039572715759277344 length of segment : 195 time for calcul the mask position with numpy : 0.00042700767517089844 nb_pixel_total : 5662 time to create 1 rle with old method : 0.006664752960205078 length of segment : 126 time for calcul the mask position with numpy : 0.0002961158752441406 nb_pixel_total : 2550 time to create 1 rle with old method : 0.0030901432037353516 length of segment : 65 time for calcul the mask position with numpy : 0.0003407001495361328 nb_pixel_total : 6600 time to create 1 rle with old method : 0.008672714233398438 length of segment : 98 time for calcul the mask position with numpy : 0.0006129741668701172 nb_pixel_total : 10918 time to create 1 rle with old method : 0.013753414154052734 length of segment : 129 time for calcul the mask position with numpy : 0.0002033710479736328 nb_pixel_total : 5018 time to create 1 rle with old method : 0.006071805953979492 length of segment : 104 time for calcul the mask position with numpy : 0.0004658699035644531 nb_pixel_total : 10200 time to create 1 rle with old method : 0.012522220611572266 length of segment : 137 time spent for convertir_results : 14.017953872680664 Inside saveOutput : final : False verbose : 0 eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 161 chid ids of type : 3594 Number RLEs to save : 39203 save missing photos in datou_result : time spend for datou_step_exec : 88.80651807785034 time spend to save output : 5.024878978729248 total time spend for step 1 : 93.83139705657959 step2:crop_condition Tue Apr 1 12:02:05 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure Loading chi in step crop with photo_hashtag_type : 3594 Loading chi in step crop for list_pids : 8 ! batch 1 Loaded 161 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ begin to crop the class : papier param for this class : {'min_score': 0.7} filtre for class : papier hashtag_id of this class : 492668766 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 116 About to insert : list_path_to_insert length 116 new photo from crops ! About to upload 116 photos upload in portfolio : 3736932 init cache_photo without model_param we have 116 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1743501747_491570 we have uploaded 116 photos in the portfolio 3736932 time of upload the photos Elapsed time : 35.39285731315613 we have finished the crop for the class : papier begin to crop the class : carton param for this class : {'min_score': 0.7} filtre for class : carton hashtag_id of this class : 492774966 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 16 About to insert : list_path_to_insert length 16 new photo from crops ! About to upload 16 photos upload in portfolio : 3736932 init cache_photo without model_param we have 16 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1743501785_491570 we have uploaded 16 photos in the portfolio 3736932 time of upload the photos Elapsed time : 4.4865734577178955 we have finished the crop for the class : carton begin to crop the class : metal param for this class : {'min_score': 0.7} filtre for class : metal hashtag_id of this class : 492628673 begin to crop the class : pet_clair param for this class : {'min_score': 0.7} filtre for class : pet_clair hashtag_id of this class : 2107755846 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 27 About to insert : list_path_to_insert length 27 new photo from crops ! About to upload 27 photos upload in portfolio : 3736932 init cache_photo without model_param we have 27 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1743501799_491570 we have uploaded 27 photos in the portfolio 3736932 time of upload the photos Elapsed time : 14.041931867599487 we have finished the crop for the class : pet_clair begin to crop the class : autre param for this class : {'min_score': 0.7} filtre for class : autre hashtag_id of this class : 494826614 we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 1 About to insert : list_path_to_insert length 1 new photo from crops ! About to upload 1 photos upload in portfolio : 3736932 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1743501814_491570 we have uploaded 1 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.6691806316375732 we have finished the crop for the class : autre begin to crop the class : pehd param for this class : {'min_score': 0.7} filtre for class : pehd hashtag_id of this class : 628944319 begin to crop the class : pet_fonce param for this class : {'min_score': 0.7} filtre for class : pet_fonce hashtag_id of this class : 2107755900 we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 1 About to insert : list_path_to_insert length 1 new photo from crops ! About to upload 1 photos upload in portfolio : 3736932 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1743501816_491570 we have uploaded 1 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.6928613185882568 we have finished the crop for the class : pet_fonce delete rles from all chi we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : crop_condition we use saveGeneral [1349221010, 1349221002, 1349220967, 1349220944, 1349220354, 1349220350, 1349220338, 1349220330] Looping around the photos to save general results len do output : 161 /1349262433Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262434Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262436Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262438Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262439Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262440Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262443Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262444Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262445Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262448Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262449Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262450Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262453Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262454Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262455Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262458Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262459Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262460Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262461Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262464Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262465Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262466Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262469Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262470Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262471Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262474Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262475Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262476Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262478Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262480Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262481Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262482Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262485Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262486Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262487Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262490Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262491Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262492Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262495Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262496Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262497Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262498Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262501Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262502Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262504Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262507Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262508Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262509Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262510Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262513Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262514Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262515Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262518Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262519Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262520Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262521Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262524Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262525Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262526Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262529Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262530Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262531Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262533Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262535Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262536Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262538Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262541Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262542Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262543Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262547Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262548Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262549Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262552Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262553Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262554Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262556Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262558Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262559Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262561Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262563Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262564Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262565Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262567Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262569Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262570Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262572Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262574Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262575Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262576Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262578Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262580Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262581Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262584Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262585Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262586Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262588Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262590Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262591Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262593Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262595Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262596Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262598Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262600Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262601Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262603Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262604Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262606Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262608Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262609Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262611Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262612Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262614Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262615Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262618Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262619Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262620Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262694Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262695Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262698Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262699Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262701Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262702Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262704Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262706Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262707Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262709Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262710Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262712Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262713Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262716Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262717Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349262718Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263009Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263011Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263017Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263023Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263025Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263027Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263031Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263032Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263033Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263036Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263037Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263038Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263041Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263042Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263044Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263045Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263048Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263049Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263051Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263053Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263054Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263055Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263057Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263059Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263062Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263065Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263067Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263089Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1349263109Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349221010', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349221002', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220967', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220944', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220354', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220350', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220338', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220330', None, None, None, None, None, '2712765') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 491 time used for this insertion : 0.16373491287231445 save_final save missing photos in datou_result : time spend for datou_step_exec : 91.11897349357605 time spend to save output : 0.16838765144348145 total time spend for step 2 : 91.28736114501953 step3:rle_unique_nms_with_priority Tue Apr 1 12:03:36 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 161 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++nb_obj : 16 nb_hashtags : 3 time to prepare the origin masks : 6.111540794372559 time for calcul the mask position with numpy : 0.5314836502075195 nb_pixel_total : 6252501 time to create 1 rle with new method : 0.7146263122558594 time for calcul the mask position with numpy : 0.02329421043395996 nb_pixel_total : 4278 time to create 1 rle with old method : 0.0072689056396484375 time for calcul the mask position with numpy : 0.02756524085998535 nb_pixel_total : 76769 time to create 1 rle with old method : 0.09075689315795898 time for calcul the mask position with numpy : 0.02333378791809082 nb_pixel_total : 132001 time to create 1 rle with old method : 0.15000033378601074 time for calcul the mask position with numpy : 0.026131153106689453 nb_pixel_total : 16927 time to create 1 rle with old method : 0.020169496536254883 time for calcul the mask position with numpy : 0.025862932205200195 nb_pixel_total : 94533 time to create 1 rle with old method : 0.11018681526184082 time for calcul the mask position with numpy : 0.023029327392578125 nb_pixel_total : 96693 time to create 1 rle with old method : 0.11033844947814941 time for calcul the mask position with numpy : 0.022584199905395508 nb_pixel_total : 14734 time to create 1 rle with old method : 0.01671600341796875 time for calcul the mask position with numpy : 0.0209352970123291 nb_pixel_total : 5472 time to create 1 rle with old method : 0.0063495635986328125 time for calcul the mask position with numpy : 0.02169179916381836 nb_pixel_total : 23608 time to create 1 rle with old method : 0.02671074867248535 time for calcul the mask position with numpy : 0.022534608840942383 nb_pixel_total : 79725 time to create 1 rle with old method : 0.10047340393066406 time for calcul the mask position with numpy : 0.023114681243896484 nb_pixel_total : 10998 time to create 1 rle with old method : 0.012447595596313477 time for calcul the mask position with numpy : 0.021878957748413086 nb_pixel_total : 38823 time to create 1 rle with old method : 0.043642520904541016 time for calcul the mask position with numpy : 0.02199411392211914 nb_pixel_total : 14528 time to create 1 rle with old method : 0.016567707061767578 time for calcul the mask position with numpy : 0.02355170249938965 nb_pixel_total : 94396 time to create 1 rle with old method : 0.11513805389404297 time for calcul the mask position with numpy : 0.028369665145874023 nb_pixel_total : 58170 time to create 1 rle with old method : 0.06779265403747559 time for calcul the mask position with numpy : 0.030780315399169922 nb_pixel_total : 36084 time to create 1 rle with old method : 0.03984570503234863 create new chi : 2.6023993492126465 time to delete rle : 0.02514052391052246 batch 1 Loaded 33 chid ids of type : 3594 ++++++++++++++++++Number RLEs to save : 11737 TO DO : save crop sub photo not yet done ! save time : 2.687356948852539 nb_obj : 22 nb_hashtags : 3 time to prepare the origin masks : 7.058765172958374 time for calcul the mask position with numpy : 0.2522132396697998 nb_pixel_total : 6117999 time to create 1 rle with new method : 0.6675257682800293 time for calcul the mask position with numpy : 0.03672528266906738 nb_pixel_total : 29653 time to create 1 rle with old method : 0.03789329528808594 time for calcul the mask position with numpy : 0.04033470153808594 nb_pixel_total : 39246 time to create 1 rle with old method : 0.04634237289428711 time for calcul the mask position with numpy : 0.04438424110412598 nb_pixel_total : 114119 time to create 1 rle with old method : 0.14250826835632324 time for calcul the mask position with numpy : 0.04248166084289551 nb_pixel_total : 73181 time to create 1 rle with old method : 0.1031796932220459 time for calcul the mask position with numpy : 0.03870844841003418 nb_pixel_total : 12775 time to create 1 rle with old method : 0.014421701431274414 time for calcul the mask position with numpy : 0.037058115005493164 nb_pixel_total : 33255 time to create 1 rle with old method : 0.03745698928833008 time for calcul the mask position with numpy : 0.03484964370727539 nb_pixel_total : 48957 time to create 1 rle with old method : 0.05456233024597168 time for calcul the mask position with numpy : 0.034804582595825195 nb_pixel_total : 20154 time to create 1 rle with old method : 0.0277864933013916 time for calcul the mask position with numpy : 0.03436684608459473 nb_pixel_total : 52750 time to create 1 rle with old method : 0.05939006805419922 time for calcul the mask position with numpy : 0.03307080268859863 nb_pixel_total : 71612 time to create 1 rle with old method : 0.08083152770996094 time for calcul the mask position with numpy : 0.03748607635498047 nb_pixel_total : 15820 time to create 1 rle with old method : 0.018192529678344727 time for calcul the mask position with numpy : 0.03672337532043457 nb_pixel_total : 96446 time to create 1 rle with old method : 0.11894869804382324 time for calcul the mask position with numpy : 0.0344851016998291 nb_pixel_total : 47909 time to create 1 rle with old method : 0.05423617362976074 time for calcul the mask position with numpy : 0.024229764938354492 nb_pixel_total : 12560 time to create 1 rle with old method : 0.013998746871948242 time for calcul the mask position with numpy : 0.02203202247619629 nb_pixel_total : 26621 time to create 1 rle with old method : 0.029873132705688477 time for calcul the mask position with numpy : 0.021100282669067383 nb_pixel_total : 6711 time to create 1 rle with old method : 0.007501363754272461 time for calcul the mask position with numpy : 0.02142024040222168 nb_pixel_total : 24167 time to create 1 rle with old method : 0.02709484100341797 time for calcul the mask position with numpy : 0.03367424011230469 nb_pixel_total : 46499 time to create 1 rle with old method : 0.052384138107299805 time for calcul the mask position with numpy : 0.03979134559631348 nb_pixel_total : 39889 time to create 1 rle with old method : 0.04492521286010742 time for calcul the mask position with numpy : 0.0342862606048584 nb_pixel_total : 14161 time to create 1 rle with old method : 0.015515804290771484 time for calcul the mask position with numpy : 0.034654855728149414 nb_pixel_total : 18778 time to create 1 rle with old method : 0.020644187927246094 time for calcul the mask position with numpy : 0.035448551177978516 nb_pixel_total : 86978 time to create 1 rle with old method : 0.09510374069213867 create new chi : 2.813275098800659 time to delete rle : 0.001973390579223633 batch 1 Loaded 45 chid ids of type : 3594 ++++++++++++++++++++++++++++++Number RLEs to save : 14584 TO DO : save crop sub photo not yet done ! save time : 0.9759948253631592 nb_obj : 24 nb_hashtags : 4 time to prepare the origin masks : 8.518556356430054 time for calcul the mask position with numpy : 0.5233767032623291 nb_pixel_total : 6087055 time to create 1 rle with new method : 0.7087101936340332 time for calcul the mask position with numpy : 0.04374122619628906 nb_pixel_total : 9832 time to create 1 rle with old method : 0.015941619873046875 time for calcul the mask position with numpy : 0.044405460357666016 nb_pixel_total : 512 time to create 1 rle with old method : 0.0009357929229736328 time for calcul the mask position with numpy : 0.045178890228271484 nb_pixel_total : 9517 time to create 1 rle with old method : 0.0157930850982666 time for calcul the mask position with numpy : 0.04668426513671875 nb_pixel_total : 33058 time to create 1 rle with old method : 0.05116987228393555 time for calcul the mask position with numpy : 0.03637290000915527 nb_pixel_total : 37973 time to create 1 rle with old method : 0.04850459098815918 time for calcul the mask position with numpy : 0.03988337516784668 nb_pixel_total : 95935 time to create 1 rle with old method : 0.11430954933166504 time for calcul the mask position with numpy : 0.03888106346130371 nb_pixel_total : 20030 time to create 1 rle with old method : 0.028337478637695312 time for calcul the mask position with numpy : 0.03840947151184082 nb_pixel_total : 18699 time to create 1 rle with old method : 0.02623295783996582 time for calcul the mask position with numpy : 0.04213571548461914 nb_pixel_total : 10513 time to create 1 rle with old method : 0.01246786117553711 time for calcul the mask position with numpy : 0.03931784629821777 nb_pixel_total : 19941 time to create 1 rle with old method : 0.0280454158782959 time for calcul the mask position with numpy : 0.03888416290283203 nb_pixel_total : 20420 time to create 1 rle with old method : 0.032685041427612305 time for calcul the mask position with numpy : 0.03399062156677246 nb_pixel_total : 1042 time to create 1 rle with old method : 0.002307891845703125 time for calcul the mask position with numpy : 0.03788471221923828 nb_pixel_total : 2734 time to create 1 rle with old method : 0.0031621456146240234 time for calcul the mask position with numpy : 0.031223535537719727 nb_pixel_total : 2593 time to create 1 rle with old method : 0.0030460357666015625 time for calcul the mask position with numpy : 0.0362095832824707 nb_pixel_total : 79268 time to create 1 rle with old method : 0.1015157699584961 time for calcul the mask position with numpy : 0.04329276084899902 nb_pixel_total : 39885 time to create 1 rle with old method : 0.04952645301818848 time for calcul the mask position with numpy : 0.03549671173095703 nb_pixel_total : 97613 time to create 1 rle with old method : 0.1289968490600586 time for calcul the mask position with numpy : 0.05051922798156738 nb_pixel_total : 20795 time to create 1 rle with old method : 0.03126096725463867 time for calcul the mask position with numpy : 0.023269176483154297 nb_pixel_total : 30901 time to create 1 rle with old method : 0.03548789024353027 time for calcul the mask position with numpy : 0.02304673194885254 nb_pixel_total : 10970 time to create 1 rle with old method : 0.012559175491333008 time for calcul the mask position with numpy : 0.02410435676574707 nb_pixel_total : 27246 time to create 1 rle with old method : 0.030594587326049805 time for calcul the mask position with numpy : 0.02457451820373535 nb_pixel_total : 145625 time to create 1 rle with old method : 0.1744225025177002 time for calcul the mask position with numpy : 0.02356576919555664 nb_pixel_total : 187309 time to create 1 rle with new method : 0.5413551330566406 time for calcul the mask position with numpy : 0.027573823928833008 nb_pixel_total : 40774 time to create 1 rle with old method : 0.04716324806213379 create new chi : 3.7052526473999023 time to delete rle : 0.0037832260131835938 batch 1 Loaded 49 chid ids of type : 3594 +++++++++++++++++++++++++++++++++Number RLEs to save : 13511 TO DO : save crop sub photo not yet done ! save time : 1.6330647468566895 nb_obj : 17 nb_hashtags : 3 time to prepare the origin masks : 6.97144627571106 time for calcul the mask position with numpy : 0.4853360652923584 nb_pixel_total : 6129376 time to create 1 rle with new method : 0.6596405506134033 time for calcul the mask position with numpy : 0.034105777740478516 nb_pixel_total : 37429 time to create 1 rle with old method : 0.041022300720214844 time for calcul the mask position with numpy : 0.02114701271057129 nb_pixel_total : 15464 time to create 1 rle with old method : 0.017614364624023438 time for calcul the mask position with numpy : 0.022548913955688477 nb_pixel_total : 81160 time to create 1 rle with old method : 0.09144091606140137 time for calcul the mask position with numpy : 0.02179694175720215 nb_pixel_total : 37232 time to create 1 rle with old method : 0.04203629493713379 time for calcul the mask position with numpy : 0.023205280303955078 nb_pixel_total : 33214 time to create 1 rle with old method : 0.036960601806640625 time for calcul the mask position with numpy : 0.020676851272583008 nb_pixel_total : 9986 time to create 1 rle with old method : 0.011486530303955078 time for calcul the mask position with numpy : 0.0210263729095459 nb_pixel_total : 11161 time to create 1 rle with old method : 0.012480974197387695 time for calcul the mask position with numpy : 0.022224903106689453 nb_pixel_total : 99247 time to create 1 rle with old method : 0.11531186103820801 time for calcul the mask position with numpy : 0.030157089233398438 nb_pixel_total : 33664 time to create 1 rle with old method : 0.03714799880981445 time for calcul the mask position with numpy : 0.03510427474975586 nb_pixel_total : 96664 time to create 1 rle with old method : 0.1072530746459961 time for calcul the mask position with numpy : 0.03409886360168457 nb_pixel_total : 2796 time to create 1 rle with old method : 0.0032837390899658203 time for calcul the mask position with numpy : 0.036530494689941406 nb_pixel_total : 80492 time to create 1 rle with old method : 0.09035134315490723 time for calcul the mask position with numpy : 0.03912806510925293 nb_pixel_total : 29393 time to create 1 rle with old method : 0.032717227935791016 time for calcul the mask position with numpy : 0.03961658477783203 nb_pixel_total : 100026 time to create 1 rle with old method : 0.1248331069946289 time for calcul the mask position with numpy : 0.03464031219482422 nb_pixel_total : 29560 time to create 1 rle with old method : 0.03342461585998535 time for calcul the mask position with numpy : 0.035887718200683594 nb_pixel_total : 179232 time to create 1 rle with new method : 0.5143001079559326 time for calcul the mask position with numpy : 0.031786203384399414 nb_pixel_total : 44144 time to create 1 rle with old method : 0.051300048828125 create new chi : 3.0710132122039795 time to delete rle : 0.0017244815826416016 batch 1 Loaded 35 chid ids of type : 3594 +++++++++++++++++++++Number RLEs to save : 11701 TO DO : save crop sub photo not yet done ! save time : 0.8281919956207275 nb_obj : 15 nb_hashtags : 2 time to prepare the origin masks : 6.189480781555176 time for calcul the mask position with numpy : 0.474545955657959 nb_pixel_total : 6605185 time to create 1 rle with new method : 0.527019739151001 time for calcul the mask position with numpy : 0.03545188903808594 nb_pixel_total : 36710 time to create 1 rle with old method : 0.040529489517211914 time for calcul the mask position with numpy : 0.020337581634521484 nb_pixel_total : 9900 time to create 1 rle with old method : 0.010977506637573242 time for calcul the mask position with numpy : 0.020925521850585938 nb_pixel_total : 36585 time to create 1 rle with old method : 0.04002737998962402 time for calcul the mask position with numpy : 0.02058553695678711 nb_pixel_total : 13265 time to create 1 rle with old method : 0.014792203903198242 time for calcul the mask position with numpy : 0.02145862579345703 nb_pixel_total : 19590 time to create 1 rle with old method : 0.021739721298217773 time for calcul the mask position with numpy : 0.02049541473388672 nb_pixel_total : 8328 time to create 1 rle with old method : 0.009017229080200195 time for calcul the mask position with numpy : 0.020072460174560547 nb_pixel_total : 54878 time to create 1 rle with old method : 0.06312942504882812 time for calcul the mask position with numpy : 0.034139394760131836 nb_pixel_total : 3989 time to create 1 rle with old method : 0.004410982131958008 time for calcul the mask position with numpy : 0.03455519676208496 nb_pixel_total : 87211 time to create 1 rle with old method : 0.09853196144104004 time for calcul the mask position with numpy : 0.035236358642578125 nb_pixel_total : 62818 time to create 1 rle with old method : 0.09485459327697754 time for calcul the mask position with numpy : 0.03919863700866699 nb_pixel_total : 20274 time to create 1 rle with old method : 0.0228574275970459 time for calcul the mask position with numpy : 0.03515815734863281 nb_pixel_total : 33716 time to create 1 rle with old method : 0.040793418884277344 time for calcul the mask position with numpy : 0.03342080116271973 nb_pixel_total : 33729 time to create 1 rle with old method : 0.04071378707885742 time for calcul the mask position with numpy : 0.026962757110595703 nb_pixel_total : 16507 time to create 1 rle with old method : 0.02619004249572754 time for calcul the mask position with numpy : 0.02088165283203125 nb_pixel_total : 7555 time to create 1 rle with old method : 0.008635997772216797 create new chi : 1.9987308979034424 time to delete rle : 0.0012505054473876953 batch 1 Loaded 31 chid ids of type : 3594 ++++++++++++++++Number RLEs to save : 7861 TO DO : save crop sub photo not yet done ! save time : 0.5000035762786865 nb_obj : 22 nb_hashtags : 3 time to prepare the origin masks : 9.785211563110352 time for calcul the mask position with numpy : 0.3438069820404053 nb_pixel_total : 6218600 time to create 1 rle with new method : 0.7723333835601807 time for calcul the mask position with numpy : 0.022523164749145508 nb_pixel_total : 420 time to create 1 rle with old method : 0.0014472007751464844 time for calcul the mask position with numpy : 0.038781166076660156 nb_pixel_total : 8213 time to create 1 rle with old method : 0.011853694915771484 time for calcul the mask position with numpy : 0.03488302230834961 nb_pixel_total : 14287 time to create 1 rle with old method : 0.015938520431518555 time for calcul the mask position with numpy : 0.03457307815551758 nb_pixel_total : 18742 time to create 1 rle with old method : 0.021265745162963867 time for calcul the mask position with numpy : 0.034065961837768555 nb_pixel_total : 18822 time to create 1 rle with old method : 0.021198034286499023 time for calcul the mask position with numpy : 0.03452873229980469 nb_pixel_total : 25917 time to create 1 rle with old method : 0.029538393020629883 time for calcul the mask position with numpy : 0.03508400917053223 nb_pixel_total : 11459 time to create 1 rle with old method : 0.013104438781738281 time for calcul the mask position with numpy : 0.036717891693115234 nb_pixel_total : 228540 time to create 1 rle with new method : 0.5492010116577148 time for calcul the mask position with numpy : 0.03472638130187988 nb_pixel_total : 19457 time to create 1 rle with old method : 0.022735118865966797 time for calcul the mask position with numpy : 0.025814056396484375 nb_pixel_total : 14928 time to create 1 rle with old method : 0.01694488525390625 time for calcul the mask position with numpy : 0.02225327491760254 nb_pixel_total : 11721 time to create 1 rle with old method : 0.016034841537475586 time for calcul the mask position with numpy : 0.02296137809753418 nb_pixel_total : 30126 time to create 1 rle with old method : 0.034609079360961914 time for calcul the mask position with numpy : 0.026148080825805664 nb_pixel_total : 19641 time to create 1 rle with old method : 0.02417588233947754 time for calcul the mask position with numpy : 0.022368431091308594 nb_pixel_total : 86167 time to create 1 rle with old method : 0.09596753120422363 time for calcul the mask position with numpy : 0.022319555282592773 nb_pixel_total : 10313 time to create 1 rle with old method : 0.0130615234375 time for calcul the mask position with numpy : 0.02497386932373047 nb_pixel_total : 14270 time to create 1 rle with old method : 0.019479990005493164 time for calcul the mask position with numpy : 0.025373458862304688 nb_pixel_total : 40641 time to create 1 rle with old method : 0.0459437370300293 time for calcul the mask position with numpy : 0.02227473258972168 nb_pixel_total : 14362 time to create 1 rle with old method : 0.016112804412841797 time for calcul the mask position with numpy : 0.02243971824645996 nb_pixel_total : 109614 time to create 1 rle with old method : 0.12328553199768066 time for calcul the mask position with numpy : 0.02369093894958496 nb_pixel_total : 22899 time to create 1 rle with old method : 0.02690720558166504 time for calcul the mask position with numpy : 0.02471470832824707 nb_pixel_total : 23751 time to create 1 rle with old method : 0.026586294174194336 time for calcul the mask position with numpy : 0.022561311721801758 nb_pixel_total : 87350 time to create 1 rle with old method : 0.09705805778503418 create new chi : 3.0388808250427246 time to delete rle : 0.0019054412841796875 batch 1 Loaded 45 chid ids of type : 3594 +++++++++++++++++++++++++++Number RLEs to save : 12113 TO DO : save crop sub photo not yet done ! save time : 0.8071401119232178 nb_obj : 20 nb_hashtags : 3 time to prepare the origin masks : 7.78820538520813 time for calcul the mask position with numpy : 0.6038639545440674 nb_pixel_total : 6217110 time to create 1 rle with new method : 0.47019124031066895 time for calcul the mask position with numpy : 0.020463943481445312 nb_pixel_total : 80560 time to create 1 rle with old method : 0.08622503280639648 time for calcul the mask position with numpy : 0.020117998123168945 nb_pixel_total : 17164 time to create 1 rle with old method : 0.018404722213745117 time for calcul the mask position with numpy : 0.02127814292907715 nb_pixel_total : 40322 time to create 1 rle with old method : 0.044248342514038086 time for calcul the mask position with numpy : 0.021474838256835938 nb_pixel_total : 26319 time to create 1 rle with old method : 0.03206062316894531 time for calcul the mask position with numpy : 0.023569345474243164 nb_pixel_total : 25655 time to create 1 rle with old method : 0.027741193771362305 time for calcul the mask position with numpy : 0.021877050399780273 nb_pixel_total : 19469 time to create 1 rle with old method : 0.02164626121520996 time for calcul the mask position with numpy : 0.02161383628845215 nb_pixel_total : 28838 time to create 1 rle with old method : 0.030803203582763672 time for calcul the mask position with numpy : 0.022209644317626953 nb_pixel_total : 35852 time to create 1 rle with old method : 0.03972339630126953 time for calcul the mask position with numpy : 0.022568941116333008 nb_pixel_total : 71086 time to create 1 rle with old method : 0.07719945907592773 time for calcul the mask position with numpy : 0.02292776107788086 nb_pixel_total : 168677 time to create 1 rle with new method : 0.5761072635650635 time for calcul the mask position with numpy : 0.02227306365966797 nb_pixel_total : 76706 time to create 1 rle with old method : 0.08359909057617188 time for calcul the mask position with numpy : 0.02202153205871582 nb_pixel_total : 9569 time to create 1 rle with old method : 0.010833024978637695 time for calcul the mask position with numpy : 0.0226287841796875 nb_pixel_total : 46228 time to create 1 rle with old method : 0.05175638198852539 time for calcul the mask position with numpy : 0.0232393741607666 nb_pixel_total : 41522 time to create 1 rle with old method : 0.045430898666381836 time for calcul the mask position with numpy : 0.02231574058532715 nb_pixel_total : 26691 time to create 1 rle with old method : 0.032282352447509766 time for calcul the mask position with numpy : 0.020806550979614258 nb_pixel_total : 33412 time to create 1 rle with old method : 0.03623175621032715 time for calcul the mask position with numpy : 0.020324230194091797 nb_pixel_total : 31204 time to create 1 rle with old method : 0.034188270568847656 time for calcul the mask position with numpy : 0.03121352195739746 nb_pixel_total : 10346 time to create 1 rle with old method : 0.011326313018798828 time for calcul the mask position with numpy : 0.03233480453491211 nb_pixel_total : 30832 time to create 1 rle with old method : 0.03333282470703125 time for calcul the mask position with numpy : 0.03182721138000488 nb_pixel_total : 12678 time to create 1 rle with old method : 0.01360940933227539 create new chi : 2.9053704738616943 time to delete rle : 0.0018596649169921875 batch 1 Loaded 41 chid ids of type : 3594 +++++++++++++++++++++++Number RLEs to save : 12021 TO DO : save crop sub photo not yet done ! save time : 4.068458318710327 nb_obj : 25 nb_hashtags : 2 time to prepare the origin masks : 7.620522499084473 time for calcul the mask position with numpy : 0.6925876140594482 nb_pixel_total : 6686569 time to create 1 rle with new method : 0.44376516342163086 time for calcul the mask position with numpy : 0.041672468185424805 nb_pixel_total : 10200 time to create 1 rle with old method : 0.013112068176269531 time for calcul the mask position with numpy : 0.02812957763671875 nb_pixel_total : 251 time to create 1 rle with old method : 0.00038361549377441406 time for calcul the mask position with numpy : 0.022588491439819336 nb_pixel_total : 10918 time to create 1 rle with old method : 0.012833356857299805 time for calcul the mask position with numpy : 0.021254301071166992 nb_pixel_total : 6600 time to create 1 rle with old method : 0.007369518280029297 time for calcul the mask position with numpy : 0.024688005447387695 nb_pixel_total : 2550 time to create 1 rle with old method : 0.00415802001953125 time for calcul the mask position with numpy : 0.0250093936920166 nb_pixel_total : 5662 time to create 1 rle with old method : 0.007016658782958984 time for calcul the mask position with numpy : 0.025828123092651367 nb_pixel_total : 26722 time to create 1 rle with old method : 0.03049612045288086 time for calcul the mask position with numpy : 0.02105545997619629 nb_pixel_total : 12037 time to create 1 rle with old method : 0.013341903686523438 time for calcul the mask position with numpy : 0.02115488052368164 nb_pixel_total : 9610 time to create 1 rle with old method : 0.010860443115234375 time for calcul the mask position with numpy : 0.020244598388671875 nb_pixel_total : 9428 time to create 1 rle with old method : 0.011580228805541992 time for calcul the mask position with numpy : 0.02111029624938965 nb_pixel_total : 4451 time to create 1 rle with old method : 0.005081653594970703 time for calcul the mask position with numpy : 0.026766061782836914 nb_pixel_total : 35553 time to create 1 rle with old method : 0.03979778289794922 time for calcul the mask position with numpy : 0.020561933517456055 nb_pixel_total : 8639 time to create 1 rle with old method : 0.00974726676940918 time for calcul the mask position with numpy : 0.02192854881286621 nb_pixel_total : 17204 time to create 1 rle with old method : 0.02229785919189453 time for calcul the mask position with numpy : 0.022232532501220703 nb_pixel_total : 15844 time to create 1 rle with old method : 0.018001794815063477 time for calcul the mask position with numpy : 0.022008895874023438 nb_pixel_total : 39942 time to create 1 rle with old method : 0.04631543159484863 time for calcul the mask position with numpy : 0.021292686462402344 nb_pixel_total : 9501 time to create 1 rle with old method : 0.011299371719360352 time for calcul the mask position with numpy : 0.021561145782470703 nb_pixel_total : 35567 time to create 1 rle with old method : 0.03967642784118652 time for calcul the mask position with numpy : 0.021229028701782227 nb_pixel_total : 19570 time to create 1 rle with old method : 0.023070096969604492 time for calcul the mask position with numpy : 0.022375822067260742 nb_pixel_total : 13525 time to create 1 rle with old method : 0.01530599594116211 time for calcul the mask position with numpy : 0.02097153663635254 nb_pixel_total : 13919 time to create 1 rle with old method : 0.015625953674316406 time for calcul the mask position with numpy : 0.038010597229003906 nb_pixel_total : 18645 time to create 1 rle with old method : 0.025957345962524414 time for calcul the mask position with numpy : 0.03533172607421875 nb_pixel_total : 7202 time to create 1 rle with old method : 0.008168935775756836 time for calcul the mask position with numpy : 0.03537273406982422 nb_pixel_total : 20584 time to create 1 rle with old method : 0.022793292999267578 time for calcul the mask position with numpy : 0.038594961166381836 nb_pixel_total : 9547 time to create 1 rle with old method : 0.010517597198486328 create new chi : 2.2471225261688232 time to delete rle : 0.0014941692352294922 batch 1 Loaded 51 chid ids of type : 3594 +++++++++++++++++++++++++++Number RLEs to save : 9198 TO DO : save crop sub photo not yet done ! save time : 0.8629345893859863 map_output_result : {1349221010: (0.0, 'Should be the crop_list due to order', 0), 1349221002: (0.0, 'Should be the crop_list due to order', 0), 1349220967: (0.0, 'Should be the crop_list due to order', 0), 1349220944: (0.0, 'Should be the crop_list due to order', 0), 1349220354: (0.0, 'Should be the crop_list due to order', 0), 1349220350: (0.0, 'Should be the crop_list due to order', 0), 1349220338: (0.0, 'Should be the crop_list due to order', 0), 1349220330: (0.0, 'Should be the crop_list due to order', 0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1349221010, 1349221002, 1349220967, 1349220944, 1349220354, 1349220350, 1349220338, 1349220330] Looping around the photos to save general results len do output : 8 /1349221010.Didn't retrieve data . /1349221002.Didn't retrieve data . /1349220967.Didn't retrieve data . /1349220944.Didn't retrieve data . /1349220354.Didn't retrieve data . /1349220350.Didn't retrieve data . /1349220338.Didn't retrieve data . /1349220330.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349221010', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349221002', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220967', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220944', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220354', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220350', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220338', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220330', None, None, None, None, None, '2712765') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 24 time used for this insertion : 0.22123408317565918 save_final save missing photos in datou_result : time spend for datou_step_exec : 95.90920519828796 time spend to save output : 0.2217404842376709 total time spend for step 3 : 96.13094568252563 step4:ventilate_hashtags_in_portfolio Tue Apr 1 12:05:12 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 21943741 get user id for portfolio 21943741 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=21943741 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('autre','metal','pehd','mal_croppe','carton','pet_clair','environnement','background','pet_fonce','papier','flou')) AND mptpi.`min_score`=0.5 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=21943741 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('autre','metal','pehd','mal_croppe','carton','pet_clair','environnement','background','pet_fonce','papier','flou')) AND mptpi.`min_score`=0.5 To do Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=21943741 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('autre','metal','pehd','mal_croppe','carton','pet_clair','environnement','background','pet_fonce','papier','flou')) AND mptpi.`min_score`=0.5 To do lien utilise dans velours : https://www.fotonower.com/velours/21943925,21943926,21943927,21943928,21943929,21943930,21943931,21943932,21943933,21943934,21943935?tags=autre,metal,pehd,mal_croppe,carton,pet_clair,environnement,background,pet_fonce,papier,flou Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1349221010, 1349221002, 1349220967, 1349220944, 1349220354, 1349220350, 1349220338, 1349220330] Looping around the photos to save general results len do output : 1 /21943741. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349221010', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349221002', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220967', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220944', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220354', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220350', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220338', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220330', None, None, None, None, None, '2712765') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 9 time used for this insertion : 0.015542030334472656 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.697037696838379 time spend to save output : 0.01577019691467285 total time spend for step 4 : 1.7128078937530518 step5:final Tue Apr 1 12:05:14 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Catched exception ! Connect or reconnect ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1349221010: ('0.10793116617306646',), 1349221002: ('0.10793116617306646',), 1349220967: ('0.10793116617306646',), 1349220944: ('0.10793116617306646',), 1349220354: ('0.10793116617306646',), 1349220350: ('0.10793116617306646',), 1349220338: ('0.10793116617306646',), 1349220330: ('0.10793116617306646',)} new output for save of step final : {1349221010: ('0.10793116617306646',), 1349221002: ('0.10793116617306646',), 1349220967: ('0.10793116617306646',), 1349220944: ('0.10793116617306646',), 1349220354: ('0.10793116617306646',), 1349220350: ('0.10793116617306646',), 1349220338: ('0.10793116617306646',), 1349220330: ('0.10793116617306646',)} [1349221010, 1349221002, 1349220967, 1349220944, 1349220354, 1349220350, 1349220338, 1349220330] Looping around the photos to save general results len do output : 8 /1349221010.Didn't retrieve data . /1349221002.Didn't retrieve data . /1349220967.Didn't retrieve data . /1349220944.Didn't retrieve data . /1349220354.Didn't retrieve data . /1349220350.Didn't retrieve data . /1349220338.Didn't retrieve data . /1349220330.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349221010', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349221002', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220967', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220944', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220354', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220350', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220338', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220330', None, None, None, None, None, '2712765') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 24 time used for this insertion : 0.012455224990844727 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.10402107238769531 time spend to save output : 0.01283717155456543 total time spend for step 5 : 0.11685824394226074 step6:blur_detection Tue Apr 1 12:05:14 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection methode: ratio et variance treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33.jpg resize: (2160, 3264) 1349221010 -3.0587747755477532 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602.jpg resize: (2160, 3264) 1349221002 -3.332728741005224 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d.jpg resize: (2160, 3264) 1349220967 0.7019138776144819 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604.jpg resize: (2160, 3264) 1349220944 -6.270105350744483 treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c.jpg resize: (2160, 3264) 1349220354 -2.8564958207661935 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca.jpg resize: (2160, 3264) 1349220350 -2.825570106317873 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690.jpg resize: (2160, 3264) 1349220338 -2.1068937022931418 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8.jpg resize: (2160, 3264) 1349220330 -2.900640462407737 treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660738_0.png resize: (410, 69) 1349262433 -0.4110095439579742 treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660740_0.png resize: (145, 137) 1349262434 -2.2022023927450824 treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660736_0.png resize: (120, 128) 1349262436 -1.1640908056036054 treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660734_0.png resize: (107, 197) 1349262438 -2.7638556021275464 treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660731_0.png resize: (221, 236) 1349262439 -1.2431743168679381 treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660732_0.png resize: (277, 328) 1349262440 -2.293787362737269 treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660741_0.png resize: (229, 502) 1349262443 0.07608162043661569 treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660735_0.png resize: (352, 197) 1349262444 -3.5877153421776655 treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660739_0.png resize: (226, 62) 1349262445 -2.3208300299145206 treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660742_0.png resize: (302, 497) 1349262448 -2.9653178724691887 treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660737_0.png resize: (528, 263) 1349262449 -2.1389032360623528 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660765_0.png resize: (370, 406) 1349262450 -1.6494934757465258 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660760_0.png resize: (423, 265) 1349262453 -1.9717112230730496 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660757_0.png resize: (306, 454) 1349262454 -2.226020290551659 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660749_0.png resize: (214, 125) 1349262455 -1.8001498017193787 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660750_0.png resize: (140, 357) 1349262458 -1.3302411128717964 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660748_0.png resize: (315, 98) 1349262459 -2.3164892815052984 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660761_0.png resize: (263, 152) 1349262460 -2.8346903416633378 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660759_0.png resize: (401, 397) 1349262461 -1.915502547511707 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660768_0.png resize: (179, 256) 1349262464 -3.9133778496236324 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660763_0.png resize: (210, 230) 1349262465 -1.8443726916995027 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660758_0.png resize: (330, 134) 1349262466 -3.1801731986157273 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660753_0.png resize: (165, 75) 1349262469 -3.23362949308129 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660772_0.png resize: (342, 119) 1349262470 -1.0974747927234927 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660777_0.png resize: (293, 211) 1349262471 -1.8862933918030427 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660775_0.png resize: (204, 164) 1349262474 -1.5303143081584305 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660771_0.png resize: (479, 619) 1349262475 -1.2836970054165813 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660783_0.png resize: (129, 228) 1349262476 0.6005351683045482 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660791_0.png resize: (192, 95) 1349262478 -1.9322232434056028 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660773_0.png resize: (71, 192) 1349262480 0.9836559534645181 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660781_0.png resize: (237, 411) 1349262481 1.384049215510134 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660769_0.png resize: (410, 203) 1349262482 0.6952997142790126 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660780_0.png resize: (74, 56) 1349262485 2.137792218681067 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660779_0.png resize: (62, 58) 1349262486 0.2022717501905281 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660790_0.png resize: (131, 103) 1349262487 -0.6210649434083171 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660786_0.png resize: (238, 136) 1349262490 -1.0395888190863933 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660785_0.png resize: (181, 172) 1349262491 2.4544561194270926 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660774_0.png resize: (268, 156) 1349262492 -0.9910324668158711 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660770_0.png resize: (455, 595) 1349262495 -0.6432686785446692 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660797_0.png resize: (380, 110) 1349262496 -2.709086314668988 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660809_0.png resize: (206, 308) 1349262497 -4.379119074809651 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660808_0.png resize: (223, 176) 1349262498 -3.325169825082265 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660803_0.png resize: (199, 98) 1349262501 -3.665126314476131 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660796_0.png resize: (316, 485) 1349262502 -4.162627478749604 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660804_0.png resize: (77, 184) 1349262504 -2.9629008401005428 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660806_0.png resize: (180, 264) 1349262507 -4.0195079755794545 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660793_0.png resize: (427, 202) 1349262508 -4.682576558472881 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660799_0.png resize: (78, 48) 1349262509 -3.006704712025713 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660801_0.png resize: (257, 173) 1349262510 -5.885828382175497 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660795_0.png resize: (270, 145) 1349262513 -5.176894093933642 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660794_0.png resize: (416, 592) 1349262514 -3.8712293472856047 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660798_0.png resize: (329, 350) 1349262515 -3.091700275191933 treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660814_0.png resize: (160, 166) 1349262518 -1.438440044279717 treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660811_0.png resize: (125, 160) 1349262519 1.0291624303056206 treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660822_0.png resize: (173, 290) 1349262520 -1.1110881828528492 treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660810_0.png resize: (101, 164) 1349262521 -2.350222293070039 treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660815_0.png resize: (344, 298) 1349262524 -2.6793695047147796 treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660819_0.png resize: (125, 95) 1349262525 -0.9486263290565933 treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660818_0.png resize: (257, 377) 1349262526 -1.9648903912167206 treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660821_0.png resize: (147, 119) 1349262529 -0.983188452757684 treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660823_0.png resize: (123, 105) 1349262530 -3.2264716528880486 treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660817_0.png resize: (165, 121) 1349262531 -0.15767987451934332 treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660813_0.png resize: (165, 249) 1349262533 -0.5753113336821822 treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660824_0.png resize: (185, 276) 1349262535 -2.3864865165839833 treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660812_0.png resize: (268, 215) 1349262536 -1.532977587140814 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660825_0.png resize: (300, 429) 1349262538 -1.5437604390571706 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660826_0.png resize: (235, 149) 1349262541 -2.685741503109834 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660834_0.png resize: (191, 179) 1349262542 -1.8249609636777382 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660837_0.png resize: (155, 133) 1349262543 -1.5889071009064153 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660842_0.png resize: (168, 158) 1349262547 -2.6530590462101604 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660836_0.png resize: (129, 112) 1349262548 -0.6407426113429292 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660831_0.png resize: (122, 185) 1349262549 -0.6160738283423263 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660844_0.png resize: (130, 159) 1349262552 -1.4830100365605312 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660845_0.png resize: (112, 114) 1349262553 -1.3496437822443226 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660846_0.png resize: (520, 663) 1349262554 -0.3102781132717209 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660838_0.png resize: (175, 203) 1349262556 -1.2105828096820102 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660828_0.png resize: (396, 521) 1349262558 -2.143957535818039 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660835_0.png resize: (214, 227) 1349262559 -1.9323701552446015 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660843_0.png resize: (279, 112) 1349262561 -1.2484869624938484 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660840_0.png resize: (96, 199) 1349262563 -3.621688221080657 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660841_0.png resize: (174, 208) 1349262564 -2.8465511376007115 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660827_0.png resize: (187, 220) 1349262565 -1.9749394112419087 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660830_0.png resize: (311, 172) 1349262567 -2.3759097144482073 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660851_0.png resize: (270, 164) 1349262569 -1.5434546871856738 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660863_0.png resize: (252, 184) 1349262570 -2.031742084320262 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660849_0.png resize: (137, 109) 1349262572 -1.1146598005666888 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660858_0.png resize: (368, 283) 1349262574 -2.0932899226375743 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660860_0.png resize: (194, 209) 1349262575 -2.112065700603717 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660866_0.png resize: (237, 527) 1349262576 -1.71797197752268 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660859_0.png resize: (310, 145) 1349262578 -0.9980679182105439 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660861_0.png resize: (170, 149) 1349262580 0.11537693831527054 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660848_0.png resize: (220, 217) 1349262581 -1.6961187706430683 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660855_0.png resize: (127, 97) 1349262584 -2.4971774277850587 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660850_0.png resize: (227, 197) 1349262585 -2.2018507957096722 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660854_0.png resize: (308, 187) 1349262586 -1.3268925740348152 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660847_0.png resize: (171, 106) 1349262588 0.2224329127841499 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660857_0.png resize: (508, 483) 1349262590 -2.192692565587048 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660875_0.png resize: (107, 145) 1349262591 -2.5483110324424545 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660876_0.png resize: (217, 273) 1349262593 -3.5277427830446926 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660870_0.png resize: (182, 137) 1349262595 -1.7650677716237129 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660877_0.png resize: (146, 148) 1349262596 -2.054979396297492 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660878_0.png resize: (125, 185) 1349262598 -1.6488562445162562 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660874_0.png resize: (302, 187) 1349262600 -2.674839377049805 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660882_0.png resize: (147, 95) 1349262601 -2.154327943955473 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660879_0.png resize: (119, 118) 1349262603 -2.654034577774616 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660872_0.png resize: (117, 172) 1349262604 -2.949872780364347 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660889_0.png resize: (104, 146) 1349262606 -3.028649362755863 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660868_0.png resize: (192, 157) 1349262608 -2.4347460398532443 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660873_0.png resize: (112, 256) 1349262609 -3.111614769970924 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660885_0.png resize: (132, 286) 1349262611 -3.183707887455837 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660881_0.png resize: (92, 84) 1349262612 -3.194373875792363 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660890_0.png resize: (100, 107) 1349262614 -2.75989456324441 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660883_0.png resize: (96, 180) 1349262615 -2.6144799575114286 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660888_0.png resize: (98, 75) 1349262618 0.754078866916133 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660867_0.png resize: (102, 143) 1349262619 -2.963893604966956 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660869_0.png resize: (74, 146) 1349262620 -1.741894893558151 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660755_0.png resize: (90, 237) 1349262694 -3.789999083811913 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660788_0.png resize: (230, 202) 1349262695 -1.0176540550761586 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660805_0.png resize: (226, 200) 1349262698 -2.876915886148385 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660832_0.png resize: (92, 151) 1349262699 -1.7480193701956113 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660829_0.png resize: (151, 149) 1349262701 -0.6579449563789623 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660853_0.png resize: (261, 264) 1349262702 -0.2889056745852016 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660852_0.png resize: (163, 246) 1349262704 -0.874545477654934 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660856_0.png resize: (315, 336) 1349262706 -3.505347561807951 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660862_0.png resize: (181, 187) 1349262707 -0.02103245979354646 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660864_0.png resize: (137, 392) 1349262709 -0.09831528906453578 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660871_0.png resize: (104, 176) 1349262710 -1.0920434669015755 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660886_0.png resize: (126, 67) 1349262712 -3.4207160032107375 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660884_0.png resize: (202, 67) 1349262713 -1.8512682515879644 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660880_0.png resize: (248, 297) 1349262716 -2.6771161085326156 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660891_0.png resize: (137, 111) 1349262717 -3.2064612371206915 treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660887_0.png resize: (65, 57) 1349262718 -1.6242023745082965 treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660745_0.png resize: (257, 402) 1349263009 -2.445302242148374 treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660744_0.png resize: (664, 280) 1349263011 -1.614205701288956 treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660746_0.png resize: (171, 495) 1349263017 0.29596202240766817 treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660733_0.png resize: (418, 353) 1349263023 -1.0945073305258655 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660754_0.png resize: (282, 134) 1349263025 -4.463883778240142 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660767_0.png resize: (261, 219) 1349263027 -2.4040238109155485 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660762_0.png resize: (226, 301) 1349263031 -3.055788844534466 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660747_0.png resize: (223, 503) 1349263032 -2.1888738162820065 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660752_0.png resize: (159, 200) 1349263033 -4.2736548432293695 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660751_0.png resize: (174, 330) 1349263036 -2.2185584850278564 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660756_0.png resize: (210, 318) 1349263037 -3.5941815476815933 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660766_0.png resize: (473, 290) 1349263038 -0.564085755327847 treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660764_0.png resize: (160, 100) 1349263041 -0.3016752069216288 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660778_0.png resize: (236, 427) 1349263042 1.2851679818749875 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660782_0.png resize: (249, 101) 1349263044 -0.21658579386556737 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660784_0.png resize: (117, 130) 1349263045 1.6498375758135653 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660789_0.png resize: (261, 161) 1349263048 -0.22458046500697887 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660776_0.png resize: (361, 459) 1349263049 -1.814519733211123 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660787_0.png resize: (361, 416) 1349263051 -1.3438311896363755 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660807_0.png resize: (241, 443) 1349263053 -1.816003727513038 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660802_0.png resize: (361, 455) 1349263054 -5.302719101406412 treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660800_0.png resize: (342, 406) 1349263055 -5.3979113992802885 treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660820_0.png resize: (151, 186) 1349263057 -3.1014866018491882 treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660816_0.png resize: (256, 478) 1349263059 -2.477480485784554 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660833_0.png resize: (330, 328) 1349263062 -2.2731312475966337 treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660839_0.png resize: (561, 718) 1349263065 -0.9948576675290143 treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660865_0.png resize: (140, 169) 1349263067 -5.097626015810291 treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660743_0.png resize: (148, 167) 1349263089 -1.585714802683557 treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660792_0.png resize: (92, 131) 1349263109 -0.761602256170386 Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 169 time used for this insertion : 0.019052743911743164 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 169 time used for this insertion : 0.03209090232849121 save missing photos in datou_result : time spend for datou_step_exec : 30.611902236938477 time spend to save output : 0.056386470794677734 total time spend for step 6 : 30.668288707733154 step7:brightness Tue Apr 1 12:05:45 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33.jpg treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602.jpg treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d.jpg treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604.jpg treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c.jpg treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca.jpg treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690.jpg treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8.jpg treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660738_0.png treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660740_0.png treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660736_0.png treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660734_0.png treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660731_0.png treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660732_0.png treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660741_0.png treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660735_0.png treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660739_0.png treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660742_0.png treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660737_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660765_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660760_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660757_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660749_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660750_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660748_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660761_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660759_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660768_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660763_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660758_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660753_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660772_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660777_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660775_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660771_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660783_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660791_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660773_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660781_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660769_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660780_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660779_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660790_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660786_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660785_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660774_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660770_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660797_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660809_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660808_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660803_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660796_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660804_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660806_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660793_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660799_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660801_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660795_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660794_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660798_0.png treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660814_0.png treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660811_0.png treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660822_0.png treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660810_0.png treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660815_0.png treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660819_0.png treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660818_0.png treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660821_0.png treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660823_0.png treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660817_0.png treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660813_0.png treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660824_0.png treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660812_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660825_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660826_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660834_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660837_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660842_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660836_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660831_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660844_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660845_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660846_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660838_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660828_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660835_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660843_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660840_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660841_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660827_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660830_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660851_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660863_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660849_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660858_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660860_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660866_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660859_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660861_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660848_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660855_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660850_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660854_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660847_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660857_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660875_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660876_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660870_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660877_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660878_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660874_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660882_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660879_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660872_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660889_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660868_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660873_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660885_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660881_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660890_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660883_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660888_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660867_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660869_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660755_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660788_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660805_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660832_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660829_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660853_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660852_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660856_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660862_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660864_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660871_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660886_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660884_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660880_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660891_0.png treat image : temp/1743501630_491570_1349220330_d08ae57d7f822b7ad827f6b739c376f8_rle_crop_3742660887_0.png treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660745_0.png treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660744_0.png treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660746_0.png treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660733_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660754_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660767_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660762_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660747_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660752_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660751_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660756_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660766_0.png treat image : temp/1743501630_491570_1349221002_ec149d90cc31d2bd12bf56db59f1f602_rle_crop_3742660764_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660778_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660782_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660784_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660789_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660776_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660787_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660807_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660802_0.png treat image : temp/1743501630_491570_1349220944_e39e101c0969c0f43403fe531eed2604_rle_crop_3742660800_0.png treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660820_0.png treat image : temp/1743501630_491570_1349220354_3a373627413c41377253fae775c4fa1c_rle_crop_3742660816_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660833_0.png treat image : temp/1743501630_491570_1349220350_c9c912a602a71aba036214c6501f05ca_rle_crop_3742660839_0.png treat image : temp/1743501630_491570_1349220338_c58a643d588d5c17b30a711bc1dad690_rle_crop_3742660865_0.png treat image : temp/1743501630_491570_1349221010_d195529bed81603d49ec9fd902c95b33_rle_crop_3742660743_0.png treat image : temp/1743501630_491570_1349220967_0217a1d7fa385129c27c741f3b27422d_rle_crop_3742660792_0.png Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 169 time used for this insertion : 0.21611523628234863 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 169 time used for this insertion : 0.03023982048034668 save missing photos in datou_result : time spend for datou_step_exec : 7.354450941085815 time spend to save output : 0.2514338493347168 total time spend for step 7 : 7.605884790420532 step8:velours_tree Tue Apr 1 12:05:52 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure can't find the photo_desc_type Inside saveOutput : final : False verbose : 0 ouput is None No outpout to save, returning out of save general time spend for datou_step_exec : 0.09474968910217285 time spend to save output : 4.982948303222656e-05 total time spend for step 8 : 0.09479951858520508 step9:send_mail_cod Tue Apr 1 12:05:53 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin/workarea/git/Velours/python in order to get the selector url, please entre the license of selector results_Auto_P21943741_01-04-2025_12_05_53.pdf 21943925 change filename to text .imagette219439251743501953 21943926 imagette219439261743501953 21943927 imagette219439271743501953 21943928 imagette219439281743501953 21943929 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette219439291743501953 21943930 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette219439301743501954 21943932 imagette219439321743501955 21943933 change filename to text .imagette219439331743501955 21943934 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette219439341743501956 21943935 imagette219439351743501957 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=21943741 and hashtag_type = 3594 and pcr.hashtag_id = h.hashtag_id; velour_link : https://www.fotonower.com/velours/21943925,21943926,21943927,21943928,21943929,21943930,21943931,21943932,21943933,21943934,21943935?tags=autre,metal,pehd,mal_croppe,carton,pet_clair,environnement,background,pet_fonce,papier,flou args[1349221010] : ((1349221010, -3.0587747755477532, 492609224), (1349221010, -0.12613090831379253, 496442774), '0.10793116617306646') We are sending mail with results at report@fotonower.com args[1349221002] : ((1349221002, -3.332728741005224, 492609224), (1349221002, -0.11634249288535638, 496442774), '0.10793116617306646') We are sending mail with results at report@fotonower.com args[1349220967] : ((1349220967, 0.7019138776144819, 492688767), (1349220967, 0.21482270802831457, 2107752395), '0.10793116617306646') We are sending mail with results at report@fotonower.com args[1349220944] : ((1349220944, -6.270105350744483, 492609224), (1349220944, 0.1251817269129396, 2107752395), '0.10793116617306646') We are sending mail with results at report@fotonower.com args[1349220354] : ((1349220354, -2.8564958207661935, 492609224), (1349220354, -0.14696940148488116, 496442774), '0.10793116617306646') We are sending mail with results at report@fotonower.com args[1349220350] : ((1349220350, -2.825570106317873, 492609224), (1349220350, -0.10387728672532069, 496442774), '0.10793116617306646') We are sending mail with results at report@fotonower.com args[1349220338] : ((1349220338, -2.1068937022931418, 492609224), (1349220338, -0.3846777688943911, 496442774), '0.10793116617306646') We are sending mail with results at report@fotonower.com args[1349220330] : ((1349220330, -2.900640462407737, 492609224), (1349220330, -0.6208420763100734, 501862349), '0.10793116617306646') We are sending mail with results at report@fotonower.com refus_total : 0.10793116617306646 2022-04-13 10:29:59 0 SELECT ph.photo_id,ph.url,ph.username,ph.uploaded_at,ph.text FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=21943741 AND mpp.hide_status=0 ORDER BY mpp.order LIMIT 0, 1000 SELECT photo_id, url FROM MTRBack.photos ph WHERE photo_id IN (1349221002,1349220330,1349220338,1349220350,1349220354,1349220944,1349220967,1349221010) Found this number of photos: 8 begin to download photo : 1349221002 begin to download photo : 1349220338 begin to download photo : 1349220354 begin to download photo : 1349220967 download finish for photo 1349220338 begin to download photo : 1349220350 download finish for photo 1349220967 begin to download photo : 1349221010 download finish for photo 1349221002 begin to download photo : 1349220330 download finish for photo 1349220354 begin to download photo : 1349220944 download finish for photo 1349221010 download finish for photo 1349220350 download finish for photo 1349220330 download finish for photo 1349220944 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P21943741_01-04-2025_12_05_53.pdf results_Auto_P21943741_01-04-2025_12_05_53.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P21943741_01-04-2025_12_05_53.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('3318','21943741','results_Auto_P21943741_01-04-2025_12_05_53.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P21943741_01-04-2025_12_05_53.pdf','pdf','','0.56','0.10793116617306646') message_in_mail: Bonjour,
Veuillez trouver ci dessous les résultats du service carac on demand pour le portfolio: https://www.fotonower.com/view/21943741

https://www.fotonower.com/image?json=false&list_photos_id=1349221010
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1349221002
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1349220967
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1349220944
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1349220354
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1349220350
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1349220338
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1349220330
Bravo, la photo est bien prise.

Dans ces conditions,le taux de refus est: 10.79%
Veuillez trouver les photos des contaminants.

exemples de contaminants: autre: https://www.fotonower.com/view/21943925?limit=200
exemples de contaminants: carton: https://www.fotonower.com/view/21943929?limit=200
exemples de contaminants: pet_clair: https://www.fotonower.com/view/21943930?limit=200
exemples de contaminants: pet_fonce: https://www.fotonower.com/view/21943933?limit=200
exemples de contaminants: papier: https://www.fotonower.com/view/21943934?limit=200
Veuillez trouver le rapport en pdf:https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P21943741_01-04-2025_12_05_53.pdf.

Lien vers velours :https://www.fotonower.com/velours/21943925,21943926,21943927,21943928,21943929,21943930,21943931,21943932,21943933,21943934,21943935?tags=autre,metal,pehd,mal_croppe,carton,pet_clair,environnement,background,pet_fonce,papier,flou.


L'équipe Fotonower 202 b'' Server: nginx Date: Tue, 01 Apr 2025 10:06:00 GMT Content-Length: 0 Connection: close X-Message-Id: aQ5ckgDQRCCBmLWCB_xMcg Access-Control-Allow-Origin: https://sendgrid.api-docs.io Access-Control-Allow-Methods: POST Access-Control-Allow-Headers: Authorization, Content-Type, On-behalf-of, x-sg-elas-acl Access-Control-Max-Age: 600 X-No-CORS-Reason: https://sendgrid.com/docs/Classroom/Basics/API/cors.html Strict-Transport-Security: max-age=31536000; includeSubDomains Content-Security-Policy: frame-ancestors 'none' Cache-Control: no-cache X-Content-Type-Options: no-sniff Referrer-Policy: strict-origin-when-cross-origin Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1349221010, 1349221002, 1349220967, 1349220944, 1349220354, 1349220350, 1349220338, 1349220330] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349221010', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349221002', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220967', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220944', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220354', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220350', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220338', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220330', None, None, None, None, None, '2712765') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 8 time used for this insertion : 0.012914657592773438 save_final save missing photos in datou_result : time spend for datou_step_exec : 7.627970457077026 time spend to save output : 0.013174295425415039 total time spend for step 9 : 7.641144752502441 step10:split_time_score Tue Apr 1 12:06:00 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score Catched exception ! Connect or reconnect ! TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('07', 8),) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 01042025 21943741 Nombre de photos uploadées : 8 / 23040 (0%) 01042025 21943741 Nombre de photos taguées (types de déchets): 0 / 8 (0%) 01042025 21943741 Nombre de photos taguées (volume) : 0 / 8 (0%) elapsed_time : load_data_split_time_score 1.6689300537109375e-06 elapsed_time : order_list_meta_photo_and_scores 5.0067901611328125e-06 ???????? elapsed_time : fill_and_build_computed_from_old_data 0.0004944801330566406 elapsed_time : insert_dashboard_record_day_entry 0.022923946380615234 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.19774587565145885 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P21941555_01-04-2025_10_34_21.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 21941555 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=21941555 AND mptpi.`type`=3594 To do Qualite : 0.10793116617306646 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P21943741_01-04-2025_12_05_53.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 21943741 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=21943741 AND mptpi.`type`=3594 To do Qualite : 0.06586668547792947 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P21943744_01-04-2025_12_02_21.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 21943744 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11449 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 11452 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Step 11452 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11453 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11453 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11454 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11478 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11478 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11456 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11455 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11455 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11458 send_mail_cod have less inputs used (3) than in the step definition (5) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 11449 doesn't seem to be define in the database( WARNING : type of input 2 of step 11452 doesn't seem to be define in the database( WARNING : output 1 of step 11449 have datatype=2 whereas input 1 of step 11453 have datatype=7 WARNING : type of output 2 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11454 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 11453 doesn't seem to be define in the database( WARNING : type of input 1 of step 11456 doesn't seem to be define in the database( WARNING : type of output 1 of step 11456 doesn't seem to be define in the database( WARNING : type of input 3 of step 11455 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11456 have datatype=10 whereas input 3 of step 11458 have datatype=6 WARNING : type of input 5 of step 11458 doesn't seem to be define in the database( WARNING : output 0 of step 11477 have datatype=11 whereas input 5 of step 11458 have datatype=None WARNING : output 0 of step 11456 have datatype=10 whereas input 0 of step 11477 have datatype=18 WARNING : type of input 2 of step 11478 doesn't seem to be define in the database( WARNING : output 1 of step 11454 have datatype=7 whereas input 2 of step 11478 have datatype=None WARNING : type of output 3 of step 11478 doesn't seem to be define in the database( WARNING : type of input 2 of step 11456 doesn't seem to be define in the database( WARNING : output 0 of step 11453 have datatype=1 whereas input 0 of step 11454 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=21943744 AND mptpi.`type`=3726 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'01042025': {'nb_upload': 8, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1349221010, 1349221002, 1349220967, 1349220944, 1349220354, 1349220350, 1349220338, 1349220330] Looping around the photos to save general results len do output : 1 /21943741Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349221010', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349221002', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220967', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220944', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220354', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220350', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220338', None, None, None, None, None, '2712765') ('3318', None, None, None, None, None, None, None, '2712765') ('3318', '21943741', '1349220330', None, None, None, None, None, '2712765') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 9 time used for this insertion : 0.3633604049682617 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.3661038875579834 time spend to save output : 0.36362552642822266 total time spend for step 10 : 0.729729413986206 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 8 set_done_treatment 130.43user 126.93system 5:35.08elapsed 76%CPU (0avgtext+0avgdata 3021720maxresident)k 1395376inputs+93896outputs (26158major+13596000minor)pagefaults 0swaps