python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 3318 ' -s datou_3318 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 2691908 load datou : 3318 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? [(photo_id, hashtag_id, hashtag_type, x0, x1, y0, y1, score, seg_temp, polygons), ...] was removed should we ? chemin de la photo was removed should we ? [ (photo_id_loc, hashtag_id, hashtag_type, x0, x1, y0, y1, score, None), ...] was removed should we ? chemin de la photo was removed should we ? id de la photo (peut être local ou global) was removed should we ? chemin de la photo was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? None was removed should we ? donnée sous forme de nombre was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) load thcls load THCL from format json or kwargs add thcl : 2847 in CacheModelConfig load pdts add pdt : 5275 in CacheModelConfig Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 3318, datou_cur_ids : ['3410775'] with mtr_portfolio_ids : ['25543287'] and first list_photo_ids : [] new path : /proc/2691908/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : mask_detect, crop_condition, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, velours_tree, send_mail_cod, split_time_score over limit max, limiting to limit_max 40 list_input_json : [] origin We have 1 , BFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 26 ; length of list_pids : 26 ; length of list_args : 26 time to download the photos : 3.4673101902008057 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 10 step1:mask_detect Thu Jul 31 14:10:32 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 8606 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-07-31 14:10:35.382674: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-07-31 14:10:35.407397: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493035000 Hz 2025-07-31 14:10:35.409481: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f71fc000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-07-31 14:10:35.409517: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-07-31 14:10:35.413297: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-07-31 14:10:35.543625: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x2651c450 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-07-31 14:10:35.543672: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-07-31 14:10:35.545093: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-07-31 14:10:35.545475: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-31 14:10:35.548393: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-31 14:10:35.551233: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-07-31 14:10:35.551713: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-07-31 14:10:35.554357: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-07-31 14:10:35.555608: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-07-31 14:10:35.560766: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-31 14:10:35.562435: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-07-31 14:10:35.562519: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-31 14:10:35.563467: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-07-31 14:10:35.563497: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-07-31 14:10:35.563509: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-07-31 14:10:35.564911: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 7947 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-07-31 14:10:35.818181: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-07-31 14:10:35.818283: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-31 14:10:35.818311: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-31 14:10:35.818336: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-07-31 14:10:35.818361: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-07-31 14:10:35.818386: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-07-31 14:10:35.818412: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-07-31 14:10:35.818437: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-31 14:10:35.820412: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-07-31 14:10:35.821552: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-07-31 14:10:35.821586: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-07-31 14:10:35.821601: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-31 14:10:35.821615: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-07-31 14:10:35.821629: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-07-31 14:10:35.821643: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-07-31 14:10:35.821656: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-07-31 14:10:35.821670: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-31 14:10:35.822822: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-07-31 14:10:35.822848: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-07-31 14:10:35.822856: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-07-31 14:10:35.822863: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-07-31 14:10:35.824113: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 7947 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl2847 thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) {'thcl': {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'], 'list_hashtags_csv': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'svm_hashtag_type_desc': 5275, 'photo_desc_type': 5275, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME learn_RUBBIA_REFUS_AMIENS_23 NUM_CLASSES 9 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : learn_RUBBIA_REFUS_AMIENS_23 model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-07-31 14:10:44.587824: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-07-31 14:10:44.804845: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-07-31 14:10:46.156975: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:46.157713: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 3.60G (3865470464 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:46.158319: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 3.24G (3478923264 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:46.158938: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 2.92G (3131030784 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:46.899113: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:46.899240: W tensorflow/core/common_runtime/bfc_allocator.cc:311] Garbage collection: deallocate free memory regions (i.e., allocations) so that we can re-allocate a larger region to avoid OOM due to memory fragmentation. If you see this message frequently, you are running near the threshold of the available device memory and re-allocation may incur great performance overhead. You may try smaller batch sizes to observe the performance impact. Set TF_ENABLE_GPU_GARBAGE_COLLECTION=false if you'd like to disable this feature. 2025-07-31 14:10:46.963048: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:46.963222: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.67GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:10:46.964551: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:46.964601: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.67GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:10:46.973350: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:46.973446: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 3.29GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:10:46.974361: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:46.974395: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 3.29GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:10:46.981765: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:46.981836: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 1.78GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:10:46.982546: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:46.982570: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 1.78GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:10:47.031658: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.031727: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 63.85MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:10:47.032433: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.032457: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 63.85MiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:10:47.033167: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.033189: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.26GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:10:47.033873: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.033895: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.26GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-07-31 14:10:47.041974: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.042664: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.058310: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.059305: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.060211: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.061205: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.065808: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.066643: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.067504: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.068283: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.069369: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.069386: W tensorflow/core/kernels/gpu_utils.cc:49] Failed to allocate memory for convolution redzone checking; skipping this check. This is benign and only means that we won't check cudnn for out-of-bounds reads and writes. This message will only be printed once. 2025-07-31 14:10:47.079607: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.080271: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.089332: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.090010: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.090703: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.091388: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.092095: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-07-31 14:10:47.092767: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory local folder : /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23 /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23/mask_model.h5 size_local : 256009536 size in s3 : 256009536 create time local : 2021-08-09 09:43:22 create time in s3 : 2021-08-06 18:54:04 mask_model.h5 already exist and didn't need to update list_images length : 26 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 34.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 8 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 30.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 9 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 13 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 25.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 9 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 23.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 23.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 9 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 12.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 23.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 14 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 30.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 9 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 21.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 11 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 30.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 12 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 35.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 30.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 12 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 26.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 20.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 20.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 31.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 22.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 25.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 11 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 29.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 5 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 14.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 27.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 41.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 12 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 25.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 11 Detection mask done ! Trying to reset tf kernel 2692649 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 3789 tf kernel not reseted sub process len(results) : 26 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 26 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 4982 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl2847 Catched exception ! Connect or reconnect ! thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] time for calcul the mask position with numpy : 0.0008137226104736328 nb_pixel_total : 39059 time to create 1 rle with old method : 0.04375147819519043 length of segment : 307 time for calcul the mask position with numpy : 0.0015897750854492188 nb_pixel_total : 89391 time to create 1 rle with old method : 0.10420966148376465 length of segment : 564 time for calcul the mask position with numpy : 0.0002570152282714844 nb_pixel_total : 14243 time to create 1 rle with old method : 0.01620197296142578 length of segment : 170 time for calcul the mask position with numpy : 0.00022935867309570312 nb_pixel_total : 11981 time to create 1 rle with old method : 0.013843536376953125 length of segment : 188 time for calcul the mask position with numpy : 0.0001900196075439453 nb_pixel_total : 11742 time to create 1 rle with old method : 0.013175725936889648 length of segment : 148 time for calcul the mask position with numpy : 0.0003249645233154297 nb_pixel_total : 20612 time to create 1 rle with old method : 0.02291417121887207 length of segment : 212 time for calcul the mask position with numpy : 0.0002033710479736328 nb_pixel_total : 8338 time to create 1 rle with old method : 0.009740114212036133 length of segment : 114 time for calcul the mask position with numpy : 0.0002720355987548828 nb_pixel_total : 19992 time to create 1 rle with old method : 0.023212909698486328 length of segment : 216 time for calcul the mask position with numpy : 0.00017976760864257812 nb_pixel_total : 10739 time to create 1 rle with old method : 0.012439489364624023 length of segment : 139 time for calcul the mask position with numpy : 0.00012826919555664062 nb_pixel_total : 6776 time to create 1 rle with old method : 0.007831335067749023 length of segment : 114 time for calcul the mask position with numpy : 0.0001766681671142578 nb_pixel_total : 10464 time to create 1 rle with old method : 0.011636972427368164 length of segment : 152 time for calcul the mask position with numpy : 0.0002338886260986328 nb_pixel_total : 9432 time to create 1 rle with old method : 0.010749101638793945 length of segment : 178 time for calcul the mask position with numpy : 6.890296936035156e-05 nb_pixel_total : 1994 time to create 1 rle with old method : 0.002549886703491211 length of segment : 37 time for calcul the mask position with numpy : 0.0003154277801513672 nb_pixel_total : 19581 time to create 1 rle with old method : 0.022622108459472656 length of segment : 160 time for calcul the mask position with numpy : 0.00024175643920898438 nb_pixel_total : 11752 time to create 1 rle with old method : 0.013599395751953125 length of segment : 188 time for calcul the mask position with numpy : 8.320808410644531e-05 nb_pixel_total : 2685 time to create 1 rle with old method : 0.003351449966430664 length of segment : 44 time for calcul the mask position with numpy : 9.1552734375e-05 nb_pixel_total : 4056 time to create 1 rle with old method : 0.0048410892486572266 length of segment : 62 time for calcul the mask position with numpy : 0.00011444091796875 nb_pixel_total : 6135 time to create 1 rle with old method : 0.0072994232177734375 length of segment : 75 time for calcul the mask position with numpy : 0.00013947486877441406 nb_pixel_total : 7588 time to create 1 rle with old method : 0.008552074432373047 length of segment : 104 time for calcul the mask position with numpy : 0.0002143383026123047 nb_pixel_total : 13682 time to create 1 rle with old method : 0.015615224838256836 length of segment : 166 time for calcul the mask position with numpy : 0.0001697540283203125 nb_pixel_total : 9682 time to create 1 rle with old method : 0.011204004287719727 length of segment : 130 time for calcul the mask position with numpy : 0.001508951187133789 nb_pixel_total : 115159 time to create 1 rle with old method : 0.12840652465820312 length of segment : 566 time for calcul the mask position with numpy : 0.0014750957489013672 nb_pixel_total : 116876 time to create 1 rle with old method : 0.13205599784851074 length of segment : 553 time for calcul the mask position with numpy : 0.00011467933654785156 nb_pixel_total : 4237 time to create 1 rle with old method : 0.005234718322753906 length of segment : 67 time for calcul the mask position with numpy : 0.010966777801513672 nb_pixel_total : 709816 time to create 1 rle with new method : 0.02089524269104004 length of segment : 911 time for calcul the mask position with numpy : 0.00020241737365722656 nb_pixel_total : 9615 time to create 1 rle with old method : 0.011177539825439453 length of segment : 102 time for calcul the mask position with numpy : 0.0016207695007324219 nb_pixel_total : 114811 time to create 1 rle with old method : 0.13721060752868652 length of segment : 547 time for calcul the mask position with numpy : 0.00021076202392578125 nb_pixel_total : 9845 time to create 1 rle with old method : 0.011866569519042969 length of segment : 104 time for calcul the mask position with numpy : 7.43865966796875e-05 nb_pixel_total : 1537 time to create 1 rle with old method : 0.0019583702087402344 length of segment : 44 time for calcul the mask position with numpy : 0.0001220703125 nb_pixel_total : 3411 time to create 1 rle with old method : 0.00432586669921875 length of segment : 58 time for calcul the mask position with numpy : 0.00029587745666503906 nb_pixel_total : 12754 time to create 1 rle with old method : 0.014908075332641602 length of segment : 189 time for calcul the mask position with numpy : 0.0002453327178955078 nb_pixel_total : 12951 time to create 1 rle with old method : 0.015456438064575195 length of segment : 112 time for calcul the mask position with numpy : 0.0005283355712890625 nb_pixel_total : 10979 time to create 1 rle with old method : 0.013530492782592773 length of segment : 186 time for calcul the mask position with numpy : 0.0007917881011962891 nb_pixel_total : 24279 time to create 1 rle with old method : 0.028982162475585938 length of segment : 225 time for calcul the mask position with numpy : 0.0004754066467285156 nb_pixel_total : 23156 time to create 1 rle with old method : 0.027497053146362305 length of segment : 137 time for calcul the mask position with numpy : 0.00018596649169921875 nb_pixel_total : 2894 time to create 1 rle with old method : 0.0034799575805664062 length of segment : 88 time for calcul the mask position with numpy : 0.001615285873413086 nb_pixel_total : 56843 time to create 1 rle with old method : 0.06611418724060059 length of segment : 207 time for calcul the mask position with numpy : 0.0012292861938476562 nb_pixel_total : 33538 time to create 1 rle with old method : 0.03863239288330078 length of segment : 319 time for calcul the mask position with numpy : 0.0003936290740966797 nb_pixel_total : 6439 time to create 1 rle with old method : 0.00755763053894043 length of segment : 183 time for calcul the mask position with numpy : 0.0004451274871826172 nb_pixel_total : 17306 time to create 1 rle with old method : 0.020077228546142578 length of segment : 202 time for calcul the mask position with numpy : 0.0004887580871582031 nb_pixel_total : 12593 time to create 1 rle with old method : 0.014561653137207031 length of segment : 190 time for calcul the mask position with numpy : 0.000431060791015625 nb_pixel_total : 13932 time to create 1 rle with old method : 0.016418933868408203 length of segment : 174 time for calcul the mask position with numpy : 0.0005168914794921875 nb_pixel_total : 10733 time to create 1 rle with old method : 0.012414216995239258 length of segment : 177 time for calcul the mask position with numpy : 0.0001595020294189453 nb_pixel_total : 2197 time to create 1 rle with old method : 0.0027446746826171875 length of segment : 56 time for calcul the mask position with numpy : 0.013891220092773438 nb_pixel_total : 711645 time to create 1 rle with new method : 0.020516395568847656 length of segment : 997 time for calcul the mask position with numpy : 0.0003428459167480469 nb_pixel_total : 4718 time to create 1 rle with old method : 0.008082389831542969 length of segment : 104 time for calcul the mask position with numpy : 0.0004572868347167969 nb_pixel_total : 12826 time to create 1 rle with old method : 0.02169036865234375 length of segment : 121 time for calcul the mask position with numpy : 0.0020859241485595703 nb_pixel_total : 86428 time to create 1 rle with old method : 0.10052871704101562 length of segment : 485 time for calcul the mask position with numpy : 0.002516031265258789 nb_pixel_total : 105166 time to create 1 rle with old method : 0.13081932067871094 length of segment : 553 time for calcul the mask position with numpy : 0.0004153251647949219 nb_pixel_total : 13190 time to create 1 rle with old method : 0.015285015106201172 length of segment : 163 time for calcul the mask position with numpy : 0.0003039836883544922 nb_pixel_total : 11812 time to create 1 rle with old method : 0.016393423080444336 length of segment : 181 time for calcul the mask position with numpy : 0.014304161071777344 nb_pixel_total : 719842 time to create 1 rle with new method : 0.02406930923461914 length of segment : 947 time for calcul the mask position with numpy : 0.00040650367736816406 nb_pixel_total : 9889 time to create 1 rle with old method : 0.011474609375 length of segment : 172 time for calcul the mask position with numpy : 0.009111881256103516 nb_pixel_total : 679481 time to create 1 rle with new method : 0.014917373657226562 length of segment : 948 time for calcul the mask position with numpy : 0.0005779266357421875 nb_pixel_total : 17827 time to create 1 rle with old method : 0.020644664764404297 length of segment : 205 time for calcul the mask position with numpy : 0.0007333755493164062 nb_pixel_total : 34915 time to create 1 rle with old method : 0.0396575927734375 length of segment : 239 time for calcul the mask position with numpy : 0.00021529197692871094 nb_pixel_total : 3338 time to create 1 rle with old method : 0.003975629806518555 length of segment : 93 time for calcul the mask position with numpy : 0.00011920928955078125 nb_pixel_total : 4803 time to create 1 rle with old method : 0.0056591033935546875 length of segment : 116 time for calcul the mask position with numpy : 0.010278940200805664 nb_pixel_total : 631309 time to create 1 rle with new method : 0.016466617584228516 length of segment : 1142 time for calcul the mask position with numpy : 0.0014843940734863281 nb_pixel_total : 68445 time to create 1 rle with old method : 0.07776880264282227 length of segment : 320 time for calcul the mask position with numpy : 0.000484466552734375 nb_pixel_total : 15398 time to create 1 rle with old method : 0.017470598220825195 length of segment : 203 time for calcul the mask position with numpy : 0.0001266002655029297 nb_pixel_total : 3418 time to create 1 rle with old method : 0.004236936569213867 length of segment : 48 time for calcul the mask position with numpy : 0.002317667007446289 nb_pixel_total : 108111 time to create 1 rle with old method : 0.13397979736328125 length of segment : 544 time for calcul the mask position with numpy : 0.000514984130859375 nb_pixel_total : 15837 time to create 1 rle with old method : 0.018349885940551758 length of segment : 152 time for calcul the mask position with numpy : 0.0003299713134765625 nb_pixel_total : 5656 time to create 1 rle with old method : 0.006732463836669922 length of segment : 142 time for calcul the mask position with numpy : 0.002226114273071289 nb_pixel_total : 96588 time to create 1 rle with old method : 0.10827898979187012 length of segment : 565 time for calcul the mask position with numpy : 0.0012898445129394531 nb_pixel_total : 58442 time to create 1 rle with old method : 0.0728006362915039 length of segment : 387 time for calcul the mask position with numpy : 0.0004487037658691406 nb_pixel_total : 12316 time to create 1 rle with old method : 0.014344453811645508 length of segment : 117 time for calcul the mask position with numpy : 0.0005578994750976562 nb_pixel_total : 14923 time to create 1 rle with old method : 0.01724076271057129 length of segment : 149 time for calcul the mask position with numpy : 0.0004677772521972656 nb_pixel_total : 13263 time to create 1 rle with old method : 0.01578664779663086 length of segment : 134 time for calcul the mask position with numpy : 0.011451244354248047 nb_pixel_total : 530269 time to create 1 rle with new method : 0.017132282257080078 length of segment : 983 time for calcul the mask position with numpy : 0.0007195472717285156 nb_pixel_total : 27635 time to create 1 rle with old method : 0.03165888786315918 length of segment : 342 time for calcul the mask position with numpy : 0.002213001251220703 nb_pixel_total : 114851 time to create 1 rle with old method : 0.1289958953857422 length of segment : 543 time for calcul the mask position with numpy : 0.0002894401550292969 nb_pixel_total : 10885 time to create 1 rle with old method : 0.012812376022338867 length of segment : 126 time for calcul the mask position with numpy : 0.00027441978454589844 nb_pixel_total : 6872 time to create 1 rle with old method : 0.008242368698120117 length of segment : 165 time for calcul the mask position with numpy : 0.00020122528076171875 nb_pixel_total : 4587 time to create 1 rle with old method : 0.005552053451538086 length of segment : 109 time for calcul the mask position with numpy : 0.00019073486328125 nb_pixel_total : 4523 time to create 1 rle with old method : 0.005477190017700195 length of segment : 84 time for calcul the mask position with numpy : 0.001954317092895508 nb_pixel_total : 95917 time to create 1 rle with old method : 0.10867452621459961 length of segment : 489 time for calcul the mask position with numpy : 0.005013227462768555 nb_pixel_total : 245355 time to create 1 rle with new method : 0.006906270980834961 length of segment : 670 time for calcul the mask position with numpy : 0.0014142990112304688 nb_pixel_total : 36171 time to create 1 rle with old method : 0.0449070930480957 length of segment : 191 time for calcul the mask position with numpy : 0.00135040283203125 nb_pixel_total : 62786 time to create 1 rle with old method : 0.07073783874511719 length of segment : 253 time for calcul the mask position with numpy : 0.000484466552734375 nb_pixel_total : 18389 time to create 1 rle with old method : 0.021491527557373047 length of segment : 154 time for calcul the mask position with numpy : 0.0005025863647460938 nb_pixel_total : 13866 time to create 1 rle with old method : 0.01635575294494629 length of segment : 177 time for calcul the mask position with numpy : 0.000148773193359375 nb_pixel_total : 2460 time to create 1 rle with old method : 0.0031244754791259766 length of segment : 72 time spent for convertir_results : 5.681774854660034 Inside saveOutput : final : False verbose : 0 eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 84 chid ids of type : 3594 Number RLEs to save : 22450 save missing photos in datou_result : time spend for datou_step_exec : 38.61137843132019 time spend to save output : 1.4112763404846191 total time spend for step 1 : 40.02265477180481 step2:crop_condition Thu Jul 31 14:11:12 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure Loading chi in step crop with photo_hashtag_type : 3594 Loading chi in step crop for list_pids : 26 ! batch 1 Loaded 84 chid ids of type : 3594 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ begin to crop the class : papier param for this class : {'min_score': 0.7} filtre for class : papier hashtag_id of this class : 492668766 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 41 About to insert : list_path_to_insert length 41 new photo from crops ! About to upload 41 photos upload in portfolio : 3736932 init cache_photo without model_param we have 41 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753963876_2691908 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 41 photos in the portfolio 3736932 time of upload the photos Elapsed time : 9.293057203292847 we have finished the crop for the class : papier begin to crop the class : carton param for this class : {'min_score': 0.7} filtre for class : carton hashtag_id of this class : 492774966 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 12 About to insert : list_path_to_insert length 12 new photo from crops ! About to upload 12 photos upload in portfolio : 3736932 init cache_photo without model_param we have 12 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753963887_2691908 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 12 photos in the portfolio 3736932 time of upload the photos Elapsed time : 3.1951358318328857 we have finished the crop for the class : carton begin to crop the class : metal param for this class : {'min_score': 0.7} filtre for class : metal hashtag_id of this class : 492628673 begin to crop the class : pet_clair param for this class : {'min_score': 0.7} filtre for class : pet_clair hashtag_id of this class : 2107755846 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 29 About to insert : list_path_to_insert length 29 new photo from crops ! About to upload 29 photos upload in portfolio : 3736932 init cache_photo without model_param we have 29 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753963911_2691908 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 29 photos in the portfolio 3736932 time of upload the photos Elapsed time : 8.037286520004272 we have finished the crop for the class : pet_clair begin to crop the class : autre param for this class : {'min_score': 0.7} filtre for class : autre hashtag_id of this class : 494826614 we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 2 About to insert : list_path_to_insert length 2 new photo from crops ! About to upload 2 photos upload in portfolio : 3736932 init cache_photo without model_param we have 2 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1753963920_2691908 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 2 photos in the portfolio 3736932 time of upload the photos Elapsed time : 1.0288498401641846 we have finished the crop for the class : autre begin to crop the class : pehd param for this class : {'min_score': 0.7} filtre for class : pehd hashtag_id of this class : 628944319 begin to crop the class : pet_fonce param for this class : {'min_score': 0.7} filtre for class : pet_fonce hashtag_id of this class : 2107755900 delete rles from all chi we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : crop_condition we use saveGeneral [1374577059, 1374577055, 1374576883, 1374576881, 1374576879, 1374576877, 1374576875, 1374576791, 1374576790, 1374576776, 1374576753, 1374576745, 1374576649, 1374576624, 1374576599, 1374576575, 1374576486, 1374576484, 1374576481, 1374576477, 1374576474, 1374576464, 1374576443, 1374576417, 1374576400, 1374576398] Looping around the photos to save general results len do output : 84 /1374598282Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598283Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598284Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598285Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598286Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598287Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598288Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598289Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598290Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598291Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598292Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598293Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598294Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598296Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598297Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598298Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598299Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598300Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598301Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598302Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598303Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598304Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598305Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598306Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598307Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598308Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598309Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598310Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598311Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598312Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598313Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598314Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598315Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598316Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598317Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598318Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598319Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598320Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598321Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598322Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598323Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598325Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598326Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598327Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598328Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598329Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598330Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598331Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598332Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598333Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598334Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598335Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598336Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598362Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598363Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598364Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598365Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598366Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598367Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598368Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598369Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598370Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598371Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598372Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598373Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598374Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598375Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598376Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598377Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598378Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598379Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598380Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598381Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598382Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598383Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598384Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598385Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598386Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598387Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598388Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598389Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598390Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598403Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1374598405Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374577059', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374577055', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576883', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576881', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576879', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576877', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576875', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576791', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576790', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576776', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576753', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576745', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576649', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576624', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576599', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576575', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576486', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576484', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576481', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576477', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576474', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576464', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576443', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576417', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576400', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576398', None, None, None, None, None, '3410775') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 278 time used for this insertion : 0.026644468307495117 save_final save missing photos in datou_result : time spend for datou_step_exec : 48.81835865974426 time spend to save output : 0.030324697494506836 total time spend for step 2 : 48.84868335723877 step3:rle_unique_nms_with_priority Thu Jul 31 14:12:01 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 84 chid ids of type : 3594 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 1.1274409294128418 time for calcul the mask position with numpy : 0.24883794784545898 nb_pixel_total : 1918926 time to create 1 rle with new method : 0.09971427917480469 time for calcul the mask position with numpy : 0.010080814361572266 nb_pixel_total : 11981 time to create 1 rle with old method : 0.013010025024414062 time for calcul the mask position with numpy : 0.0059871673583984375 nb_pixel_total : 14243 time to create 1 rle with old method : 0.015317201614379883 time for calcul the mask position with numpy : 0.006534576416015625 nb_pixel_total : 89391 time to create 1 rle with old method : 0.09621047973632812 time for calcul the mask position with numpy : 0.006642341613769531 nb_pixel_total : 39059 time to create 1 rle with old method : 0.04248642921447754 create new chi : 0.5558645725250244 time to delete rle : 0.027193784713745117 batch 1 Loaded 9 chid ids of type : 3594 +++++++Number RLEs to save : 3538 TO DO : save crop sub photo not yet done ! save time : 0.2658851146697998 nb_obj : 5 nb_hashtags : 3 time to prepare the origin masks : 0.19214200973510742 time for calcul the mask position with numpy : 0.08566951751708984 nb_pixel_total : 2002177 time to create 1 rle with new method : 0.1902751922607422 time for calcul the mask position with numpy : 0.007554531097412109 nb_pixel_total : 10739 time to create 1 rle with old method : 0.0117034912109375 time for calcul the mask position with numpy : 0.005994081497192383 nb_pixel_total : 19992 time to create 1 rle with old method : 0.021462440490722656 time for calcul the mask position with numpy : 0.006022453308105469 nb_pixel_total : 8338 time to create 1 rle with old method : 0.009052276611328125 time for calcul the mask position with numpy : 0.0061190128326416016 nb_pixel_total : 20612 time to create 1 rle with old method : 0.022619247436523438 time for calcul the mask position with numpy : 0.0065534114837646484 nb_pixel_total : 11742 time to create 1 rle with old method : 0.013024330139160156 create new chi : 0.39978861808776855 time to delete rle : 0.00030350685119628906 batch 1 Loaded 11 chid ids of type : 3594 +++++++Number RLEs to save : 2738 TO DO : save crop sub photo not yet done ! save time : 0.20154142379760742 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.06698846817016602 time for calcul the mask position with numpy : 0.020114898681640625 nb_pixel_total : 2044934 time to create 1 rle with new method : 0.30217838287353516 time for calcul the mask position with numpy : 0.00616455078125 nb_pixel_total : 1994 time to create 1 rle with old method : 0.0022182464599609375 time for calcul the mask position with numpy : 0.00609135627746582 nb_pixel_total : 9432 time to create 1 rle with old method : 0.010586261749267578 time for calcul the mask position with numpy : 0.006325244903564453 nb_pixel_total : 10464 time to create 1 rle with old method : 0.011459589004516602 time for calcul the mask position with numpy : 0.006117820739746094 nb_pixel_total : 6776 time to create 1 rle with old method : 0.007690906524658203 create new chi : 0.38586950302124023 time to delete rle : 0.0003745555877685547 batch 1 Loaded 9 chid ids of type : 3594 +++++Number RLEs to save : 2042 TO DO : save crop sub photo not yet done ! save time : 0.16567254066467285 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.0976717472076416 time for calcul the mask position with numpy : 0.21990227699279785 nb_pixel_total : 2042267 time to create 1 rle with new method : 0.11511421203613281 time for calcul the mask position with numpy : 0.0064487457275390625 nb_pixel_total : 11752 time to create 1 rle with old method : 0.013042926788330078 time for calcul the mask position with numpy : 0.006329774856567383 nb_pixel_total : 19581 time to create 1 rle with old method : 0.02190423011779785 create new chi : 0.3938159942626953 time to delete rle : 0.0002903938293457031 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1776 TO DO : save crop sub photo not yet done ! save time : 0.1465303897857666 nb_obj : 3 nb_hashtags : 3 time to prepare the origin masks : 0.14931750297546387 time for calcul the mask position with numpy : 0.5181667804718018 nb_pixel_total : 2060724 time to create 1 rle with new method : 0.10529160499572754 time for calcul the mask position with numpy : 0.006124019622802734 nb_pixel_total : 6135 time to create 1 rle with old method : 0.0068264007568359375 time for calcul the mask position with numpy : 0.006044864654541016 nb_pixel_total : 4056 time to create 1 rle with old method : 0.0046536922454833984 time for calcul the mask position with numpy : 0.006936550140380859 nb_pixel_total : 2685 time to create 1 rle with old method : 0.003053903579711914 create new chi : 0.6675398349761963 time to delete rle : 0.0002968311309814453 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 1442 TO DO : save crop sub photo not yet done ! save time : 0.12763190269470215 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.05191850662231445 time for calcul the mask position with numpy : 0.16852355003356934 nb_pixel_total : 2042648 time to create 1 rle with new method : 0.1264793872833252 time for calcul the mask position with numpy : 0.006593465805053711 nb_pixel_total : 9682 time to create 1 rle with old method : 0.010722160339355469 time for calcul the mask position with numpy : 0.006687641143798828 nb_pixel_total : 13682 time to create 1 rle with old method : 0.015708208084106445 time for calcul the mask position with numpy : 0.0060842037200927734 nb_pixel_total : 7588 time to create 1 rle with old method : 0.008446931838989258 create new chi : 0.35997700691223145 time to delete rle : 0.0003223419189453125 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 1880 TO DO : save crop sub photo not yet done ! save time : 0.1543588638305664 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.041528940200805664 time for calcul the mask position with numpy : 0.07882070541381836 nb_pixel_total : 1953701 time to create 1 rle with new method : 0.27100467681884766 time for calcul the mask position with numpy : 0.006152153015136719 nb_pixel_total : 4740 time to create 1 rle with old method : 0.005209207534790039 time for calcul the mask position with numpy : 0.0072476863861083984 nb_pixel_total : 115159 time to create 1 rle with old method : 0.12323522567749023 create new chi : 0.5000030994415283 time to delete rle : 0.0004055500030517578 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 2672 TO DO : save crop sub photo not yet done ! save time : 0.20174551010131836 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.2621791362762451 time for calcul the mask position with numpy : 0.05714273452758789 nb_pixel_total : 1359547 time to create 1 rle with new method : 0.7082846164703369 time for calcul the mask position with numpy : 0.011159658432006836 nb_pixel_total : 709816 time to create 1 rle with new method : 0.1859450340270996 time for calcul the mask position with numpy : 0.010199308395385742 nb_pixel_total : 4237 time to create 1 rle with old method : 0.0046596527099609375 create new chi : 0.9866354465484619 time to delete rle : 0.0006153583526611328 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 3036 TO DO : save crop sub photo not yet done ! save time : 0.2320108413696289 nb_obj : 4 nb_hashtags : 3 time to prepare the origin masks : 0.3256216049194336 time for calcul the mask position with numpy : 0.17302918434143066 nb_pixel_total : 1947268 time to create 1 rle with new method : 0.10308456420898438 time for calcul the mask position with numpy : 0.005908489227294922 nb_pixel_total : 1537 time to create 1 rle with old method : 0.001756429672241211 time for calcul the mask position with numpy : 0.005960941314697266 nb_pixel_total : 369 time to create 1 rle with old method : 0.0005283355712890625 time for calcul the mask position with numpy : 0.006517171859741211 nb_pixel_total : 114811 time to create 1 rle with old method : 0.12409305572509766 time for calcul the mask position with numpy : 0.006552457809448242 nb_pixel_total : 9615 time to create 1 rle with old method : 0.010459423065185547 create new chi : 0.4483048915863037 time to delete rle : 0.0004229545593261719 batch 1 Loaded 9 chid ids of type : 3594 +++++Number RLEs to save : 2530 TO DO : save crop sub photo not yet done ! save time : 0.19495201110839844 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.04908871650695801 time for calcul the mask position with numpy : 0.01964545249938965 nb_pixel_total : 2044484 time to create 1 rle with new method : 0.19484615325927734 time for calcul the mask position with numpy : 0.005879640579223633 nb_pixel_total : 12951 time to create 1 rle with old method : 0.014045476913452148 time for calcul the mask position with numpy : 0.0061740875244140625 nb_pixel_total : 12754 time to create 1 rle with old method : 0.01367640495300293 time for calcul the mask position with numpy : 0.0060236454010009766 nb_pixel_total : 3411 time to create 1 rle with old method : 0.0038950443267822266 create new chi : 0.2708756923675537 time to delete rle : 0.0003139972686767578 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 1798 TO DO : save crop sub photo not yet done ! save time : 0.1375875473022461 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.23348784446716309 time for calcul the mask position with numpy : 0.14879727363586426 nb_pixel_total : 2015186 time to create 1 rle with new method : 0.09742498397827148 time for calcul the mask position with numpy : 0.005986928939819336 nb_pixel_total : 23156 time to create 1 rle with old method : 0.024039030075073242 time for calcul the mask position with numpy : 0.005673885345458984 nb_pixel_total : 24279 time to create 1 rle with old method : 0.025161266326904297 time for calcul the mask position with numpy : 0.005778074264526367 nb_pixel_total : 10979 time to create 1 rle with old method : 0.011680126190185547 create new chi : 0.33451318740844727 time to delete rle : 0.00034356117248535156 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 2176 TO DO : save crop sub photo not yet done ! save time : 0.1833789348602295 nb_obj : 5 nb_hashtags : 3 time to prepare the origin masks : 0.23660731315612793 time for calcul the mask position with numpy : 0.18593096733093262 nb_pixel_total : 1956580 time to create 1 rle with new method : 0.2535724639892578 time for calcul the mask position with numpy : 0.006593227386474609 nb_pixel_total : 17306 time to create 1 rle with old method : 0.020053386688232422 time for calcul the mask position with numpy : 0.007010459899902344 nb_pixel_total : 6439 time to create 1 rle with old method : 0.007266521453857422 time for calcul the mask position with numpy : 0.006333827972412109 nb_pixel_total : 33538 time to create 1 rle with old method : 0.03791022300720215 time for calcul the mask position with numpy : 0.006940603256225586 nb_pixel_total : 56843 time to create 1 rle with old method : 0.06267523765563965 time for calcul the mask position with numpy : 0.006616830825805664 nb_pixel_total : 2894 time to create 1 rle with old method : 0.0032727718353271484 create new chi : 0.6139800548553467 time to delete rle : 0.0005326271057128906 batch 1 Loaded 11 chid ids of type : 3594 ++++++Number RLEs to save : 3078 TO DO : save crop sub photo not yet done ! save time : 0.23368597030639648 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.04541349411010742 time for calcul the mask position with numpy : 0.020354509353637695 nb_pixel_total : 2047075 time to create 1 rle with new method : 0.06341958045959473 time for calcul the mask position with numpy : 0.005966901779174805 nb_pixel_total : 13932 time to create 1 rle with old method : 0.015144586563110352 time for calcul the mask position with numpy : 0.006247282028198242 nb_pixel_total : 12593 time to create 1 rle with old method : 0.014026641845703125 create new chi : 0.12544822692871094 time to delete rle : 0.00029397010803222656 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 1808 TO DO : save crop sub photo not yet done ! save time : 0.1435699462890625 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.31479763984680176 time for calcul the mask position with numpy : 0.16536355018615723 nb_pixel_total : 1349080 time to create 1 rle with new method : 0.10475707054138184 time for calcul the mask position with numpy : 0.01112675666809082 nb_pixel_total : 711590 time to create 1 rle with new method : 0.1945183277130127 time for calcul the mask position with numpy : 0.006318807601928711 nb_pixel_total : 2197 time to create 1 rle with old method : 0.002518892288208008 time for calcul the mask position with numpy : 0.0071446895599365234 nb_pixel_total : 10733 time to create 1 rle with old method : 0.012128353118896484 create new chi : 0.5137939453125 time to delete rle : 0.0007762908935546875 batch 1 Loaded 7 chid ids of type : 3594 ++++++Number RLEs to save : 3523 TO DO : save crop sub photo not yet done ! save time : 0.253415584564209 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.05182337760925293 time for calcul the mask position with numpy : 0.020430326461791992 nb_pixel_total : 1969628 time to create 1 rle with new method : 0.20881104469299316 time for calcul the mask position with numpy : 0.006816387176513672 nb_pixel_total : 86428 time to create 1 rle with old method : 0.09525156021118164 time for calcul the mask position with numpy : 0.006094694137573242 nb_pixel_total : 12826 time to create 1 rle with old method : 0.014172792434692383 time for calcul the mask position with numpy : 0.0057103633880615234 nb_pixel_total : 4718 time to create 1 rle with old method : 0.005334377288818359 create new chi : 0.3723185062408447 time to delete rle : 0.0004913806915283203 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 2500 TO DO : save crop sub photo not yet done ! save time : 0.18322992324829102 nb_obj : 4 nb_hashtags : 3 time to prepare the origin masks : 0.1168968677520752 time for calcul the mask position with numpy : 0.08967876434326172 nb_pixel_total : 1224310 time to create 1 rle with new method : 0.09626889228820801 time for calcul the mask position with numpy : 0.010988712310791016 nb_pixel_total : 719122 time to create 1 rle with new method : 0.2188282012939453 time for calcul the mask position with numpy : 0.006289482116699219 nb_pixel_total : 11812 time to create 1 rle with old method : 0.013002157211303711 time for calcul the mask position with numpy : 0.005999326705932617 nb_pixel_total : 13190 time to create 1 rle with old method : 0.014362812042236328 time for calcul the mask position with numpy : 0.006743907928466797 nb_pixel_total : 105166 time to create 1 rle with old method : 0.11239361763000488 create new chi : 0.5860843658447266 time to delete rle : 0.0007097721099853516 batch 1 Loaded 9 chid ids of type : 3594 ++++Number RLEs to save : 4708 TO DO : save crop sub photo not yet done ! save time : 0.3118422031402588 nb_obj : 2 nb_hashtags : 2 time to prepare the origin masks : 0.041979074478149414 time for calcul the mask position with numpy : 0.015224695205688477 nb_pixel_total : 1384230 time to create 1 rle with new method : 0.030424833297729492 time for calcul the mask position with numpy : 0.010425567626953125 nb_pixel_total : 679481 time to create 1 rle with new method : 0.09494853019714355 time for calcul the mask position with numpy : 0.005947589874267578 nb_pixel_total : 9889 time to create 1 rle with old method : 0.010913610458374023 create new chi : 0.18664979934692383 time to delete rle : 0.0005502700805664062 batch 1 Loaded 5 chid ids of type : 3594 ++Number RLEs to save : 3320 TO DO : save crop sub photo not yet done ! save time : 0.22896385192871094 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.033220529556274414 time for calcul the mask position with numpy : 0.019755125045776367 nb_pixel_total : 2055773 time to create 1 rle with new method : 0.23107051849365234 time for calcul the mask position with numpy : 0.006316184997558594 nb_pixel_total : 17827 time to create 1 rle with old method : 0.019343137741088867 create new chi : 0.2838706970214844 time to delete rle : 0.000263214111328125 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1490 TO DO : save crop sub photo not yet done ! save time : 0.13478755950927734 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.2699153423309326 time for calcul the mask position with numpy : 0.05514407157897949 nb_pixel_total : 1399235 time to create 1 rle with new method : 0.12703633308410645 time for calcul the mask position with numpy : 0.010953664779663086 nb_pixel_total : 631309 time to create 1 rle with new method : 0.09515810012817383 time for calcul the mask position with numpy : 0.006174325942993164 nb_pixel_total : 4803 time to create 1 rle with old method : 0.005407810211181641 time for calcul the mask position with numpy : 0.006102800369262695 nb_pixel_total : 3338 time to create 1 rle with old method : 0.003676891326904297 time for calcul the mask position with numpy : 0.0061719417572021484 nb_pixel_total : 34915 time to create 1 rle with old method : 0.03854560852050781 create new chi : 0.36466217041015625 time to delete rle : 0.0007114410400390625 batch 1 Loaded 9 chid ids of type : 3594 +++++++++Number RLEs to save : 4260 TO DO : save crop sub photo not yet done ! save time : 0.28904104232788086 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.054189443588256836 time for calcul the mask position with numpy : 0.02024102210998535 nb_pixel_total : 1986339 time to create 1 rle with new method : 0.2195119857788086 time for calcul the mask position with numpy : 0.006132841110229492 nb_pixel_total : 3418 time to create 1 rle with old method : 0.0038137435913085938 time for calcul the mask position with numpy : 0.006134748458862305 nb_pixel_total : 15398 time to create 1 rle with old method : 0.017114639282226562 time for calcul the mask position with numpy : 0.0064966678619384766 nb_pixel_total : 68445 time to create 1 rle with old method : 0.07500219345092773 create new chi : 0.3631401062011719 time to delete rle : 0.00038743019104003906 batch 1 Loaded 7 chid ids of type : 3594 +++++Number RLEs to save : 2222 TO DO : save crop sub photo not yet done ! save time : 0.17104673385620117 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.045890092849731445 time for calcul the mask position with numpy : 0.019536972045898438 nb_pixel_total : 1949652 time to create 1 rle with new method : 0.23112964630126953 time for calcul the mask position with numpy : 0.006575584411621094 nb_pixel_total : 15837 time to create 1 rle with old method : 0.01724529266357422 time for calcul the mask position with numpy : 0.00645756721496582 nb_pixel_total : 108111 time to create 1 rle with old method : 0.11676645278930664 create new chi : 0.4061548709869385 time to delete rle : 0.0003936290740966797 batch 1 Loaded 5 chid ids of type : 3594 +++Number RLEs to save : 2472 TO DO : save crop sub photo not yet done ! save time : 0.18262767791748047 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.08440494537353516 time for calcul the mask position with numpy : 0.2085576057434082 nb_pixel_total : 1912914 time to create 1 rle with new method : 0.0994727611541748 time for calcul the mask position with numpy : 0.006623268127441406 nb_pixel_total : 58442 time to create 1 rle with old method : 0.06372213363647461 time for calcul the mask position with numpy : 0.006650686264038086 nb_pixel_total : 96588 time to create 1 rle with old method : 0.10479044914245605 time for calcul the mask position with numpy : 0.006494998931884766 nb_pixel_total : 5656 time to create 1 rle with old method : 0.006333351135253906 create new chi : 0.5136528015136719 time to delete rle : 0.0005831718444824219 batch 1 Loaded 7 chid ids of type : 3594 +++++++Number RLEs to save : 3268 TO DO : save crop sub photo not yet done ! save time : 0.2397904396057129 No data in photo_id : 1374576443 nb_obj : 8 nb_hashtags : 3 time to prepare the origin masks : 0.30776238441467285 time for calcul the mask position with numpy : 0.6293184757232666 nb_pixel_total : 1342586 time to create 1 rle with new method : 0.10550856590270996 time for calcul the mask position with numpy : 0.006422758102416992 nb_pixel_total : 6872 time to create 1 rle with old method : 0.007315635681152344 time for calcul the mask position with numpy : 0.006352901458740234 nb_pixel_total : 10885 time to create 1 rle with old method : 0.011939764022827148 time for calcul the mask position with numpy : 0.007051944732666016 nb_pixel_total : 114851 time to create 1 rle with old method : 0.14977717399597168 time for calcul the mask position with numpy : 0.006930828094482422 nb_pixel_total : 27635 time to create 1 rle with old method : 0.033448219299316406 time for calcul the mask position with numpy : 0.011655807495117188 nb_pixel_total : 530269 time to create 1 rle with new method : 0.13801145553588867 time for calcul the mask position with numpy : 0.006568193435668945 nb_pixel_total : 13263 time to create 1 rle with old method : 0.014806985855102539 time for calcul the mask position with numpy : 0.006168842315673828 nb_pixel_total : 14923 time to create 1 rle with old method : 0.016530513763427734 time for calcul the mask position with numpy : 0.006582736968994141 nb_pixel_total : 12316 time to create 1 rle with old method : 0.013390064239501953 create new chi : 1.1881206035614014 time to delete rle : 0.0009918212890625 batch 1 Loaded 17 chid ids of type : 3594 +++++++++++++Number RLEs to save : 6198 TO DO : save crop sub photo not yet done ! save time : 0.397064208984375 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.061234235763549805 time for calcul the mask position with numpy : 0.01690983772277832 nb_pixel_total : 1723218 time to create 1 rle with new method : 0.0283968448638916 time for calcul the mask position with numpy : 0.00809478759765625 nb_pixel_total : 245355 time to create 1 rle with new method : 0.02717876434326172 time for calcul the mask position with numpy : 0.006173610687255859 nb_pixel_total : 95917 time to create 1 rle with old method : 0.10364127159118652 time for calcul the mask position with numpy : 0.005990028381347656 nb_pixel_total : 4523 time to create 1 rle with old method : 0.005709171295166016 time for calcul the mask position with numpy : 0.006921052932739258 nb_pixel_total : 4587 time to create 1 rle with old method : 0.004915475845336914 create new chi : 0.22121262550354004 time to delete rle : 0.0006387233734130859 batch 1 Loaded 9 chid ids of type : 3594 ++++++++Number RLEs to save : 3784 TO DO : save crop sub photo not yet done ! save time : 0.26207876205444336 nb_obj : 5 nb_hashtags : 2 time to prepare the origin masks : 0.0796363353729248 time for calcul the mask position with numpy : 0.020357370376586914 nb_pixel_total : 1939928 time to create 1 rle with new method : 0.21011853218078613 time for calcul the mask position with numpy : 0.006350278854370117 nb_pixel_total : 2460 time to create 1 rle with old method : 0.0028085708618164062 time for calcul the mask position with numpy : 0.006556987762451172 nb_pixel_total : 13866 time to create 1 rle with old method : 0.015448570251464844 time for calcul the mask position with numpy : 0.006565570831298828 nb_pixel_total : 18389 time to create 1 rle with old method : 0.024242877960205078 time for calcul the mask position with numpy : 0.011207103729248047 nb_pixel_total : 62786 time to create 1 rle with old method : 0.06879377365112305 time for calcul the mask position with numpy : 0.009645462036132812 nb_pixel_total : 36171 time to create 1 rle with old method : 0.039656639099121094 create new chi : 0.4293515682220459 time to delete rle : 0.00040268898010253906 batch 1 Loaded 11 chid ids of type : 3594 +++++Number RLEs to save : 2774 TO DO : save crop sub photo not yet done ! save time : 0.21114540100097656 map_output_result : {1374577059: (0.0, 'Should be the crop_list due to order', 0), 1374577055: (0.0, 'Should be the crop_list due to order', 0), 1374576883: (0.0, 'Should be the crop_list due to order', 0), 1374576881: (0.0, 'Should be the crop_list due to order', 0), 1374576879: (0.0, 'Should be the crop_list due to order', 0), 1374576877: (0.0, 'Should be the crop_list due to order', 0), 1374576875: (0.0, 'Should be the crop_list due to order', 0), 1374576791: (0.0, 'Should be the crop_list due to order', 0), 1374576790: (0.0, 'Should be the crop_list due to order', 0), 1374576776: (0.0, 'Should be the crop_list due to order', 0), 1374576753: (0.0, 'Should be the crop_list due to order', 0), 1374576745: (0.0, 'Should be the crop_list due to order', 0), 1374576649: (0.0, 'Should be the crop_list due to order', 0), 1374576624: (0.0, 'Should be the crop_list due to order', 0), 1374576599: (0.0, 'Should be the crop_list due to order', 0), 1374576575: (0.0, 'Should be the crop_list due to order', 0), 1374576486: (0.0, 'Should be the crop_list due to order', 0), 1374576484: (0.0, 'Should be the crop_list due to order', 0), 1374576481: (0.0, 'Should be the crop_list due to order', 0), 1374576477: (0.0, 'Should be the crop_list due to order', 0), 1374576474: (0.0, 'Should be the crop_list due to order', 0), 1374576464: (0.0, 'Should be the crop_list due to order', 0), 1374576443: (0.0, 'Should be the crop_list due to order', 0.0), 1374576417: (0.0, 'Should be the crop_list due to order', 0), 1374576400: (0.0, 'Should be the crop_list due to order', 0), 1374576398: (0.0, 'Should be the crop_list due to order', 0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1374577059, 1374577055, 1374576883, 1374576881, 1374576879, 1374576877, 1374576875, 1374576791, 1374576790, 1374576776, 1374576753, 1374576745, 1374576649, 1374576624, 1374576599, 1374576575, 1374576486, 1374576484, 1374576481, 1374576477, 1374576474, 1374576464, 1374576443, 1374576417, 1374576400, 1374576398] Looping around the photos to save general results len do output : 26 /1374577059.Didn't retrieve data . /1374577055.Didn't retrieve data . /1374576883.Didn't retrieve data . /1374576881.Didn't retrieve data . /1374576879.Didn't retrieve data . /1374576877.Didn't retrieve data . /1374576875.Didn't retrieve data . /1374576791.Didn't retrieve data . /1374576790.Didn't retrieve data . /1374576776.Didn't retrieve data . /1374576753.Didn't retrieve data . /1374576745.Didn't retrieve data . /1374576649.Didn't retrieve data . /1374576624.Didn't retrieve data . /1374576599.Didn't retrieve data . /1374576575.Didn't retrieve data . /1374576486.Didn't retrieve data . /1374576484.Didn't retrieve data . /1374576481.Didn't retrieve data . /1374576477.Didn't retrieve data . /1374576474.Didn't retrieve data . /1374576464.Didn't retrieve data . /1374576443.Didn't retrieve data . /1374576417.Didn't retrieve data . /1374576400.Didn't retrieve data . /1374576398.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374577059', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374577055', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576883', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576881', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576879', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576877', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576875', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576791', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576790', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576776', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576753', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576745', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576649', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576624', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576599', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576575', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576486', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576484', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576481', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576477', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576474', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576464', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576443', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576417', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576400', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576398', None, None, None, None, None, '3410775') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 78 time used for this insertion : 0.018734455108642578 save_final save missing photos in datou_result : time spend for datou_step_exec : 21.90248703956604 time spend to save output : 0.019539356231689453 total time spend for step 3 : 21.92202639579773 step4:ventilate_hashtags_in_portfolio Thu Jul 31 14:12:23 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 25543287 get user id for portfolio 25543287 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25543287 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('pet_clair','pehd','background','environnement','papier','carton','mal_croppe','metal','pet_fonce','autre','flou')) AND mptpi.`min_score`=0.5 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25543287 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('pet_clair','pehd','background','environnement','papier','carton','mal_croppe','metal','pet_fonce','autre','flou')) AND mptpi.`min_score`=0.5 To do To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25543287 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('pet_clair','pehd','background','environnement','papier','carton','mal_croppe','metal','pet_fonce','autre','flou')) AND mptpi.`min_score`=0.5 To do lien utilise dans velours : https://www.fotonower.com/velours/25543597,25543598,25543599,25543600,25543601,25543602,25543603,25543604,25543605,25543606,25543607?tags=pet_clair,pehd,background,environnement,papier,carton,mal_croppe,metal,pet_fonce,autre,flou Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1374577059, 1374577055, 1374576883, 1374576881, 1374576879, 1374576877, 1374576875, 1374576791, 1374576790, 1374576776, 1374576753, 1374576745, 1374576649, 1374576624, 1374576599, 1374576575, 1374576486, 1374576484, 1374576481, 1374576477, 1374576474, 1374576464, 1374576443, 1374576417, 1374576400, 1374576398] Looping around the photos to save general results len do output : 1 /25543287. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374577059', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374577055', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576883', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576881', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576879', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576877', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576875', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576791', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576790', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576776', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576753', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576745', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576649', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576624', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576599', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576575', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576486', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576484', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576481', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576477', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576474', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576464', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576443', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576417', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576400', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576398', None, None, None, None, None, '3410775') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 27 time used for this insertion : 0.01738905906677246 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.7330248355865479 time spend to save output : 0.017733097076416016 total time spend for step 4 : 0.7507579326629639 step5:final Thu Jul 31 14:12:24 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Catched exception ! Connect or reconnect ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1374577059: ('0.11439766589506177',), 1374577055: ('0.11439766589506177',), 1374576883: ('0.11439766589506177',), 1374576881: ('0.11439766589506177',), 1374576879: ('0.11439766589506177',), 1374576877: ('0.11439766589506177',), 1374576875: ('0.11439766589506177',), 1374576791: ('0.11439766589506177',), 1374576790: ('0.11439766589506177',), 1374576776: ('0.11439766589506177',), 1374576753: ('0.11439766589506177',), 1374576745: ('0.11439766589506177',), 1374576649: ('0.11439766589506177',), 1374576624: ('0.11439766589506177',), 1374576599: ('0.11439766589506177',), 1374576575: ('0.11439766589506177',), 1374576486: ('0.11439766589506177',), 1374576484: ('0.11439766589506177',), 1374576481: ('0.11439766589506177',), 1374576477: ('0.11439766589506177',), 1374576474: ('0.11439766589506177',), 1374576464: ('0.11439766589506177',), 1374576443: ('0.11439766589506177',), 1374576417: ('0.11439766589506177',), 1374576400: ('0.11439766589506177',), 1374576398: ('0.11439766589506177',)} new output for save of step final : {1374577059: ('0.11439766589506177',), 1374577055: ('0.11439766589506177',), 1374576883: ('0.11439766589506177',), 1374576881: ('0.11439766589506177',), 1374576879: ('0.11439766589506177',), 1374576877: ('0.11439766589506177',), 1374576875: ('0.11439766589506177',), 1374576791: ('0.11439766589506177',), 1374576790: ('0.11439766589506177',), 1374576776: ('0.11439766589506177',), 1374576753: ('0.11439766589506177',), 1374576745: ('0.11439766589506177',), 1374576649: ('0.11439766589506177',), 1374576624: ('0.11439766589506177',), 1374576599: ('0.11439766589506177',), 1374576575: ('0.11439766589506177',), 1374576486: ('0.11439766589506177',), 1374576484: ('0.11439766589506177',), 1374576481: ('0.11439766589506177',), 1374576477: ('0.11439766589506177',), 1374576474: ('0.11439766589506177',), 1374576464: ('0.11439766589506177',), 1374576443: ('0.11439766589506177',), 1374576417: ('0.11439766589506177',), 1374576400: ('0.11439766589506177',), 1374576398: ('0.11439766589506177',)} [1374577059, 1374577055, 1374576883, 1374576881, 1374576879, 1374576877, 1374576875, 1374576791, 1374576790, 1374576776, 1374576753, 1374576745, 1374576649, 1374576624, 1374576599, 1374576575, 1374576486, 1374576484, 1374576481, 1374576477, 1374576474, 1374576464, 1374576443, 1374576417, 1374576400, 1374576398] Looping around the photos to save general results len do output : 26 /1374577059.Didn't retrieve data . /1374577055.Didn't retrieve data . /1374576883.Didn't retrieve data . /1374576881.Didn't retrieve data . /1374576879.Didn't retrieve data . /1374576877.Didn't retrieve data . /1374576875.Didn't retrieve data . /1374576791.Didn't retrieve data . /1374576790.Didn't retrieve data . /1374576776.Didn't retrieve data . /1374576753.Didn't retrieve data . /1374576745.Didn't retrieve data . /1374576649.Didn't retrieve data . /1374576624.Didn't retrieve data . /1374576599.Didn't retrieve data . /1374576575.Didn't retrieve data . /1374576486.Didn't retrieve data . /1374576484.Didn't retrieve data . /1374576481.Didn't retrieve data . /1374576477.Didn't retrieve data . /1374576474.Didn't retrieve data . /1374576464.Didn't retrieve data . /1374576443.Didn't retrieve data . /1374576417.Didn't retrieve data . /1374576400.Didn't retrieve data . /1374576398.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374577059', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374577055', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576883', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576881', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576879', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576877', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576875', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576791', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576790', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576776', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576753', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576745', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576649', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576624', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576599', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576575', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576486', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576484', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576481', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576477', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576474', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576464', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576443', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576417', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576400', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576398', None, None, None, None, None, '3410775') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 78 time used for this insertion : 0.019618749618530273 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.15574121475219727 time spend to save output : 0.020851612091064453 total time spend for step 5 : 0.17659282684326172 step6:blur_detection Thu Jul 31 14:12:24 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection methode: ratio et variance treat image : temp/1753963829_2691908_1374577059_f998fa6b3dae01901ea5561145352ed8.jpg resize: (1080, 1920) 1374577059 -0.668748884929567 treat image : temp/1753963829_2691908_1374577055_cb308b1a58d792551aaa8314c11963a4.jpg resize: (1080, 1920) 1374577055 -0.8875702863511941 treat image : temp/1753963829_2691908_1374576883_ca742d0900679c57d3afabbb4f371591.jpg resize: (1080, 1920) 1374576883 -2.3897046307170826 treat image : temp/1753963829_2691908_1374576881_4d3a62ba7a1a628d3e860dec38ca5671.jpg resize: (1080, 1920) 1374576881 0.05666185780212569 treat image : temp/1753963829_2691908_1374576879_bbb44a7d9390ec399d139c928a861fc0.jpg resize: (1080, 1920) 1374576879 -0.9090831879731828 treat image : temp/1753963829_2691908_1374576877_c68b7bed87d64bd7dadf66bab310c1cc.jpg resize: (1080, 1920) 1374576877 -0.5715592409635858 treat image : temp/1753963829_2691908_1374576875_53c413659746278c39e82901b42c66ab.jpg resize: (1080, 1920) 1374576875 -1.2287891362584393 treat image : temp/1753963829_2691908_1374576791_1b3e4e139b7f1dc16fcd716488cb63c2.jpg resize: (1080, 1920) 1374576791 1.7463540521998426 treat image : temp/1753963829_2691908_1374576790_7725c851a70db03e2d7be31331798886.jpg resize: (1080, 1920) 1374576790 -3.7309603889347724 treat image : temp/1753963829_2691908_1374576776_05e5823925967bd340731b44c370f4a5.jpg resize: (1080, 1920) 1374576776 -0.872537471224047 treat image : temp/1753963829_2691908_1374576753_6e277dd75633f124cf37d1c0261b25d2.jpg resize: (1080, 1920) 1374576753 -0.06608934769598261 treat image : temp/1753963829_2691908_1374576745_0305c87766edd42ed01b92d54e70bb47.jpg resize: (1080, 1920) 1374576745 -0.6361360911957404 treat image : temp/1753963829_2691908_1374576649_f4865b42e30d5b33c01dc1a676100a23.jpg resize: (1080, 1920) 1374576649 -0.33214943287361043 treat image : temp/1753963829_2691908_1374576624_2acb4d0319b1c34e9e243bb8a2402473.jpg resize: (1080, 1920) 1374576624 -0.07853950588384163 treat image : temp/1753963829_2691908_1374576599_acd946c91762a37f70117be6f8c1b742.jpg resize: (1080, 1920) 1374576599 0.7774980242015209 treat image : temp/1753963829_2691908_1374576575_69d5390973ea63f9b9cd2727e92bc7b5.jpg resize: (1080, 1920) 1374576575 -3.6410676321623123 treat image : temp/1753963829_2691908_1374576486_c7ba33309f65cb7e0a202b1444ef9855.jpg resize: (1080, 1920) 1374576486 0.6052999194067544 treat image : temp/1753963829_2691908_1374576484_2982565d6fb701982b8963d9ad1dea84.jpg resize: (1080, 1920) 1374576484 0.6465153801855525 treat image : temp/1753963829_2691908_1374576481_965d559275c6da82929dc7e72882a217.jpg resize: (1080, 1920) 1374576481 1.2790905637492123 treat image : temp/1753963829_2691908_1374576477_28d84030495d1e5785e29b0fdaacaa6b.jpg resize: (1080, 1920) 1374576477 -2.738722371493155 treat image : temp/1753963829_2691908_1374576474_09cac672aedccb1b959b2302188b0b5c.jpg resize: (1080, 1920) 1374576474 -1.495705739590426 treat image : temp/1753963829_2691908_1374576464_c8a1e26feb4092b1ea338570119a1a59.jpg resize: (1080, 1920) 1374576464 -4.318182249217837 treat image : temp/1753963829_2691908_1374576443_5970f261292e735a2620db3cb635036b.jpg resize: (1080, 1920) 1374576443 0.21185228699258463 treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f.jpg resize: (1080, 1920) 1374576417 -0.5908094747585899 treat image : temp/1753963829_2691908_1374576400_dd47611abee84c961074bb7a378db2a5.jpg resize: (1080, 1920) 1374576400 -0.4101476329128805 treat image : temp/1753963829_2691908_1374576398_77b703884ff7d208ea8a2ca31ad74356.jpg resize: (1080, 1920) 1374576398 -3.2346718112941804 treat image : temp/1753963829_2691908_1374577059_f998fa6b3dae01901ea5561145352ed8_rle_crop_3899283589_0.png resize: (166, 106) 1374598282 -0.15476691329136277 treat image : temp/1753963829_2691908_1374577059_f998fa6b3dae01901ea5561145352ed8_rle_crop_3899283590_0.png resize: (188, 104) 1374598283 -0.4754927574861158 treat image : temp/1753963829_2691908_1374577055_cb308b1a58d792551aaa8314c11963a4_rle_crop_3899283591_0.png resize: (148, 120) 1374598284 -0.8838912454862191 treat image : temp/1753963829_2691908_1374577055_cb308b1a58d792551aaa8314c11963a4_rle_crop_3899283592_0.png resize: (211, 161) 1374598285 -1.7403277746805161 treat image : temp/1753963829_2691908_1374577055_cb308b1a58d792551aaa8314c11963a4_rle_crop_3899283593_0.png resize: (114, 117) 1374598286 -1.7756691787554149 treat image : temp/1753963829_2691908_1374576883_ca742d0900679c57d3afabbb4f371591_rle_crop_3899283596_0.png resize: (108, 86) 1374598287 0.4305420878087323 treat image : temp/1753963829_2691908_1374576883_ca742d0900679c57d3afabbb4f371591_rle_crop_3899283598_0.png resize: (173, 106) 1374598288 -1.6389198434061327 treat image : temp/1753963829_2691908_1374576883_ca742d0900679c57d3afabbb4f371591_rle_crop_3899283599_0.png resize: (37, 75) 1374598289 -1.4396387246623488 treat image : temp/1753963829_2691908_1374576881_4d3a62ba7a1a628d3e860dec38ca5671_rle_crop_3899283601_0.png resize: (188, 101) 1374598290 -0.4034016580360539 treat image : temp/1753963829_2691908_1374576879_bbb44a7d9390ec399d139c928a861fc0_rle_crop_3899283603_0.png resize: (62, 98) 1374598291 -1.2490475170171056 treat image : temp/1753963829_2691908_1374576877_c68b7bed87d64bd7dadf66bab310c1cc_rle_crop_3899283605_0.png resize: (104, 96) 1374598292 -0.7569591124422929 treat image : temp/1753963829_2691908_1374576877_c68b7bed87d64bd7dadf66bab310c1cc_rle_crop_3899283607_0.png resize: (130, 129) 1374598293 -1.8631644249895591 treat image : temp/1753963829_2691908_1374576791_1b3e4e139b7f1dc16fcd716488cb63c2_rle_crop_3899283610_0.png resize: (67, 124) 1374598294 -2.5602113884833977 treat image : temp/1753963829_2691908_1374576790_7725c851a70db03e2d7be31331798886_rle_crop_3899283612_0.png resize: (95, 126) 1374598296 -0.42018606514005946 treat image : temp/1753963829_2691908_1374576790_7725c851a70db03e2d7be31331798886_rle_crop_3899283615_0.png resize: (38, 53) 1374598297 -0.7600287291487964 treat image : temp/1753963829_2691908_1374576776_05e5823925967bd340731b44c370f4a5_rle_crop_3899283617_0.png resize: (188, 112) 1374598298 -0.38710910675054006 treat image : temp/1753963829_2691908_1374576776_05e5823925967bd340731b44c370f4a5_rle_crop_3899283618_0.png resize: (100, 187) 1374598299 -1.6842753946769016 treat image : temp/1753963829_2691908_1374576753_6e277dd75633f124cf37d1c0261b25d2_rle_crop_3899283619_0.png resize: (185, 98) 1374598300 -0.5840677534100356 treat image : temp/1753963829_2691908_1374576753_6e277dd75633f124cf37d1c0261b25d2_rle_crop_3899283620_0.png resize: (225, 170) 1374598301 -0.8487432618403622 treat image : temp/1753963829_2691908_1374576745_0305c87766edd42ed01b92d54e70bb47_rle_crop_3899283624_0.png resize: (256, 329) 1374598302 -1.7050993782167638 treat image : temp/1753963829_2691908_1374576745_0305c87766edd42ed01b92d54e70bb47_rle_crop_3899283625_0.png resize: (183, 67) 1374598303 -2.7913621653076284 treat image : temp/1753963829_2691908_1374576649_f4865b42e30d5b33c01dc1a676100a23_rle_crop_3899283627_0.png resize: (190, 122) 1374598304 -0.4595838419949352 treat image : temp/1753963829_2691908_1374576649_f4865b42e30d5b33c01dc1a676100a23_rle_crop_3899283628_0.png resize: (174, 128) 1374598305 -1.290851011592349 treat image : temp/1753963829_2691908_1374576624_2acb4d0319b1c34e9e243bb8a2402473_rle_crop_3899283629_0.png resize: (177, 113) 1374598306 -0.5687030557866591 treat image : temp/1753963829_2691908_1374576624_2acb4d0319b1c34e9e243bb8a2402473_rle_crop_3899283630_0.png resize: (47, 64) 1374598307 0.2923866598571052 treat image : temp/1753963829_2691908_1374576599_acd946c91762a37f70117be6f8c1b742_rle_crop_3899283632_0.png resize: (104, 108) 1374598308 -1.8247579794549509 treat image : temp/1753963829_2691908_1374576575_69d5390973ea63f9b9cd2727e92bc7b5_rle_crop_3899283637_0.png resize: (181, 111) 1374598309 -0.531884201337551 treat image : temp/1753963829_2691908_1374576486_c7ba33309f65cb7e0a202b1444ef9855_rle_crop_3899283639_0.png resize: (165, 128) 1374598310 -1.8062533518909716 treat image : temp/1753963829_2691908_1374576481_965d559275c6da82929dc7e72882a217_rle_crop_3899283643_0.png resize: (93, 55) 1374598311 -0.994871391861854 treat image : temp/1753963829_2691908_1374576481_965d559275c6da82929dc7e72882a217_rle_crop_3899283644_0.png resize: (114, 66) 1374598312 -0.584029547525896 treat image : temp/1753963829_2691908_1374576477_28d84030495d1e5785e29b0fdaacaa6b_rle_crop_3899283646_0.png resize: (320, 311) 1374598313 -0.2706252623270653 treat image : temp/1753963829_2691908_1374576477_28d84030495d1e5785e29b0fdaacaa6b_rle_crop_3899283647_0.png resize: (190, 109) 1374598314 -1.11961453156699 treat image : temp/1753963829_2691908_1374576464_c8a1e26feb4092b1ea338570119a1a59_rle_crop_3899283651_0.png resize: (142, 89) 1374598315 -4.0826304227302606 treat image : temp/1753963829_2691908_1374576464_c8a1e26feb4092b1ea338570119a1a59_rle_crop_3899283653_0.png resize: (328, 219) 1374598316 -4.094545497277228 treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f_rle_crop_3899283654_0.png resize: (117, 174) 1374598317 -0.8809968672786459 treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f_rle_crop_3899283661_0.png resize: (165, 96) 1374598318 -2.2416694407091455 treat image : temp/1753963829_2691908_1374576400_dd47611abee84c961074bb7a378db2a5_rle_crop_3899283662_0.png resize: (83, 74) 1374598319 -0.20029546953584756 treat image : temp/1753963829_2691908_1374576400_dd47611abee84c961074bb7a378db2a5_rle_crop_3899283663_0.png resize: (84, 84) 1374598320 -1.7589756167725747 treat image : temp/1753963829_2691908_1374576398_77b703884ff7d208ea8a2ca31ad74356_rle_crop_3899283667_0.png resize: (251, 307) 1374598321 -4.335133502844506 treat image : temp/1753963829_2691908_1374576398_77b703884ff7d208ea8a2ca31ad74356_rle_crop_3899283668_0.png resize: (130, 214) 1374598322 -2.4738706546331604 treat image : temp/1753963829_2691908_1374576398_77b703884ff7d208ea8a2ca31ad74356_rle_crop_3899283670_0.png resize: (69, 66) 1374598323 -3.2261728801089204 treat image : temp/1753963829_2691908_1374577055_cb308b1a58d792551aaa8314c11963a4_rle_crop_3899283594_0.png resize: (164, 160) 1374598325 -2.0754315969150663 treat image : temp/1753963829_2691908_1374576883_ca742d0900679c57d3afabbb4f371591_rle_crop_3899283597_0.png resize: (143, 88) 1374598326 0.19262618481885116 treat image : temp/1753963829_2691908_1374576877_c68b7bed87d64bd7dadf66bab310c1cc_rle_crop_3899283606_0.png resize: (166, 104) 1374598327 -0.49486766284720923 treat image : temp/1753963829_2691908_1374576875_53c413659746278c39e82901b42c66ab_rle_crop_3899283609_0.png resize: (551, 351) 1374598328 0.3395462624795028 treat image : temp/1753963829_2691908_1374576790_7725c851a70db03e2d7be31331798886_rle_crop_3899283614_0.png resize: (95, 131) 1374598329 -0.20118791157495342 treat image : temp/1753963829_2691908_1374576745_0305c87766edd42ed01b92d54e70bb47_rle_crop_3899283622_0.png resize: (88, 47) 1374598330 -0.8924759650924128 treat image : temp/1753963829_2691908_1374576745_0305c87766edd42ed01b92d54e70bb47_rle_crop_3899283626_0.png resize: (194, 122) 1374598331 -0.9116295988826078 treat image : temp/1753963829_2691908_1374576575_69d5390973ea63f9b9cd2727e92bc7b5_rle_crop_3899283636_0.png resize: (160, 103) 1374598332 -0.6764954565764111 treat image : temp/1753963829_2691908_1374576484_2982565d6fb701982b8963d9ad1dea84_rle_crop_3899283641_0.png resize: (204, 136) 1374598333 -1.391432924735568 treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f_rle_crop_3899283655_0.png resize: (149, 154) 1374598334 -2.7454089956488508 treat image : temp/1753963829_2691908_1374576398_77b703884ff7d208ea8a2ca31ad74356_rle_crop_3899283666_0.png resize: (189, 239) 1374598335 -0.7254203629330851 treat image : temp/1753963829_2691908_1374576398_77b703884ff7d208ea8a2ca31ad74356_rle_crop_3899283669_0.png resize: (165, 117) 1374598336 -1.2922885644128623 treat image : temp/1753963829_2691908_1374577059_f998fa6b3dae01901ea5561145352ed8_rle_crop_3899283587_0.png resize: (307, 207) 1374598362 -0.7488719424887421 treat image : temp/1753963829_2691908_1374577059_f998fa6b3dae01901ea5561145352ed8_rle_crop_3899283588_0.png resize: (527, 294) 1374598363 -0.7344175959626541 treat image : temp/1753963829_2691908_1374577055_cb308b1a58d792551aaa8314c11963a4_rle_crop_3899283595_0.png resize: (138, 101) 1374598364 -3.1827772578748768 treat image : temp/1753963829_2691908_1374576881_4d3a62ba7a1a628d3e860dec38ca5671_rle_crop_3899283600_0.png resize: (160, 171) 1374598365 -2.368011021318574 treat image : temp/1753963829_2691908_1374576879_bbb44a7d9390ec399d139c928a861fc0_rle_crop_3899283602_0.png resize: (43, 82) 1374598366 -0.6488554929464063 treat image : temp/1753963829_2691908_1374576875_53c413659746278c39e82901b42c66ab_rle_crop_3899283608_0.png resize: (565, 339) 1374598367 0.2953913356826206 treat image : temp/1753963829_2691908_1374576791_1b3e4e139b7f1dc16fcd716488cb63c2_rle_crop_3899283611_0.png resize: (908, 1002) 1374598368 0.427091322852127 treat image : temp/1753963829_2691908_1374576790_7725c851a70db03e2d7be31331798886_rle_crop_3899283613_0.png resize: (543, 337) 1374598369 0.32459894512349335 treat image : temp/1753963829_2691908_1374576753_6e277dd75633f124cf37d1c0261b25d2_rle_crop_3899283621_0.png resize: (135, 239) 1374598370 -0.6946582560137552 treat image : temp/1753963829_2691908_1374576745_0305c87766edd42ed01b92d54e70bb47_rle_crop_3899283623_0.png resize: (181, 503) 1374598371 -1.5761666628038296 treat image : temp/1753963829_2691908_1374576624_2acb4d0319b1c34e9e243bb8a2402473_rle_crop_3899283631_0.png resize: (962, 1015) 1374598372 -0.012764598329202567 treat image : temp/1753963829_2691908_1374576599_acd946c91762a37f70117be6f8c1b742_rle_crop_3899283633_0.png resize: (119, 133) 1374598373 -0.4126500176427422 treat image : temp/1753963829_2691908_1374576599_acd946c91762a37f70117be6f8c1b742_rle_crop_3899283634_0.png resize: (479, 300) 1374598374 -0.480806758716234 treat image : temp/1753963829_2691908_1374576575_69d5390973ea63f9b9cd2727e92bc7b5_rle_crop_3899283635_0.png resize: (550, 302) 1374598375 0.44285333612260114 treat image : temp/1753963829_2691908_1374576575_69d5390973ea63f9b9cd2727e92bc7b5_rle_crop_3899283638_0.png resize: (936, 1088) 1374598376 -0.5751808679650124 treat image : temp/1753963829_2691908_1374576486_c7ba33309f65cb7e0a202b1444ef9855_rle_crop_3899283640_0.png resize: (936, 920) 1374598377 0.3365219507550894 treat image : temp/1753963829_2691908_1374576481_965d559275c6da82929dc7e72882a217_rle_crop_3899283642_0.png resize: (194, 238) 1374598378 -0.3038402590288333 treat image : temp/1753963829_2691908_1374576481_965d559275c6da82929dc7e72882a217_rle_crop_3899283645_0.png resize: (943, 938) 1374598379 0.28913921465723913 treat image : temp/1753963829_2691908_1374576477_28d84030495d1e5785e29b0fdaacaa6b_rle_crop_3899283648_0.png resize: (48, 86) 1374598380 -0.13985305971506518 treat image : temp/1753963829_2691908_1374576474_09cac672aedccb1b959b2302188b0b5c_rle_crop_3899283649_0.png resize: (538, 332) 1374598381 0.35277932584889604 treat image : temp/1753963829_2691908_1374576474_09cac672aedccb1b959b2302188b0b5c_rle_crop_3899283650_0.png resize: (151, 153) 1374598382 -1.7704045669421686 treat image : temp/1753963829_2691908_1374576464_c8a1e26feb4092b1ea338570119a1a59_rle_crop_3899283652_0.png resize: (517, 310) 1374598383 0.19566768408718316 treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f_rle_crop_3899283656_0.png resize: (122, 155) 1374598384 -1.2057195179237152 treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f_rle_crop_3899283657_0.png resize: (797, 910) 1374598385 0.19551416007169423 treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f_rle_crop_3899283658_0.png resize: (196, 271) 1374598386 -1.2072421783512972 treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f_rle_crop_3899283659_0.png resize: (540, 347) 1374598387 0.2781639245568908 treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f_rle_crop_3899283660_0.png resize: (126, 109) 1374598388 -2.118741913686257 treat image : temp/1753963829_2691908_1374576400_dd47611abee84c961074bb7a378db2a5_rle_crop_3899283664_0.png resize: (481, 350) 1374598389 -0.43406585630948574 treat image : temp/1753963829_2691908_1374576400_dd47611abee84c961074bb7a378db2a5_rle_crop_3899283665_0.png resize: (568, 533) 1374598390 -0.8134621272594956 treat image : temp/1753963829_2691908_1374576879_bbb44a7d9390ec399d139c928a861fc0_rle_crop_3899283604_0.png resize: (73, 96) 1374598403 3.635324520315874 treat image : temp/1753963829_2691908_1374576776_05e5823925967bd340731b44c370f4a5_rle_crop_3899283616_0.png resize: (58, 105) 1374598405 -1.5786164053038536 Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 110 time used for this insertion : 0.017386913299560547 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 110 time used for this insertion : 0.028380155563354492 save missing photos in datou_result : time spend for datou_step_exec : 23.27425265312195 time spend to save output : 0.05135655403137207 total time spend for step 6 : 23.32560920715332 step7:brightness Thu Jul 31 14:12:47 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness treat image : temp/1753963829_2691908_1374577059_f998fa6b3dae01901ea5561145352ed8.jpg treat image : temp/1753963829_2691908_1374577055_cb308b1a58d792551aaa8314c11963a4.jpg treat image : temp/1753963829_2691908_1374576883_ca742d0900679c57d3afabbb4f371591.jpg treat image : temp/1753963829_2691908_1374576881_4d3a62ba7a1a628d3e860dec38ca5671.jpg treat image : temp/1753963829_2691908_1374576879_bbb44a7d9390ec399d139c928a861fc0.jpg treat image : temp/1753963829_2691908_1374576877_c68b7bed87d64bd7dadf66bab310c1cc.jpg treat image : temp/1753963829_2691908_1374576875_53c413659746278c39e82901b42c66ab.jpg treat image : temp/1753963829_2691908_1374576791_1b3e4e139b7f1dc16fcd716488cb63c2.jpg treat image : temp/1753963829_2691908_1374576790_7725c851a70db03e2d7be31331798886.jpg treat image : temp/1753963829_2691908_1374576776_05e5823925967bd340731b44c370f4a5.jpg treat image : temp/1753963829_2691908_1374576753_6e277dd75633f124cf37d1c0261b25d2.jpg treat image : temp/1753963829_2691908_1374576745_0305c87766edd42ed01b92d54e70bb47.jpg treat image : temp/1753963829_2691908_1374576649_f4865b42e30d5b33c01dc1a676100a23.jpg treat image : temp/1753963829_2691908_1374576624_2acb4d0319b1c34e9e243bb8a2402473.jpg treat image : temp/1753963829_2691908_1374576599_acd946c91762a37f70117be6f8c1b742.jpg treat image : temp/1753963829_2691908_1374576575_69d5390973ea63f9b9cd2727e92bc7b5.jpg treat image : temp/1753963829_2691908_1374576486_c7ba33309f65cb7e0a202b1444ef9855.jpg treat image : temp/1753963829_2691908_1374576484_2982565d6fb701982b8963d9ad1dea84.jpg treat image : temp/1753963829_2691908_1374576481_965d559275c6da82929dc7e72882a217.jpg treat image : temp/1753963829_2691908_1374576477_28d84030495d1e5785e29b0fdaacaa6b.jpg treat image : temp/1753963829_2691908_1374576474_09cac672aedccb1b959b2302188b0b5c.jpg treat image : temp/1753963829_2691908_1374576464_c8a1e26feb4092b1ea338570119a1a59.jpg treat image : temp/1753963829_2691908_1374576443_5970f261292e735a2620db3cb635036b.jpg treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f.jpg treat image : temp/1753963829_2691908_1374576400_dd47611abee84c961074bb7a378db2a5.jpg treat image : temp/1753963829_2691908_1374576398_77b703884ff7d208ea8a2ca31ad74356.jpg treat image : temp/1753963829_2691908_1374577059_f998fa6b3dae01901ea5561145352ed8_rle_crop_3899283589_0.png treat image : temp/1753963829_2691908_1374577059_f998fa6b3dae01901ea5561145352ed8_rle_crop_3899283590_0.png treat image : temp/1753963829_2691908_1374577055_cb308b1a58d792551aaa8314c11963a4_rle_crop_3899283591_0.png treat image : temp/1753963829_2691908_1374577055_cb308b1a58d792551aaa8314c11963a4_rle_crop_3899283592_0.png treat image : temp/1753963829_2691908_1374577055_cb308b1a58d792551aaa8314c11963a4_rle_crop_3899283593_0.png treat image : temp/1753963829_2691908_1374576883_ca742d0900679c57d3afabbb4f371591_rle_crop_3899283596_0.png treat image : temp/1753963829_2691908_1374576883_ca742d0900679c57d3afabbb4f371591_rle_crop_3899283598_0.png treat image : temp/1753963829_2691908_1374576883_ca742d0900679c57d3afabbb4f371591_rle_crop_3899283599_0.png treat image : temp/1753963829_2691908_1374576881_4d3a62ba7a1a628d3e860dec38ca5671_rle_crop_3899283601_0.png treat image : temp/1753963829_2691908_1374576879_bbb44a7d9390ec399d139c928a861fc0_rle_crop_3899283603_0.png treat image : temp/1753963829_2691908_1374576877_c68b7bed87d64bd7dadf66bab310c1cc_rle_crop_3899283605_0.png treat image : temp/1753963829_2691908_1374576877_c68b7bed87d64bd7dadf66bab310c1cc_rle_crop_3899283607_0.png treat image : temp/1753963829_2691908_1374576791_1b3e4e139b7f1dc16fcd716488cb63c2_rle_crop_3899283610_0.png treat image : temp/1753963829_2691908_1374576790_7725c851a70db03e2d7be31331798886_rle_crop_3899283612_0.png treat image : temp/1753963829_2691908_1374576790_7725c851a70db03e2d7be31331798886_rle_crop_3899283615_0.png treat image : temp/1753963829_2691908_1374576776_05e5823925967bd340731b44c370f4a5_rle_crop_3899283617_0.png treat image : temp/1753963829_2691908_1374576776_05e5823925967bd340731b44c370f4a5_rle_crop_3899283618_0.png treat image : temp/1753963829_2691908_1374576753_6e277dd75633f124cf37d1c0261b25d2_rle_crop_3899283619_0.png treat image : temp/1753963829_2691908_1374576753_6e277dd75633f124cf37d1c0261b25d2_rle_crop_3899283620_0.png treat image : temp/1753963829_2691908_1374576745_0305c87766edd42ed01b92d54e70bb47_rle_crop_3899283624_0.png treat image : temp/1753963829_2691908_1374576745_0305c87766edd42ed01b92d54e70bb47_rle_crop_3899283625_0.png treat image : temp/1753963829_2691908_1374576649_f4865b42e30d5b33c01dc1a676100a23_rle_crop_3899283627_0.png treat image : temp/1753963829_2691908_1374576649_f4865b42e30d5b33c01dc1a676100a23_rle_crop_3899283628_0.png treat image : temp/1753963829_2691908_1374576624_2acb4d0319b1c34e9e243bb8a2402473_rle_crop_3899283629_0.png treat image : temp/1753963829_2691908_1374576624_2acb4d0319b1c34e9e243bb8a2402473_rle_crop_3899283630_0.png treat image : temp/1753963829_2691908_1374576599_acd946c91762a37f70117be6f8c1b742_rle_crop_3899283632_0.png treat image : temp/1753963829_2691908_1374576575_69d5390973ea63f9b9cd2727e92bc7b5_rle_crop_3899283637_0.png treat image : temp/1753963829_2691908_1374576486_c7ba33309f65cb7e0a202b1444ef9855_rle_crop_3899283639_0.png treat image : temp/1753963829_2691908_1374576481_965d559275c6da82929dc7e72882a217_rle_crop_3899283643_0.png treat image : temp/1753963829_2691908_1374576481_965d559275c6da82929dc7e72882a217_rle_crop_3899283644_0.png treat image : temp/1753963829_2691908_1374576477_28d84030495d1e5785e29b0fdaacaa6b_rle_crop_3899283646_0.png treat image : temp/1753963829_2691908_1374576477_28d84030495d1e5785e29b0fdaacaa6b_rle_crop_3899283647_0.png treat image : temp/1753963829_2691908_1374576464_c8a1e26feb4092b1ea338570119a1a59_rle_crop_3899283651_0.png treat image : temp/1753963829_2691908_1374576464_c8a1e26feb4092b1ea338570119a1a59_rle_crop_3899283653_0.png treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f_rle_crop_3899283654_0.png treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f_rle_crop_3899283661_0.png treat image : temp/1753963829_2691908_1374576400_dd47611abee84c961074bb7a378db2a5_rle_crop_3899283662_0.png treat image : temp/1753963829_2691908_1374576400_dd47611abee84c961074bb7a378db2a5_rle_crop_3899283663_0.png treat image : temp/1753963829_2691908_1374576398_77b703884ff7d208ea8a2ca31ad74356_rle_crop_3899283667_0.png treat image : temp/1753963829_2691908_1374576398_77b703884ff7d208ea8a2ca31ad74356_rle_crop_3899283668_0.png treat image : temp/1753963829_2691908_1374576398_77b703884ff7d208ea8a2ca31ad74356_rle_crop_3899283670_0.png treat image : temp/1753963829_2691908_1374577055_cb308b1a58d792551aaa8314c11963a4_rle_crop_3899283594_0.png treat image : temp/1753963829_2691908_1374576883_ca742d0900679c57d3afabbb4f371591_rle_crop_3899283597_0.png treat image : temp/1753963829_2691908_1374576877_c68b7bed87d64bd7dadf66bab310c1cc_rle_crop_3899283606_0.png treat image : temp/1753963829_2691908_1374576875_53c413659746278c39e82901b42c66ab_rle_crop_3899283609_0.png treat image : temp/1753963829_2691908_1374576790_7725c851a70db03e2d7be31331798886_rle_crop_3899283614_0.png treat image : temp/1753963829_2691908_1374576745_0305c87766edd42ed01b92d54e70bb47_rle_crop_3899283622_0.png treat image : temp/1753963829_2691908_1374576745_0305c87766edd42ed01b92d54e70bb47_rle_crop_3899283626_0.png treat image : temp/1753963829_2691908_1374576575_69d5390973ea63f9b9cd2727e92bc7b5_rle_crop_3899283636_0.png treat image : temp/1753963829_2691908_1374576484_2982565d6fb701982b8963d9ad1dea84_rle_crop_3899283641_0.png treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f_rle_crop_3899283655_0.png treat image : temp/1753963829_2691908_1374576398_77b703884ff7d208ea8a2ca31ad74356_rle_crop_3899283666_0.png treat image : temp/1753963829_2691908_1374576398_77b703884ff7d208ea8a2ca31ad74356_rle_crop_3899283669_0.png treat image : temp/1753963829_2691908_1374577059_f998fa6b3dae01901ea5561145352ed8_rle_crop_3899283587_0.png treat image : temp/1753963829_2691908_1374577059_f998fa6b3dae01901ea5561145352ed8_rle_crop_3899283588_0.png treat image : temp/1753963829_2691908_1374577055_cb308b1a58d792551aaa8314c11963a4_rle_crop_3899283595_0.png treat image : temp/1753963829_2691908_1374576881_4d3a62ba7a1a628d3e860dec38ca5671_rle_crop_3899283600_0.png treat image : temp/1753963829_2691908_1374576879_bbb44a7d9390ec399d139c928a861fc0_rle_crop_3899283602_0.png treat image : temp/1753963829_2691908_1374576875_53c413659746278c39e82901b42c66ab_rle_crop_3899283608_0.png treat image : temp/1753963829_2691908_1374576791_1b3e4e139b7f1dc16fcd716488cb63c2_rle_crop_3899283611_0.png treat image : temp/1753963829_2691908_1374576790_7725c851a70db03e2d7be31331798886_rle_crop_3899283613_0.png treat image : temp/1753963829_2691908_1374576753_6e277dd75633f124cf37d1c0261b25d2_rle_crop_3899283621_0.png treat image : temp/1753963829_2691908_1374576745_0305c87766edd42ed01b92d54e70bb47_rle_crop_3899283623_0.png treat image : temp/1753963829_2691908_1374576624_2acb4d0319b1c34e9e243bb8a2402473_rle_crop_3899283631_0.png treat image : temp/1753963829_2691908_1374576599_acd946c91762a37f70117be6f8c1b742_rle_crop_3899283633_0.png treat image : temp/1753963829_2691908_1374576599_acd946c91762a37f70117be6f8c1b742_rle_crop_3899283634_0.png treat image : temp/1753963829_2691908_1374576575_69d5390973ea63f9b9cd2727e92bc7b5_rle_crop_3899283635_0.png treat image : temp/1753963829_2691908_1374576575_69d5390973ea63f9b9cd2727e92bc7b5_rle_crop_3899283638_0.png treat image : temp/1753963829_2691908_1374576486_c7ba33309f65cb7e0a202b1444ef9855_rle_crop_3899283640_0.png treat image : temp/1753963829_2691908_1374576481_965d559275c6da82929dc7e72882a217_rle_crop_3899283642_0.png treat image : temp/1753963829_2691908_1374576481_965d559275c6da82929dc7e72882a217_rle_crop_3899283645_0.png treat image : temp/1753963829_2691908_1374576477_28d84030495d1e5785e29b0fdaacaa6b_rle_crop_3899283648_0.png treat image : temp/1753963829_2691908_1374576474_09cac672aedccb1b959b2302188b0b5c_rle_crop_3899283649_0.png treat image : temp/1753963829_2691908_1374576474_09cac672aedccb1b959b2302188b0b5c_rle_crop_3899283650_0.png treat image : temp/1753963829_2691908_1374576464_c8a1e26feb4092b1ea338570119a1a59_rle_crop_3899283652_0.png treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f_rle_crop_3899283656_0.png treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f_rle_crop_3899283657_0.png treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f_rle_crop_3899283658_0.png treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f_rle_crop_3899283659_0.png treat image : temp/1753963829_2691908_1374576417_50d0ac946d063da2c02097b4d5d5e97f_rle_crop_3899283660_0.png treat image : temp/1753963829_2691908_1374576400_dd47611abee84c961074bb7a378db2a5_rle_crop_3899283664_0.png treat image : temp/1753963829_2691908_1374576400_dd47611abee84c961074bb7a378db2a5_rle_crop_3899283665_0.png treat image : temp/1753963829_2691908_1374576879_bbb44a7d9390ec399d139c928a861fc0_rle_crop_3899283604_0.png treat image : temp/1753963829_2691908_1374576776_05e5823925967bd340731b44c370f4a5_rle_crop_3899283616_0.png Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 110 time used for this insertion : 0.02144312858581543 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 110 time used for this insertion : 0.03554058074951172 save missing photos in datou_result : time spend for datou_step_exec : 6.379213094711304 time spend to save output : 0.06174802780151367 total time spend for step 7 : 6.440961122512817 step8:velours_tree Thu Jul 31 14:12:54 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure can't find the photo_desc_type Inside saveOutput : final : False verbose : 0 ouput is None No outpout to save, returning out of save general time spend for datou_step_exec : 0.0850515365600586 time spend to save output : 3.600120544433594e-05 total time spend for step 8 : 0.08508753776550293 step9:send_mail_cod Thu Jul 31 14:12:54 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin/workarea/git/Velours/python in order to get the selector url, please entre the license of selector results_Auto_P25543287_31-07-2025_14_12_54.pdf 25543597 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette255435971753963974 25543598 imagette255435981753963975 25543599 imagette255435991753963975 25543601 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette255436011753963975 25543602 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette255436021753963977 25543603 imagette255436031753963978 25543604 imagette255436041753963978 25543605 imagette255436051753963978 25543606 change filename to text .change filename to text .imagette255436061753963978 25543607 imagette255436071753963978 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=25543287 and hashtag_type = 3594 and pcr.hashtag_id = h.hashtag_id; velour_link : https://www.fotonower.com/velours/25543597,25543598,25543599,25543600,25543601,25543602,25543603,25543604,25543605,25543606,25543607?tags=pet_clair,pehd,background,environnement,papier,carton,mal_croppe,metal,pet_fonce,autre,flou args[1374577059] : ((1374577059, -0.668748884929567, 492688767), (1374577059, 0.5565868006387663, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374577055] : ((1374577055, -0.8875702863511941, 492688767), (1374577055, 0.49854799434988917, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576883] : ((1374576883, -2.3897046307170826, 492609224), (1374576883, 0.32958742924613726, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576881] : ((1374576881, 0.05666185780212569, 492688767), (1374576881, 0.5424930942736816, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576879] : ((1374576879, -0.9090831879731828, 492688767), (1374576879, 0.5522521768077202, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576877] : ((1374576877, -0.5715592409635858, 492688767), (1374576877, 0.3994444693285909, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576875] : ((1374576875, -1.2287891362584393, 492688767), (1374576875, 0.44794378277249564, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576791] : ((1374576791, 1.7463540521998426, 492688767), (1374576791, 0.10980350116643958, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576790] : ((1374576790, -3.7309603889347724, 492609224), (1374576790, 0.5053581476174919, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576776] : ((1374576776, -0.872537471224047, 492688767), (1374576776, 0.5170664265294497, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576753] : ((1374576753, -0.06608934769598261, 492688767), (1374576753, 0.26765444762652724, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576745] : ((1374576745, -0.6361360911957404, 492688767), (1374576745, 0.6448871388439315, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576649] : ((1374576649, -0.33214943287361043, 492688767), (1374576649, 0.6207680117350515, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576624] : ((1374576624, -0.07853950588384163, 492688767), (1374576624, 0.5026408266000927, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576599] : ((1374576599, 0.7774980242015209, 492688767), (1374576599, 0.3387894461483914, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576575] : ((1374576575, -3.6410676321623123, 492609224), (1374576575, 0.4278708577898507, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576486] : ((1374576486, 0.6052999194067544, 492688767), (1374576486, 0.33547966831183285, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576484] : ((1374576484, 0.6465153801855525, 492688767), (1374576484, 0.4669474306767925, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576481] : ((1374576481, 1.2790905637492123, 492688767), (1374576481, 0.4855148173234361, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576477] : ((1374576477, -2.738722371493155, 492609224), (1374576477, 0.4842771415295506, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576474] : ((1374576474, -1.495705739590426, 492688767), (1374576474, 0.6327503580305807, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576464] : ((1374576464, -4.318182249217837, 492609224), (1374576464, 0.43977626266685393, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576443] : ((1374576443, 0.21185228699258463, 492688767), (1374576443, 0.9942032005332888, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576417] : ((1374576417, -0.5908094747585899, 492688767), (1374576417, 0.8158310467521898, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576400] : ((1374576400, -0.4101476329128805, 492688767), (1374576400, 0.4715294543259964, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com args[1374576398] : ((1374576398, -3.2346718112941804, 492609224), (1374576398, 0.30103070902637896, 2107752395), '0.11439766589506177') We are sending mail with results at report@fotonower.com refus_total : 0.11439766589506177 2022-04-13 10:29:59 0 SELECT ph.photo_id,ph.url,ph.username,ph.uploaded_at,ph.text FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=25543287 AND mpp.hide_status=0 ORDER BY mpp.order LIMIT 0, 1000 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25543287_31-07-2025_14_12_54.pdf results_Auto_P25543287_31-07-2025_14_12_54.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25543287_31-07-2025_14_12_54.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('3318','25543287','results_Auto_P25543287_31-07-2025_14_12_54.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25543287_31-07-2025_14_12_54.pdf','pdf','','0.64','0.11439766589506177') message_in_mail: Bonjour,
Veuillez trouver ci dessous les résultats du service carac on demand pour le portfolio: https://www.fotonower.com/view/25543287

https://www.fotonower.com/image?json=false&list_photos_id=1374577059
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374577055
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576883
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576881
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576879
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576877
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576875
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576791
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.7463540521998426)
https://www.fotonower.com/image?json=false&list_photos_id=1374576790
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576776
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576753
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576745
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576649
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576624
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576599
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576575
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576486
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576484
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576481
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.2790905637492123)
https://www.fotonower.com/image?json=false&list_photos_id=1374576477
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576474
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576464
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576443
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576417
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576400
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1374576398
Bravo, la photo est bien prise.

Dans ces conditions,le taux de refus est: 11.44%
Veuillez trouver les photos des contaminants.

exemples de contaminants: pet_clair: https://www.fotonower.com/view/25543597?limit=200
exemples de contaminants: papier: https://www.fotonower.com/view/25543601?limit=200
exemples de contaminants: carton: https://www.fotonower.com/view/25543602?limit=200
exemples de contaminants: autre: https://www.fotonower.com/view/25543606?limit=200
Veuillez trouver le rapport en pdf:https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25543287_31-07-2025_14_12_54.pdf.

Lien vers velours :https://www.fotonower.com/velours/25543597,25543598,25543599,25543600,25543601,25543602,25543603,25543604,25543605,25543606,25543607?tags=pet_clair,pehd,background,environnement,papier,carton,mal_croppe,metal,pet_fonce,autre,flou.


L'équipe Fotonower 202 b'' Server: nginx Date: Thu, 31 Jul 2025 12:13:00 GMT Content-Length: 0 Connection: close X-Message-Id: c0FIiJ2hSEegCMGmQgjbXQ Access-Control-Allow-Origin: https://sendgrid.api-docs.io Access-Control-Allow-Methods: POST Access-Control-Allow-Headers: Authorization, Content-Type, On-behalf-of, x-sg-elas-acl Access-Control-Max-Age: 600 X-No-CORS-Reason: https://sendgrid.com/docs/Classroom/Basics/API/cors.html Strict-Transport-Security: max-age=31536000; includeSubDomains Content-Security-Policy: frame-ancestors 'none' Cache-Control: no-cache X-Content-Type-Options: no-sniff Referrer-Policy: strict-origin-when-cross-origin Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1374577059, 1374577055, 1374576883, 1374576881, 1374576879, 1374576877, 1374576875, 1374576791, 1374576790, 1374576776, 1374576753, 1374576745, 1374576649, 1374576624, 1374576599, 1374576575, 1374576486, 1374576484, 1374576481, 1374576477, 1374576474, 1374576464, 1374576443, 1374576417, 1374576400, 1374576398] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374577059', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374577055', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576883', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576881', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576879', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576877', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576875', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576791', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576790', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576776', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576753', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576745', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576649', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576624', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576599', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576575', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576486', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576484', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576481', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576477', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576474', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576464', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576443', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576417', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576400', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576398', None, None, None, None, None, '3410775') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 26 time used for this insertion : 0.019941329956054688 save_final save missing photos in datou_result : time spend for datou_step_exec : 6.750392198562622 time spend to save output : 0.020220279693603516 total time spend for step 9 : 6.770612478256226 step10:split_time_score Thu Jul 31 14:13:00 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score Catched exception ! Connect or reconnect ! TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('11', 26),) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 31072025 25543287 Nombre de photos uploadées : 26 / 23040 (0%) 31072025 25543287 Nombre de photos taguées (types de déchets): 0 / 26 (0%) 31072025 25543287 Nombre de photos taguées (volume) : 0 / 26 (0%) elapsed_time : load_data_split_time_score 1.430511474609375e-06 elapsed_time : order_list_meta_photo_and_scores 5.0067901611328125e-06 ?????????????????????????? elapsed_time : fill_and_build_computed_from_old_data 0.0012040138244628906 Catched exception ! Connect or reconnect ! Catched exception ! Connect or reconnect ! elapsed_time : insert_dashboard_record_day_entry 0.21086668968200684 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.1288892103909465 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25530216_31-07-2025_08_21_35.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25530216 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25530216 AND mptpi.`type`=3594 To do Qualite : 0.0400941679526749 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25532093_31-07-2025_09_51_52.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25532093 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25532093 AND mptpi.`type`=3594 To do Qualite : 0.017316454475308645 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25532109_31-07-2025_09_41_23.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25532109 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25532109 AND mptpi.`type`=3594 To do Qualite : 0.045261622299382735 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25532112_31-07-2025_09_31_06.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25532112 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25532112 AND mptpi.`type`=3594 To do Qualite : 0.10603395061728398 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25537191_31-07-2025_11_41_29.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25537191 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25537191 AND mptpi.`type`=3594 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25543232 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25543235 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25543266 order by id desc limit 1 Qualite : 0.11439766589506177 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P25543287_31-07-2025_14_12_54.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 25543287 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=25543287 AND mptpi.`type`=3594 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'31072025': {'nb_upload': 26, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1374577059, 1374577055, 1374576883, 1374576881, 1374576879, 1374576877, 1374576875, 1374576791, 1374576790, 1374576776, 1374576753, 1374576745, 1374576649, 1374576624, 1374576599, 1374576575, 1374576486, 1374576484, 1374576481, 1374576477, 1374576474, 1374576464, 1374576443, 1374576417, 1374576400, 1374576398] Looping around the photos to save general results len do output : 1 /25543287Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374577059', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374577055', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576883', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576881', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576879', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576877', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576875', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576791', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576790', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576776', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576753', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576745', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576649', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576624', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576599', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576575', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576486', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576484', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576481', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576477', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576474', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576464', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576443', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576417', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576400', None, None, None, None, None, '3410775') ('3318', None, None, None, None, None, None, None, '3410775') ('3318', '25543287', '1374576398', None, None, None, None, None, '3410775') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 27 time used for this insertion : 0.01812434196472168 save_final save missing photos in datou_result : time spend for datou_step_exec : 3.5736095905303955 time spend to save output : 0.01847696304321289 total time spend for step 10 : 3.5920865535736084 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 26 set_done_treatment 88.68user 38.65system 2:39.52elapsed 79%CPU (0avgtext+0avgdata 3145224maxresident)k 501792inputs+49224outputs (7major+2609166minor)pagefaults 0swaps