python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 3318 ' -s datou_3318 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 821580 load datou : 3318 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? [(photo_id, hashtag_id, hashtag_type, x0, x1, y0, y1, score, seg_temp, polygons), ...] was removed should we ? chemin de la photo was removed should we ? [ (photo_id_loc, hashtag_id, hashtag_type, x0, x1, y0, y1, score, None), ...] was removed should we ? chemin de la photo was removed should we ? id de la photo (peut être local ou global) was removed should we ? chemin de la photo was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? None was removed should we ? donnée sous forme de nombre was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) load thcls load THCL from format json or kwargs add thcl : 2847 in CacheModelConfig load pdts add pdt : 5275 in CacheModelConfig Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 3318, datou_cur_ids : ['3759016'] with mtr_portfolio_ids : ['27096240'] and first list_photo_ids : [] new path : /proc/821580/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : mask_detect, crop_condition, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, velours_tree, send_mail_cod, split_time_score over limit max, limiting to limit_max 40 list_input_json : [] origin We have 1 , BFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 21 ; length of list_pids : 21 ; length of list_args : 21 time to download the photos : 3.0188088417053223 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 10 step1:mask_detect Mon Sep 22 16:20:31 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 10586 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-09-22 16:20:33.831818: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-09-22 16:20:33.860569: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3492910000 Hz 2025-09-22 16:20:33.862876: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f1250000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-09-22 16:20:33.862934: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-09-22 16:20:33.867045: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-09-22 16:20:34.042003: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x2eedfb20 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-09-22 16:20:34.042067: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-09-22 16:20:34.043569: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-09-22 16:20:34.044017: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-09-22 16:20:34.047568: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-09-22 16:20:34.050722: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-09-22 16:20:34.051255: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-09-22 16:20:34.054276: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-09-22 16:20:34.055337: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-09-22 16:20:34.060025: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-09-22 16:20:34.061575: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-09-22 16:20:34.061667: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-09-22 16:20:34.062463: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-09-22 16:20:34.062478: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-09-22 16:20:34.062487: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-09-22 16:20:34.063785: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9805 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-09-22 16:20:34.324911: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-09-22 16:20:34.325043: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-09-22 16:20:34.325073: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-09-22 16:20:34.325099: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-09-22 16:20:34.325125: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-09-22 16:20:34.325151: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-09-22 16:20:34.325176: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-09-22 16:20:34.325202: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-09-22 16:20:34.326512: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-09-22 16:20:34.327672: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-09-22 16:20:34.327700: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-09-22 16:20:34.327714: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-09-22 16:20:34.327727: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-09-22 16:20:34.327740: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-09-22 16:20:34.327752: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-09-22 16:20:34.327765: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-09-22 16:20:34.327778: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-09-22 16:20:34.328948: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-09-22 16:20:34.328982: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-09-22 16:20:34.328990: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-09-22 16:20:34.328997: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-09-22 16:20:34.330207: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9805 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl2847 thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) {'thcl': {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'], 'list_hashtags_csv': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'svm_hashtag_type_desc': 5275, 'photo_desc_type': 5275, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME learn_RUBBIA_REFUS_AMIENS_23 NUM_CLASSES 9 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : learn_RUBBIA_REFUS_AMIENS_23 model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-09-22 16:20:42.657618: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-09-22 16:20:42.857915: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23 /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23/mask_model.h5 size_local : 256009536 size in s3 : 256009536 create time local : 2021-08-09 09:43:22 create time in s3 : 2021-08-06 18:54:04 mask_model.h5 already exist and didn't need to update list_images length : 21 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 33.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 11 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 16.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 17 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 24.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 13 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 17.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 13 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 23.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 11 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 27.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 17 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 27.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 11 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 30.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 31.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 12 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 20.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 24.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 14 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 35.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 15 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 31.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 10 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 27.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 13 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 30.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 31.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 9 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 27.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 30.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 17.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 29.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 1 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 28.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 4 Detection mask done ! Trying to reset tf kernel 822218 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 4495 tf kernel not reseted sub process len(results) : 21 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 21 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 5702 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl2847 Catched exception ! Connect or reconnect ! thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] time for calcul the mask position with numpy : 0.0005145072937011719 nb_pixel_total : 6293 time to create 1 rle with old method : 0.007477521896362305 length of segment : 74 time for calcul the mask position with numpy : 0.00044608116149902344 nb_pixel_total : 15375 time to create 1 rle with old method : 0.017748355865478516 length of segment : 214 time for calcul the mask position with numpy : 0.00028133392333984375 nb_pixel_total : 11299 time to create 1 rle with old method : 0.013159751892089844 length of segment : 180 time for calcul the mask position with numpy : 0.0002715587615966797 nb_pixel_total : 11245 time to create 1 rle with old method : 0.013244152069091797 length of segment : 158 time for calcul the mask position with numpy : 0.0006320476531982422 nb_pixel_total : 26685 time to create 1 rle with old method : 0.030123233795166016 length of segment : 290 time for calcul the mask position with numpy : 0.00022864341735839844 nb_pixel_total : 6380 time to create 1 rle with old method : 0.007838964462280273 length of segment : 80 time for calcul the mask position with numpy : 0.0005688667297363281 nb_pixel_total : 25377 time to create 1 rle with old method : 0.02817225456237793 length of segment : 278 time for calcul the mask position with numpy : 0.0002040863037109375 nb_pixel_total : 9491 time to create 1 rle with old method : 0.01608443260192871 length of segment : 123 time for calcul the mask position with numpy : 0.002712249755859375 nb_pixel_total : 122145 time to create 1 rle with old method : 0.17755961418151855 length of segment : 355 time for calcul the mask position with numpy : 0.0002486705780029297 nb_pixel_total : 7217 time to create 1 rle with old method : 0.008857011795043945 length of segment : 111 time for calcul the mask position with numpy : 0.0002980232238769531 nb_pixel_total : 12758 time to create 1 rle with old method : 0.015093564987182617 length of segment : 190 time for calcul the mask position with numpy : 0.00016546249389648438 nb_pixel_total : 6313 time to create 1 rle with old method : 0.007952690124511719 length of segment : 95 time for calcul the mask position with numpy : 0.00014925003051757812 nb_pixel_total : 7047 time to create 1 rle with old method : 0.010201692581176758 length of segment : 67 time for calcul the mask position with numpy : 0.00018453598022460938 nb_pixel_total : 5388 time to create 1 rle with old method : 0.006580352783203125 length of segment : 121 time for calcul the mask position with numpy : 0.00014138221740722656 nb_pixel_total : 6901 time to create 1 rle with old method : 0.008272886276245117 length of segment : 111 time for calcul the mask position with numpy : 0.0003027915954589844 nb_pixel_total : 17500 time to create 1 rle with old method : 0.02010965347290039 length of segment : 194 time for calcul the mask position with numpy : 0.0001671314239501953 nb_pixel_total : 10342 time to create 1 rle with old method : 0.011998414993286133 length of segment : 130 time for calcul the mask position with numpy : 0.0003609657287597656 nb_pixel_total : 23274 time to create 1 rle with old method : 0.02644968032836914 length of segment : 156 time for calcul the mask position with numpy : 0.00022792816162109375 nb_pixel_total : 12024 time to create 1 rle with old method : 0.013698577880859375 length of segment : 177 time for calcul the mask position with numpy : 0.00015997886657714844 nb_pixel_total : 8995 time to create 1 rle with old method : 0.010269403457641602 length of segment : 195 time for calcul the mask position with numpy : 0.0001289844512939453 nb_pixel_total : 6271 time to create 1 rle with old method : 0.007178783416748047 length of segment : 163 time for calcul the mask position with numpy : 8.416175842285156e-05 nb_pixel_total : 4318 time to create 1 rle with old method : 0.0049474239349365234 length of segment : 84 time for calcul the mask position with numpy : 0.00017881393432617188 nb_pixel_total : 9373 time to create 1 rle with old method : 0.010532379150390625 length of segment : 131 time for calcul the mask position with numpy : 0.00022268295288085938 nb_pixel_total : 14712 time to create 1 rle with old method : 0.016523122787475586 length of segment : 144 time for calcul the mask position with numpy : 0.0005211830139160156 nb_pixel_total : 31293 time to create 1 rle with old method : 0.03882598876953125 length of segment : 227 time for calcul the mask position with numpy : 0.00015306472778320312 nb_pixel_total : 1698 time to create 1 rle with old method : 0.003167867660522461 length of segment : 75 time for calcul the mask position with numpy : 0.0002675056457519531 nb_pixel_total : 12323 time to create 1 rle with old method : 0.018099546432495117 length of segment : 180 time for calcul the mask position with numpy : 0.00023102760314941406 nb_pixel_total : 13549 time to create 1 rle with old method : 0.01514291763305664 length of segment : 165 time for calcul the mask position with numpy : 0.0001227855682373047 nb_pixel_total : 5625 time to create 1 rle with old method : 0.006567716598510742 length of segment : 97 time for calcul the mask position with numpy : 0.0005502700805664062 nb_pixel_total : 41229 time to create 1 rle with old method : 0.046601295471191406 length of segment : 237 time for calcul the mask position with numpy : 0.0002536773681640625 nb_pixel_total : 13034 time to create 1 rle with old method : 0.015133857727050781 length of segment : 192 time for calcul the mask position with numpy : 0.0006227493286132812 nb_pixel_total : 36606 time to create 1 rle with old method : 0.042562007904052734 length of segment : 296 time for calcul the mask position with numpy : 0.00032210350036621094 nb_pixel_total : 21140 time to create 1 rle with old method : 0.024937868118286133 length of segment : 176 time for calcul the mask position with numpy : 0.0001628398895263672 nb_pixel_total : 7152 time to create 1 rle with old method : 0.008733272552490234 length of segment : 92 time for calcul the mask position with numpy : 0.0007412433624267578 nb_pixel_total : 43673 time to create 1 rle with old method : 0.050679922103881836 length of segment : 305 time for calcul the mask position with numpy : 9.107589721679688e-05 nb_pixel_total : 3868 time to create 1 rle with old method : 0.0047490596771240234 length of segment : 51 time for calcul the mask position with numpy : 0.00022864341735839844 nb_pixel_total : 12093 time to create 1 rle with old method : 0.014056682586669922 length of segment : 183 time for calcul the mask position with numpy : 0.00038695335388183594 nb_pixel_total : 24748 time to create 1 rle with old method : 0.028406620025634766 length of segment : 142 time for calcul the mask position with numpy : 0.00012373924255371094 nb_pixel_total : 4435 time to create 1 rle with old method : 0.005224466323852539 length of segment : 76 time for calcul the mask position with numpy : 0.00014925003051757812 nb_pixel_total : 9667 time to create 1 rle with old method : 0.010872364044189453 length of segment : 124 time for calcul the mask position with numpy : 0.0005614757537841797 nb_pixel_total : 40187 time to create 1 rle with old method : 0.045774221420288086 length of segment : 154 time for calcul the mask position with numpy : 0.0002639293670654297 nb_pixel_total : 12472 time to create 1 rle with old method : 0.01401662826538086 length of segment : 184 time for calcul the mask position with numpy : 9.894371032714844e-05 nb_pixel_total : 3662 time to create 1 rle with old method : 0.004191160202026367 length of segment : 96 time for calcul the mask position with numpy : 0.000270843505859375 nb_pixel_total : 15660 time to create 1 rle with old method : 0.01764225959777832 length of segment : 250 time for calcul the mask position with numpy : 0.00011944770812988281 nb_pixel_total : 4311 time to create 1 rle with old method : 0.005257129669189453 length of segment : 99 time for calcul the mask position with numpy : 0.00012540817260742188 nb_pixel_total : 5999 time to create 1 rle with old method : 0.006816387176513672 length of segment : 125 time for calcul the mask position with numpy : 0.0001220703125 nb_pixel_total : 5265 time to create 1 rle with old method : 0.006254673004150391 length of segment : 101 time for calcul the mask position with numpy : 0.00021123886108398438 nb_pixel_total : 11836 time to create 1 rle with old method : 0.013516902923583984 length of segment : 177 time for calcul the mask position with numpy : 0.0004947185516357422 nb_pixel_total : 36293 time to create 1 rle with old method : 0.03977036476135254 length of segment : 257 time for calcul the mask position with numpy : 0.001049041748046875 nb_pixel_total : 73798 time to create 1 rle with old method : 0.08170652389526367 length of segment : 355 time for calcul the mask position with numpy : 0.0003712177276611328 nb_pixel_total : 27124 time to create 1 rle with old method : 0.030919313430786133 length of segment : 227 time for calcul the mask position with numpy : 0.00021266937255859375 nb_pixel_total : 11112 time to create 1 rle with old method : 0.012595415115356445 length of segment : 174 time for calcul the mask position with numpy : 0.0001232624053955078 nb_pixel_total : 6724 time to create 1 rle with old method : 0.007936239242553711 length of segment : 87 time for calcul the mask position with numpy : 0.0002644062042236328 nb_pixel_total : 17615 time to create 1 rle with old method : 0.020236730575561523 length of segment : 229 time for calcul the mask position with numpy : 0.0001475811004638672 nb_pixel_total : 6526 time to create 1 rle with old method : 0.007666349411010742 length of segment : 83 time for calcul the mask position with numpy : 0.0001437664031982422 nb_pixel_total : 6330 time to create 1 rle with old method : 0.007443904876708984 length of segment : 115 time for calcul the mask position with numpy : 0.0002353191375732422 nb_pixel_total : 12320 time to create 1 rle with old method : 0.014267921447753906 length of segment : 182 time for calcul the mask position with numpy : 0.00015926361083984375 nb_pixel_total : 8705 time to create 1 rle with old method : 0.009728670120239258 length of segment : 157 time for calcul the mask position with numpy : 0.00010538101196289062 nb_pixel_total : 6333 time to create 1 rle with old method : 0.007447242736816406 length of segment : 60 time for calcul the mask position with numpy : 0.00022983551025390625 nb_pixel_total : 12114 time to create 1 rle with old method : 0.01367950439453125 length of segment : 185 time for calcul the mask position with numpy : 9.012222290039062e-05 nb_pixel_total : 4103 time to create 1 rle with old method : 0.0046236515045166016 length of segment : 110 time for calcul the mask position with numpy : 0.00012922286987304688 nb_pixel_total : 7532 time to create 1 rle with old method : 0.012979984283447266 length of segment : 108 time for calcul the mask position with numpy : 0.00011205673217773438 nb_pixel_total : 4695 time to create 1 rle with old method : 0.005564689636230469 length of segment : 115 time for calcul the mask position with numpy : 7.891654968261719e-05 nb_pixel_total : 3464 time to create 1 rle with old method : 0.003979921340942383 length of segment : 71 time for calcul the mask position with numpy : 0.0004038810729980469 nb_pixel_total : 21245 time to create 1 rle with old method : 0.024811267852783203 length of segment : 170 time for calcul the mask position with numpy : 0.000225067138671875 nb_pixel_total : 11841 time to create 1 rle with old method : 0.013677120208740234 length of segment : 180 time for calcul the mask position with numpy : 0.00014829635620117188 nb_pixel_total : 7423 time to create 1 rle with old method : 0.008304834365844727 length of segment : 115 time for calcul the mask position with numpy : 0.0003113746643066406 nb_pixel_total : 20448 time to create 1 rle with old method : 0.023617982864379883 length of segment : 216 time for calcul the mask position with numpy : 0.00014090538024902344 nb_pixel_total : 8795 time to create 1 rle with old method : 0.010313987731933594 length of segment : 86 time for calcul the mask position with numpy : 0.00015664100646972656 nb_pixel_total : 7814 time to create 1 rle with old method : 0.008880853652954102 length of segment : 110 time for calcul the mask position with numpy : 0.0017654895782470703 nb_pixel_total : 107769 time to create 1 rle with old method : 0.11953401565551758 length of segment : 522 time for calcul the mask position with numpy : 0.0002315044403076172 nb_pixel_total : 12260 time to create 1 rle with old method : 0.014448881149291992 length of segment : 181 time for calcul the mask position with numpy : 0.00029659271240234375 nb_pixel_total : 7593 time to create 1 rle with old method : 0.008824348449707031 length of segment : 152 time for calcul the mask position with numpy : 0.00014853477478027344 nb_pixel_total : 9796 time to create 1 rle with old method : 0.01121830940246582 length of segment : 145 time for calcul the mask position with numpy : 0.00011920928955078125 nb_pixel_total : 6284 time to create 1 rle with old method : 0.007255077362060547 length of segment : 130 time for calcul the mask position with numpy : 0.000225067138671875 nb_pixel_total : 12700 time to create 1 rle with old method : 0.014435768127441406 length of segment : 188 time for calcul the mask position with numpy : 0.00019931793212890625 nb_pixel_total : 11380 time to create 1 rle with old method : 0.0167238712310791 length of segment : 205 time for calcul the mask position with numpy : 0.0003216266632080078 nb_pixel_total : 17223 time to create 1 rle with old method : 0.022649288177490234 length of segment : 236 time for calcul the mask position with numpy : 0.00019979476928710938 nb_pixel_total : 14726 time to create 1 rle with old method : 0.01723313331604004 length of segment : 68 time for calcul the mask position with numpy : 0.00011157989501953125 nb_pixel_total : 5937 time to create 1 rle with old method : 0.0070726871490478516 length of segment : 75 time for calcul the mask position with numpy : 7.033348083496094e-05 nb_pixel_total : 2856 time to create 1 rle with old method : 0.003490924835205078 length of segment : 65 time for calcul the mask position with numpy : 0.0001468658447265625 nb_pixel_total : 7773 time to create 1 rle with old method : 0.00924062728881836 length of segment : 106 time for calcul the mask position with numpy : 0.0006392002105712891 nb_pixel_total : 40732 time to create 1 rle with old method : 0.04671478271484375 length of segment : 328 time for calcul the mask position with numpy : 0.00040912628173828125 nb_pixel_total : 12279 time to create 1 rle with old method : 0.013777494430541992 length of segment : 186 time for calcul the mask position with numpy : 0.0004627704620361328 nb_pixel_total : 12869 time to create 1 rle with old method : 0.014618873596191406 length of segment : 188 time for calcul the mask position with numpy : 0.0004534721374511719 nb_pixel_total : 12654 time to create 1 rle with old method : 0.014886856079101562 length of segment : 164 time for calcul the mask position with numpy : 0.0006880760192871094 nb_pixel_total : 23894 time to create 1 rle with old method : 0.027695417404174805 length of segment : 262 time for calcul the mask position with numpy : 0.00022363662719726562 nb_pixel_total : 15864 time to create 1 rle with old method : 0.01880955696105957 length of segment : 92 time spent for convertir_results : 4.20660400390625 Inside saveOutput : final : False verbose : 0 eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 88 chid ids of type : 3594 Number RLEs to save : 14310 save missing photos in datou_result : time spend for datou_step_exec : 32.79396057128906 time spend to save output : 0.9339215755462646 total time spend for step 1 : 33.72788214683533 step2:crop_condition Mon Sep 22 16:21:04 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure Loading chi in step crop with photo_hashtag_type : 3594 Loading chi in step crop for list_pids : 21 ! batch 1 Loaded 88 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ begin to crop the class : papier param for this class : {'min_score': 0.7} filtre for class : papier hashtag_id of this class : 492668766 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 52 About to insert : list_path_to_insert length 52 new photo from crops ! About to upload 52 photos upload in portfolio : 3736932 init cache_photo without model_param we have 52 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1758550868_821580 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 52 photos in the portfolio 3736932 time of upload the photos Elapsed time : 13.072541236877441 we have finished the crop for the class : papier begin to crop the class : carton param for this class : {'min_score': 0.7} filtre for class : carton hashtag_id of this class : 492774966 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 4 About to insert : list_path_to_insert length 4 new photo from crops ! About to upload 4 photos upload in portfolio : 3736932 init cache_photo without model_param we have 4 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1758550882_821580 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 4 photos in the portfolio 3736932 time of upload the photos Elapsed time : 2.1411468982696533 we have finished the crop for the class : carton begin to crop the class : metal param for this class : {'min_score': 0.7} filtre for class : metal hashtag_id of this class : 492628673 begin to crop the class : pet_clair param for this class : {'min_score': 0.7} filtre for class : pet_clair hashtag_id of this class : 2107755846 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 29 About to insert : list_path_to_insert length 29 new photo from crops ! About to upload 29 photos upload in portfolio : 3736932 init cache_photo without model_param we have 29 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1758550888_821580 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 29 photos in the portfolio 3736932 time of upload the photos Elapsed time : 7.870860815048218 we have finished the crop for the class : pet_clair begin to crop the class : autre param for this class : {'min_score': 0.7} filtre for class : autre hashtag_id of this class : 494826614 we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 1 About to insert : list_path_to_insert length 1 new photo from crops ! About to upload 1 photos upload in portfolio : 3736932 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1758550896_821580 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 1 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.7779238224029541 we have finished the crop for the class : autre begin to crop the class : pehd param for this class : {'min_score': 0.7} filtre for class : pehd hashtag_id of this class : 628944319 we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 1 About to insert : list_path_to_insert length 1 new photo from crops ! About to upload 1 photos upload in portfolio : 3736932 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1758550897_821580 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 1 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.6313116550445557 we have finished the crop for the class : pehd begin to crop the class : pet_fonce param for this class : {'min_score': 0.7} filtre for class : pet_fonce hashtag_id of this class : 2107755900 we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 1 About to insert : list_path_to_insert length 1 new photo from crops ! About to upload 1 photos upload in portfolio : 3736932 init cache_photo without model_param we have 1 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1758550899_821580 batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 1 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.6832876205444336 we have finished the crop for the class : pet_fonce delete rles from all chi we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : crop_condition we use saveGeneral [1385461205, 1385461182, 1385460929, 1385460905, 1385460880, 1385460877, 1385460874, 1385460871, 1385460842, 1385460832, 1385460798, 1385460751, 1385460714, 1385460684, 1385459953, 1385459897, 1385457601, 1385457447, 1385456832, 1385456823, 1385456806] Looping around the photos to save general results len do output : 88 /1385483034Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483035Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483036Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483037Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483038Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483039Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483040Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483041Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483042Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483043Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483044Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483045Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483046Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483047Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483048Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483049Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483050Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483051Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483052Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483053Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483054Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483056Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483057Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483058Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483059Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483060Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483061Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483062Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483063Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483064Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483065Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483066Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483067Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483068Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483069Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483070Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483071Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483072Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483073Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483075Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483076Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483077Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483078Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483079Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483081Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483082Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483083Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483084Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483085Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483086Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483087Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483088Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483106Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483107Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483108Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483110Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483116Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483117Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483118Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483119Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483120Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483121Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483122Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483123Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483124Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483125Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483126Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483127Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483128Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483129Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483130Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483131Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483132Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483133Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483134Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483135Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483136Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483137Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483138Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483139Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483140Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483141Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483142Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483143Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483144Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483146Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483148Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1385483149Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385461205', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385461182', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460929', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460905', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460880', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460877', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460874', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460871', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460842', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460832', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460798', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460751', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460714', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460684', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385459953', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385459897', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385457601', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385457447', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456832', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456823', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456806', None, None, None, None, None, '3759016') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 285 time used for this insertion : 0.027347326278686523 save_final save missing photos in datou_result : time spend for datou_step_exec : 34.50407028198242 time spend to save output : 0.031039953231811523 total time spend for step 2 : 34.53511023521423 step3:rle_unique_nms_with_priority Mon Sep 22 16:21:39 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 88 chid ids of type : 3594 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++nb_obj : 9 nb_hashtags : 3 time to prepare the origin masks : 0.6521563529968262 time for calcul the mask position with numpy : 0.17094993591308594 nb_pixel_total : 1873142 time to create 1 rle with new method : 0.12056374549865723 time for calcul the mask position with numpy : 0.007086515426635742 nb_pixel_total : 122121 time to create 1 rle with old method : 0.13632869720458984 time for calcul the mask position with numpy : 0.010266542434692383 nb_pixel_total : 99 time to create 1 rle with old method : 0.0002295970916748047 time for calcul the mask position with numpy : 0.009592294692993164 nb_pixel_total : 961 time to create 1 rle with old method : 0.0012104511260986328 time for calcul the mask position with numpy : 0.010033369064331055 nb_pixel_total : 6380 time to create 1 rle with old method : 0.007061481475830078 time for calcul the mask position with numpy : 0.010009527206420898 nb_pixel_total : 26685 time to create 1 rle with old method : 0.028920888900756836 time for calcul the mask position with numpy : 0.00961160659790039 nb_pixel_total : 11245 time to create 1 rle with old method : 0.012036323547363281 time for calcul the mask position with numpy : 0.009631633758544922 nb_pixel_total : 11299 time to create 1 rle with old method : 0.011911392211914062 time for calcul the mask position with numpy : 0.009423494338989258 nb_pixel_total : 15375 time to create 1 rle with old method : 0.01672077178955078 time for calcul the mask position with numpy : 0.009810447692871094 nb_pixel_total : 6293 time to create 1 rle with old method : 0.0070497989654541016 create new chi : 0.6109023094177246 time to delete rle : 0.021864891052246094 batch 1 Loaded 19 chid ids of type : 3594 ++++++++++Number RLEs to save : 4067 TO DO : save crop sub photo not yet done ! save time : 0.2930312156677246 nb_obj : 9 nb_hashtags : 2 time to prepare the origin masks : 0.12219572067260742 time for calcul the mask position with numpy : 0.18148541450500488 nb_pixel_total : 1976860 time to create 1 rle with new method : 0.07740426063537598 time for calcul the mask position with numpy : 0.006435394287109375 nb_pixel_total : 23274 time to create 1 rle with old method : 0.02777409553527832 time for calcul the mask position with numpy : 0.006354331970214844 nb_pixel_total : 10342 time to create 1 rle with old method : 0.011473417282104492 time for calcul the mask position with numpy : 0.006128787994384766 nb_pixel_total : 17500 time to create 1 rle with old method : 0.01903820037841797 time for calcul the mask position with numpy : 0.005938053131103516 nb_pixel_total : 6901 time to create 1 rle with old method : 0.007800579071044922 time for calcul the mask position with numpy : 0.006543874740600586 nb_pixel_total : 5388 time to create 1 rle with old method : 0.006089925765991211 time for calcul the mask position with numpy : 0.006122112274169922 nb_pixel_total : 7047 time to create 1 rle with old method : 0.007707118988037109 time for calcul the mask position with numpy : 0.006085872650146484 nb_pixel_total : 6313 time to create 1 rle with old method : 0.006702899932861328 time for calcul the mask position with numpy : 0.0060842037200927734 nb_pixel_total : 12758 time to create 1 rle with old method : 0.01344442367553711 time for calcul the mask position with numpy : 0.0061032772064208984 nb_pixel_total : 7217 time to create 1 rle with old method : 0.007812023162841797 create new chi : 0.4333968162536621 time to delete rle : 0.00047469139099121094 batch 1 Loaded 19 chid ids of type : 3594 +++++++++Number RLEs to save : 3430 TO DO : save crop sub photo not yet done ! save time : 0.24387216567993164 nb_obj : 8 nb_hashtags : 2 time to prepare the origin masks : 0.10401177406311035 time for calcul the mask position with numpy : 0.07665395736694336 nb_pixel_total : 1985854 time to create 1 rle with new method : 0.1681346893310547 time for calcul the mask position with numpy : 0.0061724185943603516 nb_pixel_total : 760 time to create 1 rle with old method : 0.0009469985961914062 time for calcul the mask position with numpy : 0.006043672561645508 nb_pixel_total : 31293 time to create 1 rle with old method : 0.03309464454650879 time for calcul the mask position with numpy : 0.00607752799987793 nb_pixel_total : 14712 time to create 1 rle with old method : 0.015755176544189453 time for calcul the mask position with numpy : 0.00587916374206543 nb_pixel_total : 9373 time to create 1 rle with old method : 0.010197162628173828 time for calcul the mask position with numpy : 0.006068229675292969 nb_pixel_total : 4318 time to create 1 rle with old method : 0.004548311233520508 time for calcul the mask position with numpy : 0.005908489227294922 nb_pixel_total : 6271 time to create 1 rle with old method : 0.006667375564575195 time for calcul the mask position with numpy : 0.006142139434814453 nb_pixel_total : 8995 time to create 1 rle with old method : 0.01024007797241211 time for calcul the mask position with numpy : 0.006379842758178711 nb_pixel_total : 12024 time to create 1 rle with old method : 0.013112306594848633 create new chi : 0.3986477851867676 time to delete rle : 0.0004904270172119141 batch 1 Loaded 17 chid ids of type : 3594 +++++++++++++++Number RLEs to save : 3428 TO DO : save crop sub photo not yet done ! save time : 0.2297968864440918 nb_obj : 4 nb_hashtags : 3 time to prepare the origin masks : 0.05672264099121094 time for calcul the mask position with numpy : 0.019435405731201172 nb_pixel_total : 2000874 time to create 1 rle with new method : 0.1602334976196289 time for calcul the mask position with numpy : 0.007374286651611328 nb_pixel_total : 41229 time to create 1 rle with old method : 0.04521036148071289 time for calcul the mask position with numpy : 0.0063135623931884766 nb_pixel_total : 5625 time to create 1 rle with old method : 0.0059604644775390625 time for calcul the mask position with numpy : 0.006406307220458984 nb_pixel_total : 13549 time to create 1 rle with old method : 0.01481008529663086 time for calcul the mask position with numpy : 0.006381034851074219 nb_pixel_total : 12323 time to create 1 rle with old method : 0.013533830642700195 create new chi : 0.2930941581726074 time to delete rle : 0.00039887428283691406 batch 1 Loaded 9 chid ids of type : 3594 ++++Number RLEs to save : 2438 TO DO : save crop sub photo not yet done ! save time : 0.1843888759613037 nb_obj : 6 nb_hashtags : 2 time to prepare the origin masks : 0.0912017822265625 time for calcul the mask position with numpy : 0.1062769889831543 nb_pixel_total : 1948127 time to create 1 rle with new method : 0.10024333000183105 time for calcul the mask position with numpy : 0.00623631477355957 nb_pixel_total : 3868 time to create 1 rle with old method : 0.004212856292724609 time for calcul the mask position with numpy : 0.006562471389770508 nb_pixel_total : 43673 time to create 1 rle with old method : 0.05054426193237305 time for calcul the mask position with numpy : 0.006225109100341797 nb_pixel_total : 7152 time to create 1 rle with old method : 0.00812387466430664 time for calcul the mask position with numpy : 0.006646633148193359 nb_pixel_total : 21140 time to create 1 rle with old method : 0.02435588836669922 time for calcul the mask position with numpy : 0.006848335266113281 nb_pixel_total : 36606 time to create 1 rle with old method : 0.05288362503051758 time for calcul the mask position with numpy : 0.010451078414916992 nb_pixel_total : 13034 time to create 1 rle with old method : 0.017883777618408203 create new chi : 0.41956448554992676 time to delete rle : 0.0006470680236816406 batch 1 Loaded 13 chid ids of type : 3594 +++++++++Number RLEs to save : 3304 TO DO : save crop sub photo not yet done ! save time : 0.2503061294555664 nb_obj : 5 nb_hashtags : 2 time to prepare the origin masks : 0.07783102989196777 time for calcul the mask position with numpy : 0.1334245204925537 nb_pixel_total : 1982470 time to create 1 rle with new method : 0.16865205764770508 time for calcul the mask position with numpy : 0.0061435699462890625 nb_pixel_total : 40187 time to create 1 rle with old method : 0.04236769676208496 time for calcul the mask position with numpy : 0.006005287170410156 nb_pixel_total : 9667 time to create 1 rle with old method : 0.010300159454345703 time for calcul the mask position with numpy : 0.005865812301635742 nb_pixel_total : 4435 time to create 1 rle with old method : 0.0047833919525146484 time for calcul the mask position with numpy : 0.005940914154052734 nb_pixel_total : 24748 time to create 1 rle with old method : 0.026199817657470703 time for calcul the mask position with numpy : 0.005838871002197266 nb_pixel_total : 12093 time to create 1 rle with old method : 0.012586116790771484 create new chi : 0.43674564361572266 time to delete rle : 0.00038242340087890625 batch 1 Loaded 11 chid ids of type : 3594 +++++Number RLEs to save : 2438 TO DO : save crop sub photo not yet done ! save time : 0.1801156997680664 nb_obj : 5 nb_hashtags : 2 time to prepare the origin masks : 0.252413272857666 time for calcul the mask position with numpy : 0.12930917739868164 nb_pixel_total : 2033697 time to create 1 rle with new method : 0.08417725563049316 time for calcul the mask position with numpy : 0.006135463714599609 nb_pixel_total : 5999 time to create 1 rle with old method : 0.006789445877075195 time for calcul the mask position with numpy : 0.006051301956176758 nb_pixel_total : 2110 time to create 1 rle with old method : 0.0024187564849853516 time for calcul the mask position with numpy : 0.0064678192138671875 nb_pixel_total : 15660 time to create 1 rle with old method : 0.01744985580444336 time for calcul the mask position with numpy : 0.006136178970336914 nb_pixel_total : 3662 time to create 1 rle with old method : 0.004119396209716797 time for calcul the mask position with numpy : 0.006147146224975586 nb_pixel_total : 12472 time to create 1 rle with old method : 0.0134429931640625 create new chi : 0.29957056045532227 time to delete rle : 0.00044155120849609375 batch 1 Loaded 11 chid ids of type : 3594 +++++++Number RLEs to save : 2492 TO DO : save crop sub photo not yet done ! save time : 0.17741060256958008 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.1929473876953125 time for calcul the mask position with numpy : 0.019698619842529297 nb_pixel_total : 2020206 time to create 1 rle with new method : 0.3563532829284668 time for calcul the mask position with numpy : 0.006289243698120117 nb_pixel_total : 36293 time to create 1 rle with old method : 0.03942704200744629 time for calcul the mask position with numpy : 0.00639653205871582 nb_pixel_total : 11836 time to create 1 rle with old method : 0.013001203536987305 time for calcul the mask position with numpy : 0.006607532501220703 nb_pixel_total : 5265 time to create 1 rle with old method : 0.00586390495300293 create new chi : 0.4622020721435547 time to delete rle : 0.00038361549377441406 batch 1 Loaded 7 chid ids of type : 3594 +++++Number RLEs to save : 2150 TO DO : save crop sub photo not yet done ! save time : 0.19079995155334473 nb_obj : 7 nb_hashtags : 3 time to prepare the origin masks : 0.3473951816558838 time for calcul the mask position with numpy : 0.1735830307006836 nb_pixel_total : 1924371 time to create 1 rle with new method : 0.07813191413879395 time for calcul the mask position with numpy : 0.005783557891845703 nb_pixel_total : 6330 time to create 1 rle with old method : 0.006659507751464844 time for calcul the mask position with numpy : 0.005860090255737305 nb_pixel_total : 6526 time to create 1 rle with old method : 0.006899833679199219 time for calcul the mask position with numpy : 0.005722761154174805 nb_pixel_total : 17615 time to create 1 rle with old method : 0.018589019775390625 time for calcul the mask position with numpy : 0.006180763244628906 nb_pixel_total : 6724 time to create 1 rle with old method : 0.006974697113037109 time for calcul the mask position with numpy : 0.005806922912597656 nb_pixel_total : 11112 time to create 1 rle with old method : 0.011445760726928711 time for calcul the mask position with numpy : 0.006236553192138672 nb_pixel_total : 27124 time to create 1 rle with old method : 0.02844524383544922 time for calcul the mask position with numpy : 0.0068759918212890625 nb_pixel_total : 73798 time to create 1 rle with old method : 0.08221673965454102 create new chi : 0.46598124504089355 time to delete rle : 0.0006234645843505859 batch 1 Loaded 15 chid ids of type : 3594 ++++++++++++Number RLEs to save : 3620 TO DO : save crop sub photo not yet done ! save time : 0.2900228500366211 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.04758191108703613 time for calcul the mask position with numpy : 0.018213987350463867 nb_pixel_total : 2046242 time to create 1 rle with new method : 0.048577308654785156 time for calcul the mask position with numpy : 0.005877494812011719 nb_pixel_total : 6333 time to create 1 rle with old method : 0.006665945053100586 time for calcul the mask position with numpy : 0.005599260330200195 nb_pixel_total : 8705 time to create 1 rle with old method : 0.009374380111694336 time for calcul the mask position with numpy : 0.00593876838684082 nb_pixel_total : 12320 time to create 1 rle with old method : 0.014557361602783203 create new chi : 0.11520028114318848 time to delete rle : 0.0003573894500732422 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 1878 TO DO : save crop sub photo not yet done ! save time : 0.14146995544433594 nb_obj : 6 nb_hashtags : 2 time to prepare the origin masks : 0.0744774341583252 time for calcul the mask position with numpy : 0.01827836036682129 nb_pixel_total : 2020447 time to create 1 rle with new method : 0.1649484634399414 time for calcul the mask position with numpy : 0.006376743316650391 nb_pixel_total : 21245 time to create 1 rle with old method : 0.02321004867553711 time for calcul the mask position with numpy : 0.005896091461181641 nb_pixel_total : 3464 time to create 1 rle with old method : 0.0036962032318115234 time for calcul the mask position with numpy : 0.006258726119995117 nb_pixel_total : 4695 time to create 1 rle with old method : 0.0052759647369384766 time for calcul the mask position with numpy : 0.006302595138549805 nb_pixel_total : 7532 time to create 1 rle with old method : 0.00838780403137207 time for calcul the mask position with numpy : 0.006950855255126953 nb_pixel_total : 4103 time to create 1 rle with old method : 0.004306316375732422 time for calcul the mask position with numpy : 0.0061724185943603516 nb_pixel_total : 12114 time to create 1 rle with old method : 0.013489246368408203 create new chi : 0.2904796600341797 time to delete rle : 0.000438690185546875 batch 1 Loaded 13 chid ids of type : 3594 ++++++Number RLEs to save : 2598 TO DO : save crop sub photo not yet done ! save time : 0.19489574432373047 nb_obj : 6 nb_hashtags : 4 time to prepare the origin masks : 0.11443281173706055 time for calcul the mask position with numpy : 0.10253190994262695 nb_pixel_total : 1909510 time to create 1 rle with new method : 0.08844709396362305 time for calcul the mask position with numpy : 0.011138677597045898 nb_pixel_total : 107769 time to create 1 rle with old method : 0.11619758605957031 time for calcul the mask position with numpy : 0.00613856315612793 nb_pixel_total : 7814 time to create 1 rle with old method : 0.008402585983276367 time for calcul the mask position with numpy : 0.010001897811889648 nb_pixel_total : 8795 time to create 1 rle with old method : 0.009240388870239258 time for calcul the mask position with numpy : 0.008053779602050781 nb_pixel_total : 20448 time to create 1 rle with old method : 0.02151942253112793 time for calcul the mask position with numpy : 0.0060346126556396484 nb_pixel_total : 7423 time to create 1 rle with old method : 0.00831294059753418 time for calcul the mask position with numpy : 0.00606536865234375 nb_pixel_total : 11841 time to create 1 rle with old method : 0.013350486755371094 create new chi : 0.42641186714172363 time to delete rle : 0.0005822181701660156 batch 1 Loaded 13 chid ids of type : 3594 ++++++++Number RLEs to save : 3538 TO DO : save crop sub photo not yet done ! save time : 0.25757646560668945 nb_obj : 4 nb_hashtags : 3 time to prepare the origin masks : 0.09914565086364746 time for calcul the mask position with numpy : 0.020722389221191406 nb_pixel_total : 2037667 time to create 1 rle with new method : 0.10006380081176758 time for calcul the mask position with numpy : 0.007682085037231445 nb_pixel_total : 6284 time to create 1 rle with old method : 0.00806283950805664 time for calcul the mask position with numpy : 0.0063762664794921875 nb_pixel_total : 9796 time to create 1 rle with old method : 0.010695219039916992 time for calcul the mask position with numpy : 0.006009817123413086 nb_pixel_total : 7593 time to create 1 rle with old method : 0.00832676887512207 time for calcul the mask position with numpy : 0.0061492919921875 nb_pixel_total : 12260 time to create 1 rle with old method : 0.01313018798828125 create new chi : 0.20249462127685547 time to delete rle : 0.00039315223693847656 batch 1 Loaded 9 chid ids of type : 3594 +++++Number RLEs to save : 2296 TO DO : save crop sub photo not yet done ! save time : 0.1733243465423584 nb_obj : 8 nb_hashtags : 2 time to prepare the origin masks : 0.287031888961792 time for calcul the mask position with numpy : 0.058023691177368164 nb_pixel_total : 1960273 time to create 1 rle with new method : 0.16923880577087402 time for calcul the mask position with numpy : 0.006353139877319336 nb_pixel_total : 40732 time to create 1 rle with old method : 0.04480862617492676 time for calcul the mask position with numpy : 0.006085872650146484 nb_pixel_total : 7773 time to create 1 rle with old method : 0.008647680282592773 time for calcul the mask position with numpy : 0.005874156951904297 nb_pixel_total : 2856 time to create 1 rle with old method : 0.0032176971435546875 time for calcul the mask position with numpy : 0.005936622619628906 nb_pixel_total : 5937 time to create 1 rle with old method : 0.006679534912109375 time for calcul the mask position with numpy : 0.006048679351806641 nb_pixel_total : 14726 time to create 1 rle with old method : 0.01573920249938965 time for calcul the mask position with numpy : 0.006177186965942383 nb_pixel_total : 17223 time to create 1 rle with old method : 0.018631696701049805 time for calcul the mask position with numpy : 0.006072521209716797 nb_pixel_total : 11380 time to create 1 rle with old method : 0.012379646301269531 time for calcul the mask position with numpy : 0.006174564361572266 nb_pixel_total : 12700 time to create 1 rle with old method : 0.013479471206665039 create new chi : 0.4111604690551758 time to delete rle : 0.0006043910980224609 batch 1 Loaded 17 chid ids of type : 3594 ++++++++++Number RLEs to save : 3622 TO DO : save crop sub photo not yet done ! save time : 0.27493834495544434 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.0399320125579834 time for calcul the mask position with numpy : 0.019243478775024414 nb_pixel_total : 2061321 time to create 1 rle with new method : 0.027725696563720703 time for calcul the mask position with numpy : 0.006005525588989258 nb_pixel_total : 12279 time to create 1 rle with old method : 0.013338088989257812 create new chi : 0.066558837890625 time to delete rle : 0.00024437904357910156 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1452 TO DO : save crop sub photo not yet done ! save time : 0.13393831253051758 nb_obj : 4 nb_hashtags : 2 time to prepare the origin masks : 0.31253552436828613 time for calcul the mask position with numpy : 0.08230924606323242 nb_pixel_total : 2008319 time to create 1 rle with new method : 0.2056128978729248 time for calcul the mask position with numpy : 0.006344795227050781 nb_pixel_total : 15864 time to create 1 rle with old method : 0.017728805541992188 time for calcul the mask position with numpy : 0.006300210952758789 nb_pixel_total : 23894 time to create 1 rle with old method : 0.025853872299194336 time for calcul the mask position with numpy : 0.006176471710205078 nb_pixel_total : 12654 time to create 1 rle with old method : 0.013969659805297852 time for calcul the mask position with numpy : 0.006110191345214844 nb_pixel_total : 12869 time to create 1 rle with old method : 0.014425039291381836 create new chi : 0.395754337310791 time to delete rle : 0.0004391670227050781 batch 1 Loaded 9 chid ids of type : 3594 ++++Number RLEs to save : 2492 TO DO : save crop sub photo not yet done ! save time : 0.18691635131835938 No data in photo_id : 1385457601 No data in photo_id : 1385457447 No data in photo_id : 1385456832 No data in photo_id : 1385456823 No data in photo_id : 1385456806 map_output_result : {1385461205: (0.0, 'Should be the crop_list due to order', 0), 1385461182: (0.0, 'Should be the crop_list due to order', 0), 1385460929: (0.0, 'Should be the crop_list due to order', 0), 1385460905: (0.0, 'Should be the crop_list due to order', 0), 1385460880: (0.0, 'Should be the crop_list due to order', 0), 1385460877: (0.0, 'Should be the crop_list due to order', 0), 1385460874: (0.0, 'Should be the crop_list due to order', 0), 1385460871: (0.0, 'Should be the crop_list due to order', 0), 1385460842: (0.0, 'Should be the crop_list due to order', 0), 1385460832: (0.0, 'Should be the crop_list due to order', 0), 1385460798: (0.0, 'Should be the crop_list due to order', 0), 1385460751: (0.0, 'Should be the crop_list due to order', 0), 1385460714: (0.0, 'Should be the crop_list due to order', 0), 1385460684: (0.0, 'Should be the crop_list due to order', 0), 1385459953: (0.0, 'Should be the crop_list due to order', 0), 1385459897: (0.0, 'Should be the crop_list due to order', 0), 1385457601: (0.0, 'Should be the crop_list due to order', 0.0), 1385457447: (0.0, 'Should be the crop_list due to order', 0.0), 1385456832: (0.0, 'Should be the crop_list due to order', 0.0), 1385456823: (0.0, 'Should be the crop_list due to order', 0.0), 1385456806: (0.0, 'Should be the crop_list due to order', 0.0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1385461205, 1385461182, 1385460929, 1385460905, 1385460880, 1385460877, 1385460874, 1385460871, 1385460842, 1385460832, 1385460798, 1385460751, 1385460714, 1385460684, 1385459953, 1385459897, 1385457601, 1385457447, 1385456832, 1385456823, 1385456806] Looping around the photos to save general results len do output : 21 /1385461205.Didn't retrieve data . /1385461182.Didn't retrieve data . /1385460929.Didn't retrieve data . /1385460905.Didn't retrieve data . /1385460880.Didn't retrieve data . /1385460877.Didn't retrieve data . /1385460874.Didn't retrieve data . /1385460871.Didn't retrieve data . /1385460842.Didn't retrieve data . /1385460832.Didn't retrieve data . /1385460798.Didn't retrieve data . /1385460751.Didn't retrieve data . /1385460714.Didn't retrieve data . /1385460684.Didn't retrieve data . /1385459953.Didn't retrieve data . /1385459897.Didn't retrieve data . /1385457601.Didn't retrieve data . /1385457447.Didn't retrieve data . /1385456832.Didn't retrieve data . /1385456823.Didn't retrieve data . /1385456806.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385461205', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385461182', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460929', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460905', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460880', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460877', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460874', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460871', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460842', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460832', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460798', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460751', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460714', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460684', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385459953', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385459897', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385457601', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385457447', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456832', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456823', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456806', None, None, None, None, None, '3759016') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 63 time used for this insertion : 0.016753435134887695 save_final save missing photos in datou_result : time spend for datou_step_exec : 12.562911748886108 time spend to save output : 0.01739025115966797 total time spend for step 3 : 12.580302000045776 step4:ventilate_hashtags_in_portfolio Mon Sep 22 16:21:51 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 27096240 get user id for portfolio 27096240 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=27096240 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('mal_croppe','pet_fonce','carton','environnement','background','metal','pehd','flou','autre','papier','pet_clair')) AND mptpi.`min_score`=0.5 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=27096240 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('mal_croppe','pet_fonce','carton','environnement','background','metal','pehd','flou','autre','papier','pet_clair')) AND mptpi.`min_score`=0.5 To do Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=27096240 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('mal_croppe','pet_fonce','carton','environnement','background','metal','pehd','flou','autre','papier','pet_clair')) AND mptpi.`min_score`=0.5 To do lien utilise dans velours : https://marlene.fotonower.com/velours/27097015,27097016,27097017,27097018,27097019,27097020,27097021,27097022,27097023,27097024,27097025?tags=mal_croppe,pet_fonce,carton,environnement,background,metal,pehd,flou,autre,papier,pet_clair Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1385461205, 1385461182, 1385460929, 1385460905, 1385460880, 1385460877, 1385460874, 1385460871, 1385460842, 1385460832, 1385460798, 1385460751, 1385460714, 1385460684, 1385459953, 1385459897, 1385457601, 1385457447, 1385456832, 1385456823, 1385456806] Looping around the photos to save general results len do output : 1 /27096240. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385461205', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385461182', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460929', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460905', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460880', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460877', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460874', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460871', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460842', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460832', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460798', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460751', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460714', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460684', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385459953', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385459897', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385457601', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385457447', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456832', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456823', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456806', None, None, None, None, None, '3759016') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 22 time used for this insertion : 0.018950700759887695 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.6846084594726562 time spend to save output : 0.019276857376098633 total time spend for step 4 : 1.7038853168487549 step5:final Mon Sep 22 16:21:53 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Catched exception ! Connect or reconnect ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1385461205: ('0.03187968474426807',), 1385461182: ('0.03187968474426807',), 1385460929: ('0.03187968474426807',), 1385460905: ('0.03187968474426807',), 1385460880: ('0.03187968474426807',), 1385460877: ('0.03187968474426807',), 1385460874: ('0.03187968474426807',), 1385460871: ('0.03187968474426807',), 1385460842: ('0.03187968474426807',), 1385460832: ('0.03187968474426807',), 1385460798: ('0.03187968474426807',), 1385460751: ('0.03187968474426807',), 1385460714: ('0.03187968474426807',), 1385460684: ('0.03187968474426807',), 1385459953: ('0.03187968474426807',), 1385459897: ('0.03187968474426807',), 1385457601: ('0.03187968474426807',), 1385457447: ('0.03187968474426807',), 1385456832: ('0.03187968474426807',), 1385456823: ('0.03187968474426807',), 1385456806: ('0.03187968474426807',)} new output for save of step final : {1385461205: ('0.03187968474426807',), 1385461182: ('0.03187968474426807',), 1385460929: ('0.03187968474426807',), 1385460905: ('0.03187968474426807',), 1385460880: ('0.03187968474426807',), 1385460877: ('0.03187968474426807',), 1385460874: ('0.03187968474426807',), 1385460871: ('0.03187968474426807',), 1385460842: ('0.03187968474426807',), 1385460832: ('0.03187968474426807',), 1385460798: ('0.03187968474426807',), 1385460751: ('0.03187968474426807',), 1385460714: ('0.03187968474426807',), 1385460684: ('0.03187968474426807',), 1385459953: ('0.03187968474426807',), 1385459897: ('0.03187968474426807',), 1385457601: ('0.03187968474426807',), 1385457447: ('0.03187968474426807',), 1385456832: ('0.03187968474426807',), 1385456823: ('0.03187968474426807',), 1385456806: ('0.03187968474426807',)} [1385461205, 1385461182, 1385460929, 1385460905, 1385460880, 1385460877, 1385460874, 1385460871, 1385460842, 1385460832, 1385460798, 1385460751, 1385460714, 1385460684, 1385459953, 1385459897, 1385457601, 1385457447, 1385456832, 1385456823, 1385456806] Looping around the photos to save general results len do output : 21 /1385461205.Didn't retrieve data . /1385461182.Didn't retrieve data . /1385460929.Didn't retrieve data . /1385460905.Didn't retrieve data . /1385460880.Didn't retrieve data . /1385460877.Didn't retrieve data . /1385460874.Didn't retrieve data . /1385460871.Didn't retrieve data . /1385460842.Didn't retrieve data . /1385460832.Didn't retrieve data . /1385460798.Didn't retrieve data . /1385460751.Didn't retrieve data . /1385460714.Didn't retrieve data . /1385460684.Didn't retrieve data . /1385459953.Didn't retrieve data . /1385459897.Didn't retrieve data . /1385457601.Didn't retrieve data . /1385457447.Didn't retrieve data . /1385456832.Didn't retrieve data . /1385456823.Didn't retrieve data . /1385456806.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385461205', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385461182', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460929', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460905', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460880', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460877', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460874', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460871', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460842', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460832', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460798', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460751', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460714', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460684', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385459953', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385459897', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385457601', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385457447', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456832', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456823', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456806', None, None, None, None, None, '3759016') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 63 time used for this insertion : 0.014910221099853516 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.12153506278991699 time spend to save output : 0.015814542770385742 total time spend for step 5 : 0.13734960556030273 step6:blur_detection Mon Sep 22 16:21:53 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection methode: ratio et variance treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd.jpg resize: (1080, 1920) 1385461205 0.8686759549066627 treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871.jpg resize: (1080, 1920) 1385461182 1.2297226617061001 treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf.jpg resize: (1080, 1920) 1385460929 1.1664486821356526 treat image : temp/1758550828_821580_1385460905_fb34b8bab50e23c8e947c498d606b95f.jpg resize: (1080, 1920) 1385460905 0.8551138071186913 treat image : temp/1758550828_821580_1385460880_d9565d825eb0d243786d30d863121df0.jpg resize: (1080, 1920) 1385460880 1.166993636855617 treat image : temp/1758550828_821580_1385460877_bd3eb1ee000e2fc85ece320c5d8fadb4.jpg resize: (1080, 1920) 1385460877 0.8318107206539449 treat image : temp/1758550828_821580_1385460874_b013824bb01715c305ac38cf32fa2523.jpg resize: (1080, 1920) 1385460874 1.3811672416603507 treat image : temp/1758550828_821580_1385460871_1dc2d5a57576ace131d6fbdca766c18e.jpg resize: (1080, 1920) 1385460871 0.42080014125445847 treat image : temp/1758550828_821580_1385460842_47f34c6948ff4026314289da4b1495dc.jpg resize: (1080, 1920) 1385460842 0.9460123569614307 treat image : temp/1758550828_821580_1385460832_5f77530a5fe763cc279fe53a94a7092e.jpg resize: (1080, 1920) 1385460832 0.7895035173069901 treat image : temp/1758550828_821580_1385460798_36f0f71cd8ba8b887662a8eeff105f02.jpg resize: (1080, 1920) 1385460798 0.5981885616980995 treat image : temp/1758550828_821580_1385460751_2e6777d406c861368aea7b704eec56b4.jpg resize: (1080, 1920) 1385460751 1.7286484185826947 treat image : temp/1758550828_821580_1385460714_8ec5d7ae5c4e93d20b0897495bb0baad.jpg resize: (1080, 1920) 1385460714 0.5766295150098999 treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30.jpg resize: (1080, 1920) 1385460684 2.1530087486270197 treat image : temp/1758550828_821580_1385459953_76f7b35eddfb5433004ce9066fe4502d.jpg resize: (1080, 1920) 1385459953 0.38462438429840506 treat image : temp/1758550828_821580_1385459897_421d63444964576a69ce4038f04627bd.jpg resize: (1080, 1920) 1385459897 0.6040977474067101 treat image : temp/1758550828_821580_1385457601_915169b4d33f9c3833437e465447feab.jpg resize: (1080, 1920) 1385457601 8.571043420457677 treat image : temp/1758550828_821580_1385457447_f63d6ec6ebea8d1f796e2e83a2ed7b5f.jpg resize: (1080, 1920) 1385457447 9.528516029004553 treat image : temp/1758550828_821580_1385456832_bb0bf9ecbe80bfdb74fc037d4c88ed7a.jpg resize: (1080, 1920) 1385456832 10.959932080428956 treat image : temp/1758550828_821580_1385456823_4ac947f97b8a36a6c849661b5ceed504.jpg resize: (1080, 1920) 1385456823 9.382960484496445 treat image : temp/1758550828_821580_1385456806_a05b031446a36888fe3864391c422394.jpg resize: (1080, 1920) 1385456806 10.37331237210326 treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159053_0.png resize: (70, 133) 1385483034 -0.22810221371918055 treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159055_0.png resize: (178, 104) 1385483035 -0.49360003301472843 treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159056_0.png resize: (143, 140) 1385483036 -2.0238556768086493 treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159058_0.png resize: (80, 129) 1385483037 -0.3604557638426755 treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159059_0.png resize: (274, 146) 1385483038 -0.5311548367771548 treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159061_0.png resize: (337, 641) 1385483039 -1.6631116461891042 treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159062_0.png resize: (111, 130) 1385483040 -1.0399392792991524 treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159063_0.png resize: (188, 113) 1385483041 -0.31936179318986513 treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159064_0.png resize: (95, 119) 1385483042 -2.465953504487989 treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159065_0.png resize: (60, 155) 1385483043 -0.5777665897825127 treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159066_0.png resize: (121, 122) 1385483044 -2.5932685799305517 treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159067_0.png resize: (111, 74) 1385483045 1.6071954328065619 treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159068_0.png resize: (188, 146) 1385483046 -0.4126615239446055 treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf_rle_crop_3970159071_0.png resize: (175, 111) 1385483047 -0.37526039627767216 treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf_rle_crop_3970159072_0.png resize: (195, 62) 1385483048 -0.563525635532519 treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf_rle_crop_3970159073_0.png resize: (162, 57) 1385483049 -0.5523628618473189 treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf_rle_crop_3970159074_0.png resize: (70, 76) 1385483050 0.3935872987063442 treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf_rle_crop_3970159075_0.png resize: (137, 124) 1385483051 -0.9216068755112916 treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf_rle_crop_3970159077_0.png resize: (227, 318) 1385483052 -0.5419444741446399 treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf_rle_crop_3970159078_0.png resize: (99, 98) 1385483053 -0.5888469342309256 treat image : temp/1758550828_821580_1385460905_fb34b8bab50e23c8e947c498d606b95f_rle_crop_3970159079_0.png resize: (179, 114) 1385483054 -0.4079685713220403 treat image : temp/1758550828_821580_1385460905_fb34b8bab50e23c8e947c498d606b95f_rle_crop_3970159081_0.png resize: (97, 85) 1385483056 -1.3630676465597704 treat image : temp/1758550828_821580_1385460880_d9565d825eb0d243786d30d863121df0_rle_crop_3970159083_0.png resize: (189, 116) 1385483057 -0.4413340011851218 treat image : temp/1758550828_821580_1385460880_d9565d825eb0d243786d30d863121df0_rle_crop_3970159084_0.png resize: (248, 289) 1385483058 -0.0809486893723838 treat image : temp/1758550828_821580_1385460880_d9565d825eb0d243786d30d863121df0_rle_crop_3970159085_0.png resize: (162, 214) 1385483059 -0.69241248186403 treat image : temp/1758550828_821580_1385460880_d9565d825eb0d243786d30d863121df0_rle_crop_3970159087_0.png resize: (216, 350) 1385483060 -1.6952194096755653 treat image : temp/1758550828_821580_1385460877_bd3eb1ee000e2fc85ece320c5d8fadb4_rle_crop_3970159089_0.png resize: (181, 106) 1385483061 -0.3767209497070497 treat image : temp/1758550828_821580_1385460877_bd3eb1ee000e2fc85ece320c5d8fadb4_rle_crop_3970159091_0.png resize: (73, 117) 1385483062 -0.7105334848279751 treat image : temp/1758550828_821580_1385460874_b013824bb01715c305ac38cf32fa2523_rle_crop_3970159094_0.png resize: (183, 115) 1385483063 -0.5973297511624255 treat image : temp/1758550828_821580_1385460874_b013824bb01715c305ac38cf32fa2523_rle_crop_3970159095_0.png resize: (96, 65) 1385483064 -2.2998880462358473 treat image : temp/1758550828_821580_1385460874_b013824bb01715c305ac38cf32fa2523_rle_crop_3970159096_0.png resize: (140, 204) 1385483065 -1.6293884400698222 treat image : temp/1758550828_821580_1385460874_b013824bb01715c305ac38cf32fa2523_rle_crop_3970159097_0.png resize: (99, 93) 1385483066 0.3986697315458064 treat image : temp/1758550828_821580_1385460871_1dc2d5a57576ace131d6fbdca766c18e_rle_crop_3970159099_0.png resize: (101, 92) 1385483067 -1.8927849173971534 treat image : temp/1758550828_821580_1385460871_1dc2d5a57576ace131d6fbdca766c18e_rle_crop_3970159100_0.png resize: (177, 112) 1385483068 -0.43288419621507407 treat image : temp/1758550828_821580_1385460842_47f34c6948ff4026314289da4b1495dc_rle_crop_3970159104_0.png resize: (175, 108) 1385483069 -0.8421391552379348 treat image : temp/1758550828_821580_1385460842_47f34c6948ff4026314289da4b1495dc_rle_crop_3970159108_0.png resize: (104, 95) 1385483070 -0.4798550651705992 treat image : temp/1758550828_821580_1385460832_5f77530a5fe763cc279fe53a94a7092e_rle_crop_3970159109_0.png resize: (182, 114) 1385483071 -0.5966686890003393 treat image : temp/1758550828_821580_1385460832_5f77530a5fe763cc279fe53a94a7092e_rle_crop_3970159111_0.png resize: (59, 132) 1385483072 2.739214194102674 treat image : temp/1758550828_821580_1385460798_36f0f71cd8ba8b887662a8eeff105f02_rle_crop_3970159112_0.png resize: (185, 112) 1385483073 -0.8942158264380632 treat image : temp/1758550828_821580_1385460798_36f0f71cd8ba8b887662a8eeff105f02_rle_crop_3970159113_0.png resize: (109, 54) 1385483075 -1.100791535293258 treat image : temp/1758550828_821580_1385460798_36f0f71cd8ba8b887662a8eeff105f02_rle_crop_3970159116_0.png resize: (71, 58) 1385483076 3.4166132113231424 treat image : temp/1758550828_821580_1385460751_2e6777d406c861368aea7b704eec56b4_rle_crop_3970159118_0.png resize: (177, 110) 1385483077 -0.3463098869093531 treat image : temp/1758550828_821580_1385460751_2e6777d406c861368aea7b704eec56b4_rle_crop_3970159121_0.png resize: (83, 141) 1385483078 -0.7693553308641142 treat image : temp/1758550828_821580_1385460714_8ec5d7ae5c4e93d20b0897495bb0baad_rle_crop_3970159124_0.png resize: (179, 110) 1385483079 -0.4814274062579164 treat image : temp/1758550828_821580_1385460714_8ec5d7ae5c4e93d20b0897495bb0baad_rle_crop_3970159125_0.png resize: (152, 108) 1385483081 -1.6310571973892658 treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30_rle_crop_3970159128_0.png resize: (186, 113) 1385483082 -0.46291280905614157 treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30_rle_crop_3970159132_0.png resize: (75, 121) 1385483083 0.778843842370635 treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30_rle_crop_3970159133_0.png resize: (60, 67) 1385483084 0.5995382118126081 treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30_rle_crop_3970159134_0.png resize: (105, 118) 1385483085 -1.6731041906396695 treat image : temp/1758550828_821580_1385459953_76f7b35eddfb5433004ce9066fe4502d_rle_crop_3970159136_0.png resize: (185, 115) 1385483086 -0.8416121790148458 treat image : temp/1758550828_821580_1385459897_421d63444964576a69ce4038f04627bd_rle_crop_3970159137_0.png resize: (186, 116) 1385483087 -0.39110450784515866 treat image : temp/1758550828_821580_1385459897_421d63444964576a69ce4038f04627bd_rle_crop_3970159138_0.png resize: (163, 177) 1385483088 -0.5288806909555647 treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159054_0.png resize: (213, 106) 1385483106 -1.4046481786848335 treat image : temp/1758550828_821580_1385460905_fb34b8bab50e23c8e947c498d606b95f_rle_crop_3970159080_0.png resize: (165, 96) 1385483107 1.2419955541272845 treat image : temp/1758550828_821580_1385460751_2e6777d406c861368aea7b704eec56b4_rle_crop_3970159119_0.png resize: (114, 81) 1385483108 0.3304498157486373 treat image : temp/1758550828_821580_1385460714_8ec5d7ae5c4e93d20b0897495bb0baad_rle_crop_3970159127_0.png resize: (113, 71) 1385483110 0.20907901951103072 treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159057_0.png resize: (289, 137) 1385483116 -0.3831031446907154 treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159060_0.png resize: (113, 135) 1385483117 -1.6683236620754562 treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159069_0.png resize: (128, 101) 1385483118 0.9105971856378192 treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159070_0.png resize: (156, 214) 1385483119 0.5318927010410888 treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf_rle_crop_3970159076_0.png resize: (138, 156) 1385483120 0.3504995587308338 treat image : temp/1758550828_821580_1385460905_fb34b8bab50e23c8e947c498d606b95f_rle_crop_3970159082_0.png resize: (209, 273) 1385483121 -2.366534783643945 treat image : temp/1758550828_821580_1385460880_d9565d825eb0d243786d30d863121df0_rle_crop_3970159086_0.png resize: (92, 136) 1385483122 1.1641410984374065 treat image : temp/1758550828_821580_1385460880_d9565d825eb0d243786d30d863121df0_rle_crop_3970159088_0.png resize: (51, 91) 1385483123 5.891589995672206 treat image : temp/1758550828_821580_1385460877_bd3eb1ee000e2fc85ece320c5d8fadb4_rle_crop_3970159090_0.png resize: (138, 272) 1385483124 -1.236172797302714 treat image : temp/1758550828_821580_1385460877_bd3eb1ee000e2fc85ece320c5d8fadb4_rle_crop_3970159092_0.png resize: (120, 107) 1385483125 0.15276716287807068 treat image : temp/1758550828_821580_1385460877_bd3eb1ee000e2fc85ece320c5d8fadb4_rle_crop_3970159093_0.png resize: (154, 324) 1385483126 0.37290259218832533 treat image : temp/1758550828_821580_1385460874_b013824bb01715c305ac38cf32fa2523_rle_crop_3970159098_0.png resize: (118, 62) 1385483127 0.791937434003645 treat image : temp/1758550828_821580_1385460871_1dc2d5a57576ace131d6fbdca766c18e_rle_crop_3970159101_0.png resize: (223, 214) 1385483128 -1.2116129399102031 treat image : temp/1758550828_821580_1385460842_47f34c6948ff4026314289da4b1495dc_rle_crop_3970159102_0.png resize: (349, 341) 1385483129 -0.6936717127598606 treat image : temp/1758550828_821580_1385460842_47f34c6948ff4026314289da4b1495dc_rle_crop_3970159103_0.png resize: (221, 167) 1385483130 -0.7784861228201739 treat image : temp/1758550828_821580_1385460842_47f34c6948ff4026314289da4b1495dc_rle_crop_3970159106_0.png resize: (163, 169) 1385483131 -1.9633275011541673 treat image : temp/1758550828_821580_1385460842_47f34c6948ff4026314289da4b1495dc_rle_crop_3970159107_0.png resize: (81, 126) 1385483132 -1.4151624930014328 treat image : temp/1758550828_821580_1385460798_36f0f71cd8ba8b887662a8eeff105f02_rle_crop_3970159114_0.png resize: (108, 90) 1385483133 -0.19720954309243433 treat image : temp/1758550828_821580_1385460798_36f0f71cd8ba8b887662a8eeff105f02_rle_crop_3970159115_0.png resize: (115, 53) 1385483134 -1.752732156657042 treat image : temp/1758550828_821580_1385460798_36f0f71cd8ba8b887662a8eeff105f02_rle_crop_3970159117_0.png resize: (149, 226) 1385483135 -2.4915444069306383 treat image : temp/1758550828_821580_1385460751_2e6777d406c861368aea7b704eec56b4_rle_crop_3970159120_0.png resize: (185, 198) 1385483136 -0.8116043552348152 treat image : temp/1758550828_821580_1385460751_2e6777d406c861368aea7b704eec56b4_rle_crop_3970159123_0.png resize: (509, 320) 1385483137 0.19408889721449615 treat image : temp/1758550828_821580_1385460714_8ec5d7ae5c4e93d20b0897495bb0baad_rle_crop_3970159126_0.png resize: (145, 87) 1385483138 -0.19951136325851876 treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30_rle_crop_3970159129_0.png resize: (155, 120) 1385483139 -2.011900435099721 treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30_rle_crop_3970159130_0.png resize: (233, 113) 1385483140 -0.3619025830504706 treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30_rle_crop_3970159131_0.png resize: (67, 266) 1385483141 1.2301930922920392 treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30_rle_crop_3970159135_0.png resize: (322, 246) 1385483142 -1.388913201971542 treat image : temp/1758550828_821580_1385459897_421d63444964576a69ce4038f04627bd_rle_crop_3970159139_0.png resize: (262, 128) 1385483143 -1.1735560590129301 treat image : temp/1758550828_821580_1385459897_421d63444964576a69ce4038f04627bd_rle_crop_3970159140_0.png resize: (92, 220) 1385483144 -0.5560920280073456 treat image : temp/1758550828_821580_1385460842_47f34c6948ff4026314289da4b1495dc_rle_crop_3970159105_0.png resize: (87, 130) 1385483146 -0.7560736799866461 treat image : temp/1758550828_821580_1385460832_5f77530a5fe763cc279fe53a94a7092e_rle_crop_3970159110_0.png resize: (157, 70) 1385483148 1.530768877116349 treat image : temp/1758550828_821580_1385460751_2e6777d406c861368aea7b704eec56b4_rle_crop_3970159122_0.png resize: (99, 102) 1385483149 -0.30433619050970806 Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 109 time used for this insertion : 0.02054762840270996 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 109 time used for this insertion : 0.03626704216003418 save missing photos in datou_result : time spend for datou_step_exec : 16.935571908950806 time spend to save output : 0.06254005432128906 total time spend for step 6 : 16.998111963272095 step7:brightness Mon Sep 22 16:22:10 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd.jpg treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871.jpg treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf.jpg treat image : temp/1758550828_821580_1385460905_fb34b8bab50e23c8e947c498d606b95f.jpg treat image : temp/1758550828_821580_1385460880_d9565d825eb0d243786d30d863121df0.jpg treat image : temp/1758550828_821580_1385460877_bd3eb1ee000e2fc85ece320c5d8fadb4.jpg treat image : temp/1758550828_821580_1385460874_b013824bb01715c305ac38cf32fa2523.jpg treat image : temp/1758550828_821580_1385460871_1dc2d5a57576ace131d6fbdca766c18e.jpg treat image : temp/1758550828_821580_1385460842_47f34c6948ff4026314289da4b1495dc.jpg treat image : temp/1758550828_821580_1385460832_5f77530a5fe763cc279fe53a94a7092e.jpg treat image : temp/1758550828_821580_1385460798_36f0f71cd8ba8b887662a8eeff105f02.jpg treat image : temp/1758550828_821580_1385460751_2e6777d406c861368aea7b704eec56b4.jpg treat image : temp/1758550828_821580_1385460714_8ec5d7ae5c4e93d20b0897495bb0baad.jpg treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30.jpg treat image : temp/1758550828_821580_1385459953_76f7b35eddfb5433004ce9066fe4502d.jpg treat image : temp/1758550828_821580_1385459897_421d63444964576a69ce4038f04627bd.jpg treat image : temp/1758550828_821580_1385457601_915169b4d33f9c3833437e465447feab.jpg treat image : temp/1758550828_821580_1385457447_f63d6ec6ebea8d1f796e2e83a2ed7b5f.jpg treat image : temp/1758550828_821580_1385456832_bb0bf9ecbe80bfdb74fc037d4c88ed7a.jpg treat image : temp/1758550828_821580_1385456823_4ac947f97b8a36a6c849661b5ceed504.jpg treat image : temp/1758550828_821580_1385456806_a05b031446a36888fe3864391c422394.jpg treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159053_0.png treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159055_0.png treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159056_0.png treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159058_0.png treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159059_0.png treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159061_0.png treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159062_0.png treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159063_0.png treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159064_0.png treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159065_0.png treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159066_0.png treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159067_0.png treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159068_0.png treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf_rle_crop_3970159071_0.png treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf_rle_crop_3970159072_0.png treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf_rle_crop_3970159073_0.png treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf_rle_crop_3970159074_0.png treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf_rle_crop_3970159075_0.png treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf_rle_crop_3970159077_0.png treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf_rle_crop_3970159078_0.png treat image : temp/1758550828_821580_1385460905_fb34b8bab50e23c8e947c498d606b95f_rle_crop_3970159079_0.png treat image : temp/1758550828_821580_1385460905_fb34b8bab50e23c8e947c498d606b95f_rle_crop_3970159081_0.png treat image : temp/1758550828_821580_1385460880_d9565d825eb0d243786d30d863121df0_rle_crop_3970159083_0.png treat image : temp/1758550828_821580_1385460880_d9565d825eb0d243786d30d863121df0_rle_crop_3970159084_0.png treat image : temp/1758550828_821580_1385460880_d9565d825eb0d243786d30d863121df0_rle_crop_3970159085_0.png treat image : temp/1758550828_821580_1385460880_d9565d825eb0d243786d30d863121df0_rle_crop_3970159087_0.png treat image : temp/1758550828_821580_1385460877_bd3eb1ee000e2fc85ece320c5d8fadb4_rle_crop_3970159089_0.png treat image : temp/1758550828_821580_1385460877_bd3eb1ee000e2fc85ece320c5d8fadb4_rle_crop_3970159091_0.png treat image : temp/1758550828_821580_1385460874_b013824bb01715c305ac38cf32fa2523_rle_crop_3970159094_0.png treat image : temp/1758550828_821580_1385460874_b013824bb01715c305ac38cf32fa2523_rle_crop_3970159095_0.png treat image : temp/1758550828_821580_1385460874_b013824bb01715c305ac38cf32fa2523_rle_crop_3970159096_0.png treat image : temp/1758550828_821580_1385460874_b013824bb01715c305ac38cf32fa2523_rle_crop_3970159097_0.png treat image : temp/1758550828_821580_1385460871_1dc2d5a57576ace131d6fbdca766c18e_rle_crop_3970159099_0.png treat image : temp/1758550828_821580_1385460871_1dc2d5a57576ace131d6fbdca766c18e_rle_crop_3970159100_0.png treat image : temp/1758550828_821580_1385460842_47f34c6948ff4026314289da4b1495dc_rle_crop_3970159104_0.png treat image : temp/1758550828_821580_1385460842_47f34c6948ff4026314289da4b1495dc_rle_crop_3970159108_0.png treat image : temp/1758550828_821580_1385460832_5f77530a5fe763cc279fe53a94a7092e_rle_crop_3970159109_0.png treat image : temp/1758550828_821580_1385460832_5f77530a5fe763cc279fe53a94a7092e_rle_crop_3970159111_0.png treat image : temp/1758550828_821580_1385460798_36f0f71cd8ba8b887662a8eeff105f02_rle_crop_3970159112_0.png treat image : temp/1758550828_821580_1385460798_36f0f71cd8ba8b887662a8eeff105f02_rle_crop_3970159113_0.png treat image : temp/1758550828_821580_1385460798_36f0f71cd8ba8b887662a8eeff105f02_rle_crop_3970159116_0.png treat image : temp/1758550828_821580_1385460751_2e6777d406c861368aea7b704eec56b4_rle_crop_3970159118_0.png treat image : temp/1758550828_821580_1385460751_2e6777d406c861368aea7b704eec56b4_rle_crop_3970159121_0.png treat image : temp/1758550828_821580_1385460714_8ec5d7ae5c4e93d20b0897495bb0baad_rle_crop_3970159124_0.png treat image : temp/1758550828_821580_1385460714_8ec5d7ae5c4e93d20b0897495bb0baad_rle_crop_3970159125_0.png treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30_rle_crop_3970159128_0.png treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30_rle_crop_3970159132_0.png treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30_rle_crop_3970159133_0.png treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30_rle_crop_3970159134_0.png treat image : temp/1758550828_821580_1385459953_76f7b35eddfb5433004ce9066fe4502d_rle_crop_3970159136_0.png treat image : temp/1758550828_821580_1385459897_421d63444964576a69ce4038f04627bd_rle_crop_3970159137_0.png treat image : temp/1758550828_821580_1385459897_421d63444964576a69ce4038f04627bd_rle_crop_3970159138_0.png treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159054_0.png treat image : temp/1758550828_821580_1385460905_fb34b8bab50e23c8e947c498d606b95f_rle_crop_3970159080_0.png treat image : temp/1758550828_821580_1385460751_2e6777d406c861368aea7b704eec56b4_rle_crop_3970159119_0.png treat image : temp/1758550828_821580_1385460714_8ec5d7ae5c4e93d20b0897495bb0baad_rle_crop_3970159127_0.png treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159057_0.png treat image : temp/1758550828_821580_1385461205_d6bf341b5e92f485bf7beaa1c22c76cd_rle_crop_3970159060_0.png treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159069_0.png treat image : temp/1758550828_821580_1385461182_bf5effe952e403cb304d30c8c4334871_rle_crop_3970159070_0.png treat image : temp/1758550828_821580_1385460929_374e87da68a08f4620fcad57c30a8abf_rle_crop_3970159076_0.png treat image : temp/1758550828_821580_1385460905_fb34b8bab50e23c8e947c498d606b95f_rle_crop_3970159082_0.png treat image : temp/1758550828_821580_1385460880_d9565d825eb0d243786d30d863121df0_rle_crop_3970159086_0.png treat image : temp/1758550828_821580_1385460880_d9565d825eb0d243786d30d863121df0_rle_crop_3970159088_0.png treat image : temp/1758550828_821580_1385460877_bd3eb1ee000e2fc85ece320c5d8fadb4_rle_crop_3970159090_0.png treat image : temp/1758550828_821580_1385460877_bd3eb1ee000e2fc85ece320c5d8fadb4_rle_crop_3970159092_0.png treat image : temp/1758550828_821580_1385460877_bd3eb1ee000e2fc85ece320c5d8fadb4_rle_crop_3970159093_0.png treat image : temp/1758550828_821580_1385460874_b013824bb01715c305ac38cf32fa2523_rle_crop_3970159098_0.png treat image : temp/1758550828_821580_1385460871_1dc2d5a57576ace131d6fbdca766c18e_rle_crop_3970159101_0.png treat image : temp/1758550828_821580_1385460842_47f34c6948ff4026314289da4b1495dc_rle_crop_3970159102_0.png treat image : temp/1758550828_821580_1385460842_47f34c6948ff4026314289da4b1495dc_rle_crop_3970159103_0.png treat image : temp/1758550828_821580_1385460842_47f34c6948ff4026314289da4b1495dc_rle_crop_3970159106_0.png treat image : temp/1758550828_821580_1385460842_47f34c6948ff4026314289da4b1495dc_rle_crop_3970159107_0.png treat image : temp/1758550828_821580_1385460798_36f0f71cd8ba8b887662a8eeff105f02_rle_crop_3970159114_0.png treat image : temp/1758550828_821580_1385460798_36f0f71cd8ba8b887662a8eeff105f02_rle_crop_3970159115_0.png treat image : temp/1758550828_821580_1385460798_36f0f71cd8ba8b887662a8eeff105f02_rle_crop_3970159117_0.png treat image : temp/1758550828_821580_1385460751_2e6777d406c861368aea7b704eec56b4_rle_crop_3970159120_0.png treat image : temp/1758550828_821580_1385460751_2e6777d406c861368aea7b704eec56b4_rle_crop_3970159123_0.png treat image : temp/1758550828_821580_1385460714_8ec5d7ae5c4e93d20b0897495bb0baad_rle_crop_3970159126_0.png treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30_rle_crop_3970159129_0.png treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30_rle_crop_3970159130_0.png treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30_rle_crop_3970159131_0.png treat image : temp/1758550828_821580_1385460684_ce634c04b4570888555265dee1775f30_rle_crop_3970159135_0.png treat image : temp/1758550828_821580_1385459897_421d63444964576a69ce4038f04627bd_rle_crop_3970159139_0.png treat image : temp/1758550828_821580_1385459897_421d63444964576a69ce4038f04627bd_rle_crop_3970159140_0.png treat image : temp/1758550828_821580_1385460842_47f34c6948ff4026314289da4b1495dc_rle_crop_3970159105_0.png treat image : temp/1758550828_821580_1385460832_5f77530a5fe763cc279fe53a94a7092e_rle_crop_3970159110_0.png treat image : temp/1758550828_821580_1385460751_2e6777d406c861368aea7b704eec56b4_rle_crop_3970159122_0.png Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 109 time used for this insertion : 0.02186298370361328 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 109 time used for this insertion : 0.03586578369140625 save missing photos in datou_result : time spend for datou_step_exec : 4.796584367752075 time spend to save output : 0.06219077110290527 total time spend for step 7 : 4.8587751388549805 step8:velours_tree Mon Sep 22 16:22:15 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure can't find the photo_desc_type Inside saveOutput : final : False verbose : 0 ouput is None No outpout to save, returning out of save general time spend for datou_step_exec : 0.09797477722167969 time spend to save output : 3.266334533691406e-05 total time spend for step 8 : 0.0980074405670166 step9:send_mail_cod Mon Sep 22 16:22:15 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin/workarea/git/Velours/python in order to get the selector url, please entre the license of selector results_Auto_P27096240_22-09-2025_16_22_15.pdf 27097015 imagette270970151758550935 27097016 change filename to text .imagette270970161758550935 27097017 change filename to text .change filename to text .change filename to text .change filename to text .imagette270970171758550935 27097019 imagette270970191758550936 27097020 imagette270970201758550936 27097021 change filename to text .imagette270970211758550936 27097022 imagette270970221758550936 27097023 change filename to text .imagette270970231758550936 27097024 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette270970241758550936 27097025 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette270970251758550937 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=27096240 and hashtag_type = 3594 and pcr.hashtag_id = h.hashtag_id; velour_link : https://marlene.fotonower.com/velours/27097015,27097016,27097017,27097018,27097019,27097020,27097021,27097022,27097023,27097024,27097025?tags=mal_croppe,pet_fonce,carton,environnement,background,metal,pehd,flou,autre,papier,pet_clair args[1385461205] : ((1385461205, 0.8686759549066627, 492688767), (1385461205, 0.6106441239181805, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385461182] : ((1385461182, 1.2297226617061001, 492688767), (1385461182, 0.19521532381193316, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385460929] : ((1385460929, 1.1664486821356526, 492688767), (1385460929, 0.7526602984243109, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385460905] : ((1385460905, 0.8551138071186913, 492688767), (1385460905, 0.4789644297022587, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385460880] : ((1385460880, 1.166993636855617, 492688767), (1385460880, 0.6121287075203247, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385460877] : ((1385460877, 0.8318107206539449, 492688767), (1385460877, 0.3968981053194256, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385460874] : ((1385460874, 1.3811672416603507, 492688767), (1385460874, 0.5800593815860611, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385460871] : ((1385460871, 0.42080014125445847, 492688767), (1385460871, 0.6487584667957952, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385460842] : ((1385460842, 0.9460123569614307, 492688767), (1385460842, 0.5803175533351865, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385460832] : ((1385460832, 0.7895035173069901, 492688767), (1385460832, 0.285038584387986, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385460798] : ((1385460798, 0.5981885616980995, 492688767), (1385460798, 0.22587877190068897, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385460751] : ((1385460751, 1.7286484185826947, 492688767), (1385460751, 0.3785114072842483, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385460714] : ((1385460714, 0.5766295150098999, 492688767), (1385460714, 0.713061555281255, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385460684] : ((1385460684, 2.1530087486270197, 492688767), (1385460684, 0.5765569114520674, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385459953] : ((1385459953, 0.38462438429840506, 492688767), (1385459953, 0.6592706974067816, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385459897] : ((1385459897, 0.6040977474067101, 492688767), (1385459897, 0.5749117703981497, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385457601] : ((1385457601, 8.571043420457677, 492688767), (1385457601, 0.4229130899351689, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385457447] : ((1385457447, 9.528516029004553, 492688767), (1385457447, 0.4738682577872463, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385456832] : ((1385456832, 10.959932080428956, 492688767), (1385456832, 0.28160004530265287, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385456823] : ((1385456823, 9.382960484496445, 492688767), (1385456823, 0.37831254991434404, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com args[1385456806] : ((1385456806, 10.37331237210326, 492688767), (1385456806, 0.41644381507567546, 2107752395), '0.03187968474426807') We are sending mail with results at report@fotonower.com refus_total : 0.03187968474426807 2022-04-13 10:29:59 0 SELECT ph.photo_id,ph.url,ph.username,ph.uploaded_at,ph.text FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=27096240 AND mpp.hide_status=0 ORDER BY mpp.order LIMIT 0, 1000 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P27096240_22-09-2025_16_22_15.pdf results_Auto_P27096240_22-09-2025_16_22_15.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P27096240_22-09-2025_16_22_15.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('3318','27096240','results_Auto_P27096240_22-09-2025_16_22_15.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P27096240_22-09-2025_16_22_15.pdf','pdf','','0.27','0.03187968474426807') message_in_mail: Bonjour,
Veuillez trouver ci dessous les résultats du service carac on demand pour le portfolio: https://www.fotonower.com/view/27096240

https://www.fotonower.com/image?json=false&list_photos_id=1385461205
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1385461182
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.2297226617061001)
https://www.fotonower.com/image?json=false&list_photos_id=1385460929
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.1664486821356526)
https://www.fotonower.com/image?json=false&list_photos_id=1385460905
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1385460880
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.166993636855617)
https://www.fotonower.com/image?json=false&list_photos_id=1385460877
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1385460874
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.3811672416603507)
https://www.fotonower.com/image?json=false&list_photos_id=1385460871
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1385460842
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1385460832
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1385460798
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1385460751
La photo est trop floue, merci de reprendre une photo.(avec le score = 1.7286484185826947)
https://www.fotonower.com/image?json=false&list_photos_id=1385460714
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1385460684
La photo est trop floue, merci de reprendre une photo.(avec le score = 2.1530087486270197)
https://www.fotonower.com/image?json=false&list_photos_id=1385459953
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1385459897
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1385457601
La photo est trop floue, merci de reprendre une photo.(avec le score = 8.571043420457677)
https://www.fotonower.com/image?json=false&list_photos_id=1385457447
La photo est trop floue, merci de reprendre une photo.(avec le score = 9.528516029004553)
https://www.fotonower.com/image?json=false&list_photos_id=1385456832
La photo est trop floue, merci de reprendre une photo.(avec le score = 10.959932080428956)
https://www.fotonower.com/image?json=false&list_photos_id=1385456823
La photo est trop floue, merci de reprendre une photo.(avec le score = 9.382960484496445)
https://www.fotonower.com/image?json=false&list_photos_id=1385456806
La photo est trop floue, merci de reprendre une photo.(avec le score = 10.37331237210326)

Dans ces conditions,le taux de refus est: 3.19%
Veuillez trouver les photos des contaminants.

exemples de contaminants: pet_fonce: https://www.fotonower.com/view/27097016?limit=200
exemples de contaminants: carton: https://www.fotonower.com/view/27097017?limit=200
exemples de contaminants: pehd: https://www.fotonower.com/view/27097021?limit=200
exemples de contaminants: autre: https://www.fotonower.com/view/27097023?limit=200
exemples de contaminants: papier: https://www.fotonower.com/view/27097024?limit=200
exemples de contaminants: pet_clair: https://www.fotonower.com/view/27097025?limit=200
Veuillez trouver le rapport en pdf:https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P27096240_22-09-2025_16_22_15.pdf.

Lien vers velours :https://marlene.fotonower.com/velours/27097015,27097016,27097017,27097018,27097019,27097020,27097021,27097022,27097023,27097024,27097025?tags=mal_croppe,pet_fonce,carton,environnement,background,metal,pehd,flou,autre,papier,pet_clair.


L'équipe Fotonower 202 b'' Server: nginx Date: Mon, 22 Sep 2025 14:22:21 GMT Content-Length: 0 Connection: close X-Message-Id: 6-SuGNiyRe6oyEH1gHinUw Access-Control-Allow-Origin: https://sendgrid.api-docs.io Access-Control-Allow-Methods: POST Access-Control-Allow-Headers: Authorization, Content-Type, On-behalf-of, x-sg-elas-acl Access-Control-Max-Age: 600 X-No-CORS-Reason: https://sendgrid.com/docs/Classroom/Basics/API/cors.html Strict-Transport-Security: max-age=31536000; includeSubDomains Content-Security-Policy: frame-ancestors 'none' Cache-Control: no-cache X-Content-Type-Options: no-sniff Referrer-Policy: strict-origin-when-cross-origin Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1385461205, 1385461182, 1385460929, 1385460905, 1385460880, 1385460877, 1385460874, 1385460871, 1385460842, 1385460832, 1385460798, 1385460751, 1385460714, 1385460684, 1385459953, 1385459897, 1385457601, 1385457447, 1385456832, 1385456823, 1385456806] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385461205', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385461182', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460929', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460905', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460880', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460877', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460874', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460871', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460842', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460832', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460798', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460751', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460714', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460684', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385459953', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385459897', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385457601', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385457447', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456832', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456823', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456806', None, None, None, None, None, '3759016') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 21 time used for this insertion : 0.017276287078857422 save_final save missing photos in datou_result : time spend for datou_step_exec : 6.174682855606079 time spend to save output : 0.017556428909301758 total time spend for step 9 : 6.192239284515381 step10:split_time_score Mon Sep 22 16:22:21 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score Catched exception ! Connect or reconnect ! TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('12', 3), ('13', 18)) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 22092025 27096240 Nombre de photos uploadées : 21 / 23040 (0%) 22092025 27096240 Nombre de photos taguées (types de déchets): 0 / 21 (0%) 22092025 27096240 Nombre de photos taguées (volume) : 0 / 21 (0%) elapsed_time : load_data_split_time_score 2.1457672119140625e-06 elapsed_time : order_list_meta_photo_and_scores 7.62939453125e-06 ????????????????????? elapsed_time : fill_and_build_computed_from_old_data 0.0010085105895996094 Catched exception ! Connect or reconnect ! Catched exception ! Connect or reconnect ! elapsed_time : insert_dashboard_record_day_entry 0.21597766876220703 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.05209852430555556 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P27081911_22-09-2025_09_51_42.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27081911 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=27081911 AND mptpi.`type`=3594 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27086455 order by id desc limit 1 Qualite : 0.11064838927469141 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P27086457_22-09-2025_10_51_50.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27086457 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=27086457 AND mptpi.`type`=3594 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27096228 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27096232 order by id desc limit 1 Qualite : 0.03187968474426807 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P27096240_22-09-2025_16_22_15.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27096240 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=27096240 AND mptpi.`type`=3594 To do Qualite : 0.06591898999183013 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P27096242_22-09-2025_16_13_39.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27096242 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=27096242 AND mptpi.`type`=3594 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'22092025': {'nb_upload': 21, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1385461205, 1385461182, 1385460929, 1385460905, 1385460880, 1385460877, 1385460874, 1385460871, 1385460842, 1385460832, 1385460798, 1385460751, 1385460714, 1385460684, 1385459953, 1385459897, 1385457601, 1385457447, 1385456832, 1385456823, 1385456806] Looping around the photos to save general results len do output : 1 /27096240Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385461205', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385461182', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460929', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460905', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460880', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460877', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460874', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460871', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460842', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460832', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460798', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460751', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460714', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385460684', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385459953', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385459897', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385457601', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385457447', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456832', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456823', None, None, None, None, None, '3759016') ('3318', None, None, None, None, None, None, None, '3759016') ('3318', '27096240', '1385456806', None, None, None, None, None, '3759016') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 22 time used for this insertion : 0.014348745346069336 save_final save missing photos in datou_result : time spend for datou_step_exec : 3.516004800796509 time spend to save output : 0.014627695083618164 total time spend for step 10 : 3.530632495880127 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 21 set_done_treatment 57.86user 28.57system 2:00.68elapsed 71%CPU (0avgtext+0avgdata 3173364maxresident)k 461352inputs+22352outputs (4018major+2146409minor)pagefaults 0swaps