python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 3318 ' -s datou_3318 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 2125288 load datou : 3318 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? [(photo_id, hashtag_id, hashtag_type, x0, x1, y0, y1, score, seg_temp, polygons), ...] was removed should we ? chemin de la photo was removed should we ? [ (photo_id_loc, hashtag_id, hashtag_type, x0, x1, y0, y1, score, None), ...] was removed should we ? chemin de la photo was removed should we ? id de la photo (peut être local ou global) was removed should we ? chemin de la photo was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? None was removed should we ? donnée sous forme de nombre was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) load thcls load THCL from format json or kwargs add thcl : 2847 in CacheModelConfig load pdts add pdt : 5275 in CacheModelConfig Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 3318, datou_cur_ids : ['3824861'] with mtr_portfolio_ids : ['27463485'] and first list_photo_ids : [] new path : /proc/2125288/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : mask_detect, crop_condition, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, velours_tree, send_mail_cod, split_time_score over limit max, limiting to limit_max 40 list_input_json : [] origin We have 1 , BFBFBFBFBFBFBFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 11 ; length of list_pids : 11 ; length of list_args : 11 time to download the photos : 1.8313140869140625 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 10 step1:mask_detect Fri Oct 3 02:00:33 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 5287 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-10-03 02:00:36.502509: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-10-03 02:00:36.532771: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3492910000 Hz 2025-10-03 02:00:36.534530: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f336c000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-10-03 02:00:36.534566: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-10-03 02:00:36.537473: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-10-03 02:00:36.693684: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x23129020 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-10-03 02:00:36.693752: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-10-03 02:00:36.695043: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-10-03 02:00:36.695509: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-03 02:00:36.698898: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-03 02:00:36.702204: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-10-03 02:00:36.702764: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-10-03 02:00:36.706208: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-10-03 02:00:36.707977: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-10-03 02:00:36.713511: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-10-03 02:00:36.714796: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-10-03 02:00:36.714895: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-03 02:00:36.715541: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-10-03 02:00:36.715558: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-10-03 02:00:36.715568: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-10-03 02:00:36.716675: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 4827 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-10-03 02:00:37.135455: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-10-03 02:00:37.135537: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-03 02:00:37.135557: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-03 02:00:37.135576: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-10-03 02:00:37.135594: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-10-03 02:00:37.135612: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-10-03 02:00:37.135629: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-10-03 02:00:37.135648: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-10-03 02:00:37.136817: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-10-03 02:00:37.137970: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-10-03 02:00:37.138006: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-10-03 02:00:37.138024: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-03 02:00:37.138042: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-10-03 02:00:37.138059: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-10-03 02:00:37.138076: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-10-03 02:00:37.138093: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-10-03 02:00:37.138111: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-10-03 02:00:37.139280: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-10-03 02:00:37.139312: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-10-03 02:00:37.139323: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-10-03 02:00:37.139332: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-10-03 02:00:37.140563: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 4827 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl2847 thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (5275, 'learn_RUBBIA_REFUS_AMIENS_23', 16384, 25088, 'learn_RUBBIA_REFUS_AMIENS_23', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2021, 4, 23, 14, 19, 39), datetime.datetime(2021, 4, 23, 14, 19, 39)) {'thcl': {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'], 'list_hashtags_csv': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'svm_hashtag_type_desc': 5275, 'photo_desc_type': 5275, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME learn_RUBBIA_REFUS_AMIENS_23 NUM_CLASSES 9 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : learn_RUBBIA_REFUS_AMIENS_23 model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-10-03 02:00:44.816336: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-10-03 02:00:45.016062: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-10-03 02:00:46.397277: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 4.00G (4294967296 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory local folder : /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23 /data/models_weight/learn_RUBBIA_REFUS_AMIENS_23/mask_model.h5 size_local : 256009536 size in s3 : 256009536 create time local : 2021-08-09 09:43:22 create time in s3 : 2021-08-06 18:54:04 mask_model.h5 already exist and didn't need to update list_images length : 11 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 31.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 30.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 35.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 29.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 32.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 29.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 29.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 7 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 22.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 2 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 36.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 6 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 34.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 NEW PHOTO Processing 1 images image shape: (1080, 1920, 3) min: 38.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 17) min: 0.00000 max: 1920.00000 nb d'objets trouves : 3 Detection mask done ! Trying to reset tf kernel 2129092 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 87 tf kernel not reseted sub process len(results) : 11 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 11 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 4968 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl2847 Catched exception ! Connect or reconnect ! thcls : [{'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'}] thcl {'id': 2847, 'mtr_user_id': 31, 'name': 'learn_RUBBIA_REFUS_AMIENS_23', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,papier,carton,metal,pet_clair,autre,pehd,pet_fonce,environnement', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 3594, 'photo_desc_type': 5275, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 5275 ['background', 'papier', 'carton', 'metal', 'pet_clair', 'autre', 'pehd', 'pet_fonce', 'environnement'] time for calcul the mask position with numpy : 0.0003662109375 nb_pixel_total : 11897 time to create 1 rle with old method : 0.013718605041503906 length of segment : 173 time for calcul the mask position with numpy : 0.0003504753112792969 nb_pixel_total : 18650 time to create 1 rle with old method : 0.021603107452392578 length of segment : 106 time for calcul the mask position with numpy : 0.0020537376403808594 nb_pixel_total : 111911 time to create 1 rle with old method : 0.12665939331054688 length of segment : 548 time for calcul the mask position with numpy : 0.15678071975708008 nb_pixel_total : 1006062 time to create 1 rle with new method : 0.06319880485534668 length of segment : 1715 time for calcul the mask position with numpy : 0.00013446807861328125 nb_pixel_total : 5488 time to create 1 rle with old method : 0.006131410598754883 length of segment : 62 time for calcul the mask position with numpy : 0.00010657310485839844 nb_pixel_total : 5581 time to create 1 rle with old method : 0.006208896636962891 length of segment : 67 time for calcul the mask position with numpy : 0.0009813308715820312 nb_pixel_total : 75545 time to create 1 rle with old method : 0.079345703125 length of segment : 338 time for calcul the mask position with numpy : 0.00017571449279785156 nb_pixel_total : 7119 time to create 1 rle with old method : 0.007681846618652344 length of segment : 120 time for calcul the mask position with numpy : 8.177757263183594e-05 nb_pixel_total : 2895 time to create 1 rle with old method : 0.003182649612426758 length of segment : 69 time for calcul the mask position with numpy : 0.0014255046844482422 nb_pixel_total : 86388 time to create 1 rle with old method : 0.09208059310913086 length of segment : 498 time for calcul the mask position with numpy : 0.002831697463989258 nb_pixel_total : 102949 time to create 1 rle with old method : 0.1057884693145752 length of segment : 570 time for calcul the mask position with numpy : 0.0011317729949951172 nb_pixel_total : 35791 time to create 1 rle with old method : 0.0370182991027832 length of segment : 264 time for calcul the mask position with numpy : 0.0001609325408935547 nb_pixel_total : 6396 time to create 1 rle with old method : 0.007030963897705078 length of segment : 85 time for calcul the mask position with numpy : 0.0018362998962402344 nb_pixel_total : 103773 time to create 1 rle with old method : 0.10722756385803223 length of segment : 525 time for calcul the mask position with numpy : 0.0005068778991699219 nb_pixel_total : 15008 time to create 1 rle with old method : 0.016532421112060547 length of segment : 212 time for calcul the mask position with numpy : 0.0006973743438720703 nb_pixel_total : 25780 time to create 1 rle with old method : 0.028624534606933594 length of segment : 189 time spent for convertir_results : 2.236013650894165 Inside saveOutput : final : False verbose : 0 eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 16 chid ids of type : 3594 Number RLEs to save : 5541 save missing photos in datou_result : time spend for datou_step_exec : 24.450759649276733 time spend to save output : 0.8407902717590332 total time spend for step 1 : 25.291549921035767 step2:crop_condition Fri Oct 3 02:00:59 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure Loading chi in step crop with photo_hashtag_type : 3594 Loading chi in step crop for list_pids : 11 ! batch 1 Loaded 16 chid ids of type : 3594 +++++++++++++++++++++ begin to crop the class : papier param for this class : {'min_score': 0.7} filtre for class : papier hashtag_id of this class : 492668766 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 3 About to insert : list_path_to_insert length 3 new photo from crops ! About to upload 3 photos upload in portfolio : 3736932 init cache_photo without model_param we have 3 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1759449659_2125288 INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759449660), 0.0, 0.0, 14, '', 0, 0, '1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2_rle_crop_3984976356_0.png', 0, 111, 62, 0, 1759449660,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759449660), 0.0, 0.0, 14, '', 0, 0, '1759449632_2125288_1387403490_570863d94c7aadf080c1f0569c595afb_rle_crop_3984976359_0.png', 0, 101, 120, 0, 1759449660,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759449660), 0.0, 0.0, 14, '', 0, 0, '1759449632_2125288_1387403487_70b9bb2b414d00a14a98c4a40c419334_rle_crop_3984976364_0.png', 0, 104, 85, 0, 1759449660,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 3 photos in the portfolio 3736932 time of upload the photos Elapsed time : 1.7025587558746338 we have finished the crop for the class : papier begin to crop the class : carton param for this class : {'min_score': 0.7} filtre for class : carton hashtag_id of this class : 492774966 begin to crop the class : metal param for this class : {'min_score': 0.7} filtre for class : metal hashtag_id of this class : 492628673 begin to crop the class : pet_clair param for this class : {'min_score': 0.7} filtre for class : pet_clair hashtag_id of this class : 2107755846 we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 11 About to insert : list_path_to_insert length 11 new photo from crops ! About to upload 11 photos upload in portfolio : 3736932 init cache_photo without model_param we have 11 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1759449668_2125288 INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759449670), 0.0, 0.0, 14, '', 0, 0, '1759449632_2125288_1387403527_b93e8f674f5a48fb4eecf065bea21856_rle_crop_3984976352_0.png', 0, 97, 173, 0, 1759449670,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759449670), 0.0, 0.0, 14, '', 0, 0, '1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2_rle_crop_3984976355_0.png', 0, 1236, 988, 0, 1759449670,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759449670), 0.0, 0.0, 14, '', 0, 0, '1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2_rle_crop_3984976353_0.png', 0, 208, 103, 0, 1759449670,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759449670), 0.0, 0.0, 14, '', 0, 0, '1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2_rle_crop_3984976354_0.png', 0, 357, 533, 0, 1759449670,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759449670), 0.0, 0.0, 14, '', 0, 0, '1759449632_2125288_1387403493_37b6f0e8bbd93a6779a0edf2ce69e680_rle_crop_3984976358_0.png', 0, 345, 305, 0, 1759449670,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759449670), 0.0, 0.0, 14, '', 0, 0, '1759449632_2125288_1387403490_570863d94c7aadf080c1f0569c595afb_rle_crop_3984976361_0.png', 0, 333, 481, 0, 1759449670,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759449670), 0.0, 0.0, 14, '', 0, 0, '1759449632_2125288_1387403487_70b9bb2b414d00a14a98c4a40c419334_rle_crop_3984976362_0.png', 0, 323, 563, 0, 1759449670,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759449670), 0.0, 0.0, 14, '', 0, 0, '1759449632_2125288_1387403487_70b9bb2b414d00a14a98c4a40c419334_rle_crop_3984976363_0.png', 0, 238, 226, 0, 1759449670,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759449670), 0.0, 0.0, 14, '', 0, 0, '1759449632_2125288_1387403484_f54788ba910973b5a83c1487ae8d4621_rle_crop_3984976365_0.png', 0, 343, 518, 0, 1759449670,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759449670), 0.0, 0.0, 14, '', 0, 0, '1759449632_2125288_1387403481_5aaef4ebf5c1f59e786c76551de6ad4b_rle_crop_3984976367_0.png', 0, 208, 174, 0, 1759449670,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759449670), 0.0, 0.0, 14, '', 0, 0, '1759449632_2125288_1387403481_5aaef4ebf5c1f59e786c76551de6ad4b_rle_crop_3984976366_0.png', 0, 127, 149, 0, 1759449670,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 11 photos in the portfolio 3736932 time of upload the photos Elapsed time : 4.904461860656738 we have finished the crop for the class : pet_clair begin to crop the class : autre param for this class : {'min_score': 0.7} filtre for class : autre hashtag_id of this class : 494826614 we have both polygon and rles Next one ! we have both polygon and rles Next one ! map_result returned by crop_photo_return_map_crop : length : 2 About to insert : list_path_to_insert length 2 new photo from crops ! About to upload 2 photos upload in portfolio : 3736932 init cache_photo without model_param we have 2 photo to upload uploaded to storage server : ovh folder_temporaire : temp/1759449673_2125288 INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759449673), 0.0, 0.0, 14, '', 0, 0, '1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2_rle_crop_3984976357_0.png', 0, 105, 67, 0, 1759449673,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! INSERT INTO MTRBack.photos (`timeStamp`, `latitude`, `longitude`, `right_categories`, `tags`, `speed`, `size`, `text`, `altitude`, `width`, `height`, `score`, `created_at`,`source_id`,`place_id`) VALUES (FROM_UNIXTIME(1759449673), 0.0, 0.0, 14, '', 0, 0, '1759449632_2125288_1387403490_570863d94c7aadf080c1f0569c595afb_rle_crop_3984976360_0.png', 0, 55, 69, 0, 1759449673,'0',0) batch_size : 0, verbose : False, strat_bulk_insert : ignore_different_from_first This is a hack ! we have uploaded 2 photos in the portfolio 3736932 time of upload the photos Elapsed time : 0.9844458103179932 we have finished the crop for the class : autre begin to crop the class : pehd param for this class : {'min_score': 0.7} filtre for class : pehd hashtag_id of this class : 628944319 begin to crop the class : pet_fonce param for this class : {'min_score': 0.7} filtre for class : pet_fonce hashtag_id of this class : 2107755900 delete rles from all chi we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles we have 0 chi objets contains the rles Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : crop_condition we use saveGeneral [1387403527, 1387403526, 1387403524, 1387403522, 1387403493, 1387403490, 1387403487, 1387403484, 1387403481, 1387403479, 1387403206] Looping around the photos to save general results len do output : 16 /1387523296Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387523297Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387523298Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387523310Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387523311Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387523312Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387523313Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387523314Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387523315Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387523316Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387523317Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387523318Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387523320Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387523321Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387523324Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . /1387523325Didn't retrieve data .Didn't retrieve data .Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403527', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403526', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403524', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403522', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403493', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403490', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403487', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403484', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403481', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403479', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403206', None, None, None, None, None, '3824861') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 59 time used for this insertion : 0.03761458396911621 save_final save missing photos in datou_result : time spend for datou_step_exec : 14.896756410598755 time spend to save output : 0.03847193717956543 total time spend for step 2 : 14.93522834777832 step3:rle_unique_nms_with_priority Fri Oct 3 02:01:14 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 16 chid ids of type : 3594 +++++++++++++++++++++nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.16069698333740234 time for calcul the mask position with numpy : 0.2658202648162842 nb_pixel_total : 2061703 time to create 1 rle with new method : 0.08101296424865723 time for calcul the mask position with numpy : 0.0062694549560546875 nb_pixel_total : 11897 time to create 1 rle with old method : 0.012823820114135742 create new chi : 0.3768908977508545 time to delete rle : 0.11618256568908691 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1426 TO DO : save crop sub photo not yet done ! save time : 0.289700984954834 No data in photo_id : 1387403526 No data in photo_id : 1387403524 nb_obj : 5 nb_hashtags : 3 time to prepare the origin masks : 0.6808195114135742 time for calcul the mask position with numpy : 0.03542685508728027 nb_pixel_total : 931253 time to create 1 rle with new method : 0.30818963050842285 time for calcul the mask position with numpy : 0.01001882553100586 nb_pixel_total : 236 time to create 1 rle with old method : 0.00032520294189453125 time for calcul the mask position with numpy : 0.006021976470947266 nb_pixel_total : 5488 time to create 1 rle with old method : 0.005951881408691406 time for calcul the mask position with numpy : 0.015146493911743164 nb_pixel_total : 1006062 time to create 1 rle with new method : 0.19028162956237793 time for calcul the mask position with numpy : 0.007537126541137695 nb_pixel_total : 111911 time to create 1 rle with old method : 0.12320351600646973 time for calcul the mask position with numpy : 0.006737470626831055 nb_pixel_total : 18650 time to create 1 rle with old method : 0.020624637603759766 create new chi : 0.7456479072570801 time to delete rle : 0.0006597042083740234 batch 1 Loaded 11 chid ids of type : 3594 ++++++Number RLEs to save : 5976 TO DO : save crop sub photo not yet done ! save time : 0.8012247085571289 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.0578005313873291 time for calcul the mask position with numpy : 0.026299238204956055 nb_pixel_total : 1998055 time to create 1 rle with new method : 0.03404641151428223 time for calcul the mask position with numpy : 0.006064414978027344 nb_pixel_total : 75545 time to create 1 rle with old method : 0.07914423942565918 create new chi : 0.14582180976867676 time to delete rle : 0.0002465248107910156 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 1756 TO DO : save crop sub photo not yet done ! save time : 0.31343722343444824 nb_obj : 3 nb_hashtags : 3 time to prepare the origin masks : 0.04639935493469238 time for calcul the mask position with numpy : 0.03824257850646973 nb_pixel_total : 1977198 time to create 1 rle with new method : 0.25047922134399414 time for calcul the mask position with numpy : 0.0077250003814697266 nb_pixel_total : 86388 time to create 1 rle with old method : 0.10718297958374023 time for calcul the mask position with numpy : 0.006294727325439453 nb_pixel_total : 2895 time to create 1 rle with old method : 0.0030405521392822266 time for calcul the mask position with numpy : 0.006119966506958008 nb_pixel_total : 7119 time to create 1 rle with old method : 0.007804155349731445 create new chi : 0.4380486011505127 time to delete rle : 0.0003943443298339844 batch 1 Loaded 7 chid ids of type : 3594 +++Number RLEs to save : 2454 TO DO : save crop sub photo not yet done ! save time : 0.4567537307739258 nb_obj : 3 nb_hashtags : 2 time to prepare the origin masks : 0.2346668243408203 time for calcul the mask position with numpy : 0.05817437171936035 nb_pixel_total : 1928464 time to create 1 rle with new method : 0.1683027744293213 time for calcul the mask position with numpy : 0.006535530090332031 nb_pixel_total : 6396 time to create 1 rle with old method : 0.0069580078125 time for calcul the mask position with numpy : 0.006165742874145508 nb_pixel_total : 35791 time to create 1 rle with old method : 0.038553476333618164 time for calcul the mask position with numpy : 0.0064258575439453125 nb_pixel_total : 102949 time to create 1 rle with old method : 0.10904312133789062 create new chi : 0.4105069637298584 time to delete rle : 0.0003676414489746094 batch 1 Loaded 7 chid ids of type : 3594 ++++Number RLEs to save : 2918 TO DO : save crop sub photo not yet done ! save time : 0.452467679977417 nb_obj : 1 nb_hashtags : 1 time to prepare the origin masks : 0.07109713554382324 time for calcul the mask position with numpy : 0.11938810348510742 nb_pixel_total : 1969827 time to create 1 rle with new method : 0.18282389640808105 time for calcul the mask position with numpy : 0.00621795654296875 nb_pixel_total : 103773 time to create 1 rle with old method : 0.10923576354980469 create new chi : 0.4238309860229492 time to delete rle : 0.00034356117248535156 batch 1 Loaded 3 chid ids of type : 3594 +Number RLEs to save : 2130 TO DO : save crop sub photo not yet done ! save time : 0.45569610595703125 nb_obj : 2 nb_hashtags : 1 time to prepare the origin masks : 0.04368448257446289 time for calcul the mask position with numpy : 0.04053926467895508 nb_pixel_total : 2032812 time to create 1 rle with new method : 0.07648682594299316 time for calcul the mask position with numpy : 0.005995035171508789 nb_pixel_total : 25780 time to create 1 rle with old method : 0.026582002639770508 time for calcul the mask position with numpy : 0.00567936897277832 nb_pixel_total : 15008 time to create 1 rle with old method : 0.018596172332763672 create new chi : 0.18408441543579102 time to delete rle : 0.0004191398620605469 batch 1 Loaded 5 chid ids of type : 3594 +++++Number RLEs to save : 1882 TO DO : save crop sub photo not yet done ! save time : 0.3178403377532959 No data in photo_id : 1387403479 No data in photo_id : 1387403206 map_output_result : {1387403527: (0.0, 'Should be the crop_list due to order', 0), 1387403526: (0.0, 'Should be the crop_list due to order', 0.0), 1387403524: (0.0, 'Should be the crop_list due to order', 0.0), 1387403522: (0.0, 'Should be the crop_list due to order', 0), 1387403493: (0.0, 'Should be the crop_list due to order', 0), 1387403490: (0.0, 'Should be the crop_list due to order', 0), 1387403487: (0.0, 'Should be the crop_list due to order', 0), 1387403484: (0.0, 'Should be the crop_list due to order', 0), 1387403481: (0.0, 'Should be the crop_list due to order', 0), 1387403479: (0.0, 'Should be the crop_list due to order', 0.0), 1387403206: (0.0, 'Should be the crop_list due to order', 0.0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1387403527, 1387403526, 1387403524, 1387403522, 1387403493, 1387403490, 1387403487, 1387403484, 1387403481, 1387403479, 1387403206] Looping around the photos to save general results len do output : 11 /1387403527.Didn't retrieve data . /1387403526.Didn't retrieve data . /1387403524.Didn't retrieve data . /1387403522.Didn't retrieve data . /1387403493.Didn't retrieve data . /1387403490.Didn't retrieve data . /1387403487.Didn't retrieve data . /1387403484.Didn't retrieve data . /1387403481.Didn't retrieve data . /1387403479.Didn't retrieve data . /1387403206.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403527', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403526', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403524', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403522', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403493', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403490', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403487', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403484', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403481', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403479', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403206', None, None, None, None, None, '3824861') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 33 time used for this insertion : 0.03567099571228027 save_final save missing photos in datou_result : time spend for datou_step_exec : 7.623699188232422 time spend to save output : 0.03604316711425781 total time spend for step 3 : 7.65974235534668 step4:ventilate_hashtags_in_portfolio Fri Oct 3 02:01:21 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 27463485 get user id for portfolio 27463485 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=27463485 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('metal','mal_croppe','autre','flou','pet_clair','environnement','background','pet_fonce','pehd','carton','papier')) AND mptpi.`min_score`=0.5 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=27463485 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('metal','mal_croppe','autre','flou','pet_clair','environnement','background','pet_fonce','pehd','carton','papier')) AND mptpi.`min_score`=0.5 To do Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") Catched exception ! Connect or reconnect ! (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ')\n and cspi.crop_hashtag_id = chi.id' at line 3") To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=27463485 AND mptpi.`type`=3594 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('metal','mal_croppe','autre','flou','pet_clair','environnement','background','pet_fonce','pehd','carton','papier')) AND mptpi.`min_score`=0.5 To do lien utilise dans velours : https://marlene.fotonower.com/velours/27464910,27464911,27464912,27464913,27464914,27464915,27464916,27464917,27464918,27464920,27464921?tags=metal,mal_croppe,autre,flou,pet_clair,environnement,background,pet_fonce,pehd,carton,papier Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1387403527, 1387403526, 1387403524, 1387403522, 1387403493, 1387403490, 1387403487, 1387403484, 1387403481, 1387403479, 1387403206] Looping around the photos to save general results len do output : 1 /27463485. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403527', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403526', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403524', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403522', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403493', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403490', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403487', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403484', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403481', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403479', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403206', None, None, None, None, None, '3824861') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 12 time used for this insertion : 0.03682208061218262 save_final save missing photos in datou_result : time spend for datou_step_exec : 7.136303186416626 time spend to save output : 0.03708195686340332 total time spend for step 4 : 7.173385143280029 step5:final Fri Oct 3 02:01:28 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Catched exception ! Connect or reconnect ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1387403527: ('0.07084245230078562',), 1387403526: ('0.07084245230078562',), 1387403524: ('0.07084245230078562',), 1387403522: ('0.07084245230078562',), 1387403493: ('0.07084245230078562',), 1387403490: ('0.07084245230078562',), 1387403487: ('0.07084245230078562',), 1387403484: ('0.07084245230078562',), 1387403481: ('0.07084245230078562',), 1387403479: ('0.07084245230078562',), 1387403206: ('0.07084245230078562',)} new output for save of step final : {1387403527: ('0.07084245230078562',), 1387403526: ('0.07084245230078562',), 1387403524: ('0.07084245230078562',), 1387403522: ('0.07084245230078562',), 1387403493: ('0.07084245230078562',), 1387403490: ('0.07084245230078562',), 1387403487: ('0.07084245230078562',), 1387403484: ('0.07084245230078562',), 1387403481: ('0.07084245230078562',), 1387403479: ('0.07084245230078562',), 1387403206: ('0.07084245230078562',)} [1387403527, 1387403526, 1387403524, 1387403522, 1387403493, 1387403490, 1387403487, 1387403484, 1387403481, 1387403479, 1387403206] Looping around the photos to save general results len do output : 11 /1387403527.Didn't retrieve data . /1387403526.Didn't retrieve data . /1387403524.Didn't retrieve data . /1387403522.Didn't retrieve data . /1387403493.Didn't retrieve data . /1387403490.Didn't retrieve data . /1387403487.Didn't retrieve data . /1387403484.Didn't retrieve data . /1387403481.Didn't retrieve data . /1387403479.Didn't retrieve data . /1387403206.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403527', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403526', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403524', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403522', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403493', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403490', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403487', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403484', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403481', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403479', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403206', None, None, None, None, None, '3824861') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 33 time used for this insertion : 0.03861665725708008 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.4130361080169678 time spend to save output : 0.0393223762512207 total time spend for step 5 : 0.4523584842681885 step6:blur_detection Fri Oct 3 02:01:29 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection methode: ratio et variance treat image : temp/1759449632_2125288_1387403527_b93e8f674f5a48fb4eecf065bea21856.jpg resize: (1080, 1920) 1387403527 -2.115730625141723 treat image : temp/1759449632_2125288_1387403526_9e2822e6d763d9845ad22df626710a59.jpg resize: (1080, 1920) 1387403526 -1.6148253149470808 treat image : temp/1759449632_2125288_1387403524_8ddf0f8afb730bd7fb85a12ea5735169.jpg resize: (1080, 1920) 1387403524 -2.5776840405158916 treat image : temp/1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2.jpg resize: (1080, 1920) 1387403522 -0.8699040602484222 treat image : temp/1759449632_2125288_1387403493_37b6f0e8bbd93a6779a0edf2ce69e680.jpg resize: (1080, 1920) 1387403493 -0.22030738663888816 treat image : temp/1759449632_2125288_1387403490_570863d94c7aadf080c1f0569c595afb.jpg resize: (1080, 1920) 1387403490 -0.5872535075024725 treat image : temp/1759449632_2125288_1387403487_70b9bb2b414d00a14a98c4a40c419334.jpg resize: (1080, 1920) 1387403487 -1.5935637156107305 treat image : temp/1759449632_2125288_1387403484_f54788ba910973b5a83c1487ae8d4621.jpg resize: (1080, 1920) 1387403484 -1.8753771763280922 treat image : temp/1759449632_2125288_1387403481_5aaef4ebf5c1f59e786c76551de6ad4b.jpg resize: (1080, 1920) 1387403481 -1.3711356542341835 treat image : temp/1759449632_2125288_1387403479_ee65d63be3e3377873c15e92e7e43d23.jpg resize: (1080, 1920) 1387403479 -1.3716029030143382 treat image : temp/1759449632_2125288_1387403206_1946864c544807978ed26292569ccdd2.jpg resize: (1080, 1920) 1387403206 -1.9487407536288395 treat image : temp/1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2_rle_crop_3984976356_0.png resize: (62, 111) 1387523296 0.01736024594472706 treat image : temp/1759449632_2125288_1387403490_570863d94c7aadf080c1f0569c595afb_rle_crop_3984976359_0.png resize: (120, 101) 1387523297 -1.2294011594746965 treat image : temp/1759449632_2125288_1387403487_70b9bb2b414d00a14a98c4a40c419334_rle_crop_3984976364_0.png resize: (85, 104) 1387523298 -0.8765457639072306 treat image : temp/1759449632_2125288_1387403527_b93e8f674f5a48fb4eecf065bea21856_rle_crop_3984976352_0.png resize: (173, 97) 1387523310 -3.6490731819608206 treat image : temp/1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2_rle_crop_3984976355_0.png resize: (988, 1236) 1387523311 -0.9877896864321227 treat image : temp/1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2_rle_crop_3984976353_0.png resize: (103, 208) 1387523312 -3.25083010715906 treat image : temp/1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2_rle_crop_3984976354_0.png resize: (533, 357) 1387523313 0.06313032090644123 treat image : temp/1759449632_2125288_1387403493_37b6f0e8bbd93a6779a0edf2ce69e680_rle_crop_3984976358_0.png resize: (305, 345) 1387523314 -1.2073682681826097 treat image : temp/1759449632_2125288_1387403490_570863d94c7aadf080c1f0569c595afb_rle_crop_3984976361_0.png resize: (481, 333) 1387523315 -0.5058395882139811 treat image : temp/1759449632_2125288_1387403487_70b9bb2b414d00a14a98c4a40c419334_rle_crop_3984976362_0.png resize: (563, 323) 1387523316 0.12503088765218118 treat image : temp/1759449632_2125288_1387403487_70b9bb2b414d00a14a98c4a40c419334_rle_crop_3984976363_0.png resize: (226, 238) 1387523317 -2.0627262709892995 treat image : temp/1759449632_2125288_1387403484_f54788ba910973b5a83c1487ae8d4621_rle_crop_3984976365_0.png resize: (518, 343) 1387523318 -0.11364409374456011 treat image : temp/1759449632_2125288_1387403481_5aaef4ebf5c1f59e786c76551de6ad4b_rle_crop_3984976367_0.png resize: (174, 208) 1387523320 -2.2022867632657186 treat image : temp/1759449632_2125288_1387403481_5aaef4ebf5c1f59e786c76551de6ad4b_rle_crop_3984976366_0.png resize: (149, 127) 1387523321 -3.1162503183351125 treat image : temp/1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2_rle_crop_3984976357_0.png resize: (67, 105) 1387523324 -0.021219253481395407 treat image : temp/1759449632_2125288_1387403490_570863d94c7aadf080c1f0569c595afb_rle_crop_3984976360_0.png resize: (69, 55) 1387523325 1.2619570208345474 Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 27 time used for this insertion : 0.03620576858520508 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 27 time used for this insertion : 0.036440134048461914 save missing photos in datou_result : time spend for datou_step_exec : 9.106381177902222 time spend to save output : 0.08947443962097168 total time spend for step 6 : 9.195855617523193 step7:brightness Fri Oct 3 02:01:38 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness treat image : temp/1759449632_2125288_1387403527_b93e8f674f5a48fb4eecf065bea21856.jpg treat image : temp/1759449632_2125288_1387403526_9e2822e6d763d9845ad22df626710a59.jpg treat image : temp/1759449632_2125288_1387403524_8ddf0f8afb730bd7fb85a12ea5735169.jpg treat image : temp/1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2.jpg treat image : temp/1759449632_2125288_1387403493_37b6f0e8bbd93a6779a0edf2ce69e680.jpg treat image : temp/1759449632_2125288_1387403490_570863d94c7aadf080c1f0569c595afb.jpg treat image : temp/1759449632_2125288_1387403487_70b9bb2b414d00a14a98c4a40c419334.jpg treat image : temp/1759449632_2125288_1387403484_f54788ba910973b5a83c1487ae8d4621.jpg treat image : temp/1759449632_2125288_1387403481_5aaef4ebf5c1f59e786c76551de6ad4b.jpg treat image : temp/1759449632_2125288_1387403479_ee65d63be3e3377873c15e92e7e43d23.jpg treat image : temp/1759449632_2125288_1387403206_1946864c544807978ed26292569ccdd2.jpg treat image : temp/1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2_rle_crop_3984976356_0.png treat image : temp/1759449632_2125288_1387403490_570863d94c7aadf080c1f0569c595afb_rle_crop_3984976359_0.png treat image : temp/1759449632_2125288_1387403487_70b9bb2b414d00a14a98c4a40c419334_rle_crop_3984976364_0.png treat image : temp/1759449632_2125288_1387403527_b93e8f674f5a48fb4eecf065bea21856_rle_crop_3984976352_0.png treat image : temp/1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2_rle_crop_3984976355_0.png treat image : temp/1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2_rle_crop_3984976353_0.png treat image : temp/1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2_rle_crop_3984976354_0.png treat image : temp/1759449632_2125288_1387403493_37b6f0e8bbd93a6779a0edf2ce69e680_rle_crop_3984976358_0.png treat image : temp/1759449632_2125288_1387403490_570863d94c7aadf080c1f0569c595afb_rle_crop_3984976361_0.png treat image : temp/1759449632_2125288_1387403487_70b9bb2b414d00a14a98c4a40c419334_rle_crop_3984976362_0.png treat image : temp/1759449632_2125288_1387403487_70b9bb2b414d00a14a98c4a40c419334_rle_crop_3984976363_0.png treat image : temp/1759449632_2125288_1387403484_f54788ba910973b5a83c1487ae8d4621_rle_crop_3984976365_0.png treat image : temp/1759449632_2125288_1387403481_5aaef4ebf5c1f59e786c76551de6ad4b_rle_crop_3984976367_0.png treat image : temp/1759449632_2125288_1387403481_5aaef4ebf5c1f59e786c76551de6ad4b_rle_crop_3984976366_0.png treat image : temp/1759449632_2125288_1387403522_bc3a498ffe2edb025c4e00f6e38293b2_rle_crop_3984976357_0.png treat image : temp/1759449632_2125288_1387403490_570863d94c7aadf080c1f0569c595afb_rle_crop_3984976360_0.png Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 27 time used for this insertion : 0.10585927963256836 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 27 time used for this insertion : 0.03554034233093262 save missing photos in datou_result : time spend for datou_step_exec : 2.7408182621002197 time spend to save output : 0.15850591659545898 total time spend for step 7 : 2.8993241786956787 step8:velours_tree Fri Oct 3 02:01:41 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 VR 22-3-18 : For now we do not clean correctly the datou structure can't find the photo_desc_type Inside saveOutput : final : False verbose : 0 ouput is None No outpout to save, returning out of save general time spend for datou_step_exec : 0.13811683654785156 time spend to save output : 4.363059997558594e-05 total time spend for step 8 : 0.13816046714782715 step9:send_mail_cod Fri Oct 3 02:01:41 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin/workarea/git/Velours/python in order to get the selector url, please entre the license of selector results_Auto_P27463485_03-10-2025_02_01_41.pdf 27464910 imagette274649101759449701 27464911 imagette274649111759449701 27464912 change filename to text .change filename to text .imagette274649121759449701 27464913 imagette274649131759449701 27464914 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette274649141759449701 27464916 imagette274649161759449702 27464917 imagette274649171759449702 27464918 imagette274649181759449702 27464920 imagette274649201759449702 27464921 change filename to text .change filename to text .change filename to text .imagette274649211759449702 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=27463485 and hashtag_type = 3594 and pcr.hashtag_id = h.hashtag_id; velour_link : https://marlene.fotonower.com/velours/27464910,27464911,27464912,27464913,27464914,27464915,27464916,27464917,27464918,27464920,27464921?tags=metal,mal_croppe,autre,flou,pet_clair,environnement,background,pet_fonce,pehd,carton,papier args[1387403527] : ((1387403527, -2.115730625141723, 492609224), (1387403527, 0.5397981839064981, 2107752395), '0.07084245230078562') We are sending mail with results at report@fotonower.com args[1387403526] : ((1387403526, -1.6148253149470808, 492688767), (1387403526, 0.4587264003839966, 2107752395), '0.07084245230078562') We are sending mail with results at report@fotonower.com args[1387403524] : ((1387403524, -2.5776840405158916, 492609224), (1387403524, 0.62199022550626, 2107752395), '0.07084245230078562') We are sending mail with results at report@fotonower.com args[1387403522] : ((1387403522, -0.8699040602484222, 492688767), (1387403522, 0.5031985617586335, 2107752395), '0.07084245230078562') We are sending mail with results at report@fotonower.com args[1387403493] : ((1387403493, -0.22030738663888816, 492688767), (1387403493, 0.5398915310869529, 2107752395), '0.07084245230078562') We are sending mail with results at report@fotonower.com args[1387403490] : ((1387403490, -0.5872535075024725, 492688767), (1387403490, 0.4720372533629951, 2107752395), '0.07084245230078562') We are sending mail with results at report@fotonower.com args[1387403487] : ((1387403487, -1.5935637156107305, 492688767), (1387403487, 0.4796982889935047, 2107752395), '0.07084245230078562') We are sending mail with results at report@fotonower.com args[1387403484] : ((1387403484, -1.8753771763280922, 492688767), (1387403484, 0.3856750052392971, 2107752395), '0.07084245230078562') We are sending mail with results at report@fotonower.com args[1387403481] : ((1387403481, -1.3711356542341835, 492688767), (1387403481, 0.4706626110872106, 2107752395), '0.07084245230078562') We are sending mail with results at report@fotonower.com args[1387403479] : ((1387403479, -1.3716029030143382, 492688767), (1387403479, 0.4528961464918986, 2107752395), '0.07084245230078562') We are sending mail with results at report@fotonower.com args[1387403206] : ((1387403206, -1.9487407536288395, 492688767), (1387403206, 0.8137033101192813, 2107752395), '0.07084245230078562') We are sending mail with results at report@fotonower.com refus_total : 0.07084245230078562 2022-04-13 10:29:59 0 SELECT ph.photo_id,ph.url,ph.username,ph.uploaded_at,ph.text FROM MTRBack.photos_view ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=27463485 AND mpp.hide_status=0 ORDER BY mpp.order LIMIT 0, 1000 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P27463485_03-10-2025_02_01_41.pdf results_Auto_P27463485_03-10-2025_02_01_41.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P27463485_03-10-2025_02_01_41.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('3318','27463485','results_Auto_P27463485_03-10-2025_02_01_41.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P27463485_03-10-2025_02_01_41.pdf','pdf','','0.26','0.07084245230078562') message_in_mail: Bonjour,
Veuillez trouver ci dessous les résultats du service carac on demand pour le portfolio: https://www.fotonower.com/view/27463485

https://www.fotonower.com/image?json=false&list_photos_id=1387403527
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1387403526
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1387403524
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1387403522
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1387403493
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1387403490
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1387403487
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1387403484
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1387403481
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1387403479
Bravo, la photo est bien prise.
https://www.fotonower.com/image?json=false&list_photos_id=1387403206
Bravo, la photo est bien prise.

Dans ces conditions,le taux de refus est: 7.08%
Veuillez trouver les photos des contaminants.

exemples de contaminants: autre: https://www.fotonower.com/view/27464912?limit=200
exemples de contaminants: pet_clair: https://www.fotonower.com/view/27464914?limit=200
exemples de contaminants: papier: https://www.fotonower.com/view/27464921?limit=200
Veuillez trouver le rapport en pdf:https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P27463485_03-10-2025_02_01_41.pdf.

Lien vers velours :https://marlene.fotonower.com/velours/27464910,27464911,27464912,27464913,27464914,27464915,27464916,27464917,27464918,27464920,27464921?tags=metal,mal_croppe,autre,flou,pet_clair,environnement,background,pet_fonce,pehd,carton,papier.


L'équipe Fotonower 202 b'' Server: nginx Date: Fri, 03 Oct 2025 00:01:45 GMT Content-Length: 0 Connection: close X-Message-Id: 7JPjTDVET8ucCx5uHPwsiw Access-Control-Allow-Origin: https://sendgrid.api-docs.io Access-Control-Allow-Methods: POST Access-Control-Allow-Headers: Authorization, Content-Type, On-behalf-of, x-sg-elas-acl Access-Control-Max-Age: 600 X-No-CORS-Reason: https://sendgrid.com/docs/Classroom/Basics/API/cors.html Strict-Transport-Security: max-age=31536000; includeSubDomains Content-Security-Policy: frame-ancestors 'none' Cache-Control: no-cache X-Content-Type-Options: no-sniff Referrer-Policy: strict-origin-when-cross-origin Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1387403527, 1387403526, 1387403524, 1387403522, 1387403493, 1387403490, 1387403487, 1387403484, 1387403481, 1387403479, 1387403206] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403527', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403526', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403524', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403522', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403493', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403490', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403487', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403484', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403481', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403479', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403206', None, None, None, None, None, '3824861') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 11 time used for this insertion : 0.037830352783203125 save_final save missing photos in datou_result : time spend for datou_step_exec : 3.7587811946868896 time spend to save output : 0.03813314437866211 total time spend for step 9 : 3.7969143390655518 step10:split_time_score Fri Oct 3 02:01:45 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score Catched exception ! Connect or reconnect ! TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('13', 11),) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 02102025 27463485 Nombre de photos uploadées : 11 / 23040 (0%) 02102025 27463485 Nombre de photos taguées (types de déchets): 0 / 11 (0%) 02102025 27463485 Nombre de photos taguées (volume) : 0 / 11 (0%) elapsed_time : load_data_split_time_score 2.1457672119140625e-06 elapsed_time : order_list_meta_photo_and_scores 7.867813110351562e-06 ??????????? elapsed_time : fill_and_build_computed_from_old_data 0.0003924369812011719 Catched exception ! Connect or reconnect ! Catched exception ! Connect or reconnect ! elapsed_time : insert_dashboard_record_day_entry 0.660637378692627 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.016331413089225585 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P27463472_03-10-2025_01_52_01.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27463472 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=27463472 AND mptpi.`type`=3594 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27463474 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27463475 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27463477 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27463481 order by id desc limit 1 find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27463483 order by id desc limit 1 Qualite : 0.07084245230078562 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P27463485_03-10-2025_02_01_41.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27463485 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=27463485 AND mptpi.`type`=3594 To do Qualite : 0.0557065329218107 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P27463488_03-10-2025_01_51_57.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27463488 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=27463488 AND mptpi.`type`=3594 To do find url: select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27463504 order by id desc limit 1 Qualite : 0.0 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P27463505_03-10-2025_01_31_09.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 27463505 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 7928 mask_detect is not consistent : 3 used against 2 in the step definition ! Step 8092 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 8092 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7933 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 7935 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 7934 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 7934 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13649 velours_tree is not consistent : 2 used against 1 in the step definition ! Step 9283 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 7935 doesn't seem to be define in the database( WARNING : type of input 3 of step 7934 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of input 1 of step 7935 doesn't seem to be define in the database( WARNING : output 1 of step 7933 have datatype=7 whereas input 1 of step 7935 have datatype=None WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 2 of step 8092 doesn't seem to be define in the database( WARNING : type of output 3 of step 8092 doesn't seem to be define in the database( WARNING : type of input 1 of step 7933 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10917 doesn't seem to be define in the database( WARNING : type of output 2 of step 7928 doesn't seem to be define in the database( WARNING : type of input 1 of step 10918 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 7935 have datatype=10 whereas input 3 of step 10916 have datatype=6 WARNING : output 0 of step 7935 have datatype=10 whereas input 0 of step 13649 have datatype=18 WARNING : type of output 1 of step 13649 doesn't seem to be define in the database( WARNING : type of input 5 of step 10916 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=27463505 AND mptpi.`type`=3594 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'02102025': {'nb_upload': 11, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1387403527, 1387403526, 1387403524, 1387403522, 1387403493, 1387403490, 1387403487, 1387403484, 1387403481, 1387403479, 1387403206] Looping around the photos to save general results len do output : 1 /27463485Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403527', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403526', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403524', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403522', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403493', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403490', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403487', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403484', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403481', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403479', None, None, None, None, None, '3824861') ('3318', None, None, None, None, None, None, None, '3824861') ('3318', '27463485', '1387403206', None, None, None, None, None, '3824861') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 12 time used for this insertion : 0.09528255462646484 save_final save missing photos in datou_result : time spend for datou_step_exec : 5.907317876815796 time spend to save output : 0.09552383422851562 total time spend for step 10 : 6.0028417110443115 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 11 set_done_treatment 37.12user 17.89system 1:24.82elapsed 64%CPU (0avgtext+0avgdata 2700472maxresident)k 23480inputs+17368outputs (107major+1368357minor)pagefaults 0swaps