python /home/admin/mtr/script_for_cron.py -j python_test3 -m 12 -a ' --short_python3 -v ' -s python_test3 -M 0 -S 0 -U 100,100,120 import MySQLdb succeeded Import error (python version) python version = 3 warning , we can't find thcl infos in json_data warning , we can't find pdt infos in json_data list_job_run_as_list : ['mask_detection', 'datou', 'CacheModelData_queries', 'CachePhotoData_queries', 'test_fork', 'prepare_maskdata', 'portfolio_queries', 'sla_mensuel'] python version used : 3 liste_fichiers : [('tests/mask_test', True, 'Test mask-detection ', 'mask_detection'), ('tests/datou_test', True, 'Datou All Test', 'datou', 'all'), ('mtr/database_queries/CacheModelData_queries', True, 'Test Cache Model Data', 'CacheModelData_queries'), ('tests/cache_photo_data_test', True, 'Test local_cache_photo ', 'CachePhotoData_queries'), ('mtr/mask_rcnn/prepare_maskdata', True, 'test prepare mask data', 'prepare_maskdata', 'all'), ('mtr/database_queries/portfolio_queries', True, 'test portfolio queries', 'portfolio_queries'), ('prod/memo/memo', True, 'SLA Mensuel', 'sla_mensuel', 'all')] #&_# BEGIN OF TEST : tests/mask_test #&_# /home/admin/workarea/git/Velours/python/tests/mask_test.py Test mask-detection python version used : 3 ############################### TEST memory used ################################ free memory at begining : begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 7035 run mask_detect Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : mask_detect list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.13683032989501953 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:mask_detect Wed May 21 09:35:27 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 7035 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 /home/admin/workarea/git/Velours/python/tests/python_tests.py:11: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses import imp 2025-05-21 09:35:30.516047: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-05-21 09:35:30.543056: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493065000 Hz 2025-05-21 09:35:30.544969: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f4714000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-05-21 09:35:30.545035: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-05-21 09:35:30.548669: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-05-21 09:35:30.785600: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x11c5d790 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-05-21 09:35:30.785641: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-05-21 09:35:30.786663: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-21 09:35:30.786944: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-21 09:35:30.789080: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-21 09:35:30.791243: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-21 09:35:30.791592: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-21 09:35:30.794120: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-21 09:35:30.795535: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-21 09:35:30.800912: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-21 09:35:30.802524: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-21 09:35:30.802603: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-21 09:35:30.803448: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-05-21 09:35:30.803468: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-05-21 09:35:30.803479: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-05-21 09:35:30.804854: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6470 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl454 thcls : [{'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}] thcl {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 3473 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (3473, 'mask_coco_origin', 16384, 25088, 'mask_coco_origin', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2018, 3, 19, 10, 42, 21), datetime.datetime(2018, 3, 19, 10, 42, 21)) {'thcl': {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'], 'list_hashtags_csv': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'svm_hashtag_type_desc': 3473, 'photo_desc_type': 3473, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME mask_coco_origin NUM_CLASSES 81 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 2025-05-21 09:35:31.484363: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-21 09:35:31.484450: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-21 09:35:31.484474: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-21 09:35:31.484497: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-21 09:35:31.484518: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-21 09:35:31.484539: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-21 09:35:31.484572: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-21 09:35:31.484595: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-21 09:35:31.485902: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-21 09:35:31.487280: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-21 09:35:31.487346: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-21 09:35:31.487379: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-21 09:35:31.487410: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-21 09:35:31.487440: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-21 09:35:31.487470: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-21 09:35:31.487498: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-21 09:35:31.487522: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-21 09:35:31.488775: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-21 09:35:31.488814: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-05-21 09:35:31.488827: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-05-21 09:35:31.488837: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-05-21 09:35:31.490022: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6470 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : mask_coco_origin model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-05-21 09:35:39.263119: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-21 09:35:39.434971: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/mask_coco_origin /data/models_weight/mask_coco_origin/mask_model.h5 size_local : 257557808 size in s3 : 257557808 create time local : 2021-08-09 05:27:17 create time in s3 : 2021-08-06 19:45:17 mask_model.h5 already exist and didn't need to update list_images length : 1 NEW PHOTO Processing 1 images image shape: (480, 640, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 89) min: 0.00000 max: 640.00000 nb d'objets trouves : 5 Detection mask done ! Trying to reset tf kernel 1946489 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 1746 tf kernel not reseted sub process len(results) : 1 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 1 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 7035 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl454 Catched exception ! Connect or reconnect ! thcls : [{'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}] thcl {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 3473 ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] time for calcul the mask position with numpy : 0.0006411075592041016 nb_pixel_total : 15551 time to create 1 rle with old method : 0.019162416458129883 length of segment : 256 time for calcul the mask position with numpy : 0.003081798553466797 nb_pixel_total : 145329 time to create 1 rle with old method : 0.15656757354736328 length of segment : 371 time for calcul the mask position with numpy : 0.000278472900390625 nb_pixel_total : 14254 time to create 1 rle with old method : 0.015749692916870117 length of segment : 151 time for calcul the mask position with numpy : 0.00012564659118652344 nb_pixel_total : 5613 time to create 1 rle with old method : 0.006569623947143555 length of segment : 48 time for calcul the mask position with numpy : 5.626678466796875e-05 nb_pixel_total : 1824 time to create 1 rle with old method : 0.0022869110107421875 length of segment : 39 time spent for convertir_results : 0.9827277660369873 time spend for datou_step_exec : 18.278632164001465 time spend to save output : 5.316734313964844e-05 total time spend for step 1 : 18.278685331344604 caffe_path_current : About to save ! 1 Inside saveOutput : final : True verbose : False eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 3331 chid ids of type : 445 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++Number RLEs to save : 0 begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1 time used for this insertion : 0.01902937889099121 save missing photos in datou_result : After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {'957285035': [[(957285035, 492601069, 445, 0, 186, 22, 282, 0.99548954, [(140, 26, 6), (135, 27, 15), (133, 28, 18), (131, 29, 22), (127, 30, 27), (10, 31, 1), (120, 31, 35), (8, 32, 13), (27, 32, 3), (115, 32, 41), (7, 33, 52), (109, 33, 48), (6, 34, 70), (103, 34, 55), (5, 35, 154), (4, 36, 155), (3, 37, 156), (3, 38, 156), (3, 39, 156), (2, 40, 157), (2, 41, 157), (2, 42, 157), (2, 43, 157), (2, 44, 157), (2, 45, 157), (1, 46, 158), (1, 47, 158), (1, 48, 158), (1, 49, 157), (1, 50, 157), (1, 51, 156), (1, 52, 156), (1, 53, 155), (1, 54, 154), (1, 55, 152), (1, 56, 149), (1, 57, 145), (1, 58, 141), (1, 59, 136), (1, 60, 133), (1, 61, 130), (1, 62, 127), (1, 63, 126), (1, 64, 124), (1, 65, 123), (1, 66, 121), (1, 67, 120), (1, 68, 118), (1, 69, 117), (1, 70, 116), (1, 71, 115), (1, 72, 114), (1, 73, 113), (1, 74, 112), (1, 75, 111), (1, 76, 110), (1, 77, 108), (1, 78, 108), (1, 79, 107), (1, 80, 106), (1, 81, 105), (2, 82, 104), (2, 83, 103), (2, 84, 103), (2, 85, 102), (2, 86, 102), (2, 87, 101), (2, 88, 100), (2, 89, 99), (2, 90, 99), (2, 91, 98), (2, 92, 97), (2, 93, 96), (2, 94, 95), (2, 95, 93), (2, 96, 91), (2, 97, 90), (2, 98, 89), (2, 99, 87), (2, 100, 86), (2, 101, 86), (2, 102, 85), (2, 103, 84), (2, 104, 83), (2, 105, 83), (2, 106, 82), (2, 107, 81), (2, 108, 80), (2, 109, 80), (2, 110, 79), (2, 111, 78), (2, 112, 77), (2, 113, 76), (1, 114, 76), (1, 115, 75), (1, 116, 74), (1, 117, 73), (1, 118, 72), (1, 119, 71), (1, 120, 71), (1, 121, 70), (1, 122, 69), (1, 123, 69), (1, 124, 68), (1, 125, 68), (1, 126, 67), (1, 127, 67), (1, 128, 66), (1, 129, 66), (1, 130, 66), (1, 131, 65), (1, 132, 65), (1, 133, 64), (1, 134, 63), (1, 135, 63), (1, 136, 62), (1, 137, 61), (1, 138, 60), (1, 139, 60), (1, 140, 59), (1, 141, 58), (1, 142, 58), (1, 143, 57), (1, 144, 56), (1, 145, 56), (1, 146, 55), (1, 147, 54), (1, 148, 54), (1, 149, 53), (1, 150, 52), (1, 151, 52), (1, 152, 51), (1, 153, 50), (1, 154, 49), (1, 155, 48), (1, 156, 47), (1, 157, 46), (1, 158, 45), (1, 159, 45), (1, 160, 44), (1, 161, 43), (1, 162, 42), (1, 163, 41), (1, 164, 41), (1, 165, 40), (1, 166, 40), (1, 167, 39), (1, 168, 38), (1, 169, 37), (1, 170, 36), (1, 171, 35), (1, 172, 34), (1, 173, 34), (1, 174, 33), (1, 175, 33), (1, 176, 32), (1, 177, 32), (1, 178, 32), (1, 179, 32), (1, 180, 31), (1, 181, 31), (1, 182, 31), (1, 183, 30), (1, 184, 30), (1, 185, 30), (1, 186, 29), (1, 187, 29), (1, 188, 29), (1, 189, 28), (1, 190, 28), (1, 191, 27), (1, 192, 27), (1, 193, 26), (1, 194, 26), (1, 195, 26), (1, 196, 26), (1, 197, 26), (1, 198, 26), (1, 199, 26), (1, 200, 25), (1, 201, 25), (1, 202, 25), (1, 203, 25), (1, 204, 25), (1, 205, 25), (1, 206, 25), (1, 207, 25), (1, 208, 25), (1, 209, 25), (1, 210, 25), (1, 211, 25), (1, 212, 25), (1, 213, 25), (1, 214, 25), (1, 215, 25), (1, 216, 25), (1, 217, 25), (1, 218, 25), (1, 219, 25), (1, 220, 24), (1, 221, 24), (1, 222, 24), (1, 223, 24), (1, 224, 24), (1, 225, 24), (1, 226, 25), (1, 227, 25), (1, 228, 25), (2, 229, 24), (2, 230, 24), (2, 231, 24), (2, 232, 23), (2, 233, 23), (2, 234, 23), (2, 235, 23), (2, 236, 23), (2, 237, 23), (2, 238, 23), (2, 239, 23), (2, 240, 23), (2, 241, 23), (2, 242, 23), (2, 243, 23), (2, 244, 23), (2, 245, 23), (2, 246, 23), (2, 247, 23), (2, 248, 23), (2, 249, 24), (2, 250, 24), (2, 251, 23), (2, 252, 23), (2, 253, 23), (2, 254, 23), (2, 255, 23), (2, 256, 23), (2, 257, 23), (2, 258, 23), (2, 259, 23), (2, 260, 23), (2, 261, 23), (3, 262, 22), (3, 263, 22), (3, 264, 22), (3, 265, 22), (4, 266, 21), (4, 267, 21), (5, 268, 20), (5, 269, 20), (6, 270, 19), (7, 271, 17), (8, 272, 16), (8, 273, 16), (9, 274, 13), (11, 275, 9), (15, 276, 2)], ['16,276,8,273,2,261,2,229,1,228,1,114,2,113,2,82,1,81,1,46,3,37,8,32,20,32,21,33,58,33,59,34,75,34,76,35,102,35,114,33,120,31,130,30,135,27,145,26,152,29,158,35,158,48,154,54,141,58,128,61,119,67,105,81,103,86,96,94,89,98,81,109,71,119,65,132,60,138,52,151,45,158,40,166,34,172,29,188,26,193,25,200,25,219,24,232,24,270,23,273']), (957285035, 492601069, 445, 29, 591, 24, 419, 0.99238765, [(315, 37, 25), (272, 38, 86), (253, 39, 130), (238, 40, 151), (199, 41, 196), (189, 42, 213), (180, 43, 238), (175, 44, 250), (172, 45, 257), (169, 46, 265), (166, 47, 274), (162, 48, 284), (159, 49, 294), (157, 50, 304), (155, 51, 311), (153, 52, 317), (151, 53, 323), (149, 54, 330), (148, 55, 334), (146, 56, 337), (144, 57, 341), (142, 58, 344), (140, 59, 347), (138, 60, 350), (136, 61, 353), (134, 62, 356), (132, 63, 358), (130, 64, 361), (128, 65, 364), (126, 66, 367), (124, 67, 370), (122, 68, 373), (120, 69, 376), (118, 70, 379), (117, 71, 381), (115, 72, 385), (114, 73, 387), (113, 74, 389), (112, 75, 391), (112, 76, 393), (111, 77, 395), (110, 78, 397), (109, 79, 399), (109, 80, 400), (108, 81, 402), (107, 82, 404), (107, 83, 404), (106, 84, 406), (105, 85, 408), (105, 86, 409), (104, 87, 410), (104, 88, 411), (103, 89, 413), (102, 90, 415), (101, 91, 417), (100, 92, 420), (98, 93, 423), (97, 94, 426), (96, 95, 428), (94, 96, 431), (93, 97, 433), (92, 98, 435), (91, 99, 437), (90, 100, 439), (89, 101, 441), (89, 102, 441), (89, 103, 442), (89, 104, 443), (89, 105, 444), (89, 106, 444), (89, 107, 445), (89, 108, 446), (89, 109, 447), (89, 110, 448), (89, 111, 449), (89, 112, 450), (89, 113, 451), (89, 114, 453), (89, 115, 454), (89, 116, 455), (88, 117, 456), (88, 118, 457), (87, 119, 459), (87, 120, 459), (86, 121, 461), (85, 122, 462), (85, 123, 463), (84, 124, 464), (84, 125, 465), (83, 126, 466), (82, 127, 468), (82, 128, 468), (81, 129, 470), (80, 130, 471), (78, 131, 473), (76, 132, 476), (75, 133, 477), (73, 134, 480), (71, 135, 482), (70, 136, 484), (68, 137, 486), (67, 138, 488), (65, 139, 490), (64, 140, 492), (63, 141, 493), (61, 142, 496), (60, 143, 497), (59, 144, 499), (58, 145, 501), (58, 146, 501), (57, 147, 503), (57, 148, 504), (57, 149, 505), (56, 150, 507), (56, 151, 507), (55, 152, 509), (55, 153, 510), (54, 154, 511), (54, 155, 512), (54, 156, 513), (53, 157, 514), (53, 158, 514), (52, 159, 515), (52, 160, 516), (52, 161, 516), (51, 162, 517), (51, 163, 517), (50, 164, 518), (50, 165, 518), (49, 166, 519), (49, 167, 520), (48, 168, 521), (48, 169, 521), (47, 170, 522), (47, 171, 522), (46, 172, 523), (46, 173, 523), (46, 174, 523), (45, 175, 524), (45, 176, 523), (44, 177, 524), (44, 178, 524), (44, 179, 524), (43, 180, 525), (43, 181, 525), (42, 182, 525), (42, 183, 525), (42, 184, 525), (41, 185, 526), (41, 186, 526), (40, 187, 526), (39, 188, 526), (39, 189, 525), (38, 190, 526), (38, 191, 525), (37, 192, 525), (37, 193, 523), (36, 194, 523), (36, 195, 522), (36, 196, 522), (35, 197, 522), (35, 198, 521), (34, 199, 521), (34, 200, 521), (34, 201, 520), (34, 202, 520), (34, 203, 520), (34, 204, 519), (34, 205, 519), (33, 206, 520), (33, 207, 519), (33, 208, 519), (33, 209, 519), (33, 210, 518), (33, 211, 518), (33, 212, 518), (33, 213, 517), (32, 214, 518), (32, 215, 517), (32, 216, 517), (32, 217, 516), (32, 218, 515), (32, 219, 514), (32, 220, 513), (32, 221, 512), (32, 222, 511), (32, 223, 510), (32, 224, 508), (32, 225, 507), (32, 226, 505), (32, 227, 504), (32, 228, 503), (32, 229, 502), (32, 230, 502), (32, 231, 501), (32, 232, 500), (32, 233, 499), (32, 234, 498), (32, 235, 497), (31, 236, 496), (31, 237, 495), (31, 238, 494), (31, 239, 493), (31, 240, 491), (31, 241, 490), (31, 242, 488), (31, 243, 487), (31, 244, 486), (31, 245, 485), (31, 246, 483), (31, 247, 482), (31, 248, 480), (31, 249, 479), (31, 250, 477), (31, 251, 475), (31, 252, 473), (31, 253, 472), (31, 254, 470), (31, 255, 468), (31, 256, 467), (31, 257, 465), (31, 258, 464), (31, 259, 463), (31, 260, 462), (31, 261, 461), (31, 262, 459), (31, 263, 458), (31, 264, 456), (31, 265, 455), (31, 266, 453), (31, 267, 451), (31, 268, 449), (31, 269, 448), (31, 270, 446), (31, 271, 445), (31, 272, 444), (31, 273, 443), (32, 274, 441), (32, 275, 440), (32, 276, 438), (32, 277, 437), (32, 278, 435), (32, 279, 434), (32, 280, 432), (33, 281, 429), (33, 282, 427), (33, 283, 426), (33, 284, 424), (33, 285, 423), (34, 286, 421), (34, 287, 420), (34, 288, 419), (35, 289, 416), (35, 290, 415), (35, 291, 414), (36, 292, 411), (36, 293, 410), (37, 294, 407), (37, 295, 406), (38, 296, 403), (38, 297, 401), (39, 298, 399), (39, 299, 397), (41, 300, 394), (42, 301, 392), (43, 302, 389), (44, 303, 387), (45, 304, 385), (46, 305, 382), (47, 306, 380), (47, 307, 378), (48, 308, 376), (49, 309, 373), (50, 310, 370), (51, 311, 368), (51, 312, 367), (52, 313, 365), (54, 314, 362), (55, 315, 360), (56, 316, 359), (58, 317, 356), (61, 318, 352), (64, 319, 349), (67, 320, 345), (70, 321, 341), (73, 322, 338), (75, 323, 335), (78, 324, 332), (80, 325, 329), (82, 326, 327), (84, 327, 324), (86, 328, 322), (88, 329, 320), (90, 330, 317), (93, 331, 314), (96, 332, 311), (99, 333, 307), (102, 334, 304), (105, 335, 300), (108, 336, 297), (111, 337, 294), (113, 338, 291), (115, 339, 289), (117, 340, 286), (119, 341, 283), (121, 342, 281), (123, 343, 278), (125, 344, 275), (127, 345, 272), (129, 346, 269), (132, 347, 266), (135, 348, 262), (137, 349, 259), (141, 350, 255), (143, 351, 252), (145, 352, 250), (147, 353, 247), (149, 354, 245), (151, 355, 242), (152, 356, 241), (154, 357, 239), (156, 358, 237), (159, 359, 233), (161, 360, 231), (163, 361, 229), (165, 362, 227), (167, 363, 224), (169, 364, 222), (170, 365, 221), (172, 366, 219), (173, 367, 218), (174, 368, 216), (175, 369, 215), (177, 370, 213), (178, 371, 212), (180, 372, 209), (183, 373, 206), (185, 374, 204), (188, 375, 200), (191, 376, 197), (194, 377, 193), (196, 378, 191), (199, 379, 188), (201, 380, 185), (203, 381, 183), (205, 382, 180), (207, 383, 178), (208, 384, 176), (210, 385, 174), (212, 386, 171), (213, 387, 169), (215, 388, 166), (218, 389, 162), (221, 390, 158), (225, 391, 153), (228, 392, 149), (232, 393, 144), (235, 394, 140), (238, 395, 136), (241, 396, 133), (245, 397, 128), (248, 398, 124), (252, 399, 119), (257, 400, 113), (263, 401, 105), (272, 402, 94), (283, 403, 82), (297, 404, 65), (306, 405, 53), (313, 406, 38), (321, 407, 23)], ['321,407,296,403,263,401,215,388,178,371,168,363,110,336,90,330,77,323,56,316,39,299,31,273,31,236,34,199,42,184,58,145,79,131,89,116,89,101,104,88,115,72,159,49,180,43,199,41,237,41,272,38,339,37,382,39,402,43,417,43,481,55,504,76,543,116,556,143,566,156,568,167,566,186,554,199,548,216,528,235,509,249,477,269,448,291,420,309,407,327,403,339,392,355,383,385,369,400,358,405']), (957285035, 492601069, 445, 485, 636, 23, 174, 0.97113514, [(540, 24, 21), (626, 24, 3), (531, 25, 49), (594, 25, 40), (527, 26, 107), (523, 27, 111), (520, 28, 114), (517, 29, 118), (516, 30, 119), (515, 31, 120), (513, 32, 122), (512, 33, 123), (510, 34, 125), (509, 35, 126), (507, 36, 128), (506, 37, 129), (504, 38, 131), (503, 39, 132), (501, 40, 134), (500, 41, 135), (499, 42, 136), (498, 43, 137), (497, 44, 138), (496, 45, 139), (496, 46, 139), (495, 47, 140), (495, 48, 140), (494, 49, 141), (493, 50, 142), (492, 51, 143), (491, 52, 144), (491, 53, 144), (490, 54, 145), (490, 55, 145), (490, 56, 145), (490, 57, 146), (490, 58, 146), (490, 59, 146), (491, 60, 145), (491, 61, 145), (491, 62, 145), (492, 63, 144), (493, 64, 143), (494, 65, 142), (495, 66, 141), (496, 67, 139), (497, 68, 138), (498, 69, 138), (499, 70, 137), (500, 71, 136), (501, 72, 135), (503, 73, 133), (503, 74, 133), (505, 75, 131), (506, 76, 130), (507, 77, 129), (508, 78, 128), (509, 79, 127), (510, 80, 126), (511, 81, 125), (512, 82, 124), (513, 83, 123), (514, 84, 122), (515, 85, 121), (516, 86, 120), (517, 87, 119), (518, 88, 118), (519, 89, 117), (521, 90, 115), (521, 91, 115), (522, 92, 114), (523, 93, 113), (524, 94, 112), (525, 95, 111), (526, 96, 110), (527, 97, 109), (529, 98, 107), (530, 99, 106), (532, 100, 104), (533, 101, 103), (534, 102, 102), (535, 103, 101), (536, 104, 100), (538, 105, 98), (540, 106, 96), (541, 107, 95), (543, 108, 93), (546, 109, 90), (548, 110, 88), (549, 111, 87), (551, 112, 84), (552, 113, 83), (553, 114, 82), (555, 115, 80), (556, 116, 79), (556, 117, 79), (557, 118, 78), (558, 119, 77), (559, 120, 76), (560, 121, 75), (560, 122, 75), (561, 123, 74), (561, 124, 74), (561, 125, 74), (562, 126, 73), (562, 127, 73), (563, 128, 72), (563, 129, 72), (564, 130, 70), (564, 131, 70), (565, 132, 69), (565, 133, 68), (565, 134, 68), (565, 135, 67), (566, 136, 65), (566, 137, 64), (566, 138, 64), (566, 139, 62), (566, 140, 61), (566, 141, 59), (566, 142, 57), (566, 143, 56), (566, 144, 55), (566, 145, 54), (567, 146, 53), (567, 147, 52), (567, 148, 51), (568, 149, 50), (568, 150, 49), (568, 151, 48), (568, 152, 47), (569, 153, 45), (569, 154, 44), (570, 155, 42), (570, 156, 42), (570, 157, 41), (571, 158, 39), (571, 159, 39), (572, 160, 37), (572, 161, 37), (573, 162, 35), (573, 163, 34), (573, 164, 34), (574, 165, 32), (575, 166, 30), (577, 167, 28), (578, 168, 26), (581, 169, 22), (584, 170, 19), (587, 171, 15), (591, 172, 8)], ['598,172,591,172,586,170,578,168,573,164,573,162,568,152,568,149,566,145,566,136,565,132,561,125,560,121,556,116,547,109,543,108,536,104,531,99,527,97,491,62,490,54,495,48,496,45,501,40,514,32,517,29,531,25,539,25,540,24,560,24,561,25,579,25,580,26,593,26,594,25,633,25,634,29,634,56,635,57,635,111,634,112,634,129,632,134,629,138,623,141,619,145,617,149,611,155,608,161,604,166']), (957285035, 492601069, 445, 280, 481, 2, 55, 0.8298536, [(292, 3, 128), (284, 4, 146), (282, 5, 151), (281, 6, 154), (281, 7, 156), (281, 8, 157), (281, 9, 158), (281, 10, 160), (281, 11, 162), (281, 12, 165), (281, 13, 167), (281, 14, 169), (281, 15, 171), (281, 16, 173), (281, 17, 174), (281, 18, 175), (281, 19, 177), (281, 20, 178), (281, 21, 179), (281, 22, 180), (281, 23, 181), (281, 24, 182), (281, 25, 183), (281, 26, 184), (281, 27, 185), (281, 28, 185), (281, 29, 185), (282, 30, 185), (283, 31, 27), (337, 31, 131), (371, 32, 97), (401, 33, 68), (409, 34, 61), (419, 35, 52), (424, 36, 48), (429, 37, 44), (432, 38, 41), (434, 39, 40), (436, 40, 39), (438, 41, 37), (441, 42, 35), (444, 43, 32), (448, 44, 29), (452, 45, 25), (454, 46, 23), (459, 47, 17), (463, 48, 12), (468, 49, 5)], ['472,49,468,49,467,48,459,47,458,46,454,46,451,44,448,44,447,43,444,43,440,41,438,41,428,36,424,36,423,35,419,35,418,34,409,34,408,33,401,33,400,32,371,32,370,31,337,31,336,30,283,31,281,29,281,6,284,4,291,4,292,3,419,3,420,4,429,4,430,5,432,5,436,7,441,11,445,12,453,16,456,19,457,19,465,27,465,29,472,37,476,44,476,46']), (957285035, 492601069, 445, 456, 547, 6, 45, 0.7404838, [(482, 8, 19), (464, 9, 3), (481, 9, 44), (457, 10, 12), (479, 10, 50), (457, 11, 13), (476, 11, 56), (457, 12, 15), (475, 12, 65), (457, 13, 84), (457, 14, 85), (457, 15, 89), (457, 16, 89), (458, 17, 88), (459, 18, 87), (460, 19, 86), (461, 20, 80), (464, 21, 71), (466, 22, 63), (467, 23, 59), (468, 24, 55), (469, 25, 52), (469, 26, 51), (470, 27, 48), (471, 28, 46), (471, 29, 44), (472, 30, 42), (473, 31, 39), (473, 32, 38), (474, 33, 36), (475, 34, 33), (475, 35, 32), (476, 36, 30), (476, 37, 29), (477, 38, 26), (478, 39, 23), (479, 40, 20), (480, 41, 17), (488, 42, 5)], ['492,42,488,42,487,41,480,41,476,37,475,34,473,32,469,25,465,21,461,20,457,16,457,10,463,10,464,9,466,9,470,12,474,13,476,11,480,10,482,8,500,8,501,9,524,9,525,10,528,10,532,12,539,12,542,15,545,15,545,19,535,20,534,21,529,21,525,23,523,23,513,30,512,30,504,37,496,41,493,41'])], 'temp/1747812927_1946083_957285035_a42482e51c93c8025d243dd179aee85b.jpg']} free memory after detection : begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 6882 ############################### TEST detect object ################################ run mask_detect Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : mask_detect list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.24146103858947754 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:mask_detect Wed May 21 09:35:48 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 6684 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-05-21 09:35:51.240295: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-05-21 09:35:51.267159: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493065000 Hz 2025-05-21 09:35:51.269484: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f4718000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-05-21 09:35:51.269549: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-05-21 09:35:51.274010: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-05-21 09:35:51.426068: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x126373a0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-05-21 09:35:51.426138: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-05-21 09:35:51.427463: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-21 09:35:51.427877: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-21 09:35:51.430890: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-21 09:35:51.433703: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-21 09:35:51.434280: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-21 09:35:51.436880: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-21 09:35:51.437970: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-21 09:35:51.442116: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-21 09:35:51.443423: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-21 09:35:51.443499: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-21 09:35:51.444147: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-05-21 09:35:51.444162: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-05-21 09:35:51.444171: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-05-21 09:35:51.445324: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 5719 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-05-21 09:35:51.527918: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-21 09:35:51.528024: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-21 09:35:51.528050: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-21 09:35:51.528072: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-21 09:35:51.528094: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-21 09:35:51.528116: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-21 09:35:51.528138: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-21 09:35:51.528175: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-21 09:35:51.529478: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-21 09:35:51.530812: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-21 09:35:51.530867: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-21 09:35:51.530892: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-21 09:35:51.530936: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-21 09:35:51.530962: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-21 09:35:51.530990: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-21 09:35:51.531018: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-21 09:35:51.531042: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-21 09:35:51.532130: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-21 09:35:51.532168: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-05-21 09:35:51.532180: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-05-21 09:35:51.532191: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-05-21 09:35:51.533309: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 5719 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (3473, 'mask_coco_origin', 16384, 25088, 'mask_coco_origin', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2018, 3, 19, 10, 42, 21), datetime.datetime(2018, 3, 19, 10, 42, 21)) {'thcl': {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'], 'list_hashtags_csv': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'svm_hashtag_type_desc': 3473, 'photo_desc_type': 3473, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME mask_coco_origin NUM_CLASSES 81 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : mask_coco_origin model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-05-21 09:35:59.163982: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-21 09:35:59.383213: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/mask_coco_origin /data/models_weight/mask_coco_origin/mask_model.h5 size_local : 257557808 size in s3 : 257557808 create time local : 2021-08-09 05:27:17 create time in s3 : 2021-08-06 19:45:17 mask_model.h5 already exist and didn't need to update list_images length : 1 NEW PHOTO Processing 1 images image shape: (720, 1280, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 89) min: 0.00000 max: 1280.00000 nb d'objets trouves : 4 Detection mask done ! Trying to reset tf kernel 1947469 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 813 tf kernel not reseted sub process len(results) : 1 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 1 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 1746 list_Values should be empty [] ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] time for calcul the mask position with numpy : 0.0005469322204589844 nb_pixel_total : 16901 time to create 1 rle with old method : 0.02794361114501953 length of segment : 107 time for calcul the mask position with numpy : 0.023395776748657227 nb_pixel_total : 483300 time to create 1 rle with new method : 0.036163330078125 length of segment : 633 time for calcul the mask position with numpy : 0.0005421638488769531 nb_pixel_total : 36642 time to create 1 rle with old method : 0.04208993911743164 length of segment : 133 time for calcul the mask position with numpy : 0.000125885009765625 nb_pixel_total : 4794 time to create 1 rle with old method : 0.0062372684478759766 length of segment : 51 time spent for convertir_results : 0.34836435317993164 time spend for datou_step_exec : 17.604896783828735 time spend to save output : 4.029273986816406e-05 total time spend for step 1 : 17.604937076568604 caffe_path_current : About to save ! 1 Inside saveOutput : final : True verbose : False eke 12-6-18 : saveMask need to be cleaned for new output ! Catched exception ! Connect or reconnect ! Number saved : None batch 1 Loaded 428 chid ids of type : 445 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++Number RLEs to save : 0 begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1 time used for this insertion : 0.015354156494140625 save missing photos in datou_result : After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {'917855882': [[(917855882, 492601069, 445, 1092, 1280, 0, 108, 0.99883693, [(1205, 1, 58), (1165, 2, 105), (1159, 3, 113), (1149, 4, 124), (1113, 5, 161), (1100, 6, 174), (1097, 7, 177), (1095, 8, 179), (1095, 9, 179), (1095, 10, 179), (1095, 11, 179), (1095, 12, 179), (1095, 13, 179), (1095, 14, 178), (1095, 15, 178), (1095, 16, 178), (1095, 17, 178), (1095, 18, 177), (1095, 19, 177), (1095, 20, 177), (1095, 21, 177), (1095, 22, 177), (1095, 23, 178), (1095, 24, 178), (1095, 25, 178), (1095, 26, 179), (1095, 27, 179), (1095, 28, 180), (1095, 29, 181), (1095, 30, 182), (1095, 31, 183), (1095, 32, 183), (1095, 33, 184), (1095, 34, 184), (1096, 35, 183), (1096, 36, 183), (1096, 37, 184), (1097, 38, 183), (1097, 39, 183), (1097, 40, 183), (1098, 41, 182), (1098, 42, 182), (1098, 43, 182), (1099, 44, 181), (1099, 45, 181), (1099, 46, 181), (1100, 47, 180), (1100, 48, 180), (1101, 49, 179), (1101, 50, 179), (1102, 51, 178), (1102, 52, 178), (1103, 53, 177), (1103, 54, 177), (1104, 55, 176), (1104, 56, 176), (1104, 57, 176), (1104, 58, 176), (1105, 59, 175), (1105, 60, 175), (1105, 61, 175), (1105, 62, 175), (1105, 63, 175), (1106, 64, 174), (1106, 65, 174), (1106, 66, 174), (1106, 67, 174), (1106, 68, 174), (1106, 69, 174), (1106, 70, 174), (1106, 71, 174), (1106, 72, 174), (1106, 73, 174), (1107, 74, 173), (1107, 75, 173), (1107, 76, 173), (1107, 77, 173), (1107, 78, 173), (1107, 79, 173), (1108, 80, 172), (1108, 81, 172), (1109, 82, 171), (1110, 83, 170), (1110, 84, 170), (1111, 85, 169), (1112, 86, 168), (1113, 87, 166), (1114, 88, 165), (1115, 89, 164), (1117, 90, 162), (1120, 91, 159), (1138, 92, 141), (1146, 93, 133), (1154, 94, 125), (1167, 95, 112), (1177, 96, 102), (1183, 97, 95), (1185, 98, 93), (1187, 99, 90), (1188, 100, 55), (1264, 100, 11), (1190, 101, 50), (1191, 102, 46), (1194, 103, 40), (1197, 104, 34), (1202, 105, 25), (1207, 106, 16)], ['1222,106,1207,106,1206,105,1197,104,1191,102,1182,96,1176,95,1167,95,1166,94,1154,94,1153,93,1146,93,1145,92,1137,91,1120,91,1115,89,1110,84,1107,79,1106,73,1106,64,1104,55,1099,46,1095,34,1095,8,1100,6,1112,6,1113,5,1148,5,1149,4,1158,4,1165,2,1204,2,1205,1,1262,1,1269,2,1273,5,1273,13,1271,18,1271,22,1273,27,1277,31,1279,37,1279,86,1278,87,1278,96,1274,100,1264,100,1263,99,1243,99,1230,104']), (917855882, 492601069, 445, 50, 1144, 0, 672, 0.99753404, [(963, 20, 9), (612, 21, 63), (912, 21, 100), (581, 22, 225), (861, 22, 166), (564, 23, 479), (549, 24, 500), (537, 25, 515), (533, 26, 522), (529, 27, 529), (525, 28, 535), (522, 29, 540), (519, 30, 545), (516, 31, 550), (514, 32, 554), (512, 33, 557), (510, 34, 561), (508, 35, 564), (506, 36, 567), (505, 37, 569), (503, 38, 572), (501, 39, 575), (499, 40, 577), (497, 41, 580), (495, 42, 583), (493, 43, 586), (492, 44, 588), (490, 45, 591), (488, 46, 595), (487, 47, 597), (485, 48, 600), (484, 49, 602), (483, 50, 604), (481, 51, 607), (480, 52, 609), (479, 53, 611), (477, 54, 614), (476, 55, 616), (475, 56, 618), (474, 57, 619), (473, 58, 621), (472, 59, 623), (471, 60, 624), (470, 61, 625), (469, 62, 626), (468, 63, 627), (467, 64, 628), (465, 65, 630), (464, 66, 632), (463, 67, 633), (461, 68, 635), (459, 69, 637), (458, 70, 638), (456, 71, 640), (455, 72, 641), (454, 73, 642), (453, 74, 643), (451, 75, 645), (450, 76, 646), (449, 77, 647), (448, 78, 648), (447, 79, 650), (446, 80, 651), (445, 81, 652), (444, 82, 653), (443, 83, 654), (442, 84, 655), (441, 85, 656), (440, 86, 657), (438, 87, 659), (437, 88, 660), (435, 89, 662), (433, 90, 665), (431, 91, 667), (429, 92, 669), (427, 93, 671), (424, 94, 674), (421, 95, 677), (418, 96, 680), (414, 97, 684), (411, 98, 687), (408, 99, 690), (405, 100, 694), (403, 101, 696), (400, 102, 699), (398, 103, 701), (396, 104, 703), (394, 105, 705), (392, 106, 707), (391, 107, 708), (389, 108, 710), (387, 109, 712), (385, 110, 714), (383, 111, 716), (380, 112, 719), (376, 113, 723), (373, 114, 726), (369, 115, 730), (365, 116, 734), (361, 117, 738), (357, 118, 742), (353, 119, 746), (349, 120, 749), (344, 121, 754), (340, 122, 758), (336, 123, 762), (333, 124, 765), (330, 125, 768), (327, 126, 771), (324, 127, 774), (322, 128, 776), (319, 129, 779), (317, 130, 781), (315, 131, 783), (313, 132, 785), (311, 133, 787), (308, 134, 790), (306, 135, 792), (302, 136, 796), (297, 137, 801), (291, 138, 807), (285, 139, 813), (279, 140, 819), (272, 141, 826), (265, 142, 833), (258, 143, 840), (251, 144, 847), (245, 145, 853), (239, 146, 859), (233, 147, 865), (228, 148, 870), (223, 149, 875), (219, 150, 879), (215, 151, 883), (211, 152, 887), (208, 153, 890), (205, 154, 893), (202, 155, 897), (199, 156, 899), (197, 157, 901), (194, 158, 904), (190, 159, 908), (187, 160, 911), (184, 161, 914), (181, 162, 917), (178, 163, 919), (175, 164, 922), (172, 165, 925), (169, 166, 928), (166, 167, 931), (163, 168, 934), (161, 169, 936), (158, 170, 938), (155, 171, 941), (152, 172, 944), (149, 173, 947), (147, 174, 949), (145, 175, 951), (143, 176, 952), (142, 177, 953), (140, 178, 955), (139, 179, 956), (138, 180, 957), (137, 181, 958), (136, 182, 958), (135, 183, 959), (134, 184, 960), (133, 185, 961), (132, 186, 962), (132, 187, 961), (131, 188, 962), (130, 189, 963), (129, 190, 964), (128, 191, 964), (127, 192, 965), (127, 193, 965), (126, 194, 966), (125, 195, 967), (124, 196, 967), (123, 197, 968), (123, 198, 968), (122, 199, 969), (121, 200, 969), (120, 201, 970), (120, 202, 970), (119, 203, 970), (118, 204, 971), (117, 205, 972), (116, 206, 972), (115, 207, 973), (114, 208, 973), (113, 209, 974), (112, 210, 974), (110, 211, 976), (109, 212, 976), (107, 213, 978), (106, 214, 979), (105, 215, 979), (104, 216, 980), (102, 217, 981), (101, 218, 982), (100, 219, 982), (100, 220, 982), (99, 221, 982), (98, 222, 983), (97, 223, 984), (96, 224, 984), (96, 225, 984), (95, 226, 985), (94, 227, 985), (94, 228, 985), (93, 229, 985), (93, 230, 985), (93, 231, 984), (93, 232, 984), (92, 233, 985), (92, 234, 984), (92, 235, 984), (91, 236, 984), (91, 237, 984), (91, 238, 984), (91, 239, 983), (90, 240, 984), (90, 241, 983), (90, 242, 983), (89, 243, 984), (89, 244, 983), (89, 245, 983), (89, 246, 983), (88, 247, 984), (88, 248, 983), (88, 249, 983), (88, 250, 983), (87, 251, 983), (87, 252, 983), (87, 253, 983), (87, 254, 983), (87, 255, 983), (86, 256, 983), (86, 257, 983), (86, 258, 983), (86, 259, 983), (86, 260, 983), (85, 261, 983), (85, 262, 983), (85, 263, 983), (85, 264, 983), (85, 265, 983), (84, 266, 984), (84, 267, 983), (84, 268, 983), (84, 269, 983), (83, 270, 984), (83, 271, 984), (83, 272, 983), (83, 273, 983), (82, 274, 984), (82, 275, 984), (82, 276, 984), (82, 277, 983), (82, 278, 983), (81, 279, 984), (81, 280, 983), (81, 281, 983), (81, 282, 982), (80, 283, 983), (80, 284, 982), (80, 285, 982), (80, 286, 981), (80, 287, 981), (79, 288, 981), (79, 289, 981), (79, 290, 980), (79, 291, 980), (78, 292, 980), (78, 293, 979), (78, 294, 979), (77, 295, 979), (77, 296, 978), (77, 297, 978), (77, 298, 977), (76, 299, 977), (76, 300, 976), (76, 301, 975), (76, 302, 973), (75, 303, 973), (75, 304, 971), (75, 305, 970), (75, 306, 968), (74, 307, 967), (74, 308, 965), (74, 309, 964), (74, 310, 962), (73, 311, 962), (73, 312, 961), (73, 313, 959), (73, 314, 958), (72, 315, 958), (72, 316, 957), (72, 317, 956), (71, 318, 955), (71, 319, 954), (71, 320, 953), (70, 321, 953), (70, 322, 952), (70, 323, 952), (70, 324, 951), (69, 325, 951), (69, 326, 950), (69, 327, 948), (69, 328, 947), (69, 329, 946), (68, 330, 946), (68, 331, 944), (68, 332, 943), (68, 333, 941), (68, 334, 939), (68, 335, 937), (67, 336, 936), (67, 337, 935), (67, 338, 933), (67, 339, 931), (67, 340, 930), (67, 341, 929), (66, 342, 928), (66, 343, 927), (66, 344, 926), (66, 345, 925), (66, 346, 924), (66, 347, 923), (66, 348, 922), (65, 349, 922), (65, 350, 921), (65, 351, 920), (65, 352, 918), (65, 353, 917), (65, 354, 916), (65, 355, 914), (65, 356, 913), (65, 357, 911), (65, 358, 909), (65, 359, 908), (64, 360, 907), (64, 361, 905), (64, 362, 903), (64, 363, 901), (64, 364, 899), (64, 365, 897), (64, 366, 896), (64, 367, 894), (64, 368, 893), (64, 369, 892), (64, 370, 891), (63, 371, 891), (63, 372, 890), (64, 373, 888), (64, 374, 887), (64, 375, 886), (64, 376, 885), (64, 377, 884), (64, 378, 883), (64, 379, 882), (65, 380, 880), (65, 381, 878), (65, 382, 877), (65, 383, 876), (65, 384, 874), (65, 385, 873), (65, 386, 871), (66, 387, 869), (66, 388, 867), (66, 389, 866), (66, 390, 864), (66, 391, 862), (66, 392, 860), (66, 393, 859), (67, 394, 856), (67, 395, 855), (67, 396, 854), (67, 397, 853), (67, 398, 852), (67, 399, 851), (67, 400, 850), (67, 401, 849), (67, 402, 848), (67, 403, 847), (67, 404, 846), (67, 405, 845), (68, 406, 843), (68, 407, 842), (68, 408, 841), (68, 409, 840), (68, 410, 839), (68, 411, 838), (68, 412, 837), (68, 413, 837), (68, 414, 836), (68, 415, 835), (68, 416, 834), (68, 417, 833), (68, 418, 832), (69, 419, 830), (69, 420, 829), (69, 421, 828), (69, 422, 827), (69, 423, 825), (69, 424, 824), (69, 425, 822), (69, 426, 821), (69, 427, 819), (70, 428, 816), (70, 429, 815), (70, 430, 814), (70, 431, 812), (70, 432, 811), (70, 433, 810), (70, 434, 809), (70, 435, 808), (71, 436, 806), (71, 437, 805), (71, 438, 804), (71, 439, 804), (71, 440, 803), (71, 441, 802), (71, 442, 801), (71, 443, 801), (72, 444, 799), (72, 445, 799), (73, 446, 797), (73, 447, 797), (73, 448, 796), (74, 449, 795), (74, 450, 794), (75, 451, 793), (75, 452, 792), (76, 453, 791), (76, 454, 790), (76, 455, 789), (77, 456, 788), (77, 457, 787), (77, 458, 787), (78, 459, 785), (78, 460, 785), (78, 461, 784), (79, 462, 782), (79, 463, 782), (79, 464, 781), (80, 465, 779), (80, 466, 778), (80, 467, 778), (81, 468, 776), (81, 469, 775), (81, 470, 774), (82, 471, 772), (82, 472, 771), (82, 473, 769), (83, 474, 767), (83, 475, 766), (83, 476, 765), (84, 477, 762), (84, 478, 761), (85, 479, 759), (85, 480, 758), (85, 481, 757), (86, 482, 756), (86, 483, 755), (86, 484, 754), (87, 485, 752), (87, 486, 751), (87, 487, 751), (88, 488, 749), (88, 489, 748), (88, 490, 748), (89, 491, 746), (89, 492, 746), (90, 493, 744), (90, 494, 744), (91, 495, 742), (92, 496, 741), (92, 497, 740), (93, 498, 739), (94, 499, 737), (95, 500, 736), (95, 501, 735), (96, 502, 734), (97, 503, 732), (98, 504, 731), (99, 505, 729), (100, 506, 727), (101, 507, 726), (102, 508, 724), (103, 509, 723), (104, 510, 721), (105, 511, 720), (107, 512, 717), (108, 513, 715), (109, 514, 714), (111, 515, 711), (112, 516, 709), (113, 517, 708), (115, 518, 705), (116, 519, 703), (117, 520, 701), (119, 521, 698), (120, 522, 696), (121, 523, 694), (123, 524, 691), (124, 525, 689), (126, 526, 686), (127, 527, 684), (128, 528, 681), (130, 529, 678), (131, 530, 676), (133, 531, 673), (134, 532, 672), (136, 533, 669), (137, 534, 667), (139, 535, 664), (141, 536, 661), (142, 537, 660), (144, 538, 657), (145, 539, 655), (147, 540, 653), (149, 541, 650), (150, 542, 649), (152, 543, 646), (154, 544, 644), (155, 545, 642), (157, 546, 640), (158, 547, 638), (160, 548, 636), (162, 549, 633), (163, 550, 632), (165, 551, 629), (166, 552, 628), (168, 553, 625), (169, 554, 624), (171, 555, 622), (172, 556, 620), (174, 557, 618), (175, 558, 616), (177, 559, 614), (178, 560, 612), (180, 561, 610), (181, 562, 609), (183, 563, 606), (184, 564, 605), (186, 565, 602), (188, 566, 600), (191, 567, 596), (194, 568, 593), (196, 569, 590), (199, 570, 587), (202, 571, 583), (205, 572, 580), (208, 573, 576), (211, 574, 573), (214, 575, 569), (217, 576, 565), (219, 577, 563), (222, 578, 559), (226, 579, 554), (228, 580, 551), (230, 581, 548), (232, 582, 546), (234, 583, 543), (236, 584, 540), (237, 585, 538), (239, 586, 535), (241, 587, 532), (242, 588, 529), (244, 589, 526), (245, 590, 524), (247, 591, 521), (249, 592, 517), (250, 593, 515), (252, 594, 512), (254, 595, 509), (256, 596, 506), (259, 597, 502), (261, 598, 499), (264, 599, 495), (267, 600, 491), (271, 601, 486), (275, 602, 481), (278, 603, 477), (282, 604, 472), (285, 605, 468), (289, 606, 463), (292, 607, 459), (295, 608, 455), (298, 609, 451), (301, 610, 447), (304, 611, 443), (306, 612, 440), (308, 613, 437), (310, 614, 434), (312, 615, 430), (315, 616, 426), (317, 617, 422), (319, 618, 418), (322, 619, 413), (324, 620, 409), (327, 621, 402), (329, 622, 397), (332, 623, 391), (335, 624, 385), (338, 625, 379), (341, 626, 373), (344, 627, 367), (349, 628, 359), (354, 629, 351), (358, 630, 345), (362, 631, 338), (367, 632, 331), (371, 633, 324), (375, 634, 317), (379, 635, 310), (383, 636, 302), (387, 637, 295), (391, 638, 287), (395, 639, 279), (400, 640, 270), (406, 641, 259), (411, 642, 249), (418, 643, 236), (430, 644, 219), (448, 645, 195), (466, 646, 170), (482, 647, 148), (499, 648, 125), (520, 649, 97), (547, 650, 36)], ['547,650,418,643,344,627,303,610,261,598,235,583,184,564,128,528,95,501,88,490,71,443,64,360,73,314,93,229,132,186,149,173,211,152,301,137,324,127,382,112,439,87,488,46,516,31,537,25,612,21,805,22,860,23,912,21,1011,21,1042,23,1070,34,1092,56,1098,100,1097,156,1088,205,1069,251,1064,279,1050,301,958,366,923,393,895,422,871,442,862,460,837,486,820,517,801,536,781,577,743,614,669,640,616,649']), (917855882, 492601069, 445, 0, 440, 0, 116, 0.9919416, [(127, 1, 141), (94, 2, 206), (384, 2, 2), (59, 3, 273), (340, 3, 57), (22, 4, 381), (19, 5, 387), (16, 6, 392), (15, 7, 394), (14, 8, 396), (14, 9, 397), (13, 10, 399), (12, 11, 400), (12, 12, 400), (11, 13, 402), (10, 14, 403), (11, 15, 403), (11, 16, 404), (12, 17, 403), (12, 18, 404), (12, 19, 405), (12, 20, 405), (12, 21, 406), (12, 22, 406), (12, 23, 407), (12, 24, 407), (12, 25, 408), (12, 26, 408), (12, 27, 408), (12, 28, 408), (12, 29, 409), (12, 30, 409), (12, 31, 409), (12, 32, 409), (12, 33, 409), (12, 34, 410), (12, 35, 410), (12, 36, 410), (12, 37, 410), (12, 38, 410), (12, 39, 410), (12, 40, 410), (12, 41, 411), (12, 42, 411), (12, 43, 411), (12, 44, 411), (12, 45, 411), (12, 46, 410), (12, 47, 410), (12, 48, 410), (12, 49, 410), (12, 50, 410), (12, 51, 410), (12, 52, 409), (12, 53, 408), (12, 54, 408), (12, 55, 407), (12, 56, 406), (12, 57, 404), (12, 58, 403), (11, 59, 403), (11, 60, 402), (11, 61, 401), (11, 62, 400), (11, 63, 400), (11, 64, 399), (11, 65, 398), (11, 66, 397), (11, 67, 397), (11, 68, 396), (11, 69, 395), (11, 70, 395), (11, 71, 394), (11, 72, 394), (11, 73, 394), (11, 74, 393), (11, 75, 393), (11, 76, 393), (11, 77, 393), (11, 78, 393), (11, 79, 393), (11, 80, 392), (10, 81, 394), (10, 82, 394), (10, 83, 395), (9, 84, 396), (9, 85, 262), (279, 85, 126), (9, 86, 75), (98, 86, 28), (142, 86, 117), (292, 86, 112), (9, 87, 71), (152, 87, 103), (294, 87, 110), (8, 88, 68), (161, 88, 91), (296, 88, 107), (8, 89, 63), (176, 89, 73), (297, 89, 106), (7, 90, 61), (205, 90, 40), (298, 90, 104), (7, 91, 57), (299, 91, 103), (6, 92, 54), (300, 92, 102), (6, 93, 50), (301, 93, 100), (7, 94, 46), (303, 94, 97), (7, 95, 44), (306, 95, 92), (7, 96, 42), (308, 96, 89), (7, 97, 40), (310, 97, 86), (7, 98, 38), (312, 98, 83), (8, 99, 34), (314, 99, 79), (8, 100, 32), (317, 100, 75), (8, 101, 29), (319, 101, 71), (13, 102, 19), (324, 102, 63), (20, 103, 6), (330, 103, 51), (337, 104, 37), (344, 105, 22), (352, 106, 3)], ['344,105,319,101,301,93,291,85,259,85,244,90,205,90,204,89,176,89,161,88,141,85,126,85,125,86,98,86,84,85,56,92,36,101,26,102,8,101,6,92,11,80,11,59,12,58,12,17,10,14,16,6,22,4,58,4,59,3,93,3,94,2,126,2,127,1,267,1,268,2,331,3,396,3,407,6,411,10,419,25,421,34,421,51,410,62,404,71,402,80,404,85,401,92,394,98,386,102,365,105']), (917855882, 492601069, 445, 390, 550, 0, 54, 0.93914086, [(414, 0, 7), (441, 0, 60), (508, 0, 28), (402, 1, 142), (401, 2, 146), (402, 3, 145), (404, 4, 143), (406, 5, 140), (408, 6, 137), (410, 7, 134), (411, 8, 132), (412, 9, 130), (413, 10, 127), (414, 11, 125), (415, 12, 123), (415, 13, 122), (416, 14, 120), (417, 15, 117), (417, 16, 116), (418, 17, 114), (418, 18, 113), (418, 19, 111), (418, 20, 109), (419, 21, 107), (419, 22, 105), (419, 23, 103), (419, 24, 102), (419, 25, 100), (420, 26, 97), (420, 27, 95), (420, 28, 94), (421, 29, 91), (421, 30, 90), (422, 31, 88), (422, 32, 88), (422, 33, 87), (423, 34, 84), (423, 35, 82), (423, 36, 81), (424, 37, 79), (424, 38, 77), (424, 39, 75), (424, 40, 73), (424, 41, 71), (425, 42, 67), (425, 43, 66), (426, 44, 62), (426, 45, 6), (433, 45, 52), (443, 46, 30), (450, 47, 1)], ['450,47,449,46,443,46,442,45,426,45,424,41,424,37,423,36,422,31,419,25,419,21,418,20,418,17,417,15,409,6,402,3,402,1,413,1,414,0,420,0,421,1,440,1,441,0,500,0,501,1,507,1,508,0,535,0,536,1,543,1,546,2,546,4,542,8,530,18,527,19,525,21,522,22,520,24,512,28,508,33,505,34,502,37,494,41,492,41,490,43,488,43,484,45,473,45,472,46,451,46'])], 'temp/1747812948_1946083_917855882_da0fa7b7e6b5b551fe26c0ba8713276d.jpg']} error in position expected : (917855882, 492601069, 445, 52, 1128, 16, 668, 0.9977477) got : (917855882, 492601069, 445, 50, 1144, 0, 672, 0.99753404) ERROR test detect objet FAILED ############################### TEST POLYGON ################################ Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : mask_detect list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.21096110343933105 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:mask_detect Wed May 21 09:36:08 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory havn't enough memory gpu , need / 3000 l 3632 free memory gpu now : 1746 wait 20 seconds l 3637 free memory gpu now : 1746 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-05-21 09:36:31.771139: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-05-21 09:36:31.799314: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493065000 Hz 2025-05-21 09:36:31.801017: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f4710000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-05-21 09:36:31.801063: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-05-21 09:36:31.807076: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-05-21 09:36:31.957950: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x12b59980 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-05-21 09:36:31.958010: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-05-21 09:36:31.958798: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-21 09:36:31.986010: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-21 09:36:31.990758: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-21 09:36:31.994182: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-21 09:36:31.995591: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-21 09:36:31.999601: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-21 09:36:32.001098: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-21 09:36:32.008423: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-21 09:36:32.009457: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-21 09:36:32.009563: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-21 09:36:32.010128: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-05-21 09:36:32.010145: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-05-21 09:36:32.010154: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-05-21 09:36:32.010995: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 1369 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-05-21 09:36:32.178210: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-21 09:36:32.178359: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-21 09:36:32.178389: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-21 09:36:32.178416: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-21 09:36:32.178440: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-21 09:36:32.178474: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-21 09:36:32.178497: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-21 09:36:32.178522: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-21 09:36:32.179354: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-21 09:36:32.180523: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-21 09:36:32.180600: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-21 09:36:32.180620: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-21 09:36:32.180637: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-21 09:36:32.180653: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-21 09:36:32.180670: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-21 09:36:32.180686: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-21 09:36:32.180703: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-21 09:36:32.181444: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-21 09:36:32.181484: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-05-21 09:36:32.181494: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-05-21 09:36:32.181502: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-05-21 09:36:32.182308: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 1369 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (3473, 'mask_coco_origin', 16384, 25088, 'mask_coco_origin', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2018, 3, 19, 10, 42, 21), datetime.datetime(2018, 3, 19, 10, 42, 21)) {'thcl': {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'], 'list_hashtags_csv': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'svm_hashtag_type_desc': 3473, 'photo_desc_type': 3473, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME mask_coco_origin NUM_CLASSES 81 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : mask_coco_origin model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-05-21 09:36:41.202752: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-21 09:36:41.362655: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-21 09:36:43.038931: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-05-21 09:36:43.038990: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-05-21 09:36:43.045480: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-05-21 09:36:43.045504: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-05-21 09:36:43.097556: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-05-21 09:36:43.097598: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.06GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-05-21 09:36:43.139343: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.09GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-05-21 09:36:43.139376: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.09GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-05-21 09:36:43.185907: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.15GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-05-21 09:36:43.185955: W tensorflow/core/common_runtime/bfc_allocator.cc:245] Allocator (GPU_0_bfc) ran out of memory trying to allocate 2.15GiB with freed_by_count=0. The caller indicates that this is not a failure, but may mean that there could be performance gains if more memory were available. 2025-05-21 09:36:43.212439: W tensorflow/core/common_runtime/bfc_allocator.cc:311] Garbage collection: deallocate free memory regions (i.e., allocations) so that we can re-allocate a larger region to avoid OOM due to memory fragmentation. If you see this message frequently, you are running near the threshold of the available device memory and re-allocation may incur great performance overhead. You may try smaller batch sizes to observe the performance impact. Set TF_ENABLE_GPU_GARBAGE_COLLECTION=false if you'd like to disable this feature. 2025-05-21 09:36:43.232516: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.233453: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 772.99M (810536192 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.234327: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 695.69M (729482752 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.235295: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 626.12M (656534528 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.236539: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.238877: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.239854: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.246371: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.246908: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.252433: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.252963: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.254407: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.254907: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.281642: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.282199: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.282755: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.283302: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.287000: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.287601: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.303179: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.303730: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.304252: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.304935: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.317758: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.318317: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.318840: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.319373: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.328178: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.328808: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.341830: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.343014: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.348000: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.348645: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.370882: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.371489: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.372058: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.372593: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.373121: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.373661: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.415630: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.415712: W tensorflow/core/kernels/gpu_utils.cc:49] Failed to allocate memory for convolution redzone checking; skipping this check. This is benign and only means that we won't check cudnn for out-of-bounds reads and writes. This message will only be printed once. 2025-05-21 09:36:43.416725: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.417729: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.425042: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.425628: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.433822: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.434464: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.449288: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.449848: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.450381: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.450911: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.455064: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.455607: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.456127: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.456647: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.457612: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.468005: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.468532: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.478916: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.479493: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.480027: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.480534: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.481025: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory 2025-05-21 09:36:43.481535: I tensorflow/stream_executor/cuda/cuda_driver.cc:763] failed to allocate 858.88M (900595712 bytes) from device: CUDA_ERROR_OUT_OF_MEMORY: out of memory local folder : /data/models_weight/mask_coco_origin /data/models_weight/mask_coco_origin/mask_model.h5 size_local : 257557808 size in s3 : 257557808 create time local : 2021-08-09 05:27:17 create time in s3 : 2021-08-06 19:45:17 mask_model.h5 already exist and didn't need to update list_images length : 1 NEW PHOTO Processing 1 images image shape: (2448, 2448, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 89) min: 0.00000 max: 2448.00000 nb d'objets trouves : 1 Detection mask done ! Trying to reset tf kernel 1949565 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 553 tf kernel not reseted sub process len(results) : 1 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 1 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 1746 list_Values should be empty [] ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] time for calcul the mask position with numpy : 0.4695861339569092 nb_pixel_total : 3697104 time to create 1 rle with new method : 0.6404023170471191 length of segment : 2044 time spent for convertir_results : 2.163172721862793 time spend for datou_step_exec : 41.34025812149048 time spend to save output : 5.221366882324219e-05 total time spend for step 1 : 41.3403103351593 caffe_path_current : About to save ! 1 Inside saveOutput : final : True verbose : False eke 12-6-18 : saveMask need to be cleaned for new output ! Catched exception ! Connect or reconnect ! Number saved : None batch 1 Loaded 722 chid ids of type : 445 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++Number RLEs to save : 0 begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1 time used for this insertion : 0.019553422927856445 save missing photos in datou_result : After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {'917877156': [[(917877156, 492601069, 445, 0, 2283, 103, 2222, 0.9817535, [(1251, 110, 28), (653, 111, 289), (1203, 111, 134), (614, 112, 378), (1080, 112, 348), (526, 113, 909), (519, 114, 923), (512, 115, 936), (505, 116, 949), (499, 117, 961), (493, 118, 972), (487, 119, 984), (481, 120, 995), (476, 121, 1005), (471, 122, 1015), (466, 123, 1024), (461, 124, 1034), (456, 125, 1043), (451, 126, 1053), (447, 127, 1061), (443, 128, 1073), (439, 129, 1086), (435, 130, 1099), (431, 131, 1111), (427, 132, 1124), (423, 133, 1136), (420, 134, 1147), (416, 135, 1158), (413, 136, 1169), (410, 137, 1179), (406, 138, 1187), (403, 139, 1193), (400, 140, 1200), (397, 141, 1206), (394, 142, 1212), (391, 143, 1218), (388, 144, 1225), (385, 145, 1231), (382, 146, 1238), (379, 147, 1245), (375, 148, 1253), (372, 149, 1260), (368, 150, 1268), (365, 151, 1275), (363, 152, 1281), (361, 153, 1288), (358, 154, 1295), (356, 155, 1302), (353, 156, 1310), (350, 157, 1318), (348, 158, 1325), (345, 159, 1333), (342, 160, 1342), (339, 161, 1350), (336, 162, 1359), (334, 163, 1367), (330, 164, 1377), (327, 165, 1386), (324, 166, 1395), (321, 167, 1405), (318, 168, 1415), (314, 169, 1426), (311, 170, 1437), (307, 171, 1449), (303, 172, 1461), (300, 173, 1473), (296, 174, 1485), (292, 175, 1498), (288, 176, 1511), (284, 177, 1524), (281, 178, 1536), (277, 179, 1549), (274, 180, 1561), (271, 181, 1568), (269, 182, 1575), (266, 183, 1582), (263, 184, 1589), (260, 185, 1596), (258, 186, 1602), (255, 187, 1608), (253, 188, 1614), (250, 189, 1620), (248, 190, 1625), (246, 191, 1631), (243, 192, 1637), (241, 193, 1642), (239, 194, 1647), (237, 195, 1651), (235, 196, 1656), (233, 197, 1661), (231, 198, 1665), (229, 199, 1670), (227, 200, 1674), (225, 201, 1679), (223, 202, 1683), (222, 203, 1686), (220, 204, 1690), (218, 205, 1694), (217, 206, 1697), (215, 207, 1701), (213, 208, 1705), (212, 209, 1707), (210, 210, 1710), (209, 211, 1712), (207, 212, 1715), (206, 213, 1718), (204, 214, 1721), (203, 215, 1723), (202, 216, 1725), (200, 217, 1728), (199, 218, 1730), (198, 219, 1732), (197, 220, 1734), (195, 221, 1737), (194, 222, 1739), (193, 223, 1741), (191, 224, 1744), (190, 225, 1746), (189, 226, 1748), (187, 227, 1751), (186, 228, 1753), (185, 229, 1755), (183, 230, 1758), (182, 231, 1760), (180, 232, 1763), (179, 233, 1765), (178, 234, 1767), (176, 235, 1770), (175, 236, 1772), (173, 237, 1776), (172, 238, 1778), (170, 239, 1781), (169, 240, 1783), (167, 241, 1786), (166, 242, 1789), (164, 243, 1792), (163, 244, 1794), (161, 245, 1798), (160, 246, 1800), (158, 247, 1803), (156, 248, 1806), (155, 249, 1809), (153, 250, 1812), (151, 251, 1816), (150, 252, 1818), (148, 253, 1821), (146, 254, 1825), (145, 255, 1827), (143, 256, 1831), (141, 257, 1834), (140, 258, 1837), (138, 259, 1840), (136, 260, 1844), (134, 261, 1848), (132, 262, 1851), (131, 263, 1854), (129, 264, 1858), (127, 265, 1861), (125, 266, 1865), (123, 267, 1869), (122, 268, 1871), (120, 269, 1875), (119, 270, 1878), (118, 271, 1881), (117, 272, 1883), (116, 273, 1885), (114, 274, 1889), (113, 275, 1891), (112, 276, 1893), (111, 277, 1895), (110, 278, 1897), (109, 279, 1899), (108, 280, 1901), (107, 281, 1904), (106, 282, 1906), (106, 283, 1907), (105, 284, 1908), (104, 285, 1910), (103, 286, 1912), (102, 287, 1914), (101, 288, 1916), (101, 289, 1917), (100, 290, 1919), (99, 291, 1921), (99, 292, 1921), (98, 293, 1923), (98, 294, 1923), (97, 295, 1925), (97, 296, 1925), (97, 297, 1926), (96, 298, 1927), (96, 299, 1928), (96, 300, 1928), (95, 301, 1930), (95, 302, 1930), (94, 303, 1932), (94, 304, 1932), (94, 305, 1933), (93, 306, 1934), (93, 307, 1935), (92, 308, 1936), (92, 309, 1937), (92, 310, 1937), (91, 311, 1939), (91, 312, 1939), (91, 313, 1940), (90, 314, 1941), (90, 315, 1942), (90, 316, 1942), (89, 317, 1944), (89, 318, 1944), (88, 319, 1946), (88, 320, 1946), (88, 321, 1947), (87, 322, 1948), (87, 323, 1949), (87, 324, 1949), (86, 325, 1951), (86, 326, 1951), (86, 327, 1952), (85, 328, 1953), (85, 329, 1954), (85, 330, 1955), (84, 331, 1956), (84, 332, 1957), (84, 333, 1957), (83, 334, 1959), (83, 335, 1959), (83, 336, 1960), (82, 337, 1961), (82, 338, 1962), (81, 339, 1963), (81, 340, 1964), (81, 341, 1964), (80, 342, 1966), (80, 343, 1966), (80, 344, 1967), (79, 345, 1968), (79, 346, 1969), (79, 347, 1969), (78, 348, 1971), (78, 349, 1971), (78, 350, 1972), (78, 351, 1972), (77, 352, 1974), (77, 353, 1974), (77, 354, 1975), (76, 355, 1976), (76, 356, 1977), (76, 357, 1977), (75, 358, 1979), (75, 359, 1979), (75, 360, 1980), (74, 361, 1981), (74, 362, 1982), (74, 363, 1983), (73, 364, 1984), (73, 365, 1985), (73, 366, 1985), (72, 367, 1987), (72, 368, 1987), (72, 369, 1988), (72, 370, 1989), (71, 371, 1990), (71, 372, 1991), (71, 373, 1992), (71, 374, 1993), (70, 375, 1995), (70, 376, 1995), (70, 377, 1996), (70, 378, 1997), (69, 379, 1999), (69, 380, 2000), (69, 381, 2001), (69, 382, 2002), (68, 383, 2004), (68, 384, 2005), (68, 385, 2006), (68, 386, 2007), (67, 387, 2009), (67, 388, 2010), (67, 389, 2011), (66, 390, 2013), (66, 391, 2015), (66, 392, 2016), (66, 393, 2017), (65, 394, 2019), (65, 395, 2020), (65, 396, 2021), (64, 397, 2023), (64, 398, 2024), (64, 399, 2025), (64, 400, 2026), (63, 401, 2028), (63, 402, 2029), (63, 403, 2030), (62, 404, 2032), (62, 405, 2032), (62, 406, 2033), (61, 407, 2035), (61, 408, 2036), (61, 409, 2037), (60, 410, 2038), (60, 411, 2039), (60, 412, 2040), (59, 413, 2042), (59, 414, 2042), (59, 415, 2043), (58, 416, 2045), (58, 417, 2045), (58, 418, 2046), (57, 419, 2047), (57, 420, 2048), (56, 421, 2050), (56, 422, 2050), (56, 423, 2051), (55, 424, 2052), (55, 425, 2053), (55, 426, 2054), (54, 427, 2055), (54, 428, 2056), (53, 429, 2057), (53, 430, 2058), (53, 431, 2058), (52, 432, 2060), (52, 433, 2060), (51, 434, 2062), (51, 435, 2062), (51, 436, 2063), (50, 437, 2064), (50, 438, 2065), (49, 439, 2066), (49, 440, 2066), (49, 441, 2067), (48, 442, 2068), (48, 443, 2069), (47, 444, 2070), (47, 445, 2070), (47, 446, 2070), (47, 447, 2071), (46, 448, 2072), (46, 449, 2072), (46, 450, 2072), (46, 451, 2072), (46, 452, 2073), (45, 453, 2074), (45, 454, 2074), (45, 455, 2074), (45, 456, 2074), (44, 457, 2076), (44, 458, 2076), (44, 459, 2076), (44, 460, 2076), (43, 461, 2077), (43, 462, 2078), (43, 463, 2078), (43, 464, 2078), (42, 465, 2079), (42, 466, 2079), (42, 467, 2080), (42, 468, 2080), (41, 469, 2081), (41, 470, 2081), (41, 471, 2082), (41, 472, 2082), (40, 473, 2083), (40, 474, 2083), (40, 475, 2083), (40, 476, 2084), (39, 477, 2085), (39, 478, 2085), (39, 479, 2085), (39, 480, 2086), (38, 481, 2087), (38, 482, 2087), (38, 483, 2087), (38, 484, 2087), (37, 485, 2089), (37, 486, 2089), (37, 487, 2089), (37, 488, 2089), (36, 489, 2091), (36, 490, 2091), (36, 491, 2091), (36, 492, 2091), (36, 493, 2092), (35, 494, 2093), (35, 495, 2093), (35, 496, 2093), (35, 497, 2094), (34, 498, 2095), (34, 499, 2095), (34, 500, 2095), (34, 501, 2096), (34, 502, 2096), (33, 503, 2097), (33, 504, 2097), (33, 505, 2098), (33, 506, 2098), (33, 507, 2098), (32, 508, 2099), (32, 509, 2100), (32, 510, 2100), (32, 511, 2100), (32, 512, 2100), (31, 513, 2102), (31, 514, 2102), (31, 515, 2102), (31, 516, 2102), (31, 517, 2103), (30, 518, 2104), (30, 519, 2104), (30, 520, 2104), (30, 521, 2105), (30, 522, 2105), (30, 523, 2105), (30, 524, 2105), (29, 525, 2106), (29, 526, 2106), (29, 527, 2107), (29, 528, 2107), (29, 529, 2107), (29, 530, 2107), (29, 531, 2107), (29, 532, 2107), (29, 533, 2108), (28, 534, 2109), (28, 535, 2109), (28, 536, 2109), (28, 537, 2109), (28, 538, 2110), (28, 539, 2110), (28, 540, 2110), (28, 541, 2110), (28, 542, 2110), (28, 543, 2111), (27, 544, 2112), (27, 545, 2112), (27, 546, 2112), (27, 547, 2112), (27, 548, 2112), (27, 549, 2113), (27, 550, 2113), (27, 551, 2113), (27, 552, 2113), (26, 553, 2114), (26, 554, 2115), (26, 555, 2115), (26, 556, 2115), (26, 557, 2115), (26, 558, 2115), (26, 559, 2116), (26, 560, 2116), (26, 561, 2116), (26, 562, 2116), (25, 563, 2118), (25, 564, 2118), (25, 565, 2118), (25, 566, 2118), (25, 567, 2118), (25, 568, 2119), (25, 569, 2119), (25, 570, 2119), (25, 571, 2119), (25, 572, 2119), (24, 573, 2121), (24, 574, 2121), (24, 575, 2121), (24, 576, 2121), (24, 577, 2122), (24, 578, 2122), (24, 579, 2122), (24, 580, 2122), (24, 581, 2123), (24, 582, 2123), (24, 583, 2123), (23, 584, 2124), (23, 585, 2124), (23, 586, 2125), (23, 587, 2125), (23, 588, 2125), (23, 589, 2125), (23, 590, 2126), (23, 591, 2126), (23, 592, 2126), (23, 593, 2126), (22, 594, 2128), (22, 595, 2128), (22, 596, 2128), (22, 597, 2128), (22, 598, 2128), (22, 599, 2128), (22, 600, 2129), (22, 601, 2129), (22, 602, 2129), (22, 603, 2129), (22, 604, 2129), (22, 605, 2129), (22, 606, 2130), (22, 607, 2130), (22, 608, 2130), (22, 609, 2130), (22, 610, 2130), (21, 611, 2131), (21, 612, 2132), (21, 613, 2132), (21, 614, 2132), (21, 615, 2132), (21, 616, 2132), (21, 617, 2132), (21, 618, 2133), (21, 619, 2133), (21, 620, 2133), (21, 621, 2133), (21, 622, 2133), (21, 623, 2133), (21, 624, 2134), (21, 625, 2134), (21, 626, 2134), (21, 627, 2134), (21, 628, 2134), (20, 629, 2135), (20, 630, 2136), (20, 631, 2136), (20, 632, 2136), (20, 633, 2136), (20, 634, 2136), (20, 635, 2137), (20, 636, 2137), (20, 637, 2137), (20, 638, 2137), (20, 639, 2137), (20, 640, 2137), (20, 641, 2138), (20, 642, 2138), (20, 643, 2138), (20, 644, 2138), (20, 645, 2138), (20, 646, 2139), (19, 647, 2140), (19, 648, 2140), (19, 649, 2140), (19, 650, 2140), (19, 651, 2141), (19, 652, 2141), (19, 653, 2141), (19, 654, 2141), (19, 655, 2141), (19, 656, 2142), (19, 657, 2142), (19, 658, 2142), (19, 659, 2142), (19, 660, 2142), (19, 661, 2143), (19, 662, 2143), (19, 663, 2143), (19, 664, 2143), (19, 665, 2143), (18, 666, 2145), (18, 667, 2145), (18, 668, 2145), (18, 669, 2145), (18, 670, 2145), (18, 671, 2145), (18, 672, 2145), (19, 673, 2144), (19, 674, 2144), (19, 675, 2144), (19, 676, 2144), (19, 677, 2144), (19, 678, 2143), (19, 679, 2143), (19, 680, 2143), (19, 681, 2143), (19, 682, 2143), (19, 683, 2143), (20, 684, 2142), (20, 685, 2142), (20, 686, 2142), (20, 687, 2141), (20, 688, 2141), (20, 689, 2141), (20, 690, 2141), (20, 691, 2141), (20, 692, 2141), (20, 693, 2141), (21, 694, 2140), (21, 695, 2139), (21, 696, 2139), (21, 697, 2139), (21, 698, 2139), (21, 699, 2139), (21, 700, 2139), (21, 701, 2139), (21, 702, 2139), (22, 703, 2138), (22, 704, 2137), (22, 705, 2137), (22, 706, 2137), (22, 707, 2137), (22, 708, 2137), (22, 709, 2137), (22, 710, 2137), (22, 711, 2137), (22, 712, 2137), (23, 713, 2135), (23, 714, 2135), (23, 715, 2135), (23, 716, 2135), (23, 717, 2135), (23, 718, 2135), (23, 719, 2135), (23, 720, 2135), (23, 721, 2135), (24, 722, 2134), (24, 723, 2133), (24, 724, 2133), (24, 725, 2133), (24, 726, 2133), (24, 727, 2133), (24, 728, 2133), (24, 729, 2133), (24, 730, 2133), (25, 731, 2132), (25, 732, 2131), (25, 733, 2131), (25, 734, 2131), (25, 735, 2131), (25, 736, 2131), (25, 737, 2131), (25, 738, 2131), (26, 739, 2130), (26, 740, 2130), (26, 741, 2130), (26, 742, 2129), (26, 743, 2129), (26, 744, 2129), (26, 745, 2129), (26, 746, 2129), (26, 747, 2129), (26, 748, 2129), (26, 749, 2129), (26, 750, 2129), (26, 751, 2129), (26, 752, 2128), (26, 753, 2128), (26, 754, 2128), (26, 755, 2128), (26, 756, 2128), (26, 757, 2128), (26, 758, 2128), (26, 759, 2128), (26, 760, 2128), (26, 761, 2128), (26, 762, 2128), (26, 763, 2127), (26, 764, 2127), (26, 765, 2127), (26, 766, 2127), (26, 767, 2127), (26, 768, 2127), (26, 769, 2127), (26, 770, 2127), (26, 771, 2127), (26, 772, 2127), (26, 773, 2127), (26, 774, 2126), (26, 775, 2126), (26, 776, 2126), (26, 777, 2126), (26, 778, 2126), (26, 779, 2126), (26, 780, 2126), (26, 781, 2126), (26, 782, 2126), (26, 783, 2126), (26, 784, 2126), (25, 785, 2126), (25, 786, 2126), (25, 787, 2126), (25, 788, 2126), (25, 789, 2126), (25, 790, 2126), (25, 791, 2126), (25, 792, 2126), (25, 793, 2126), (25, 794, 2126), (25, 795, 2126), (25, 796, 2125), (25, 797, 2125), (25, 798, 2125), (25, 799, 2125), (25, 800, 2125), (25, 801, 2125), (25, 802, 2125), (25, 803, 2125), (25, 804, 2125), (25, 805, 2125), (25, 806, 2125), (25, 807, 2125), (25, 808, 2124), (25, 809, 2124), (25, 810, 2124), (25, 811, 2124), (25, 812, 2124), (25, 813, 2124), (25, 814, 2124), (25, 815, 2124), (25, 816, 2124), (25, 817, 2124), (25, 818, 2124), (25, 819, 2124), (25, 820, 2123), (25, 821, 2123), (25, 822, 2123), (25, 823, 2123), (25, 824, 2123), (25, 825, 2122), (25, 826, 2122), (25, 827, 2122), (25, 828, 2121), (25, 829, 2121), (25, 830, 2121), (25, 831, 2120), (25, 832, 2120), (25, 833, 2120), (26, 834, 2118), (26, 835, 2118), (26, 836, 2118), (26, 837, 2117), (26, 838, 2117), (26, 839, 2117), (26, 840, 2117), (26, 841, 2116), (26, 842, 2116), (26, 843, 2116), (26, 844, 2115), (26, 845, 2115), (26, 846, 2115), (26, 847, 2115), (26, 848, 2114), (27, 849, 2113), (27, 850, 2113), (27, 851, 2113), (27, 852, 2112), (27, 853, 2112), (27, 854, 2112), (27, 855, 2112), (27, 856, 2111), (27, 857, 2111), (27, 858, 2111), (27, 859, 2110), (27, 860, 2110), (27, 861, 2110), (27, 862, 2110), (27, 863, 2110), (28, 864, 2108), (28, 865, 2108), (28, 866, 2108), (28, 867, 2108), (28, 868, 2107), (28, 869, 2107), (28, 870, 2107), (28, 871, 2107), (28, 872, 2106), (28, 873, 2106), (28, 874, 2106), (28, 875, 2106), (28, 876, 2106), (28, 877, 2105), (29, 878, 2104), (29, 879, 2104), (29, 880, 2104), (29, 881, 2103), (29, 882, 2103), (29, 883, 2103), (29, 884, 2103), (29, 885, 2103), (29, 886, 2102), (29, 887, 2102), (29, 888, 2102), (29, 889, 2102), (29, 890, 2102), (29, 891, 2101), (30, 892, 2100), (30, 893, 2100), (30, 894, 2100), (30, 895, 2100), (30, 896, 2099), (30, 897, 2099), (30, 898, 2099), (30, 899, 2099), (30, 900, 2099), (30, 901, 2099), (30, 902, 2099), (30, 903, 2099), (30, 904, 2098), (30, 905, 2098), (29, 906, 2099), (29, 907, 2099), (29, 908, 2099), (29, 909, 2099), (29, 910, 2099), (29, 911, 2099), (29, 912, 2099), (29, 913, 2099), (29, 914, 2098), (29, 915, 2098), (29, 916, 2098), (29, 917, 2098), (29, 918, 2098), (29, 919, 2098), (29, 920, 2098), (29, 921, 2098), (29, 922, 2098), (29, 923, 2098), (29, 924, 2097), (29, 925, 2097), (29, 926, 2097), (28, 927, 2098), (28, 928, 2098), (28, 929, 2098), (28, 930, 2098), (28, 931, 2098), (28, 932, 2098), (28, 933, 2098), (28, 934, 2097), (28, 935, 2097), (28, 936, 2097), (28, 937, 2097), (28, 938, 2097), (28, 939, 2097), (28, 940, 2097), (28, 941, 2097), (28, 942, 2097), (28, 943, 2097), (28, 944, 2096), (28, 945, 2096), (28, 946, 2096), (28, 947, 2096), (28, 948, 2096), (27, 949, 2097), (27, 950, 2097), (27, 951, 2097), (27, 952, 2097), (27, 953, 2097), (27, 954, 2097), (27, 955, 2096), (27, 956, 2096), (27, 957, 2096), (27, 958, 2096), (27, 959, 2096), (27, 960, 2096), (27, 961, 2096), (27, 962, 2096), (27, 963, 2096), (27, 964, 2096), (27, 965, 2096), (27, 966, 2095), (27, 967, 2095), (27, 968, 2095), (27, 969, 2095), (27, 970, 2095), (26, 971, 2096), (26, 972, 2096), (26, 973, 2096), (26, 974, 2096), (26, 975, 2095), (27, 976, 2094), (27, 977, 2094), (27, 978, 2094), (27, 979, 2093), (27, 980, 2093), (27, 981, 2093), (27, 982, 2092), (27, 983, 2092), (27, 984, 2092), (27, 985, 2092), (27, 986, 2091), (27, 987, 2091), (27, 988, 2091), (27, 989, 2090), (27, 990, 2090), (27, 991, 2090), (27, 992, 2090), (27, 993, 2089), (27, 994, 2089), (27, 995, 2089), (27, 996, 2088), (27, 997, 2088), (27, 998, 2088), (27, 999, 2087), (28, 1000, 2086), (28, 1001, 2086), (28, 1002, 2085), (28, 1003, 2085), (28, 1004, 2085), (28, 1005, 2084), (28, 1006, 2084), (28, 1007, 2084), (28, 1008, 2083), (28, 1009, 2083), (28, 1010, 2083), (28, 1011, 2082), (28, 1012, 2082), (28, 1013, 2082), (28, 1014, 2081), (28, 1015, 2081), (28, 1016, 2080), (28, 1017, 2080), (28, 1018, 2080), (28, 1019, 2079), (28, 1020, 2079), (28, 1021, 2079), (28, 1022, 2078), (28, 1023, 2078), (29, 1024, 2076), (29, 1025, 2076), (29, 1026, 2075), (29, 1027, 2075), (29, 1028, 2075), (29, 1029, 2074), (29, 1030, 2074), (29, 1031, 2073), (29, 1032, 2073), (29, 1033, 2072), (29, 1034, 2072), (29, 1035, 2072), (29, 1036, 2071), (29, 1037, 2071), (29, 1038, 2070), (29, 1039, 2070), (29, 1040, 2069), (29, 1041, 2069), (29, 1042, 2068), (29, 1043, 2068), (29, 1044, 2067), (29, 1045, 2067), (29, 1046, 2066), (30, 1047, 2065), (30, 1048, 2064), (30, 1049, 2063), (30, 1050, 2063), (30, 1051, 2062), (30, 1052, 2061), (29, 1053, 2061), (29, 1054, 2061), (29, 1055, 2060), (29, 1056, 2059), (29, 1057, 2058), (29, 1058, 2057), (29, 1059, 2056), (29, 1060, 2056), (29, 1061, 2055), (29, 1062, 2054), (29, 1063, 2053), (29, 1064, 2052), (29, 1065, 2051), (29, 1066, 2050), (29, 1067, 2049), (29, 1068, 2048), (29, 1069, 2047), (29, 1070, 2046), (29, 1071, 2045), (29, 1072, 2044), (29, 1073, 2043), (29, 1074, 2042), (29, 1075, 2041), (29, 1076, 2040), (29, 1077, 2039), (29, 1078, 2039), (29, 1079, 2038), (29, 1080, 2037), (29, 1081, 2036), (29, 1082, 2035), (29, 1083, 2035), (29, 1084, 2034), (29, 1085, 2033), (29, 1086, 2033), (29, 1087, 2032), (29, 1088, 2031), (29, 1089, 2031), (29, 1090, 2030), (29, 1091, 2029), (29, 1092, 2029), (29, 1093, 2028), (29, 1094, 2027), (29, 1095, 2027), (29, 1096, 2026), (29, 1097, 2026), (29, 1098, 2025), (29, 1099, 2024), (29, 1100, 2024), (29, 1101, 2023), (29, 1102, 2023), (29, 1103, 2022), (29, 1104, 2022), (29, 1105, 2021), (29, 1106, 2021), (29, 1107, 2020), (29, 1108, 2020), (29, 1109, 2019), (29, 1110, 2019), (29, 1111, 2018), (29, 1112, 2018), (29, 1113, 2017), (28, 1114, 2018), (28, 1115, 2017), (28, 1116, 2017), (28, 1117, 2017), (28, 1118, 2016), (28, 1119, 2016), (28, 1120, 2015), (28, 1121, 2015), (28, 1122, 2014), (28, 1123, 2014), (28, 1124, 2014), (28, 1125, 2013), (28, 1126, 2013), (28, 1127, 2013), (28, 1128, 2012), (28, 1129, 2012), (28, 1130, 2012), (28, 1131, 2011), (28, 1132, 2011), (28, 1133, 2011), (28, 1134, 2010), (28, 1135, 2010), (28, 1136, 2010), (28, 1137, 2009), (28, 1138, 2009), (28, 1139, 2009), (29, 1140, 2007), (29, 1141, 2007), (29, 1142, 2006), (29, 1143, 2006), (29, 1144, 2006), (29, 1145, 2005), (29, 1146, 2005), (29, 1147, 2004), (29, 1148, 2004), (29, 1149, 2004), (29, 1150, 2003), (29, 1151, 2003), (29, 1152, 2002), (29, 1153, 2002), (29, 1154, 2002), (29, 1155, 2001), (29, 1156, 2001), (29, 1157, 2000), (29, 1158, 2000), (29, 1159, 2000), (29, 1160, 1999), (29, 1161, 1999), (29, 1162, 1998), (29, 1163, 1998), (29, 1164, 1997), (29, 1165, 1997), (29, 1166, 1996), (29, 1167, 1996), (29, 1168, 1995), (29, 1169, 1995), (29, 1170, 1995), (29, 1171, 1994), (29, 1172, 1994), (29, 1173, 1993), (29, 1174, 1993), (29, 1175, 1992), (29, 1176, 1992), (29, 1177, 1991), (29, 1178, 1991), (29, 1179, 1990), (29, 1180, 1990), (29, 1181, 1989), (29, 1182, 1988), (29, 1183, 1988), (29, 1184, 1987), (29, 1185, 1987), (29, 1186, 1986), (29, 1187, 1986), (29, 1188, 1985), (29, 1189, 1985), (29, 1190, 1984), (29, 1191, 1983), (29, 1192, 1983), (29, 1193, 1982), (29, 1194, 1982), (29, 1195, 1981), (29, 1196, 1980), (29, 1197, 1980), (29, 1198, 1979), (29, 1199, 1979), (29, 1200, 1978), (29, 1201, 1977), (29, 1202, 1976), (29, 1203, 1976), (29, 1204, 1975), (29, 1205, 1974), (29, 1206, 1973), (29, 1207, 1973), (29, 1208, 1972), (29, 1209, 1971), (29, 1210, 1970), (29, 1211, 1969), (29, 1212, 1968), (29, 1213, 1967), (29, 1214, 1967), (29, 1215, 1966), (29, 1216, 1965), (29, 1217, 1964), (29, 1218, 1963), (29, 1219, 1962), (29, 1220, 1962), (29, 1221, 1961), (29, 1222, 1960), (29, 1223, 1959), (29, 1224, 1959), (29, 1225, 1958), (29, 1226, 1957), (29, 1227, 1957), (29, 1228, 1956), (29, 1229, 1955), (29, 1230, 1955), (29, 1231, 1954), (29, 1232, 1953), (29, 1233, 1953), (29, 1234, 1952), (29, 1235, 1951), (29, 1236, 1951), (29, 1237, 1950), (29, 1238, 1950), (29, 1239, 1949), (29, 1240, 1949), (29, 1241, 1948), (29, 1242, 1948), (29, 1243, 1947), (29, 1244, 1947), (29, 1245, 1946), (29, 1246, 1945), (29, 1247, 1945), (29, 1248, 1945), (29, 1249, 1944), (29, 1250, 1944), (29, 1251, 1943), (29, 1252, 1943), (29, 1253, 1942), (29, 1254, 1942), (29, 1255, 1941), (29, 1256, 1941), (29, 1257, 1940), (29, 1258, 1940), (29, 1259, 1940), (29, 1260, 1939), (29, 1261, 1939), (29, 1262, 1938), (29, 1263, 1938), (29, 1264, 1938), (29, 1265, 1937), (29, 1266, 1937), (29, 1267, 1936), (29, 1268, 1936), (29, 1269, 1936), (29, 1270, 1935), (29, 1271, 1935), (29, 1272, 1935), (29, 1273, 1934), (29, 1274, 1934), (29, 1275, 1934), (29, 1276, 1933), (29, 1277, 1933), (29, 1278, 1933), (29, 1279, 1933), (29, 1280, 1932), (29, 1281, 1932), (30, 1282, 1931), (30, 1283, 1931), (30, 1284, 1930), (30, 1285, 1930), (30, 1286, 1930), (30, 1287, 1930), (30, 1288, 1929), (30, 1289, 1929), (30, 1290, 1929), (30, 1291, 1929), (30, 1292, 1928), (30, 1293, 1928), (30, 1294, 1928), (30, 1295, 1928), (30, 1296, 1927), (30, 1297, 1927), (30, 1298, 1927), (30, 1299, 1927), (30, 1300, 1926), (30, 1301, 1926), (30, 1302, 1926), (30, 1303, 1925), (30, 1304, 1925), (31, 1305, 1924), (31, 1306, 1924), (31, 1307, 1923), (31, 1308, 1923), (31, 1309, 1923), (31, 1310, 1922), (31, 1311, 1922), (31, 1312, 1922), (31, 1313, 1922), (31, 1314, 1921), (31, 1315, 1921), (31, 1316, 1921), (31, 1317, 1920), (31, 1318, 1920), (31, 1319, 1920), (31, 1320, 1920), (31, 1321, 1919), (31, 1322, 1919), (31, 1323, 1919), (31, 1324, 1918), (31, 1325, 1918), (31, 1326, 1918), (31, 1327, 1917), (32, 1328, 1916), (32, 1329, 1916), (32, 1330, 1915), (32, 1331, 1915), (32, 1332, 1915), (32, 1333, 1914), (32, 1334, 1914), (32, 1335, 1914), (32, 1336, 1913), (32, 1337, 1913), (32, 1338, 1913), (32, 1339, 1912), (32, 1340, 1912), (32, 1341, 1912), (32, 1342, 1911), (32, 1343, 1911), (32, 1344, 1911), (32, 1345, 1910), (32, 1346, 1910), (32, 1347, 1910), (32, 1348, 1909), (32, 1349, 1909), (32, 1350, 1909), (33, 1351, 1907), (33, 1352, 1907), (33, 1353, 1906), (33, 1354, 1906), (33, 1355, 1905), (33, 1356, 1904), (33, 1357, 1904), (33, 1358, 1903), (33, 1359, 1903), (33, 1360, 1902), (33, 1361, 1901), (33, 1362, 1901), (33, 1363, 1900), (34, 1364, 1898), (34, 1365, 1898), (34, 1366, 1897), (34, 1367, 1896), (34, 1368, 1896), (34, 1369, 1895), (34, 1370, 1894), (34, 1371, 1894), (34, 1372, 1893), (34, 1373, 1892), (34, 1374, 1891), (34, 1375, 1890), (35, 1376, 1889), (35, 1377, 1888), (35, 1378, 1887), (35, 1379, 1886), (35, 1380, 1885), (35, 1381, 1884), (35, 1382, 1883), (35, 1383, 1882), (35, 1384, 1881), (35, 1385, 1880), (35, 1386, 1879), (35, 1387, 1878), (36, 1388, 1876), (36, 1389, 1876), (36, 1390, 1875), (36, 1391, 1874), (36, 1392, 1873), (36, 1393, 1872), (36, 1394, 1871), (36, 1395, 1871), (36, 1396, 1870), (36, 1397, 1869), (36, 1398, 1868), (37, 1399, 1867), (37, 1400, 1866), (37, 1401, 1865), (37, 1402, 1864), (37, 1403, 1864), (37, 1404, 1863), (37, 1405, 1862), (37, 1406, 1862), (37, 1407, 1861), (37, 1408, 1861), (37, 1409, 1860), (38, 1410, 1858), (38, 1411, 1858), (38, 1412, 1857), (38, 1413, 1857), (38, 1414, 1856), (38, 1415, 1855), (38, 1416, 1855), (38, 1417, 1854), (38, 1418, 1854), (38, 1419, 1853), (38, 1420, 1853), (39, 1421, 1851), (39, 1422, 1851), (39, 1423, 1850), (39, 1424, 1850), (39, 1425, 1849), (39, 1426, 1849), (39, 1427, 1849), (39, 1428, 1848), (39, 1429, 1848), (39, 1430, 1848), (39, 1431, 1848), (39, 1432, 1848), (39, 1433, 1847), (39, 1434, 1847), (39, 1435, 1847), (39, 1436, 1847), (39, 1437, 1847), (39, 1438, 1846), (39, 1439, 1846), (39, 1440, 1846), (39, 1441, 1846), (39, 1442, 1846), (39, 1443, 1845), (39, 1444, 1845), (40, 1445, 1844), (40, 1446, 1844), (40, 1447, 1844), (40, 1448, 1843), (40, 1449, 1843), (40, 1450, 1843), (40, 1451, 1843), (40, 1452, 1843), (40, 1453, 1843), (40, 1454, 1842), (40, 1455, 1842), (40, 1456, 1842), (40, 1457, 1842), (40, 1458, 1842), (40, 1459, 1841), (40, 1460, 1841), (40, 1461, 1841), (40, 1462, 1841), (40, 1463, 1841), (40, 1464, 1841), (40, 1465, 1840), (40, 1466, 1840), (40, 1467, 1840), (40, 1468, 1840), (40, 1469, 1840), (40, 1470, 1839), (40, 1471, 1839), (40, 1472, 1839), (40, 1473, 1839), (40, 1474, 1839), (40, 1475, 1839), (40, 1476, 1838), (40, 1477, 1838), (40, 1478, 1838), (40, 1479, 1838), (40, 1480, 1838), (40, 1481, 1838), (40, 1482, 1837), (40, 1483, 1837), (40, 1484, 1837), (40, 1485, 1837), (41, 1486, 1836), (41, 1487, 1836), (41, 1488, 1835), (41, 1489, 1835), (41, 1490, 1835), (41, 1491, 1835), (41, 1492, 1835), (41, 1493, 1835), (41, 1494, 1835), (41, 1495, 1834), (41, 1496, 1834), (41, 1497, 1834), (41, 1498, 1834), (41, 1499, 1834), (41, 1500, 1834), (41, 1501, 1833), (41, 1502, 1833), (41, 1503, 1833), (41, 1504, 1833), (41, 1505, 1833), (41, 1506, 1833), (41, 1507, 1833), (41, 1508, 1832), (41, 1509, 1832), (41, 1510, 1832), (41, 1511, 1832), (41, 1512, 1832), (41, 1513, 1832), (41, 1514, 1832), (42, 1515, 1830), (42, 1516, 1830), (42, 1517, 1830), (42, 1518, 1830), (42, 1519, 1830), (42, 1520, 1830), (42, 1521, 1830), (42, 1522, 1829), (42, 1523, 1829), (42, 1524, 1829), (42, 1525, 1829), (42, 1526, 1829), (42, 1527, 1829), (42, 1528, 1829), (42, 1529, 1829), (42, 1530, 1828), (42, 1531, 1828), (42, 1532, 1828), (42, 1533, 1828), (42, 1534, 1828), (42, 1535, 1828), (42, 1536, 1827), (43, 1537, 1826), (43, 1538, 1826), (43, 1539, 1826), (43, 1540, 1826), (43, 1541, 1826), (43, 1542, 1826), (43, 1543, 1825), (43, 1544, 1825), (43, 1545, 1825), (43, 1546, 1825), (43, 1547, 1825), (43, 1548, 1825), (43, 1549, 1825), (43, 1550, 1824), (43, 1551, 1824), (43, 1552, 1824), (43, 1553, 1824), (43, 1554, 1824), (43, 1555, 1824), (43, 1556, 1823), (43, 1557, 1823), (43, 1558, 1823), (43, 1559, 1823), (44, 1560, 1822), (44, 1561, 1822), (44, 1562, 1822), (44, 1563, 1821), (44, 1564, 1821), (44, 1565, 1821), (44, 1566, 1821), (44, 1567, 1821), (44, 1568, 1821), (44, 1569, 1820), (44, 1570, 1820), (44, 1571, 1820), (44, 1572, 1820), (44, 1573, 1820), (44, 1574, 1820), (44, 1575, 1819), (44, 1576, 1819), (44, 1577, 1819), (44, 1578, 1819), (44, 1579, 1819), (44, 1580, 1819), (44, 1581, 1819), (44, 1582, 1818), (44, 1583, 1818), (44, 1584, 1818), (44, 1585, 1818), (44, 1586, 1818), (44, 1587, 1818), (44, 1588, 1817), (44, 1589, 1817), (44, 1590, 1817), (44, 1591, 1817), (44, 1592, 1817), (43, 1593, 1818), (43, 1594, 1817), (43, 1595, 1817), (43, 1596, 1817), (43, 1597, 1817), (43, 1598, 1817), (43, 1599, 1817), (43, 1600, 1816), (43, 1601, 1816), (43, 1602, 1816), (43, 1603, 1816), (43, 1604, 1816), (43, 1605, 1816), (43, 1606, 1815), (43, 1607, 1815), (43, 1608, 1815), (43, 1609, 1815), (42, 1610, 1816), (42, 1611, 1815), (42, 1612, 1815), (42, 1613, 1815), (42, 1614, 1815), (42, 1615, 1815), (42, 1616, 1815), (42, 1617, 1814), (42, 1618, 1814), (42, 1619, 1814), (42, 1620, 1814), (42, 1621, 1814), (42, 1622, 1813), (42, 1623, 1813), (42, 1624, 1813), (42, 1625, 1813), (41, 1626, 1814), (41, 1627, 1813), (41, 1628, 1813), (41, 1629, 1813), (41, 1630, 1813), (41, 1631, 1813), (41, 1632, 1812), (41, 1633, 1812), (41, 1634, 1812), (41, 1635, 1812), (41, 1636, 1812), (41, 1637, 1811), (41, 1638, 1811), (41, 1639, 1811), (41, 1640, 1811), (41, 1641, 1811), (41, 1642, 1810), (40, 1643, 1811), (40, 1644, 1811), (40, 1645, 1811), (40, 1646, 1810), (40, 1647, 1810), (40, 1648, 1810), (40, 1649, 1810), (40, 1650, 1809), (40, 1651, 1809), (40, 1652, 1809), (40, 1653, 1809), (40, 1654, 1809), (40, 1655, 1808), (40, 1656, 1808), (40, 1657, 1808), (40, 1658, 1807), (41, 1659, 1806), (41, 1660, 1806), (41, 1661, 1805), (41, 1662, 1805), (41, 1663, 1805), (41, 1664, 1804), (42, 1665, 1803), (42, 1666, 1803), (42, 1667, 1803), (42, 1668, 1802), (42, 1669, 1802), (42, 1670, 1802), (43, 1671, 1800), (43, 1672, 1800), (43, 1673, 1799), (43, 1674, 1799), (43, 1675, 1799), (43, 1676, 1798), (44, 1677, 1797), (44, 1678, 1797), (44, 1679, 1796), (44, 1680, 1796), (44, 1681, 1796), (44, 1682, 1795), (45, 1683, 1794), (45, 1684, 1793), (45, 1685, 1793), (45, 1686, 1793), (45, 1687, 1792), (45, 1688, 1792), (45, 1689, 1792), (46, 1690, 1790), (46, 1691, 1790), (46, 1692, 1789), (46, 1693, 1789), (46, 1694, 1788), (46, 1695, 1788), (47, 1696, 1787), (47, 1697, 1786), (47, 1698, 1786), (47, 1699, 1785), (47, 1700, 1785), (47, 1701, 1784), (47, 1702, 1784), (48, 1703, 1783), (48, 1704, 1782), (48, 1705, 1782), (48, 1706, 1781), (48, 1707, 1781), (48, 1708, 1781), (48, 1709, 1780), (49, 1710, 1779), (49, 1711, 1778), (49, 1712, 1778), (49, 1713, 1778), (49, 1714, 1777), (49, 1715, 1777), (50, 1716, 1776), (50, 1717, 1775), (50, 1718, 1775), (50, 1719, 1774), (50, 1720, 1774), (50, 1721, 1774), (50, 1722, 1773), (51, 1723, 1772), (51, 1724, 1772), (51, 1725, 1771), (51, 1726, 1771), (51, 1727, 1771), (51, 1728, 1770), (51, 1729, 1770), (51, 1730, 1770), (51, 1731, 1769), (51, 1732, 1769), (51, 1733, 1769), (51, 1734, 1769), (51, 1735, 1768), (51, 1736, 1768), (51, 1737, 1768), (51, 1738, 1767), (51, 1739, 1767), (51, 1740, 1767), (51, 1741, 1766), (51, 1742, 1766), (51, 1743, 1766), (51, 1744, 1766), (51, 1745, 1765), (51, 1746, 1765), (51, 1747, 1765), (51, 1748, 1765), (51, 1749, 1764), (51, 1750, 1764), (51, 1751, 1764), (51, 1752, 1763), (51, 1753, 1763), (51, 1754, 1763), (51, 1755, 1763), (51, 1756, 1762), (51, 1757, 1762), (51, 1758, 1762), (51, 1759, 1762), (51, 1760, 1761), (51, 1761, 1761), (51, 1762, 1761), (51, 1763, 1761), (51, 1764, 1760), (51, 1765, 1760), (51, 1766, 1760), (51, 1767, 1760), (51, 1768, 1759), (51, 1769, 1759), (51, 1770, 1759), (51, 1771, 1759), (51, 1772, 1758), (51, 1773, 1758), (51, 1774, 1758), (51, 1775, 1758), (51, 1776, 1757), (51, 1777, 1757), (51, 1778, 1757), (51, 1779, 1757), (51, 1780, 1756), (51, 1781, 1756), (51, 1782, 1756), (51, 1783, 1756), (51, 1784, 1755), (51, 1785, 1755), (51, 1786, 1755), (51, 1787, 1755), (51, 1788, 1754), (51, 1789, 1754), (51, 1790, 1754), (51, 1791, 1754), (51, 1792, 1754), (51, 1793, 1753), (51, 1794, 1753), (51, 1795, 1753), (51, 1796, 1753), (51, 1797, 1752), (51, 1798, 1752), (51, 1799, 1752), (51, 1800, 1752), (51, 1801, 1752), (51, 1802, 1751), (51, 1803, 1751), (51, 1804, 1751), (51, 1805, 1751), (52, 1806, 1749), (52, 1807, 1749), (52, 1808, 1749), (53, 1809, 1748), (53, 1810, 1747), (54, 1811, 1746), (54, 1812, 1746), (54, 1813, 1745), (55, 1814, 1744), (55, 1815, 1744), (56, 1816, 1743), (56, 1817, 1742), (56, 1818, 1742), (57, 1819, 1741), (57, 1820, 1740), (57, 1821, 1740), (58, 1822, 1739), (58, 1823, 1738), (59, 1824, 1737), (59, 1825, 1737), (59, 1826, 1736), (60, 1827, 1735), (60, 1828, 1735), (60, 1829, 1735), (61, 1830, 1733), (61, 1831, 1733), (62, 1832, 1732), (62, 1833, 1731), (62, 1834, 1731), (63, 1835, 1730), (63, 1836, 1729), (63, 1837, 1729), (64, 1838, 1728), (64, 1839, 1727), (65, 1840, 1726), (65, 1841, 1726), (65, 1842, 1725), (66, 1843, 1724), (66, 1844, 1724), (66, 1845, 1723), (67, 1846, 1722), (67, 1847, 1722), (67, 1848, 1721), (68, 1849, 1720), (68, 1850, 1720), (68, 1851, 1719), (69, 1852, 1718), (69, 1853, 1717), (70, 1854, 1716), (70, 1855, 1716), (70, 1856, 1715), (71, 1857, 1714), (71, 1858, 1714), (71, 1859, 1713), (72, 1860, 1712), (72, 1861, 1712), (72, 1862, 1711), (73, 1863, 1710), (73, 1864, 1709), (73, 1865, 1709), (74, 1866, 1708), (74, 1867, 1707), (74, 1868, 1707), (75, 1869, 1706), (75, 1870, 1705), (75, 1871, 1705), (76, 1872, 1703), (76, 1873, 1703), (76, 1874, 1703), (77, 1875, 1701), (77, 1876, 1701), (77, 1877, 1700), (78, 1878, 1699), (78, 1879, 1699), (79, 1880, 1697), (79, 1881, 1697), (79, 1882, 1696), (79, 1883, 1696), (80, 1884, 1694), (80, 1885, 1694), (80, 1886, 1693), (81, 1887, 1692), (81, 1888, 1692), (81, 1889, 1691), (81, 1890, 1691), (82, 1891, 1689), (82, 1892, 1689), (82, 1893, 1688), (83, 1894, 1687), (83, 1895, 1686), (83, 1896, 1686), (84, 1897, 1684), (84, 1898, 1684), (84, 1899, 1683), (85, 1900, 1682), (85, 1901, 1681), (85, 1902, 1681), (86, 1903, 1679), (86, 1904, 1679), (86, 1905, 1678), (87, 1906, 1677), (87, 1907, 1676), (87, 1908, 1675), (88, 1909, 1674), (88, 1910, 1673), (89, 1911, 1672), (89, 1912, 1671), (89, 1913, 1671), (90, 1914, 1669), (90, 1915, 1668), (90, 1916, 1668), (91, 1917, 1666), (91, 1918, 1666), (92, 1919, 1664), (92, 1920, 1663), (93, 1921, 1662), (93, 1922, 1661), (93, 1923, 1660), (94, 1924, 1659), (94, 1925, 1658), (95, 1926, 1656), (95, 1927, 1656), (96, 1928, 1654), (96, 1929, 1653), (96, 1930, 1653), (97, 1931, 1651), (97, 1932, 1650), (98, 1933, 1649), (98, 1934, 1648), (99, 1935, 1646), (99, 1936, 1646), (100, 1937, 1644), (100, 1938, 1643), (101, 1939, 1642), (101, 1940, 1641), (102, 1941, 1640), (102, 1942, 1639), (103, 1943, 1637), (104, 1944, 1636), (104, 1945, 1635), (105, 1946, 1634), (105, 1947, 1633), (106, 1948, 1631), (106, 1949, 1631), (107, 1950, 1629), (108, 1951, 1628), (108, 1952, 1627), (109, 1953, 1626), (109, 1954, 1625), (110, 1955, 1623), (111, 1956, 1622), (111, 1957, 1621), (112, 1958, 1620), (113, 1959, 1618), (114, 1960, 1616), (114, 1961, 1615), (115, 1962, 1613), (116, 1963, 1611), (117, 1964, 1609), (118, 1965, 84), (205, 1965, 1521), (119, 1966, 64), (211, 1966, 1514), (119, 1967, 48), (217, 1967, 1507), (120, 1968, 33), (224, 1968, 1499), (121, 1969, 18), (231, 1969, 1491), (122, 1970, 5), (238, 1970, 1483), (245, 1971, 1475), (253, 1972, 1466), (261, 1973, 1457), (269, 1974, 1448), (278, 1975, 1438), (286, 1976, 1429), (296, 1977, 1418), (306, 1978, 1407), (317, 1979, 1395), (328, 1980, 1383), (341, 1981, 1369), (355, 1982, 1354), (368, 1983, 1340), (373, 1984, 1334), (379, 1985, 1327), (384, 1986, 1321), (389, 1987, 1315), (395, 1988, 1308), (401, 1989, 1300), (406, 1990, 1294), (412, 1991, 1287), (418, 1992, 1280), (424, 1993, 1273), (430, 1994, 1265), (436, 1995, 1258), (442, 1996, 1251), (448, 1997, 1244), (453, 1998, 1237), (458, 1999, 1231), (463, 2000, 1225), (468, 2001, 1218), (473, 2002, 1212), (477, 2003, 1207), (482, 2004, 1200), (487, 2005, 1194), (492, 2006, 1187), (496, 2007, 1182), (501, 2008, 1175), (506, 2009, 1169), (510, 2010, 1163), (515, 2011, 1157), (519, 2012, 1131), (524, 2013, 1100), (528, 2014, 1068), (532, 2015, 1056), (536, 2016, 1049), (540, 2017, 1042), (544, 2018, 1035), (547, 2019, 1030), (551, 2020, 1023), (554, 2021, 1017), (558, 2022, 1011), (561, 2023, 1006), (564, 2024, 1000), (567, 2025, 995), (571, 2026, 988), (574, 2027, 983), (577, 2028, 978), (580, 2029, 973), (583, 2030, 968), (585, 2031, 964), (588, 2032, 959), (591, 2033, 954), (593, 2034, 950), (595, 2035, 946), (597, 2036, 942), (600, 2037, 937), (602, 2038, 933), (605, 2039, 928), (607, 2040, 924), (610, 2041, 919), (612, 2042, 915), (614, 2043, 911), (616, 2044, 906), (618, 2045, 902), (620, 2046, 898), (623, 2047, 892), (625, 2048, 888), (627, 2049, 883), (630, 2050, 873), (632, 2051, 858), (635, 2052, 841), (637, 2053, 824), (640, 2054, 806), (642, 2055, 788), (645, 2056, 778), (648, 2057, 769), (650, 2058, 761), (653, 2059, 753), (656, 2060, 744), (659, 2061, 735), (662, 2062, 726), (665, 2063, 717), (668, 2064, 708), (672, 2065, 698), (675, 2066, 689), (678, 2067, 680), (682, 2068, 669), (685, 2069, 660), (689, 2070, 648), (693, 2071, 637), (697, 2072, 625), (702, 2073, 613), (706, 2074, 601), (710, 2075, 590), (714, 2076, 579), (719, 2077, 566), (723, 2078, 555), (727, 2079, 544), (730, 2080, 534), (734, 2081, 525), (738, 2082, 517), (742, 2083, 509), (745, 2084, 502), (749, 2085, 495), (752, 2086, 488), (756, 2087, 480), (759, 2088, 474), (762, 2089, 468), (765, 2090, 461), (769, 2091, 454), (772, 2092, 448), (775, 2093, 442), (778, 2094, 436), (780, 2095, 431), (783, 2096, 425), (785, 2097, 420), (788, 2098, 415), (790, 2099, 410), (793, 2100, 404), (795, 2101, 400), (798, 2102, 394), (800, 2103, 390), (803, 2104, 384), (805, 2105, 380), (807, 2106, 376), (809, 2107, 372), (812, 2108, 367), (814, 2109, 363), (817, 2110, 357), (819, 2111, 353), (822, 2112, 348), (824, 2113, 343), (827, 2114, 338), (830, 2115, 332), (833, 2116, 327), (836, 2117, 321), (839, 2118, 316), (842, 2119, 310), (845, 2120, 304), (848, 2121, 299), (851, 2122, 293), (855, 2123, 286), (858, 2124, 280), (863, 2125, 273), (867, 2126, 266), (871, 2127, 259), (876, 2128, 251), (880, 2129, 244), (884, 2130, 236), (889, 2131, 228), (894, 2132, 220), (898, 2133, 213), (903, 2134, 204), (907, 2135, 197), (912, 2136, 188), (917, 2137, 174), (922, 2138, 161), (927, 2139, 147), (932, 2140, 133), (937, 2141, 119), (954, 2142, 93), (974, 2143, 64), (995, 2144, 34), (1017, 2145, 3)], ['954,2142,858,2124,777,2093,367,1982,183,1965,122,1970,96,1930,51,1805,51,1723,40,1658,44,1560,29,1281,26,739,19,683,29,525,47,444,97,295,120,269,243,192,416,135,526,113,653,111,1427,112,1588,137,1740,170,1834,180,1917,208,2019,291,2054,361,2116,443,2162,677,2148,819,2128,896,2121,974,2094,1047,2067,1077,2007,1199,1968,1257,1940,1350,1887,1425,1848,1654,1823,1719,1794,1829,1731,1958,1671,2011,1588,2014,1509,2049,1430,2054,1264,2079,1099,2136'])], 'temp/1747812967_1946083_917877156_a9c2d4b99270c9302def4ed40606e685.jpg']} nb pixel non reg : 3692295 nb pixel common : 3678100 proportion of common points : 0.9961555076178907 [('test release memory', 'SUCCESS', True), ('test detect objet', "ERROR in object's position", False), ('test polygone', 'SUCCESS', True)] res_total : False #&_# TEST FAILED #&_# : tests/mask_test #&_# /home/admin/workarea/git/Velours/python/tests/python_tests.py refs/heads/master_f651ad69392893e66d2fa3324c867782c5710d2e SQL :INSERT INTO MTRAdmin.monitor_sys (name, type, server, version_code, result_str, result_bool, lien , test_group ,test_name) VALUES ('python_test3','1','marlene','refs/heads/master_f651ad69392893e66d2fa3324c867782c5710d2e','{"mask_detection": "fail"}','0','http://marlene.fotonower-preprod.com/job/2025/May/21052025/python_test3//data_2/data_log/job/2025/May/21052025/python_test3/log-python3----short_python3--v--marlene-09:35:01.txt','mask_detection','unknown'); #&_# END OF TEST #&_# : tests/mask_test #&_# #&_# BEGIN OF TEST : tests/datou_test #&_# /home/admin/workarea/git/Velours/python/tests/datou_test.py Datou All Test python version used : 3 ############################### TEST sam ################################ TEST SAM Inside batchDatouExec : verbose : True ##### chargement datou SELECT name, created_at,limit_max FROM MTRDatou.mtr_datou WHERE id=4573 SELECT mtd.id, mtdt.`type`, mtd.`param`, mtd.param_json, mtdt.nb_input, mtdt.nb_output, mtdt.prod, mtdt.is_local, mtdt.is_datou_depend, mtdt.is_photo_id_local FROM MTRDatou.mtr_datou_step mtd, MTRDatou.mtr_datou_step_types mtdt WHERE mtdt.`id`=mtd.`type` AND mtd.mtd_id=4573 SELECT mtd.id, mtd.mtd_id, mdsdt.id, mdsdt.name, mdsdt.description, msid.output_or_input, msid.data_order_id, mdsdt.type FROM MTRDatou.mtr_datou_step mtd, MTRDatou.mtr_datou_steptype_io_datatypes msid, MTRDatou.mtr_datou_step_data_types mdsdt WHERE mtd.`type`=msid.`mtr_datou_step_type` AND mtd.mtd_id= 4573 AND msid.data_type=mdsdt.id SELECT mts_id_output, id_output, mts_id_input, id_input FROM MTRDatou.mtr_datou_step_by_step WHERE mtd_id=4573 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! no param json to modify List Step Type Loaded in datou : sam list_input_json : [] ##### fin chargement datou ##### chargement data ##### Call load_data_input : nb_thread : 5 origin SELECT photo_id, url FROM MTRBack.photos ph WHERE photo_id IN (1189321094) Found this number of photos: 1 ##### Call download_photos : nb_thread : 5 begin to download photo : 1189321094 download finish for photo 1189321094 we have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB ##### After download_photos length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 ##### After load_data_input time to download the photos : 0.18970298767089844 #### fin chargement data Blocking on flush ? No conitnuing About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! WARNING : we have an input that is not a photo, we should get rid of it Calling datou_exec Inside datou_exec : verbose : True number of steps : 1 step1:sam Wed May 21 09:36:55 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec After prepare type args : Here we display some param of map_info ! map_filenames : {'temp/1747813015_1946083_1189321094_9626af7f95d010f2a4fd524688d4ea22_76896585.png': 1189321094} map_photo_id_path_extension : {1189321094: {'path': 'temp/1747813015_1946083_1189321094_9626af7f95d010f2a4fd524688d4ea22_76896585.png', 'extension': 'png'}} map_subphoto_mainphoto : {} Beginning of datou step sam ! pht : 4677 Inside sam : nb paths : 1 (640, 960, 3) ERROR in datou_step_exec, will save and exit ! CUDA out of memory. Tried to allocate 768.00 MiB (GPU 0; 10.76 GiB total capacity; 443.59 MiB already allocated; 230.88 MiB free; 530.00 MiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF File "/home/admin/workarea/git/Velours/python/mtr/datou/datou_lib.py", line 2329, in datou_exec output = datou_step_exec(sNext, args, cache, context, map_info, verbose, mtr_user_id) File "/home/admin/workarea/git/Velours/python/mtr/datou/datou_lib.py", line 2430, in datou_step_exec return lib_process.datou_step_sam(param, json_param, args, cache, context, map_info, verbose) File "/home/admin/workarea/git/Velours/python/mtr/datou/lib_step_exec/lib_step_process.py", line 396, in datou_step_sam masks = mask_generator.generate(image) File "/home/admin/.local/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, **kwargs) File "/home/admin/workarea/install/segment-anything/segment_anything/automatic_mask_generator.py", line 163, in generate mask_data = self._generate_masks(image) File "/home/admin/workarea/install/segment-anything/segment_anything/automatic_mask_generator.py", line 206, in _generate_masks crop_data = self._process_crop(image, crop_box, layer_idx, orig_size) File "/home/admin/workarea/install/segment-anything/segment_anything/automatic_mask_generator.py", line 236, in _process_crop self.predictor.set_image(cropped_im) File "/home/admin/workarea/install/segment-anything/segment_anything/predictor.py", line 60, in set_image self.set_torch_image(input_image_torch, image.shape[:2]) File "/home/admin/.local/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, **kwargs) File "/home/admin/workarea/install/segment-anything/segment_anything/predictor.py", line 89, in set_torch_image self.features = self.model.image_encoder(input_image) File "/home/admin/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, **kwargs) File "/home/admin/workarea/install/segment-anything/segment_anything/modeling/image_encoder.py", line 112, in forward x = blk(x) File "/home/admin/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, **kwargs) File "/home/admin/workarea/install/segment-anything/segment_anything/modeling/image_encoder.py", line 174, in forward x = self.attn(x) File "/home/admin/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, **kwargs) File "/home/admin/workarea/install/segment-anything/segment_anything/modeling/image_encoder.py", line 231, in forward attn = (q * self.scale) @ k.transpose(-2, -1) [1189321094] map_info['map_portfolio_photo'] : {} final : True mtd_id 4573 list_pids : [1189321094] begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1 insert ignore into MTRPhoto.mtr_datou_result (mtd_id, mtr_portfolio_id,mtr_photo_id,result,result_long,result_double,hashtag_id,proba, mtr_current_id) values (%s,%s,%s,%s,%s,%s,%s,%s,%s) on duplicate key update mtr_portfolio_id = mtr_portfolio_id list_values : [('4573', None, '1189321094', "[>, , , , , 'CUDA out of memory. Tried to allocate 768.00 MiB (GPU 0; 10.76 GiB total capacity; 443.59 MiB already allocated; 230.88 MiB free; 530.00 MiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF']", '-1', '-1.0', '501120777', '1.0', None)] time used for this insertion : 0.016500234603881836 save_final ERROR in last step sam, CUDA out of memory. Tried to allocate 768.00 MiB (GPU 0; 10.76 GiB total capacity; 443.59 MiB already allocated; 230.88 MiB free; 530.00 MiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF time spend for datou_step_exec : 6.970123529434204 time spend to save output : 0.02746725082397461 total time spend for step 0 : 6.997590780258179 need to delete datou_research and reload, so keep current state 1 need to delete datou_research and reload, so keep current state 1 need to delete datou_research and reload, so keep current state 1 caffe_path_current : About to save ! 2 After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : None ERROR nb objects espect : 98 nb_objects detect : 0 ERROR sam FAILED ############################### TEST frcnn ################################ test frcnn Inside batchDatouExec : verbose : True ##### chargement datou SELECT name, created_at,limit_max FROM MTRDatou.mtr_datou WHERE id=4184 SELECT mtd.id, mtdt.`type`, mtd.`param`, mtd.param_json, mtdt.nb_input, mtdt.nb_output, mtdt.prod, mtdt.is_local, mtdt.is_datou_depend, mtdt.is_photo_id_local FROM MTRDatou.mtr_datou_step mtd, MTRDatou.mtr_datou_step_types mtdt WHERE mtdt.`id`=mtd.`type` AND mtd.mtd_id=4184 SELECT mtd.id, mtd.mtd_id, mdsdt.id, mdsdt.name, mdsdt.description, msid.output_or_input, msid.data_order_id, mdsdt.type FROM MTRDatou.mtr_datou_step mtd, MTRDatou.mtr_datou_steptype_io_datatypes msid, MTRDatou.mtr_datou_step_data_types mdsdt WHERE mtd.`type`=msid.`mtr_datou_step_type` AND mtd.mtd_id= 4184 AND msid.data_type=mdsdt.id SELECT mts_id_output, id_output, mts_id_input, id_input FROM MTRDatou.mtr_datou_step_by_step WHERE mtd_id=4184 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! no param json to modify List Step Type Loaded in datou : frcnn list_input_json : [] ##### fin chargement datou ##### chargement data ##### Call load_data_input : nb_thread : 5 origin SELECT photo_id, url FROM MTRBack.photos ph WHERE photo_id IN (917754606) Found this number of photos: 1 ##### Call download_photos : nb_thread : 5 begin to download photo : 917754606 download finish for photo 917754606 we have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB ##### After download_photos length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 ##### After load_data_input time to download the photos : 0.15611815452575684 #### fin chargement data Blocking on flush ? No conitnuing About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : True number of steps : 1 step1:frcnn Wed May 21 09:37:02 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec After prepare type args : Here we display some param of map_info ! map_filenames : {'temp/1747813022_1946083_917754606_35f3c9ae49686a6be16030c6ec25c9ee.jpg': 917754606} map_photo_id_path_extension : {917754606: {'path': 'temp/1747813022_1946083_917754606_35f3c9ae49686a6be16030c6ec25c9ee.jpg', 'extension': 'jpg'}} map_subphoto_mainphoto : {} Beginning of datou step Faster rcnn ! classes : ['background', 'plaque'] pht : 4370 caffemodel_name (should be vgg16_immat_307 but not used because net loaded outside in the fonction) : {'id': 3375, 'mtr_user_id': 31, 'name': 'detection_plaque_valcor_010622', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,plaque', 'svm_portfolios_learning': '0,0', 'photo_hashtag_type': 4370, 'photo_desc_type': 5676, 'type_classification': 'caffe_faster_rcnn', 'hashtag_id_list': '0,0'} To loadFromThcl() model_param file didn't exist model_name : detection_plaque_valcor_010622 model_type : caffe_faster_rcnn list file need : ['caffemodel', 'test.prototxt'] file exist in s3 : ['caffemodel', 'test.prototxt'] file manque in s3 : [] WARNING: Logging before InitGoogleLogging() is written to STDERR F0521 09:37:05.603628 1946083 syncedmem.cpp:71] Check failed: error == cudaSuccess (2 vs. 0) out of memory *** Check failure stack trace: *** Command terminated by signal 6 37.40user 26.18system 1:40.99elapsed 62%CPU (0avgtext+0avgdata 3510592maxresident)k 3520224inputs+4648outputs (7968major+3083064minor)pagefaults 0swaps