python /home/admin/mtr/script_for_cron.py -j python_test3 -m 12 -a ' --short_python3 -v ' -s python_test3 -M 0 -S 0 -U 100,100,120 import MySQLdb succeeded Import error (python version) python version = 3 warning , we can't find thcl infos in json_data warning , we can't find pdt infos in json_data list_job_run_as_list : ['mask_detection', 'datou', 'CacheModelData_queries', 'CachePhotoData_queries', 'test_fork', 'prepare_maskdata', 'portfolio_queries', 'sla_mensuel'] python version used : 3 liste_fichiers : [('tests/mask_test', True, 'Test mask-detection ', 'mask_detection'), ('tests/datou_test', True, 'Datou All Test', 'datou', 'all'), ('mtr/database_queries/CacheModelData_queries', True, 'Test Cache Model Data', 'CacheModelData_queries'), ('tests/cache_photo_data_test', True, 'Test local_cache_photo ', 'CachePhotoData_queries'), ('mtr/mask_rcnn/prepare_maskdata', True, 'test prepare mask data', 'prepare_maskdata', 'all'), ('mtr/database_queries/portfolio_queries', True, 'test portfolio queries', 'portfolio_queries'), ('prod/memo/memo', True, 'SLA Mensuel', 'sla_mensuel', 'all')] #&_# BEGIN OF TEST : tests/mask_test #&_# /home/admin/workarea/git/Velours/python/tests/mask_test.py Test mask-detection python version used : 3 ############################### TEST memory used ################################ free memory at begining : begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 7035 run mask_detect Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : mask_detect list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.10530233383178711 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:mask_detect Wed May 7 15:35:27 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 7035 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 /home/admin/workarea/git/Velours/python/tests/python_tests.py:11: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses import imp 2025-05-07 15:35:30.044963: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-05-07 15:35:30.071418: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493065000 Hz 2025-05-07 15:35:30.073402: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f1774000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-05-07 15:35:30.073480: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-05-07 15:35:30.078322: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-05-07 15:35:30.347678: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x19443fc0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-05-07 15:35:30.347740: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-05-07 15:35:30.348488: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-07 15:35:30.348926: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-07 15:35:30.351490: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-07 15:35:30.354250: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-07 15:35:30.354705: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-07 15:35:30.358805: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-07 15:35:30.360423: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-07 15:35:30.365439: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-07 15:35:30.366723: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-07 15:35:30.366815: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-07 15:35:30.367563: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-05-07 15:35:30.367584: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-05-07 15:35:30.367594: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-05-07 15:35:30.368735: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6470 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param To do loadFromThcl(), then load ParamDescType : thcl454 thcls : [{'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}] thcl {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 3473 FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (3473, 'mask_coco_origin', 16384, 25088, 'mask_coco_origin', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2018, 3, 19, 10, 42, 21), datetime.datetime(2018, 3, 19, 10, 42, 21)) {'thcl': {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'], 'list_hashtags_csv': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'svm_hashtag_type_desc': 3473, 'photo_desc_type': 3473, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME mask_coco_origin NUM_CLASSES 81 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 2025-05-07 15:35:30.956782: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-07 15:35:30.956891: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-07 15:35:30.956924: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-07 15:35:30.956953: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-07 15:35:30.956982: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-07 15:35:30.957011: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-07 15:35:30.957062: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-07 15:35:30.957092: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-07 15:35:30.958866: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-07 15:35:30.960319: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-07 15:35:30.960378: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-07 15:35:30.960394: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-07 15:35:30.960409: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-07 15:35:30.960423: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-07 15:35:30.960438: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-07 15:35:30.960452: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-07 15:35:30.960467: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-07 15:35:30.961526: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-07 15:35:30.961555: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-05-07 15:35:30.961564: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-05-07 15:35:30.961572: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-05-07 15:35:30.962681: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6470 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : mask_coco_origin model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-05-07 15:35:38.987658: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-07 15:35:39.209973: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/mask_coco_origin /data/models_weight/mask_coco_origin/mask_model.h5 size_local : 257557808 size in s3 : 257557808 create time local : 2021-08-09 05:27:17 create time in s3 : 2021-08-06 19:45:17 mask_model.h5 already exist and didn't need to update list_images length : 1 NEW PHOTO Processing 1 images image shape: (480, 640, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 89) min: 0.00000 max: 640.00000 nb d'objets trouves : 5 Detection mask done ! Trying to reset tf kernel 1251467 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 1746 tf kernel not reseted sub process len(results) : 1 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 1 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 7035 list_Values should be empty [] To do loadFromThcl(), then load ParamDescType : thcl454 Catched exception ! Connect or reconnect ! thcls : [{'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}] thcl {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'} Update svm_hashtag_type_desc : 3473 ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] time for calcul the mask position with numpy : 0.000579833984375 nb_pixel_total : 15551 time to create 1 rle with old method : 0.017942190170288086 length of segment : 256 time for calcul the mask position with numpy : 0.002864837646484375 nb_pixel_total : 145328 time to create 1 rle with old method : 0.16350769996643066 length of segment : 371 time for calcul the mask position with numpy : 0.00024318695068359375 nb_pixel_total : 14255 time to create 1 rle with old method : 0.016272306442260742 length of segment : 151 time for calcul the mask position with numpy : 0.00012111663818359375 nb_pixel_total : 5613 time to create 1 rle with old method : 0.007088899612426758 length of segment : 48 time for calcul the mask position with numpy : 6.198883056640625e-05 nb_pixel_total : 1824 time to create 1 rle with old method : 0.0024170875549316406 length of segment : 39 time spent for convertir_results : 0.9892816543579102 time spend for datou_step_exec : 18.899258375167847 time spend to save output : 3.814697265625e-05 total time spend for step 1 : 18.899296522140503 caffe_path_current : About to save ! 1 Inside saveOutput : final : True verbose : False eke 12-6-18 : saveMask need to be cleaned for new output ! Number saved : None batch 1 Loaded 3331 chid ids of type : 445 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++Number RLEs to save : 0 begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1 time used for this insertion : 0.012902021408081055 save missing photos in datou_result : After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : {'957285035': [[(957285035, 492601069, 445, 0, 186, 22, 282, 0.99548054, [(140, 26, 6), (135, 27, 15), (133, 28, 18), (131, 29, 22), (127, 30, 27), (10, 31, 1), (120, 31, 35), (8, 32, 13), (27, 32, 3), (115, 32, 41), (7, 33, 52), (109, 33, 48), (6, 34, 70), (103, 34, 55), (5, 35, 154), (4, 36, 155), (3, 37, 156), (3, 38, 156), (3, 39, 156), (2, 40, 157), (2, 41, 157), (2, 42, 157), (2, 43, 157), (2, 44, 157), (2, 45, 157), (1, 46, 158), (1, 47, 158), (1, 48, 158), (1, 49, 157), (1, 50, 157), (1, 51, 156), (1, 52, 156), (1, 53, 155), (1, 54, 154), (1, 55, 152), (1, 56, 149), (1, 57, 145), (1, 58, 141), (1, 59, 136), (1, 60, 133), (1, 61, 130), (1, 62, 127), (1, 63, 126), (1, 64, 124), (1, 65, 123), (1, 66, 121), (1, 67, 120), (1, 68, 118), (1, 69, 117), (1, 70, 116), (1, 71, 115), (1, 72, 114), (1, 73, 113), (1, 74, 112), (1, 75, 111), (1, 76, 110), (1, 77, 108), (1, 78, 108), (1, 79, 107), (1, 80, 106), (1, 81, 105), (2, 82, 104), (2, 83, 103), (2, 84, 103), (2, 85, 102), (2, 86, 102), (2, 87, 101), (2, 88, 100), (2, 89, 99), (2, 90, 99), (2, 91, 98), (2, 92, 97), (2, 93, 96), (2, 94, 95), (2, 95, 93), (2, 96, 91), (2, 97, 90), (2, 98, 89), (2, 99, 87), (2, 100, 86), (2, 101, 86), (2, 102, 85), (2, 103, 84), (2, 104, 83), (2, 105, 83), (2, 106, 82), (2, 107, 81), (2, 108, 80), (2, 109, 80), (2, 110, 79), (2, 111, 78), (2, 112, 77), (2, 113, 76), (1, 114, 76), (1, 115, 75), (1, 116, 74), (1, 117, 73), (1, 118, 72), (1, 119, 71), (1, 120, 71), (1, 121, 70), (1, 122, 69), (1, 123, 69), (1, 124, 68), (1, 125, 68), (1, 126, 67), (1, 127, 67), (1, 128, 66), (1, 129, 66), (1, 130, 66), (1, 131, 65), (1, 132, 65), (1, 133, 64), (1, 134, 63), (1, 135, 63), (1, 136, 62), (1, 137, 61), (1, 138, 60), (1, 139, 60), (1, 140, 59), (1, 141, 58), (1, 142, 58), (1, 143, 57), (1, 144, 56), (1, 145, 56), (1, 146, 55), (1, 147, 54), (1, 148, 54), (1, 149, 53), (1, 150, 52), (1, 151, 52), (1, 152, 51), (1, 153, 50), (1, 154, 49), (1, 155, 48), (1, 156, 47), (1, 157, 46), (1, 158, 45), (1, 159, 45), (1, 160, 44), (1, 161, 43), (1, 162, 42), (1, 163, 41), (1, 164, 41), (1, 165, 40), (1, 166, 40), (1, 167, 39), (1, 168, 38), (1, 169, 37), (1, 170, 36), (1, 171, 35), (1, 172, 34), (1, 173, 34), (1, 174, 33), (1, 175, 33), (1, 176, 32), (1, 177, 32), (1, 178, 32), (1, 179, 32), (1, 180, 31), (1, 181, 31), (1, 182, 31), (1, 183, 30), (1, 184, 30), (1, 185, 30), (1, 186, 29), (1, 187, 29), (1, 188, 29), (1, 189, 28), (1, 190, 28), (1, 191, 27), (1, 192, 27), (1, 193, 26), (1, 194, 26), (1, 195, 26), (1, 196, 26), (1, 197, 26), (1, 198, 26), (1, 199, 26), (1, 200, 25), (1, 201, 25), (1, 202, 25), (1, 203, 25), (1, 204, 25), (1, 205, 25), (1, 206, 25), (1, 207, 25), (1, 208, 25), (1, 209, 25), (1, 210, 25), (1, 211, 25), (1, 212, 25), (1, 213, 25), (1, 214, 25), (1, 215, 25), (1, 216, 25), (1, 217, 25), (1, 218, 25), (1, 219, 25), (1, 220, 24), (1, 221, 24), (1, 222, 24), (1, 223, 24), (1, 224, 24), (1, 225, 24), (1, 226, 25), (1, 227, 25), (1, 228, 25), (2, 229, 24), (2, 230, 24), (2, 231, 24), (2, 232, 23), (2, 233, 23), (2, 234, 23), (2, 235, 23), (2, 236, 23), (2, 237, 23), (2, 238, 23), (2, 239, 23), (2, 240, 23), (2, 241, 23), (2, 242, 23), (2, 243, 23), (2, 244, 23), (2, 245, 23), (2, 246, 23), (2, 247, 23), (2, 248, 23), (2, 249, 24), (2, 250, 24), (2, 251, 23), (2, 252, 23), (2, 253, 23), (2, 254, 23), (2, 255, 23), (2, 256, 23), (2, 257, 23), (2, 258, 23), (2, 259, 23), (2, 260, 23), (2, 261, 23), (3, 262, 22), (3, 263, 22), (3, 264, 22), (3, 265, 22), (4, 266, 21), (4, 267, 21), (5, 268, 20), (5, 269, 20), (6, 270, 19), (7, 271, 17), (8, 272, 16), (8, 273, 16), (9, 274, 13), (11, 275, 9), (15, 276, 2)], ['16,276,8,273,2,261,2,229,1,228,1,114,2,113,2,82,1,81,1,46,3,37,8,32,20,32,21,33,58,33,59,34,75,34,76,35,102,35,114,33,120,31,130,30,135,27,145,26,152,29,158,35,158,48,154,54,141,58,128,61,119,67,105,81,103,86,96,94,89,98,81,109,71,119,65,132,60,138,52,151,45,158,40,166,34,172,29,188,26,193,25,200,25,219,24,232,24,270,23,273']), (957285035, 492601069, 445, 29, 591, 24, 419, 0.99238104, [(315, 37, 25), (272, 38, 86), (253, 39, 130), (238, 40, 151), (199, 41, 196), (189, 42, 213), (180, 43, 238), (175, 44, 250), (172, 45, 257), (169, 46, 265), (166, 47, 274), (162, 48, 284), (159, 49, 294), (157, 50, 304), (155, 51, 310), (153, 52, 317), (151, 53, 323), (149, 54, 330), (148, 55, 334), (146, 56, 337), (144, 57, 341), (142, 58, 344), (140, 59, 347), (138, 60, 350), (136, 61, 353), (134, 62, 356), (132, 63, 358), (130, 64, 361), (128, 65, 364), (126, 66, 367), (124, 67, 370), (122, 68, 373), (120, 69, 376), (118, 70, 379), (117, 71, 381), (115, 72, 385), (114, 73, 387), (113, 74, 389), (112, 75, 391), (112, 76, 393), (111, 77, 395), (110, 78, 397), (109, 79, 399), (109, 80, 400), (108, 81, 402), (107, 82, 404), (107, 83, 404), (106, 84, 406), (105, 85, 408), (105, 86, 409), (104, 87, 410), (104, 88, 411), (103, 89, 413), (102, 90, 415), (101, 91, 417), (100, 92, 420), (98, 93, 423), (97, 94, 426), (96, 95, 428), (94, 96, 431), (93, 97, 433), (92, 98, 435), (91, 99, 437), (90, 100, 439), (89, 101, 441), (89, 102, 441), (89, 103, 442), (89, 104, 443), (89, 105, 444), (89, 106, 444), (89, 107, 445), (89, 108, 446), (89, 109, 447), (89, 110, 448), (89, 111, 449), (89, 112, 450), (89, 113, 451), (89, 114, 453), (89, 115, 454), (89, 116, 455), (88, 117, 456), (88, 118, 457), (87, 119, 459), (87, 120, 459), (86, 121, 461), (85, 122, 462), (85, 123, 463), (84, 124, 464), (84, 125, 465), (83, 126, 466), (82, 127, 468), (82, 128, 468), (81, 129, 470), (80, 130, 471), (78, 131, 473), (76, 132, 476), (75, 133, 477), (73, 134, 480), (71, 135, 482), (70, 136, 484), (68, 137, 486), (67, 138, 488), (65, 139, 490), (64, 140, 492), (63, 141, 493), (61, 142, 496), (60, 143, 497), (59, 144, 499), (58, 145, 501), (58, 146, 501), (57, 147, 503), (57, 148, 504), (57, 149, 505), (56, 150, 507), (56, 151, 507), (55, 152, 509), (55, 153, 510), (54, 154, 511), (54, 155, 512), (54, 156, 513), (53, 157, 514), (53, 158, 514), (52, 159, 515), (52, 160, 516), (52, 161, 516), (51, 162, 517), (51, 163, 517), (50, 164, 518), (50, 165, 518), (49, 166, 519), (49, 167, 520), (48, 168, 521), (48, 169, 521), (47, 170, 522), (47, 171, 522), (46, 172, 523), (46, 173, 523), (46, 174, 523), (45, 175, 524), (45, 176, 523), (44, 177, 524), (44, 178, 524), (44, 179, 524), (43, 180, 525), (43, 181, 525), (42, 182, 525), (42, 183, 525), (42, 184, 525), (41, 185, 526), (41, 186, 526), (40, 187, 526), (39, 188, 526), (39, 189, 525), (38, 190, 526), (38, 191, 525), (37, 192, 525), (37, 193, 523), (36, 194, 523), (36, 195, 522), (36, 196, 522), (35, 197, 522), (35, 198, 521), (34, 199, 521), (34, 200, 521), (34, 201, 520), (34, 202, 520), (34, 203, 520), (34, 204, 519), (34, 205, 519), (33, 206, 520), (33, 207, 519), (33, 208, 519), (33, 209, 519), (33, 210, 518), (33, 211, 518), (33, 212, 518), (33, 213, 517), (32, 214, 518), (32, 215, 517), (32, 216, 517), (32, 217, 516), (32, 218, 515), (32, 219, 514), (32, 220, 513), (32, 221, 512), (32, 222, 511), (32, 223, 510), (32, 224, 508), (32, 225, 507), (32, 226, 505), (32, 227, 504), (32, 228, 503), (32, 229, 502), (32, 230, 502), (32, 231, 501), (32, 232, 500), (32, 233, 499), (32, 234, 498), (32, 235, 497), (31, 236, 496), (31, 237, 495), (31, 238, 494), (31, 239, 493), (31, 240, 491), (31, 241, 490), (31, 242, 488), (31, 243, 487), (31, 244, 486), (31, 245, 485), (31, 246, 483), (31, 247, 482), (31, 248, 480), (31, 249, 479), (31, 250, 477), (31, 251, 475), (31, 252, 473), (31, 253, 472), (31, 254, 470), (31, 255, 468), (31, 256, 467), (31, 257, 465), (31, 258, 464), (31, 259, 463), (31, 260, 462), (31, 261, 461), (31, 262, 459), (31, 263, 458), (31, 264, 456), (31, 265, 455), (31, 266, 453), (31, 267, 451), (31, 268, 449), (31, 269, 448), (31, 270, 447), (31, 271, 445), (31, 272, 444), (31, 273, 443), (32, 274, 441), (32, 275, 440), (32, 276, 438), (32, 277, 437), (32, 278, 435), (32, 279, 434), (32, 280, 432), (33, 281, 429), (33, 282, 427), (33, 283, 426), (33, 284, 424), (33, 285, 423), (34, 286, 421), (34, 287, 420), (34, 288, 419), (35, 289, 416), (35, 290, 415), (35, 291, 414), (36, 292, 411), (36, 293, 410), (37, 294, 407), (37, 295, 406), (38, 296, 403), (38, 297, 401), (39, 298, 399), (39, 299, 397), (41, 300, 394), (42, 301, 392), (43, 302, 389), (44, 303, 387), (45, 304, 385), (46, 305, 382), (47, 306, 380), (47, 307, 378), (48, 308, 376), (49, 309, 373), (50, 310, 370), (51, 311, 368), (51, 312, 367), (52, 313, 365), (54, 314, 362), (55, 315, 360), (56, 316, 359), (58, 317, 356), (61, 318, 352), (64, 319, 349), (67, 320, 345), (70, 321, 341), (73, 322, 338), (75, 323, 335), (78, 324, 332), (80, 325, 329), (82, 326, 327), (84, 327, 324), (86, 328, 322), (88, 329, 320), (90, 330, 317), (93, 331, 314), (96, 332, 311), (99, 333, 307), (102, 334, 304), (105, 335, 300), (108, 336, 297), (111, 337, 294), (113, 338, 291), (115, 339, 289), (117, 340, 286), (119, 341, 283), (121, 342, 281), (123, 343, 278), (125, 344, 275), (127, 345, 272), (129, 346, 269), (132, 347, 266), (135, 348, 262), (138, 349, 258), (141, 350, 255), (143, 351, 252), (146, 352, 249), (147, 353, 247), (149, 354, 245), (151, 355, 242), (152, 356, 241), (154, 357, 239), (156, 358, 237), (159, 359, 233), (161, 360, 231), (163, 361, 229), (165, 362, 227), (167, 363, 224), (169, 364, 222), (170, 365, 221), (172, 366, 219), (173, 367, 218), (174, 368, 216), (175, 369, 215), (177, 370, 213), (178, 371, 212), (180, 372, 209), (183, 373, 206), (185, 374, 204), (188, 375, 200), (191, 376, 197), (194, 377, 193), (196, 378, 191), (199, 379, 188), (201, 380, 185), (203, 381, 183), (205, 382, 180), (207, 383, 178), (208, 384, 176), (210, 385, 174), (212, 386, 171), (213, 387, 169), (215, 388, 166), (218, 389, 162), (221, 390, 158), (225, 391, 153), (228, 392, 149), (232, 393, 144), (235, 394, 140), (238, 395, 136), (241, 396, 133), (245, 397, 128), (248, 398, 124), (252, 399, 119), (257, 400, 113), (263, 401, 105), (272, 402, 94), (283, 403, 82), (296, 404, 66), (306, 405, 53), (313, 406, 38), (321, 407, 23)], ['321,407,305,404,263,401,215,388,206,382,178,371,168,363,145,351,129,346,110,336,90,330,56,316,39,299,31,273,31,236,34,199,58,145,79,131,89,116,89,101,104,88,115,72,159,49,180,43,199,41,237,41,272,38,339,37,382,39,402,43,417,43,481,55,504,76,543,116,556,143,566,156,568,167,566,186,554,199,548,216,528,235,496,256,471,275,460,281,414,315,403,339,392,355,383,385,369,400,358,405']), (957285035, 492601069, 445, 485, 636, 23, 174, 0.9711345, [(540, 24, 21), (626, 24, 3), (531, 25, 49), (594, 25, 40), (527, 26, 107), (523, 27, 111), (520, 28, 114), (517, 29, 118), (516, 30, 119), (515, 31, 120), (513, 32, 122), (512, 33, 123), (510, 34, 125), (509, 35, 126), (507, 36, 128), (506, 37, 129), (504, 38, 131), (503, 39, 132), (501, 40, 134), (500, 41, 135), (499, 42, 136), (498, 43, 137), (497, 44, 138), (496, 45, 139), (496, 46, 139), (495, 47, 140), (495, 48, 140), (494, 49, 141), (493, 50, 142), (492, 51, 143), (491, 52, 144), (491, 53, 144), (490, 54, 145), (490, 55, 145), (490, 56, 145), (490, 57, 146), (490, 58, 146), (490, 59, 146), (491, 60, 145), (491, 61, 145), (491, 62, 145), (492, 63, 144), (493, 64, 143), (494, 65, 142), (495, 66, 141), (496, 67, 140), (497, 68, 138), (498, 69, 138), (499, 70, 137), (500, 71, 136), (501, 72, 135), (503, 73, 133), (503, 74, 133), (505, 75, 131), (506, 76, 130), (507, 77, 129), (508, 78, 128), (509, 79, 127), (510, 80, 126), (511, 81, 125), (512, 82, 124), (513, 83, 123), (514, 84, 122), (515, 85, 121), (516, 86, 120), (517, 87, 119), (518, 88, 118), (519, 89, 117), (521, 90, 115), (521, 91, 115), (522, 92, 114), (523, 93, 113), (524, 94, 112), (525, 95, 111), (526, 96, 110), (527, 97, 109), (529, 98, 107), (530, 99, 106), (532, 100, 104), (533, 101, 103), (534, 102, 102), (535, 103, 101), (536, 104, 100), (538, 105, 98), (540, 106, 96), (541, 107, 95), (543, 108, 93), (546, 109, 90), (548, 110, 88), (549, 111, 87), (551, 112, 84), (552, 113, 83), (553, 114, 82), (555, 115, 80), (556, 116, 79), (556, 117, 79), (557, 118, 78), (558, 119, 77), (559, 120, 76), (560, 121, 75), (560, 122, 75), (561, 123, 74), (561, 124, 74), (561, 125, 74), (562, 126, 73), (562, 127, 73), (563, 128, 72), (563, 129, 72), (564, 130, 70), (564, 131, 70), (565, 132, 69), (565, 133, 68), (565, 134, 68), (565, 135, 67), (566, 136, 65), (566, 137, 64), (566, 138, 64), (566, 139, 62), (566, 140, 61), (566, 141, 59), (566, 142, 57), (566, 143, 56), (566, 144, 55), (566, 145, 54), (567, 146, 53), (567, 147, 52), (567, 148, 51), (568, 149, 50), (568, 150, 49), (568, 151, 48), (568, 152, 47), (569, 153, 45), (569, 154, 44), (570, 155, 42), (570, 156, 42), (570, 157, 41), (571, 158, 39), (571, 159, 39), (572, 160, 37), (572, 161, 37), (573, 162, 35), (573, 163, 34), (573, 164, 34), (574, 165, 32), (575, 166, 30), (577, 167, 28), (578, 168, 26), (581, 169, 22), (584, 170, 19), (587, 171, 15), (591, 172, 8)], ['598,172,591,172,590,171,578,168,573,164,573,162,568,152,568,149,566,145,566,136,565,132,561,125,560,121,556,116,547,109,543,108,536,104,531,99,527,97,491,62,490,54,495,48,496,45,501,40,514,32,517,29,531,25,539,25,540,24,560,24,561,25,579,25,580,26,593,26,594,25,633,25,634,29,634,56,635,57,635,111,634,112,634,129,632,134,629,138,623,141,619,145,617,149,611,155,608,161,604,166']), (957285035, 492601069, 445, 280, 481, 2, 55, 0.8297425, [(292, 3, 128), (284, 4, 146), (282, 5, 151), (281, 6, 154), (281, 7, 156), (281, 8, 157), (281, 9, 158), (281, 10, 160), (281, 11, 162), (281, 12, 165), (281, 13, 167), (281, 14, 169), (281, 15, 171), (281, 16, 173), (281, 17, 174), (281, 18, 175), (281, 19, 177), (281, 20, 178), (281, 21, 179), (281, 22, 180), (281, 23, 181), (281, 24, 182), (281, 25, 183), (281, 26, 184), (281, 27, 185), (281, 28, 185), (281, 29, 185), (282, 30, 185), (283, 31, 27), (337, 31, 131), (371, 32, 97), (401, 33, 68), (409, 34, 61), (419, 35, 52), (424, 36, 48), (429, 37, 44), (432, 38, 41), (434, 39, 40), (436, 40, 39), (438, 41, 37), (441, 42, 35), (444, 43, 32), (448, 44, 29), (452, 45, 25), (454, 46, 23), (459, 47, 17), (463, 48, 12), (468, 49, 5)], ['472,49,468,49,467,48,459,47,458,46,454,46,451,44,448,44,447,43,444,43,440,41,438,41,428,36,424,36,423,35,419,35,418,34,409,34,408,33,401,33,400,32,371,32,370,31,337,31,336,30,283,31,281,29,281,6,284,4,291,4,292,3,419,3,420,4,429,4,430,5,432,5,436,7,441,11,445,12,453,16,456,19,457,19,465,27,465,29,472,37,476,44,476,46']), (957285035, 492601069, 445, 456, 547, 6, 45, 0.74069184, [(482, 8, 19), (464, 9, 3), (481, 9, 44), (457, 10, 12), (479, 10, 50), (457, 11, 13), (476, 11, 56), (457, 12, 15), (475, 12, 65), (457, 13, 84), (457, 14, 85), (457, 15, 89), (457, 16, 89), (458, 17, 88), (459, 18, 87), (460, 19, 86), (461, 20, 80), (464, 21, 71), (466, 22, 63), (467, 23, 59), (468, 24, 55), (469, 25, 52), (469, 26, 51), (470, 27, 48), (471, 28, 46), (471, 29, 44), (472, 30, 42), (473, 31, 39), (473, 32, 38), (474, 33, 36), (475, 34, 33), (475, 35, 32), (476, 36, 30), (476, 37, 29), (477, 38, 26), (478, 39, 23), (479, 40, 20), (480, 41, 17), (488, 42, 5)], ['492,42,488,42,487,41,480,41,476,37,475,34,473,32,469,25,465,21,461,20,457,16,457,10,463,10,464,9,466,9,470,12,474,13,476,11,480,10,482,8,500,8,501,9,524,9,525,10,528,10,532,12,539,12,542,15,545,15,545,19,535,20,534,21,529,21,525,23,523,23,513,30,512,30,504,37,496,41,493,41'])], 'temp/1746624927_1251121_957285035_a42482e51c93c8025d243dd179aee85b.jpg']} free memory after detection : begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 7035 ############################### TEST detect object ################################ run mask_detect Inside batchDatouExec : verbose : False # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : mask_detect list_input_json : [] origin BFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 time to download the photos : 0.15259075164794922 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : False number of steps : 1 step1:mask_detect Wed May 7 15:35:47 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Beginning of datou step mask_detect ! save_polygon : True begin detect begin to check gpu status inside check gpu memory l 3637 free memory gpu now : 7035 max_wait_temp : 1 max_wait : 0 gpu_flag : 0 2025-05-07 15:35:50.683220: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2025-05-07 15:35:50.711312: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 3493065000 Hz 2025-05-07 15:35:50.712907: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f1778000b60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2025-05-07 15:35:50.712930: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version 2025-05-07 15:35:50.716086: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1 2025-05-07 15:35:50.963782: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x1999e560 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices: 2025-05-07 15:35:50.963830: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA GeForce RTX 2080 Ti, Compute Capability 7.5 2025-05-07 15:35:50.964762: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-07 15:35:50.965087: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-07 15:35:50.967094: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-07 15:35:50.969182: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-07 15:35:50.969504: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-07 15:35:50.971761: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-07 15:35:50.972784: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-07 15:35:50.977300: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-07 15:35:50.978588: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-07 15:35:50.978674: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-07 15:35:50.979369: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-05-07 15:35:50.979386: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-05-07 15:35:50.979396: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-05-07 15:35:50.980520: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6470 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) WARNING:tensorflow:From /home/admin/workarea/git/Velours/python/mtr/mask_rcnn/mask_detection.py:69: The name tf.keras.backend.set_session is deprecated. Please use tf.compat.v1.keras.backend.set_session instead. 2025-05-07 15:35:51.066562: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-07 15:35:51.066724: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-07 15:35:51.066754: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-07 15:35:51.066780: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-07 15:35:51.066800: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-07 15:35:51.066824: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-07 15:35:51.066849: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-07 15:35:51.066892: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-07 15:35:51.067910: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-07 15:35:51.068940: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties: pciBusID: 0000:41:00.0 name: NVIDIA GeForce RTX 2080 Ti computeCapability: 7.5 coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s 2025-05-07 15:35:51.068982: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1 2025-05-07 15:35:51.069002: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-07 15:35:51.069022: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10 2025-05-07 15:35:51.069044: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10 2025-05-07 15:35:51.069065: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10 2025-05-07 15:35:51.069082: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10 2025-05-07 15:35:51.069100: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 2025-05-07 15:35:51.070128: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0 2025-05-07 15:35:51.070164: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix: 2025-05-07 15:35:51.070172: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0 2025-05-07 15:35:51.070180: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N 2025-05-07 15:35:51.071283: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6470 MB memory) -> physical GPU (device: 0, name: NVIDIA GeForce RTX 2080 Ti, pci bus id: 0000:41:00.0, compute capability: 7.5) Using TensorFlow backend. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:396: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. Instructions for updating: box_ind is deprecated, use box_indices instead WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:703: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. WARNING:tensorflow:From /home/admin/workarea/install/Mask_RCNN/model.py:729: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. Instructions for updating: Use `tf.cast` instead. Inside mask_sub_process Inside mask_detect About to load cache.load_thcl_param FOUND : 1 Here is data_from_sql_as_vec to set the ParamDescriptorType : (3473, 'mask_coco_origin', 16384, 25088, 'mask_coco_origin', 'pool5', 10.0, None, None, 256, None, 0, None, 8, None, None, -1000.0, 1, datetime.datetime(2018, 3, 19, 10, 42, 21), datetime.datetime(2018, 3, 19, 10, 42, 21)) {'thcl': {'id': 454, 'mtr_user_id': 31, 'name': 'mask_coco_origin', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'photo_desc_type': 3473, 'type_classification': 'mask_rcnn', 'hashtag_id_list': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0'}, 'list_hashtags': ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'], 'list_hashtags_csv': 'backgroud,person,bicycle,car,motorcycle,airplane,bus,train,truck,boat,trafficlight,firehydrant,stopsign,parkingmeter,bench,bird,cat,dog,horse,sheep,cow,elephant,bear,zebra,giraffe,backpack,umbrella,handbag,tie,suitcase,frisbee,skis,snowboard,sportsball,kite,baseballbat,baseballglove,skateboard,surfboard,tennisracket,bottle,wineglass,cup,fork,knife,spoon,bowl,banana,apple,sandwich,orange,broccoli,carrot,hotdog,pizza,donut,cake,chair,couch,pottedplant,bed,diningtable,toilet,tv,laptop,mouse,remote,keyboard,cellphone,microwave,oven,toaster,sink,refrigerator,book,clock,vase,scissors,teddybear,hairdrier,toothbrush', 'svm_portfolios_learning': '0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0', 'photo_hashtag_type': 445, 'svm_hashtag_type_desc': 3473, 'photo_desc_type': 3473, 'pb_hashtag_id_or_classifier': 0} list_class_names : ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] Configurations: BACKBONE resnet101 BACKBONE_SHAPES [[160 160] [ 80 80] [ 40 40] [ 20 20] [ 10 10]] BACKBONE_STRIDES [4, 8, 16, 32, 64] BATCH_SIZE 1 BBOX_STD_DEV [0.1 0.1 0.2 0.2] DETECTION_MAX_INSTANCES 100 DETECTION_MIN_CONFIDENCE 0.3 DETECTION_NMS_THRESHOLD 0.3 GPU_COUNT 1 IMAGES_PER_GPU 1 IMAGE_MAX_DIM 640 IMAGE_MIN_DIM 640 IMAGE_PADDING True IMAGE_SHAPE [640 640 3] LEARNING_MOMENTUM 0.9 LEARNING_RATE 0.001 LOSS_WEIGHTS {'rpn_class_loss': 1.0, 'rpn_bbox_loss': 1.0, 'mrcnn_class_loss': 1.0, 'mrcnn_bbox_loss': 1.0, 'mrcnn_mask_loss': 1.0} MASK_POOL_SIZE 14 MASK_SHAPE [28, 28] MAX_GT_INSTANCES 100 MEAN_PIXEL [123.7 116.8 103.9] MINI_MASK_SHAPE (56, 56) NAME mask_coco_origin NUM_CLASSES 81 POOL_SIZE 7 POST_NMS_ROIS_INFERENCE 1000 POST_NMS_ROIS_TRAINING 2000 ROI_POSITIVE_RATIO 0.33 RPN_ANCHOR_RATIOS [0.5, 1, 2] RPN_ANCHOR_SCALES (16, 32, 64, 128, 256) RPN_ANCHOR_STRIDE 1 RPN_BBOX_STD_DEV [0.1 0.1 0.2 0.2] RPN_NMS_THRESHOLD 0.7 RPN_TRAIN_ANCHORS_PER_IMAGE 256 STEPS_PER_EPOCH 1000 TRAIN_ROIS_PER_IMAGE 200 USE_MINI_MASK True USE_RPN_ROIS True VALIDATION_STEPS 50 WEIGHT_DECAY 0.0001 model_param file didn't exist model_name : mask_coco_origin model_type : mask_rcnn list file need : ['mask_model.h5'] file exist in s3 : ['mask_model.h5'] file manque in s3 : [] 2025-05-07 15:35:58.502135: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10 2025-05-07 15:35:58.685112: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7 local folder : /data/models_weight/mask_coco_origin /data/models_weight/mask_coco_origin/mask_model.h5 size_local : 257557808 size in s3 : 257557808 create time local : 2021-08-09 05:27:17 create time in s3 : 2021-08-06 19:45:17 mask_model.h5 already exist and didn't need to update list_images length : 1 NEW PHOTO Processing 1 images image shape: (720, 1280, 3) min: 0.00000 max: 255.00000 molded_images shape: (1, 640, 640, 3) min: -123.70000 max: 151.10000 image_metas shape: (1, 89) min: 0.00000 max: 1280.00000 nb d'objets trouves : 4 Detection mask done ! Trying to reset tf kernel 1252466 begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 1746 tf kernel not reseted sub process len(results) : 1 len(list_Values) 0 None max_time_sub_proc : 3600 parent process len(results) : 1 len(list_Values) 0 process is alive finish correctly or not : True after detect begin to check gpu status inside check gpu memory l 3610 free memory gpu now : 7035 list_Values should be empty [] ['backgroud', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', 'train', 'truck', 'boat', 'trafficlight', 'firehydrant', 'stopsign', 'parkingmeter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', 'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', 'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sportsball', 'kite', 'baseballbat', 'baseballglove', 'skateboard', 'surfboard', 'tennisracket', 'bottle', 'wineglass', 'cup', 'fork', 'knife', 'spoon', 'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', 'hotdog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'pottedplant', 'bed', 'diningtable', 'toilet', 'tv', 'laptop', 'mouse', 'remote', 'keyboard', 'cellphone', 'microwave', 'oven', 'toaster', 'sink', 'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddybear', 'hairdrier', 'toothbrush'] time for calcul the mask position with numpy : 0.0006308555603027344 nb_pixel_total : 16902 time to create 1 rle with old method : 0.0217282772064209 length of segment : 107 time for calcul the mask position with numpy : 0.018344879150390625 nb_pixel_total : 480739 time to create 1 rle with new method : 0.03702664375305176 length of segment : 632 time for calcul the mask position with numpy : 0.0005159378051757812 nb_pixel_total : 36641 time to create 1 rle with old method : 0.041498661041259766 length of segment : 133 time for calcul the mask position with numpy : 0.00012636184692382812 nb_pixel_total : 4793 time to create 1 rle with old method : 0.0056667327880859375 length of segment : 51 time spent for convertir_results : 0.3076646327972412 time spend for datou_$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC$mmoLD;%g?wŷovH0a5*ؒl͛SiyrO7%L]%hk>v1HBd\(e oIx>36BS%( f$h eԎH`ݶ f{FoY@00uMbz-XI$&gf7Ӵu|'K.oP PF.o9B<~.[<٭${1A.bKxL'u8n5e,]HVWw$Cel|zysKi-qݬbk,wnG;~ er͒~' 1`V⦫-*[LK'2@仪n2NƶGi/U'E@`H;J +Jn#6ڴĹGNG'Z!WiNJ@AZ|[$q}iҷQbtTEC 1898), (33, 1370, 1898), (33, 1371, 1897), (33, 1372, 1897), (33, 1373, 1896), (33, 1374, 1896), (33, 1375, 1895), (33, 1376, 1895), (33, 1377, 1894), (33, 1378, 1894), (33, 1379, 1893), (33, 1380, 1893), (33, 1381, 1892), (33, 1382, 1891), (34, 1383, 1890), (34, 1384, 1889), (34, 1385, 1889), (34, 1386, 1888), (34, 1387, 1887), (34, 1388, 1887), (34, 1389, 1886), (34, 1390, 1885), (34, 1391, 1885), (34, 1392, 1884), (34, 1393, 1883), (34, 1394, 1883), (34, 1395, 1882), (34, 1396, 1881), (35, 1397, 1879), (35, 1398, 1879), (35, 1399, 1878), (35, 1400, 1877), (35, 1401, 1876), (35, 1402, 1875), (35, 1403, 1875), (35, 1404, 1874), (35, 1405, 1873), (35, 1406, 1872), (35, 1407, 1871), (35, 1408, 1870), (35, 1409, 1869), (35, 1410, 1868), (36, 1411, 1866), (36, 1412, 1865), (36, 1413, 1865), (36, 1414, 1864), (36, 1415, 1863), (36, 1416, 1862), (36, 1417, 1861), (36, 1418, 1860), (36, 1419, 1860), (36, 1420, 1859), (36, 1421, 1858), (36, 1422, 1857), (36, 1423, 1857), (37, 1424, 1855), (37, 1425, 1854), (37, 1426, 1854), (37, 1427, 1853), (37, 1428, 1852), (37, 1429, 1852), (37, 1430, 1851), (37, 1431, 1850), (37, 1432, 1850), (37, 1433, 1849), (37, 1434, 1849), (37, 1435, 1848), (38, 1436, 1847), (38, 1437, 1846), (38, 1438, 1845), (38, 1439, 1845), (38, 1440, 1844), (38, 1441, 1844), (38, 1442, 1843), (38, 1443, 1843), (38, 1444, 1842), (38, 1445, 1842), (38, 1446, 1841), (38, 1447, 1841), (38, 1448, 1841), (38, 1449, 1841), (38, 1450, 1841), (38, 1451, 1840), (38, 1452, 1840), (38, 1453, 1840), (38, 1454, 1840), (39, 1455, 1838), (39, 1456, 1838), (39, 1457, 1838), (39, 1458, 1838), (39, 1459, 1838), (39, 1460, 1837), (39, 1461, 1837), (39, 1462, 1837), (39, 1463, 1837), (39, 1464, 1837), (39, 1465, 1836), (39, 1466, 1836), (39, 1467, 1836), (39, 1468, 1836), (39, 1469, 1836), (39, 1470, 1835), (39, 1471, 1835), (39, 1472, 1835), (39, 1473, 1835), (39, 1474, 1835), (39, 1475, 1834), (39, 1476, 1834), (39, 1477, 1834), (39, 1478, 1834), (39, 1479, 1834), (39, 1480, 1833), (39, 1481, 1833), (39, 1482, 1833), (39, 1483, 1833), (39, 1484, 1833), (39, 1485, 1833), (39, 1486, 1832), (39, 1487, 1832), (39, 1488, 1832), (39, 1489, 1832), (39, 1490, 1832), (39, 1491, 1831), (39, 1492, 1831), (39, 1493, 1831), (39, 1494, 1831), (39, 1495, 1831), (40, 1496, 1830), (40, 1497, 1829), (40, 1498, 1829), (40, 1499, 1829), (40, 1500, 1829), (40, 1501, 1829), (40, 1502, 1829), (40, 1503, 1828), (40, 1504, 1828), (40, 1505, 1828), (40, 1506, 1828), (40, 1507, 1828), (40, 1508, 1828), (40, 1509, 1827), (40, 1510, 1827), (40, 1511, 1827), (40, 1512, 1827), (40, 1513, 1827), (40, 1514, 1827), (40, 1515, 1826), (40, 1516, 1826), (40, 1517, 1826), (40, 1518, 1826), (40, 1519, 1826), (40, 1520, 1826), (40, 1521, 1825), (40, 1522, 1825), (40, 1523, 1825), (40, 1524, 1825), (40, 1525, 1825), (40, 1526, 1825), (40, 1527, 1825), (40, 1528, 1825), (40, 1529, 1824), (40, 1530, 1824), (40, 1531, 1824), (40, 1532, 1824), (40, 1533, 1824), (40, 1534, 1824), (40, 1535, 1824), (40, 1536, 1824), (40, 1537, 1823), (40, 1538, 1823), (41, 1539, 1822), (41, 1540, 1822), (41, 1541, 1822), (41, 1542, 1822), (41, 1543, 1822), (41, 1544, 1822), (41, 1545, 1821), (41, 1546, 1821), (41, 1547, 1821), (41, 1548, 1821), (41, 1549, 1821), (41, 1550, 1821), (41, 1551, 1821), (41, 1552, 1821), (41, 1553, 1820), (41, 1554, 1820), (41, 1555, 1820), (41, 1556, 1820), (41, 1557, 1820), (41, 1558, 1820), (41, 1559, 1820), (41, 1560, 1820), (41, 1561, 1819), (41, 1562, 1819), (41, 1563, 1819), (41, 1564, 1819), (41, 1565, 1819), (41, 1566, 1819), (41, 1567, 1819), (41, 1568, 1819), (41, 1569, 1818), (41, 1570, 1818), (41, 1571, 1818), (41, 1572, 1818), (41, 1573, 1818), (41, 1574, 1818), (41, 1575, 1818), (41, 1576, 1818), (41, 1577, 1817), (41, 1578, 1817), (41, 1579, 1817), (41, 1580, 1817), (41, 1581, 1817), (41, 1582, 1817), (41, 1583, 1817), (41, 1584, 1816), (42, 1585, 1815), (42, 1586, 1815), (42, 1587, 1815), (42, 1588, 1815), (42, 1589, 1815), (42, 1590, 1815), (42, 1591, 1815), (42, 1592, 1814), (42, 1593, 1814), (42, 1594, 1814), (42, 1595, 1814), (42, 1596, 1814), (42, 1597, 1814), (42, 1598, 1814), (42, 1599, 1814), (42, 1600, 1813), (42, 1601, 1813), (42, 1602, 1813), (42, 1603, 1813), (41, 1604, 1814), (41, 1605, 1814), (41, 1606, 1814), (41, 1607, 1814), (41, 1608, 1814), (41, 1609, 1813), (41, 1610, 1813), (41, 1611, 1813), (41, 1612, 1813), (41, 1613, 1813), (41, 1614, 1813), (41, 1615, 1813), (41, 1616, 1813), (41, 1617, 1812), (41, 1618, 1812), (41, 1619, 1812), (41, 1620, 1812), (41, 1621, 1812), (41, 1622, 1812), (41, 1623, 1812), (41, 1624, 1812), (41, 1625, 1812), (41, 1626, 1811), (40, 1627, 1812), (40, 1628, 1812), (40, 1629, 1812), (40, 1630, 1812), (40, 1631, 1812), (40, 1632, 1812), (40, 1633, 1812), (40, 1634, 1811), (40, 1635, 1811), (40, 1636, 1811), (40, 1637, 1811), (40, 1638, 1811), (40, 1639, 1811), (40, 1640, 1811), (40, 1641, 1810), (40, 1642, 1810), (40, 1643, 1810), (40, 1644, 1810), (40, 1645, 1810), (40, 1646, 1810), (40, 1647, 1810), (40, 1648, 1810), (40, 1649, 1809), (40, 1650, 1809), (39, 1651, 1810), (39, 1652, 1810), (39, 1653, 1810), (39, 1654, 1810), (39, 1655, 1810), (39, 1656, 1809), (39, 1657, 1809), (39, 1658, 1809), (39, 1659, 1809), (39, 1660, 1809), (39, 1661, 1809), (39, 1662, 1809), (39, 1663, 1808), (39, 1664, 1808), (39, 1665, 1808), (39, 1666, 1808), (39, 1667, 1808), (39, 1668, 1808), (39, 1669, 1808), (39, 1670, 1807), (39, 1671, 1807), (39, 1672, 1807), (39, 1673, 1807), (39, 1674, 1807), (39, 1675, 1807), (39, 1676, 1806), (39, 1677, 1806), (39, 1678, 1806), (40, 1679, 1805), (40, 1680, 1804), (40, 1681, 1804), (40, 1682, 1804), (40, 1683, 1803), (40, 1684, 1803), (41, 1685, 1802), (41, 1686, 1802), (41, 1687, 1801), (41, 1688, 1801), (41, 1689, 1801), (41, 1690, 1801), (41, 1691, 1800), (42, 1692, 1799), (42, 1693, 1799), (42, 1694, 1798), (42, 1695, 1798), (42, 1696, 1798), (42, 1697, 1798), (43, 1698, 1796), (43, 1699, 1796), (43, 1700, 1796), (43, 1701, 1795), (43, 1702, 1795), (44, 1703, 1794), (44, 1704, 1793), (44, 1705, 1793), (44, 1706, 1793), (44, 1707, 1793), (44, 1708, 1792), (45, 1709, 1791), (45, 1710, 1791), (45, 1711, 1790), (45, 1712, 1790), (45, 1713, 1790), (45, 1714, 1789), (46, 1715, 1788), (46, 1716, 1788), (46, 1717, 1787), (46, 1718, 1787), (46, 1719, 1787), (47, 1720, 1785), (47, 1721, 1785), (47, 1722, 1784), (47, 1723, 1784), (47, 1724, 1784), (48, 1725, 1782), (48, 1726, 1782), (48, 1727, 1782), (48, 1728, 1781), (48, 1729, 1781), (49, 1730, 1779), (49, 1731, 1779), (49, 1732, 1779), (49, 1733, 1778), (50, 1734, 1777), (50, 1735, 1776), (50, 1736, 1776), (50, 1737, 1776), (50, 1738, 1775), (51, 1739, 1774), (51, 1740, 1773), (51, 1741, 1773), (51, 1742, 1772), (51, 1743, 1772), (52, 1744, 1771), (52, 1745, 1770), (52, 1746, 1770), (52, 1747, 1769), (52, 1748, 1769), (53, 1749, 1767), (53, 1750, 1767), (53, 1751, 1767), (53, 1752, 1766), (53, 1753, 1766), (53, 1754, 1765), (53, 1755, 1765), (53, 1756, 1765), (53, 1757, 1764), (53, 1758, 1764), (53, 1759, 1763), (53, 1760, 1763), (53, 1761, 1763), (53, 1762, 1762), (53, 1763, 1762), (53, 1764, 1762), (53, 1765, 1761), (53, 1766, 1761), (53, 1767, 1760), (53, 1768, 1760), (53, 1769, 1760), (53, 1770, 1759), (53, 1771, 1759), (53, 1772, 1759), (53, 1773, 1758), (53, 1774, 1758), (53, 1775, 1758), (53, 1776, 1757), (53, 1777, 1757), (53, 1778, 1757), (53, 1779, 1756), (53, 1780, 1756), (53, 1781, 1756), (53, 1782, 1755), (53, 1783, 1755), (53, 1784, 1755), (53, 1785, 1754), (53, 1786, 1754), (53, 1787, 1754), (53, 1788, 1753), (53, 1789, 1753), (53, 1790, 1753), (53, 1791, 1753), (53, 1792, 1752), (53, 1793, 1752), (53, 1794, 1752), (53, 1795, 1751), (53, 1796, 1751), (53, 1797, 1751), (53, 1798, 1750), (53, 1799, 1750), (53, 1800, 1750), (53, 1801, 1750), (53, 1802, 1749), (53, 1803, 1749), (53, 1804, 1749), (53, 1805, 1748), (53, 1806, 1748), (53, 1807, 1748), (54, 1808, 1747), (54, 1809, 1746), (54, 1810, 1746), (54, 1811, 1746), (54, 1812, 1745), (54, 1813, 1745), (54, 1814, 1745), (54, 1815, 1745), (54, 1816, 1744), (54, 1817, 1744), (54, 1818, 1744), (54, 1819, 1744), (54, 1820, 1743), (54, 1821, 1743), (54, 1822, 1743), (54, 1823, 1742), (54, 1824, 1742), (54, 1825, 1742), (55, 1826, 1741), (55, 1827, 1740), (56, 1828, 1739), (56, 1829, 1739), (57, 1830, 1737), (57, 1831, 1737), (58, 1832, 1736), (58, 1833, 1735), (58, 1834, 1735), (59, 1835, 1734), (59, 1836, 1733), (60, 1837, 1732), (60, 1838, 1732), (61, 1839, 1730), (61, 1840, 1730), (62, 1841, 1729), (62, 1842, 1728), (63, 1843, 1727), (63, 1844, 1727), (64, 1845, 1725), (64, 1846, 1725), (65, 1847, 1723), (65, 1848, 1723), (65, 1849, 1723), (66, 1850, 1721), (66, 1851, 1721), (67, 1852, 1720), (67, 1853, 1719), (68, 1854, 1718), (68, 1855, 1718), (69, 1856, 1716), (69, 1857, 1716), (70, 1858, 1714), (70, 1859, 1714), (70, 1860, 1714), (71, 1861, 1712), (71, 1862, 1712), (72, 1863, 1711), (72, 1864, 1710), (73, 1865, 1709), (73, 1866, 1708), (74, 1867, 1707), (74, 1868, 1707), (74, 1869, 1706), (75, 1870, 1705), (75, 1871, 1704), (76, 1872, 1703), (76, 1873, 1703), (77, 1874, 1701), (77, 1875, 1701), (78, 1876, 1699), (78, 1877, 1699), (78, 1878, 1699), (79, 1879, 1697), (79, 1880, 1697), (80, 1881, 1695), (80, 1882, 1695), (81, 1883, 1693), (81, 1884, 1693), (81, 1885, 1693), (82, 1886, 1691), (82, 1887, 1691), (83, 1888, 1689), (83, 1889, 1689), (84, 1890, 1687), (84, 1891, 1687), (84, 1892, 1687), (85, 1893, 1685), (85, 1894, 1685), (86, 1895, 1683), (86, 1896, 1683), (86, 1897, 1682), (87, 1898, 1681), (87, 1899, 1680), (88, 1900, 1679), (88, 1901, 1678), (88, 1902, 1678), (89, 1903, 1676), (89, 1904, 1676), (90, 1905, 1674), (90, 1906, 1674), (90, 1907, 1673), (91, 1908, 1672), (91, 1909, 1671), (91, 1910, 1671), (92, 1911, 1669), (92, 1912, 1669), (93, 1913, 1667), (93, 1914, 1667), (93, 1915, 1666), (94, 1916, 1665), (94, 1917, 1664), (95, 1918, 1663), (95, 1919, 1662), (96, 1920, 1661), (96, 1921, 1660), (97, 1922, 1658), (97, 1923, 1658), (97, 1924, 1657), (98, 1925, 1656), (98, 1926, 1655), (99, 1927, 1653), (99, 1928, 1653), (100, 1929, 1651), (100, 1930, 1651), (101, 1931, 1649), (101, 1932, 1648), (102, 1933, 1647), (102, 1934, 1646), (103, 1935, 1644), (103, 1936, 1644), (104, 1937, 1642), (104, 1938, 1641), (105, 1939, 1640), (106, 1940, 1638), (106, 1941, 1637), (107, 1942, 1635), (107, 1943, 1635), (108, 1944, 1633), (109, 1945, 1631), (109, 1946, 1630), (110, 1947, 1628), (110, 1948, 1628), (111, 1949, 1626), (112, 1950, 1624), (112, 1951, 1623), (113, 1952, 1622), (114, 1953, 1620), (114, 1954, 1619), (115, 1955, 1617), (116, 1956, 1615), (116, 1957, 1615), (117, 1958, 1613), (118, 1959, 1611), (119, 1960, 1610), (119, 1961, 1609), (120, 1962, 1607), (121, 1963, 87), (210, 1963, 1516), (122, 1964, 73), (216, 1964, 1510), (122, 1965, 62), (223, 1965, 1502), (123, 1966, 51), (230, 1966, 1494), (124, 1967, 40), (237, 1967, 1487), (125, 1968, 31), (245, 1968, 1478), (126, 1969, 21), (252, 1969, 1470), (127, 1970, 13), (259, 1970, 1462), (128, 1971, 4), (267, 1971, 1454), (274, 1972, 1446), (282, 1973, 1437), (290, 1974, 1429), (295, 1975, 1423), (299, 1976, 1418), (304, 1977, 1412), (309, 1978, 1406), (314, 1979, 1400), (320, 1980, 1393), (325, 1981, 1387), (331, 1982, 1380), (338, 1983, 1372), (344, 1984, 1365), (351, 1985, 1357), (358, 1986, 1349), (366, 1987, 1340), (372, 1988, 1332), (376, 1989, 1327), (380, 1990, 1322), (384, 1991, 1317), (388, 1992, 1311), (393, 1993, 1305), (397, 1994, 1300), (401, 1995, 1295), (406, 1996, 1288), (410, 1997, 1283), (415, 1998, 1277), (420, 1999, 1270), (425, 2000, 1264), (429, 2001, 1258), (435, 2002, 1251), (440, 2003, 1244), (445, 2004, 1238), (450, 2005, 1231), (455, 2006, 1224), (460, 2007, 1218), (465, 2008, 1211), (470, 2009, 1204), (475, 2010, 1198), (480, 2011, 1191), (486, 2012, 1183), (491, 2013, 1176), (496, 2014, 1169), (502, 2015, 1161), (507, 2016, 1094), (513, 2017, 1065), (519, 2018, 1055), (524, 2019, 1046), (530, 2020, 1036), (536, 2021, 1026), (541, 2022, 1017), (546, 2023, 1008), (550, 2024, 1000), (555, 2025, 992), (560, 2026, 983), (565, 2027, 974), (569, 2028, 967), (574, 2029, 958), (578, 2030, 951), (582, 2031, 943), (587, 2032, 935), (591, 2033, 928), (595, 2034, 920), (599, 2035, 913), (603, 2036, 906), (607, 2037, 899), (611, 2038, 892), (614, 2039, 883), (617, 2040, 869), (619, 2041, 857), (622, 2042, 844), (624, 2043, 832), (627, 2044, 819), (629, 2045, 808), (631, 2046, 796), (634, 2047, 785), (636, 2048, 780), (638, 2049, 775), (640, 2050, 769), (642, 2051, 764), (644, 2052, 759), (645, 2053, 755), (647, 2054, 750), (649, 2055, 745), (651, 2056, 740), (652, 2057, 736), (654, 2058, 731), (656, 2059, 725), (658, 2060, 720), (660, 2061, 715), (662, 2062, 709), (664, 2063, 703), (666, 2064, 698), (669, 2065, 691), (671, 2066, 685), (673, 2067, 679), (675, 2068, 673), (678, 2069, 666), (680, 2070, 660), (683, 2071, 652), (685, 2072, 644), (688, 2073, 636), (691, 2074, 628), (694, 2075, 620), (698, 2076, 610), (703, 2077, 599), (707, 2078, 590), (712, 2079, 579), (716, 2080, 569), (721, 2081, 558), (725, 2082, 548), (729, 2083, 538), (734, 2084, 527), (738, 2085, 517), (742, 2086, 508), (747, 2087, 497), (751, 2088, 488), (755, 2089, 479), (759, 2090, 470), (763, 2091, 462), (768, 2092, 452), (772, 2093, 443), (775, 2094, 436), (779, 2095, 427), (782, 2096, 420), (785, 2097, 413), (789, 2098, 404), (792, 2099, 397), (795, 2100, 390), (798, 2101, 383), (802, 2102, 376), (805, 2103, 370), (808, 2104, 365), (811, 2105, 359), (814, 2106, 354), (817, 2107, 349), (821, 2108, 342), (824, 2109, 337), (827, 2110, 332), (830, 2111, 327), (833, 2112, 322), (836, 2113, 317), (839, 2114, 312), (842, 2115, 307), (845, 2116, 302), (848, 2117, 297), (851, 2118, 293), (854, 2119, 288), (858, 2120, 282), (861, 2121, 277), (864, 2122, 273), (866, 2123, 269), (869, 2124, 264), (872, 2125, 260), (875, 2126, 255), (877, 2127, 251), (880, 2128, 247), (883, 2129, 242), (886, 2130, 237), (890, 2131, 231), (893, 2132, 226), (896, 2133, 221), (899, 2134, 216), (903, 2135, 210), (906, 2136, 204), (909, 2137, 199), (913, 2138, 193), (917, 2139, 187), (920, 2140, 181), (924, 2141, 175), (928, 2142, 166), (932, 2143, 155), (936, 2144, 143), (946, 2145, 125), (956, 2146, 107), (966, 2147, 89), (977, 2148, 69), (988, 2149, 50), (1000, 2150, 29), (1012, 2151, 8)], ['1000,2150,935,2143,771,2092,694,2075,610,2037,344,1984,215,1963,128,1971,54,1825,39,1678,39,1455,29,1245,28,875,21,698,27,541,38,462,92,312,117,279,206,209,286,181,370,133,523,124,1426,124,1604,132,1680,145,1823,193,1904,206,2002,286,2094,411,2150,539,2168,613,2171,718,2164,841,2132,905,2113,991,2080,1069,2028,1139,1930,1370,1878,1446,1845,1675,1789,1844,1756,1920,1718,1974,1662,2015,1578,2016,1496,2039,1419,2046,1339,2070,1181,2100,1093,2142'])], 'temp/1746624966_1251121_917877156_a9c2d4b99270c9302def4ed40606e685.jpg']} nb pixel non reg : 3692295 nb pixel common : 3686864 proportion of common points : 0.9985290991104449 [('test release memory', 'SUCCESS', True), ('test detect objet', 'SUCCESS', True), ('test polygone', 'SUCCESS', True)] res_total : True #&_# TEST SUCCEEDED #&_# : tests/mask_test #&_# /home/admin/workarea/git/Velours/python/tests/python_tests.py refs/heads/master_6b796098f0a7c88b7d6a90fb4c0df56eec821fbf SQL :INSERT INTO MTRAdmin.monitor_sys (name, type, server, version_code, result_str, result_bool, lien , test_group ,test_name) VALUES ('python_test3','1','marlene','refs/heads/master_6b796098f0a7c88b7d6a90fb4c0df56eec821fbf','{"mask_detection": "success"}','1','http://marlene.fotonower-preprod.com/job/2025/May/07052025/python_test3//data_2/data_log/job/2025/May/07052025/python_test3/log-python3----short_python3--v--marlene-15:35:00.txt','mask_detection','unknown'); #&_# END OF TEST #&_# : tests/mask_test #&_# #&_# BEGIN OF TEST : tests/datou_test #&_# /home/admin/workarea/git/Velours/python/tests/datou_test.py Datou All Test python version used : 3 ############################### TEST sam ################################ TEST SAM Inside batchDatouExec : verbose : True ##### chargement datou SELECT name, created_at,limit_max FROM MTRDatou.mtr_datou WHERE id=4573 SELECT mtd.id, mtdt.`type`, mtd.`param`, mtd.param_json, mtdt.nb_input, mtdt.nb_output, mtdt.prod, mtdt.is_local, mtdt.is_datou_depend, mtdt.is_photo_id_local FROM MTRDatou.mtr_datou_step mtd, MTRDatou.mtr_datou_step_types mtdt WHERE mtdt.`id`=mtd.`type` AND mtd.mtd_id=4573 SELECT mtd.id, mtd.mtd_id, mdsdt.id, mdsdt.name, mdsdt.description, msid.output_or_input, msid.data_order_id, mdsdt.type FROM MTRDatou.mtr_datou_step mtd, MTRDatou.mtr_datou_steptype_io_datatypes msid, MTRDatou.mtr_datou_step_data_types mdsdt WHERE mtd.`type`=msid.`mtr_datou_step_type` AND mtd.mtd_id= 4573 AND msid.data_type=mdsdt.id SELECT mts_id_output, id_output, mts_id_input, id_input FROM MTRDatou.mtr_datou_step_by_step WHERE mtd_id=4573 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! no param json to modify List Step Type Loaded in datou : sam list_input_json : [] ##### fin chargement datou ##### chargement data ##### Call load_data_input : nb_thread : 5 origin SELECT photo_id, url FROM MTRBack.photos ph WHERE photo_id IN (1189321094) Found this number of photos: 1 ##### Call download_photos : nb_thread : 5 begin to download photo : 1189321094 download finish for photo 1189321094 we have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB ##### After download_photos length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 ##### After load_data_input time to download the photos : 0.17729854583740234 #### fin chargement data Blocking on flush ? No conitnuing About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! WARNING : we have an input that is not a photo, we should get rid of it Calling datou_exec Inside datou_exec : verbose : True number of steps : 1 step1:sam Wed May 7 15:36:31 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec After prepare type args : Here we display some param of map_info ! map_filenames : {'temp/1746624991_1251121_1189321094_9626af7f95d010f2a4fd524688d4ea22_76896585.png': 1189321094} map_photo_id_path_extension : {1189321094: {'path': 'temp/1746624991_1251121_1189321094_9626af7f95d010f2a4fd524688d4ea22_76896585.png', 'extension': 'png'}} map_subphoto_mainphoto : {} Beginning of datou step sam ! pht : 4677 Inside sam : nb paths : 1 (640, 960, 3) ERROR in datou_step_exec, will save and exit ! CUDA out of memory. Tried to allocate 768.00 MiB (GPU 0; 10.76 GiB total capacity; 443.59 MiB already allocated; 230.88 MiB free; 530.00 MiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF File "/home/admin/workarea/git/Velours/python/mtr/datou/datou_lib.py", line 2329, in datou_exec output = datou_step_exec(sNext, args, cache, context, map_info, verbose, mtr_user_id) File "/home/admin/workarea/git/Velours/python/mtr/datou/datou_lib.py", line 2430, in datou_step_exec return lib_process.datou_step_sam(param, json_param, args, cache, context, map_info, verbose) File "/home/admin/workarea/git/Velours/python/mtr/datou/lib_step_exec/lib_step_process.py", line 396, in datou_step_sam masks = mask_generator.generate(image) File "/home/admin/.local/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, **kwargs) File "/home/admin/workarea/install/segment-anything/segment_anything/automatic_mask_generator.py", line 163, in generate mask_data = self._generate_masks(image) File "/home/admin/workarea/install/segment-anything/segment_anything/automatic_mask_generator.py", line 206, in _generate_masks crop_data = self._process_crop(image, crop_box, layer_idx, orig_size) File "/home/admin/workarea/install/segment-anything/segment_anything/automatic_mask_generator.py", line 236, in _process_crop self.predictor.set_image(cropped_im) File "/home/admin/workarea/install/segment-anything/segment_anything/predictor.py", line 60, in set_image self.set_torch_image(input_image_torch, image.shape[:2]) File "/home/admin/.local/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, **kwargs) File "/home/admin/workarea/install/segment-anything/segment_anything/predictor.py", line 89, in set_torch_image self.features = self.model.image_encoder(input_image) File "/home/admin/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, **kwargs) File "/home/admin/workarea/install/segment-anything/segment_anything/modeling/image_encoder.py", line 112, in forward x = blk(x) File "/home/admin/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, **kwargs) File "/home/admin/workarea/install/segment-anything/segment_anything/modeling/image_encoder.py", line 174, in forward x = self.attn(x) File "/home/admin/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, **kwargs) File "/home/admin/workarea/install/segment-anything/segment_anything/modeling/image_encoder.py", line 231, in forward attn = (q * self.scale) @ k.transpose(-2, -1) [1189321094] map_info['map_portfolio_photo'] : {} final : True mtd_id 4573 list_pids : [1189321094] begin to insert list_values into mtr_datou_result : length of list_values in save_final : 1 insert ignore into MTRPhoto.mtr_datou_result (mtd_id, mtr_portfolio_id,mtr_photo_id,result,result_long,result_double,hashtag_id,proba, mtr_current_id) values (%s,%s,%s,%s,%s,%s,%s,%s,%s) on duplicate key update mtr_portfolio_id = mtr_portfolio_id list_values : [('4573', None, '1189321094', "[>, , , , , 'CUDA out of memory. Tried to allocate 768.00 MiB (GPU 0; 10.76 GiB total capacity; 443.59 MiB already allocated; 230.88 MiB free; 530.00 MiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF']", '-1', '-1.0', '501120777', '1.0', None)] time used for this insertion : 0.014345407485961914 save_final ERROR in last step sam, CUDA out of memory. Tried to allocate 768.00 MiB (GPU 0; 10.76 GiB total capacity; 443.59 MiB already allocated; 230.88 MiB free; 530.00 MiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF time spend for datou_step_exec : 6.506886959075928 time spend to save output : 0.8979043960571289 total time spend for step 0 : 7.404791355133057 need to delete datou_research and reload, so keep current state 1 need to delete datou_research and reload, so keep current state 1 need to delete datou_research and reload, so keep current state 1 caffe_path_current : About to save ! 2 After save, about to update current ! datou_cur_ids : [] len(datou.list_steps) : 1 output : None ERROR nb objects espect : 98 nb_objects detect : 0 ERROR sam FAILED ############################### TEST frcnn ################################ test frcnn Inside batchDatouExec : verbose : True ##### chargement datou SELECT name, created_at,limit_max FROM MTRDatou.mtr_datou WHERE id=4184 SELECT mtd.id, mtdt.`type`, mtd.`param`, mtd.param_json, mtdt.nb_input, mtdt.nb_output, mtdt.prod, mtdt.is_local, mtdt.is_datou_depend, mtdt.is_photo_id_local FROM MTRDatou.mtr_datou_step mtd, MTRDatou.mtr_datou_step_types mtdt WHERE mtdt.`id`=mtd.`type` AND mtd.mtd_id=4184 SELECT mtd.id, mtd.mtd_id, mdsdt.id, mdsdt.name, mdsdt.description, msid.output_or_input, msid.data_order_id, mdsdt.type FROM MTRDatou.mtr_datou_step mtd, MTRDatou.mtr_datou_steptype_io_datatypes msid, MTRDatou.mtr_datou_step_data_types mdsdt WHERE mtd.`type`=msid.`mtr_datou_step_type` AND mtd.mtd_id= 4184 AND msid.data_type=mdsdt.id SELECT mts_id_output, id_output, mts_id_input, id_input FROM MTRDatou.mtr_datou_step_by_step WHERE mtd_id=4184 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! no param json to modify List Step Type Loaded in datou : frcnn list_input_json : [] ##### fin chargement datou ##### chargement data ##### Call load_data_input : nb_thread : 5 origin SELECT photo_id, url FROM MTRBack.photos ph WHERE photo_id IN (917754606) Found this number of photos: 1 ##### Call download_photos : nb_thread : 5 begin to download photo : 917754606 download finish for photo 917754606 we have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB ##### After download_photos length of list_filenames : 1 ; length of list_pids : 1 ; length of list_args : 1 ##### After load_data_input time to download the photos : 0.15955233573913574 #### fin chargement data Blocking on flush ? No conitnuing About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : True number of steps : 1 step1:frcnn Wed May 7 15:36:39 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec After prepare type args : Here we display some param of map_info ! map_filenames : {'temp/1746624998_1251121_917754606_35f3c9ae49686a6be16030c6ec25c9ee.jpg': 917754606} map_photo_id_path_extension : {917754606: {'path': 'temp/1746624998_1251121_917754606_35f3c9ae49686a6be16030c6ec25c9ee.jpg', 'extension': 'jpg'}} map_subphoto_mainphoto : {} Beginning of datou step Faster rcnn ! classes : ['background', 'plaque'] pht : 4370 caffemodel_name (should be vgg16_immat_307 but not used because net loaded outside in the fonction) : {'id': 3375, 'mtr_user_id': 31, 'name': 'detection_plaque_valcor_010622', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'background,plaque', 'svm_portfolios_learning': '0,0', 'photo_hashtag_type': 4370, 'photo_desc_type': 5676, 'type_classification': 'caffe_faster_rcnn', 'hashtag_id_list': '0,0'} To loadFromThcl() model_param file didn't exist model_name : detection_plaque_valcor_010622 model_type : caffe_faster_rcnn list file need : ['caffemodel', 'test.prototxt'] file exist in s3 : ['caffemodel', 'test.prototxt'] file manque in s3 : [] WARNING: Logging before InitGoogleLogging() is written to STDERR F0507 15:36:42.124058 1251121 syncedmem.cpp:71] Check failed: error == cudaSuccess (2 vs. 0) out of memory *** Check failure stack trace: *** Command terminated by signal 6 38.79user 24.89system 1:18.28elapsed 81%CPU (0avgtext+0avgdata 3504196maxresident)k 3284080inputs+4608outputs (6241major+3019724minor)pagefaults 0swaps