python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 4189' -s datou_current_4189 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 1898325 load datou : 4189 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? load thcls load pdts Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec no input labels no input values updating current state to 1 list_input_json: {} Date 2026-04-24 10:30:30.417749 Current got : datou_id : 4189, datou_cur_ids : ['4377144'] with mtr_portfolio_ids : ['31020520'] and first list_photo_ids : [] new path : /proc/1898325/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : split_time_score over limit max, limiting to limit_max 100 list_input_json : {} origin We have 1 , we have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB time to download the photos : 0.02144145965576172 About to test input to load Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 1 step1:split_time_score Fri Apr 24 10:30:30 2026 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec begin split time score 2022-04-13 10:29:59 0 TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 3379, 'mtr_user_id': 31, 'name': 'learn_classif_flux_maj_generique_effnet_v2_s_02062022', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'aluminium,ela,film_pedb,flux_dev,jrm,pcm,pcnc,pehd_pp,pet_clair,refus,tapis_vide', 'svm_portfolios_learning': '5515864,5515840,5515844,5515850,6244400,6237996,6237998,5515847,5515841,5515868,5515866', 'photo_hashtag_type': 4374, 'photo_desc_type': 5680, 'type_classification': 'tf_classification2', 'hashtag_id_list': '493546845,492741797,2107760237,2107760238,495916461,560181804,1284539308,2107760239,2107755846,538914404,2107748999'}] thcls : [{'id': 3513, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2_tf', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 4557, 'photo_desc_type': 5767, 'type_classification': 'tf_classification2', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('10', 17),) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {1: 11} L24042026 31020520 Nombre de photos uploadées : 17 / 23040 (0%) 24042026 31020520 Nombre de photos taguées (types de déchets): 11 / 17 (64%) 24042026 31020520 Nombre de photos taguées (volume) : 11 / 17 (64%) elapsed_time : load_data_split_time_score 2.1457672119140625e-06 elapsed_time : order_list_meta_photo_and_scores 1.430511474609375e-05 LLLLLLLLLLLLLLLLL?????? elapsed_time : fill_and_build_computed_from_old_data 0.0011675357818603516 LCatched exception ! Connect or reconnect ! Catched exception ! Connect or reconnect ! elapsed_time : insert_dashboard_record_day_entry 0.24692130088806152 LLLLLLLLLLLLLLLLLL Creating list_photo_by_hashtags Hashtag is None Hashtag is None Hashtag is None Hashtag is None Hashtag is None Hashtag is None elapsed_time : list_photo_by_hashtags 0.022022008895874023 Creating list_photo_total elapsed_time : select_descriptors 1.0881006717681885 24042026 31020520 Nombre de photos avec descriptors (type 5680) : 11 / 17 (64%) ERROR : Hum hum, what can we do for different size of descriptors (ignore the difference ) : 0 vs 1280 photo_id : 1416344376 photo_id_prec : 0 L0:00:00|ON:LLLLLLLLLLERROR : Hum hum, what can we do for different size of descriptors (ignore the difference ) : 1280 vs 0 photo_id : 1416347723 photo_id_prec : 1416345890 LMissing descriptors for photos 1416347723 and 1416347835 LMissing descriptors for photos 1416347835 and 1416347943 LMissing descriptors for photos 1416347943 and 1416348048 LMissing descriptors for photos 1416348048 and 1416348138 LMissing descriptors for photos 1416348138 and 1416348139 L 24042026 Removing 0 photos because of the 'same image' condition Total on : 0 Total off : 0.0 list_time_off Warning in study_and_display_distrib_list : min=max : 0.0 0.0 dist_desc begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 17 time used for this insertion : 0.014721870422363281 photos_removed : len 0 elapsed_time : remove_photo_duplicate 0.04871702194213867 To do, maybe not at the correct place ! .L.L.L.L.L.L.L.L.L.L.L.L.L.L.L.L.Lforce hashtag to JRM elapsed_time : CREATE_PORT_BATCH_BY_HOUR 0.006146669387817383 NUMBER BATCH : 1 list_ponderation used : [1e-05, 1e-05, 1e-05, 1e-05, 1e-05] , list_hashtag_class_create_as_list : ['jrm'] LLLLLLLLLLLLLLLLLLLLLLLLLLLLLERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info ERROR missing amount info LLresult_one_balle_Type_JRM:{'day': '24042026', 'map_nb_amount': {0: 3, 1: 7, 2: 1, 3: 0, 4: 0}, 'map_time_amount': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0}, 'duration': 119.41723489761353, 'nb_balles_papier': 0.00011, 'begin_time_port': 'image_24042026_10_00_02_763674_0000.jpg'} Production hashtag (incorrect ponderation at 20-10-18) : 0.00011 We have rejected 0 photos because of the batch_size condition ! NUMBER BATCH list_of_portfolios_to_create : 1 list_same_port_ids : [31020520] find same portfolio which already exist 31020520 , we will use it # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 12489 mask_detect is not consistent : 3 used against 2 in the step definition ! WARNING : number of outputs for step 12499 brightness is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 12500 blur_detection is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 12492 crop_condition is not consistent : 3 used against 2 in the step definition ! Step 12492 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! WARNING : number of outputs for step 12493 merge_mask_thcl_custom is not consistent : 4 used against 2 in the step definition ! WARNING : number of inputs for step 12494 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 12494 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 12502 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 12502 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of inputs for step 12496 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 12496 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 12495 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 12495 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! WARNING : type of output 2 of step 12489 doesn't seem to be define in the database( WARNING : type of input 2 of step 12492 doesn't seem to be define in the database( WARNING : output 1 of step 12489 have datatype=2 whereas input 1 of step 12493 have datatype=7 WARNING : type of output 2 of step 12493 doesn't seem to be define in the database( WARNING : type of input 1 of step 12494 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 3 of step 12493 doesn't seem to be define in the database( WARNING : type of input 1 of step 12496 doesn't seem to be define in the database( WARNING : type of output 1 of step 12496 doesn't seem to be define in the database( WARNING : type of input 3 of step 12495 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 2 of step 12489 doesn't seem to be define in the database( WARNING : type of input 1 of step 12500 doesn't seem to be define in the database( WARNING : type of output 2 of step 12489 doesn't seem to be define in the database( WARNING : type of input 1 of step 12499 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 12496 have datatype=10 whereas input 3 of step 12498 have datatype=6 WARNING : type of input 5 of step 12498 doesn't seem to be define in the database( WARNING : output 0 of step 12501 have datatype=11 whereas input 5 of step 12498 have datatype=None WARNING : output 0 of step 12496 have datatype=10 whereas input 0 of step 12501 have datatype=18 WARNING : type of input 2 of step 12502 doesn't seem to be define in the database( WARNING : output 1 of step 12494 have datatype=7 whereas input 2 of step 12502 have datatype=None WARNING : type of output 3 of step 12502 doesn't seem to be define in the database( WARNING : type of input 2 of step 12496 doesn't seem to be define in the database( WARNING : type of output 1 of step 12499 doesn't seem to be define in the database( WARNING : type of input 3 of step 12492 doesn't seem to be define in the database( WARNING : type of output 1 of step 12500 doesn't seem to be define in the database( WARNING : type of input 3 of step 12492 doesn't seem to be define in the database( WARNING : output 0 of step 12493 have datatype=1 whereas input 0 of step 12494 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=31020520 AND mptpi.`type`=3726 To do elapsed_time : count_nb_balles_and_create_portfolio 0.7630336284637451 # DISPLAY ALL COLLECTED DATA : {'24042026': {'nb_upload': 17, 'nb_taggue_class': 11, 'nb_taggue_densite': 11, 'nb_descriptors': 11}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1416348139, 1416348138, 1416348048, 1416347943, 1416347835, 1416347723, 1416345890, 1416345889, 1416345888, 1416345887, 1416345886, 1416345870, 1416344381, 1416344380, 1416344378, 1416344377, 1416344376] Looping around the photos to save general results len do output : 1 /31020520Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416348139', None, None, None, None, None, '4377144') ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416348138', None, None, None, None, None, '4377144') ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416348048', None, None, None, None, None, '4377144') ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416347943', None, None, None, None, None, '4377144') ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416347835', None, None, None, None, None, '4377144') ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416347723', None, None, None, None, None, '4377144') ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416345890', None, None, None, None, None, '4377144') ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416345889', None, None, None, None, None, '4377144') ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416345888', None, None, None, None, None, '4377144') ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416345887', None, None, None, None, None, '4377144') ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416345886', None, None, None, None, None, '4377144') ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416345870', None, None, None, None, None, '4377144') ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416344381', None, None, None, None, None, '4377144') ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416344380', None, None, None, None, None, '4377144') ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416344378', None, None, None, None, None, '4377144') ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416344377', None, None, None, None, None, '4377144') ('4189', None, None, None, None, None, None, None, '4377144') ('4189', '31020520', '1416344376', None, None, None, None, None, '4377144') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 18 time used for this insertion : 0.022477388381958008 save_final save missing photos in datou_result : time spend for datou_step_exec : 2.2679238319396973 time spend to save output : 0.02277970314025879 total time spend for step 1 : 2.290703535079956 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 1 set_done_treatment 1.35user 0.68system 0:06.96elapsed 29%CPU (0avgtext+0avgdata 110700maxresident)k 128inputs+136outputs (6major+52691minor)pagefaults 0swaps