python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 4876' -s datou_current_4876 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 2228987 load datou : 4876 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? load thcls load pdts Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec no input labels no input values updating current state to 1 we have a portfolio with more photos than limit : 1059>1000 please execute split_portfolio.py -i 25799757 -l 1000 size over we load limit photo not treated list_input_json: {} Current got : datou_id : 4876, datou_cur_ids : ['3483426'] with mtr_portfolio_ids : ['25799757'] and first list_photo_ids : [] new path : /proc/2228987/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : split_time_score over limit max, limiting to limit_max 100 list_input_json : {} origin We have 1 , WARNING: data may be incomplete, need to offset and complete ! we have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB time to download the photos : 0.1332247257232666 About to test input to load Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 1 step1:split_time_score Fri Aug 8 09:52:29 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec begin split time score 2022-04-13 10:29:59 0 TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 3379, 'mtr_user_id': 31, 'name': 'learn_classif_flux_maj_generique_effnet_v2_s_02062022', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'aluminium,ela,film_pedb,flux_dev,jrm,pcm,pcnc,pehd_pp,pet_clair,refus,tapis_vide', 'svm_portfolios_learning': '5515864,5515840,5515844,5515850,6244400,6237996,6237998,5515847,5515841,5515868,5515866', 'photo_hashtag_type': 4374, 'photo_desc_type': 5680, 'type_classification': 'tf_classification2', 'hashtag_id_list': '493546845,492741797,2107760237,2107760238,495916461,560181804,1284539308,2107760239,2107755846,538914404,2107748999'}] thcls : [{'id': 3513, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2_tf', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 4557, 'photo_desc_type': 5767, 'type_classification': 'tf_classification2', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('00', 52), ('01', 30), ('02', 40), ('03', 34), ('04', 6), ('05', 314), ('06', 365), ('07', 170), ('08', 47), ('09', 1)) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {1: 625} 08082025 25799757 Nombre de photos uploadées : 68 / 23040 (0%) 08082025 25799757 Nombre de photos taguées (types de déchets): 625 / 68 (919%) 08082025 25799757 Nombre de photos taguées (volume) : 625 / 68 (919%) elapsed_time : load_data_split_time_score 1.9073486328125e-06 elapsed_time : order_list_meta_photo_and_scores 0.0002524852752685547 ??????? elapsed_time : fill_and_build_computed_from_old_data 0.0038301944732666016 Catched exception ! Connect or reconnect ! Catched exception ! Connect or reconnect ! elapsed_time : insert_dashboard_record_day_entry 0.2093825340270996 Creating list_photo_by_hashtags Hashtag is None Hashtag is None Hashtag is None Hashtag is None Hashtag is None Hashtag is None Hashtag is None elapsed_time : list_photo_by_hashtags 0.005866527557373047 photos_removed : len 0 elapsed_time : load_duplicate_info 0.010860681533813477 ***** BEGIN SPLIT TIME ***** ````````````````````````````````````````````````````````````````````list printed: [[0], [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19], [20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58], [59, 60], [61, 62], [63, 64, 65, 66], [67]] forced_hashtag: JRM force hashtag to JRM elapsed_time : SPLIT_TIME 0.006873369216918945 ***** END SPLIT TIME ***** NUMBER BATCH : 7 list_ponderation used : [1e-05, 1e-05, 1e-05, 1e-05, 1e-05] , list_hashtag_class_create_as_list : ['jrm'] Catched exception ! Connect or reconnect ! ERROR in datou_step_exec, will save and exit ! (1213, 'Deadlock found when trying to get lock; try restarting transaction') File "/home/admin/workarea/git/Velours/python/mtr/datou/datou_lib.py", line 2329, in datou_exec output = datou_step_exec(sNext, args, cache, context, map_info, verbose, mtr_user_id) File "/home/admin/workarea/git/Velours/python/mtr/datou/datou_lib.py", line 2513, in datou_step_exec return lib_process.datou_step_split_time_score(param, json_param, args, context, map_info, verbose, mtr_user_id) File "/home/admin/workarea/git/Velours/python/mtr/datou/lib_step_exec/lib_step_process.py", line 2295, in datou_step_split_time_score one_result = split_port_in_batch_balle( File "/home/admin/workarea/git/Velours/python/misc/split_time_score.py", line 816, in split_port_in_batch_balle map_amount_per_hashtag, dict_nb_balles, list_of_portfolios_to_create = count_nb_balles(ds, File "/home/admin/workarea/git/Velours/python/mtr/math_fotonower/timeseries/lib_split_time_score.py", line 2783, in count_nb_balles context.pq.update_text_in_photos(list_photo_id_text, verbose) File "/home/admin/workarea/git/Velours/python/mtr/database_queries/portfolio_queries.py", line 366, in update_text_in_photos nb1 = self.gq.insert_many(query1, list_photo_id_text) File "/home/admin/workarea/git/Velours/python/mtr/database_queries/general_queries.py", line 167, in insert_many cursor.executemany(query, args[:limit]) File "/home/admin/.local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 241, in executemany return self._do_execute_many( File "/home/admin/.local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 275, in _do_execute_many rows += self.execute(sql + postfix) File "/home/admin/.local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 179, in execute res = self._query(mogrified_query) File "/home/admin/.local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 330, in _query db.query(q) File "/home/admin/.local/lib/python3.8/site-packages/MySQLdb/connections.py", line 280, in query _mysql.connection.query(self, query) [1375830339, 1375823652, 1375823351, 1375822654, 1375822566, 1375822282, 1375821967, 1375821650, 1375821632, 1375821415, 1375818234, 1375817610, 1375817545, 1375816918, 1375816856, 1375816801, 1375816751, 1375816611, 1375816587, 1375816561, 1375816537, 1375816511, 1375816486, 1375816386, 1375816351, 1375816327, 1375816300, 1375816271, 1375816242, 1375816087, 1375816081, 1375816036, 1375816035, 1375815990, 1375815983, 1375815944, 1375815939, 1375815892, 1375815889, 1375815843, 1375815841, 1375815320, 1375815275, 1375815227, 1375815182, 1375815135, 1375815108, 1375814957, 1375814954, 1375814952, 1375814950, 1375814948, 1375814946, 1375814944, 1375814942, 1375814940, 1375814937, 1375814921, 1375814900, 1375814892, 1375814871, 1375814859, 1375814827, 1375814820, 1375814575, 1375814562, 1375814545, 1375814451, 1375813306, 1375813302, 1375813297, 1375813289, 1375813279, 1375813223, 1375813219, 1375813216, 1375813214, 1375813213, 1375813179, 1375813177, 1375812211, 1375812173, 1375812169, 1375812166, 1375812153, 1375812145, 1375812125, 1375812037, 1375812017, 1375812003, 1375811995, 1375811991, 1375811987, 1375811715, 1375811708, 1375811689, 1375811685, 1375811617, 1375811613, 1375811576] begin to insert list_values into mtr_datou_result : length of list_values in save_final : 100 time used for this insertion : 0.5170855522155762 save_final ERROR in last step split_time_score, (1213, 'Deadlock found when trying to get lock; try restarting transaction') time spend for datou_step_exec : 0.7086515426635742 time spend to save output : 0.5260307788848877 total time spend for step 0 : 1.234682321548462 need to delete datou_research and reload, so keep current state 1 caffe_path_current : About to save ! 2 After save, about to update current ! 1.34user 0.80system 0:05.41elapsed 39%CPU (0avgtext+0avgdata 103812maxresident)k 1504inputs+24outputs (0major+49157minor)pagefaults 0swaps