python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 20 -a ' -a 4883' -s datou_current_4883 -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 3390229 load datou : 4883 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? load thcls load pdts Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 we have a portfolio with more photos than limit : 2247>1000 please execute split_portfolio.py -i 26419647 -l 1000 size over we load limit photo not treated list_input_json: [] Current got : datou_id : 4883, datou_cur_ids : ['3652591'] with mtr_portfolio_ids : ['26419647'] and first list_photo_ids : [] new path : /proc/3390229/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : split_time_score over limit max, limiting to limit_max 100 list_input_json : [] origin We have 1 , WARNING: data may be incomplete, need to offset and complete ! we have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB time to download the photos : 0.028521299362182617 About to test input to load Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 1 step1:split_time_score Tue Sep 2 16:00:28 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec begin split time score 2022-04-13 10:29:59 0 TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 3379, 'mtr_user_id': 31, 'name': 'learn_classif_flux_maj_generique_effnet_v2_s_02062022', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'aluminium,ela,film_pedb,flux_dev,jrm,pcm,pcnc,pehd_pp,pet_clair,refus,tapis_vide', 'svm_portfolios_learning': '5515864,5515840,5515844,5515850,6244400,6237996,6237998,5515847,5515841,5515868,5515866', 'photo_hashtag_type': 4374, 'photo_desc_type': 5680, 'type_classification': 'tf_classification2', 'hashtag_id_list': '493546845,492741797,2107760237,2107760238,495916461,560181804,1284539308,2107760239,2107755846,538914404,2107748999'}] thcls : [{'id': 3513, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2_tf', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 4557, 'photo_desc_type': 5767, 'type_classification': 'tf_classification2', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('00', 41), ('01', 37), ('02', 35), ('03', 34), ('04', 7), ('05', 233), ('06', 253), ('07', 161), ('08', 201), ('09', 106), ('10', 317), ('11', 262), ('12', 28), ('13', 28), ('14', 245), ('15', 259)) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {1: 1611} 02092025 26419647 Nombre de photos uploadées : 505 / 23040 (2%) 02092025 26419647 Nombre de photos taguées (types de déchets): 1611 / 505 (319%) 02092025 26419647 Nombre de photos taguées (volume) : 1611 / 505 (319%) Catched exception ! (1213, 'Deadlock found when trying to get lock; try restarting transaction') Connect or reconnect ! ERROR in datou_step_exec, will save and exit ! (1213, 'Deadlock found when trying to get lock; try restarting transaction') File "/home/admin/workarea/git/Velours/python/mtr/datou/datou_lib.py", line 2329, in datou_exec output = datou_step_exec(sNext, args, cache, context, map_info, verbose, mtr_user_id) File "/home/admin/workarea/git/Velours/python/mtr/datou/datou_lib.py", line 2513, in datou_step_exec return lib_process.datou_step_split_time_score(param, json_param, args, context, map_info, verbose, mtr_user_id) File "/home/admin/workarea/git/Velours/python/mtr/datou/lib_step_exec/lib_step_process.py", line 2295, in datou_step_split_time_score one_result = split_port_in_batch_balle( File "/home/admin/workarea/git/Velours/python/misc/split_time_score.py", line 346, in split_port_in_batch_balle context.pq.order_portfolio_for_veolia(mtr_portfolio) File "/home/admin/workarea/git/Velours/python/mtr/database_queries/portfolio_queries.py", line 355, in order_portfolio_for_veolia nb = self.gq.insert(query) File "/home/admin/workarea/git/Velours/python/mtr/database_queries/general_queries.py", line 120, in insert number = cursor.execute(query, args) File "/home/admin/.local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 179, in execute res = self._query(mogrified_query) File "/home/admin/.local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 330, in _query db.query(q) File "/home/admin/.local/lib/python3.8/site-packages/MySQLdb/connections.py", line 280, in query _mysql.connection.query(self, query) [1380862319, 1380862318, 1380862316, 1380862297, 1380862296, 1380862294, 1380861596, 1380861595, 1380861594, 1380861553, 1380861552, 1380861551, 1380861546, 1380861545, 1380861544, 1380861543, 1380861542, 1380861541, 1380861501, 1380861500, 1380861499, 1380861498, 1380861431, 1380861430, 1380861383, 1380861382, 1380861381, 1380861380, 1380861327, 1380861326, 1380861325, 1380861324, 1380861323, 1380861275, 1380861274, 1380861272, 1380861270, 1380861206, 1380861205, 1380861204, 1380861203, 1380861202, 1380861201, 1380861126, 1380861125, 1380861124, 1380861123, 1380861121, 1380860883, 1380860881, 1380860702, 1380860701, 1380860700, 1380860699, 1380860567, 1380860565, 1380860562, 1380860560, 1380860533, 1380860532, 1380860531, 1380860530, 1380860529, 1380860528, 1380860384, 1380860382, 1380860380, 1380860378, 1380860376, 1380860192, 1380860191, 1380860190, 1380860189, 1380860188, 1380860092, 1380860091, 1380860090, 1380860035, 1380860033, 1380860031, 1380860029, 1380859948, 1380859947, 1380859946, 1380859945, 1380859944, 1380859943, 1380859879, 1380859877, 1380859875, 1380859873, 1380859742, 1380859741, 1380859740, 1380859739, 1380859738, 1380859737, 1380859671, 1380859670, 1380859669] begin to insert list_values into mtr_datou_result : length of list_values in save_final : 100 time used for this insertion : 0.41577911376953125 save_final ERROR in last step split_time_score, (1213, 'Deadlock found when trying to get lock; try restarting transaction') time spend for datou_step_exec : 1.5357849597930908 time spend to save output : 0.4257826805114746 total time spend for step 0 : 1.9615676403045654 need to delete datou_research and reload, so keep current state 1 caffe_path_current : About to save ! 2 After save, about to update current ! 1.50user 0.67system 0:04.97elapsed 43%CPU (0avgtext+0avgdata 103428maxresident)k 976inputs+24outputs (15major+49472minor)pagefaults 0swaps