python /home/admin/mtr/script_for_cron.py -j default -m 20 -a 'python3 ~/workarea/git/Velours/python/prod/datou.py -j batch_current -C 2743412' -s traitement_sts -M 0 -S 0 -U 100,80,95 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/home/admin/workarea/git/apy', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 3012297 load datou : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : step 0 init_dummy_multi_datou is not linked in the step_by_step architecture ! WARNING : step 1294 init_dummy_multi_datou is not linked in the step_by_step architecture ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : (photo_id, hashtag_id, score_max) was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (x0, y0, x1, y1) was removed should we ? chemin de la photo was removed should we ? load thcls load pdts Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 we have a portfolio with more photos than limit : 11967>1000 please execute split_portfolio.py -i 21477534 -l 1000 size over we load limit photo not treated list_input_json: [] Current got : datou_id : 4323, datou_cur_ids : ['2743412'] with mtr_portfolio_ids : ['21477534'] and first list_photo_ids : [] new path : /proc/3012297/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : split_time_score over limit max, limiting to limit_max 100 list_input_json : [] origin We have 1 , WARNING: data may be incomplete, need to offset and complete ! we have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB time to download the photos : 12.349514722824097 About to test input to load Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 1 step1:split_time_score Fri Apr 11 10:19:37 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec begin split time score 2022-04-13 10:29:59 0 TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 3847, 'mtr_user_id': 31, 'name': 'learn_MM_generique_050224', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'aluminium,ela,emr,film_pedb,flux_dev,jrm,pcm,pcnc,pehd_pp,pet_clair,refus,tapis_vide', 'svm_portfolios_learning': '13096157,13096155,13096163,13096159,13301956,13095886,13096162,13096160,13358264,13096158,5515868,13276803', 'photo_hashtag_type': 4932, 'photo_desc_type': 6032, 'type_classification': 'tf_classification2', 'hashtag_id_list': '493546845,492741797,616987804,2107760237,2107760238,495916461,560181804,1284539308,2107760239,2107755846,538914404,2107748999'}] thcls : [{'id': 3513, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2_tf', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 4557, 'photo_desc_type': 5767, 'type_classification': 'tf_classification2', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('00', 289), ('01', 280), ('02', 236), ('03', 309), ('04', 234), ('05', 279), ('06', 550), ('07', 574), ('08', 585), ('09', 593), ('23', 561), ('22', 544), ('21', 541), ('20', 547), ('19', 586), ('18', 539), ('17', 585), ('16', 586), ('15', 572), ('14', 591), ('13', 606), ('12', 589), ('11', 594), ('10', 597)) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {1: 11967} 17032025 21477534 Nombre de photos uploadées : 11967 / 23040 (51%) 17032025 21477534 Nombre de photos taguées (types de déchets): 11967 / 11967 (100%) 17032025 21477534 Nombre de photos taguées (volume) : 11967 / 11967 (100%) Catched exception ! Connect or reconnect ! ERROR in datou_step_exec, will save and exit ! (1205, 'Lock wait timeout exceeded; try restarting transaction') File "/home/admin/workarea/git/Velours/python/mtr/datou/datou_lib.py", line 2329, in datou_exec output = datou_step_exec(sNext, args, cache, context, map_info, verbose, mtr_user_id) File "/home/admin/workarea/git/Velours/python/mtr/datou/datou_lib.py", line 2513, in datou_step_exec return lib_process.datou_step_split_time_score(param, json_param, args, context, map_info, verbose, mtr_user_id) File "/home/admin/workarea/git/Velours/python/mtr/datou/lib_step_exec/lib_step_process.py", line 2294, in datou_step_split_time_score one_result = split_port_in_batch_balle( File "/home/admin/workarea/git/Velours/python/misc/split_time_score.py", line 345, in split_port_in_batch_balle context.pq.order_portfolio_for_veolia(mtr_portfolio) File "/home/admin/workarea/git/Velours/python/mtr/database_queries/portfolio_queries.py", line 352, in order_portfolio_for_veolia nb = self.gq.insert(query) File "/home/admin/workarea/git/Velours/python/mtr/database_queries/general_queries.py", line 112, in insert number = cursor.execute(query, args) File "/usr/local/lib/python3.8/dist-packages/MySQLdb/cursors.py", line 209, in execute res = self._query(query) File "/usr/local/lib/python3.8/dist-packages/MySQLdb/cursors.py", line 315, in _query db.query(q) File "/usr/local/lib/python3.8/dist-packages/MySQLdb/connections.py", line 239, in query _mysql.connection.query(self, query) [1350161710, 1350161709, 1350161707, 1350161704, 1350161701, 1350161699, 1350161687, 1350161685, 1350161684, 1350161683, 1350161681, 1350161680, 1350161674, 1350161672, 1350161668, 1350161663, 1350161661, 1350161659, 1350161657, 1350161655, 1350161653, 1350161650, 1350161647, 1350161641, 1350161630, 1350161628, 1350161627, 1350161626, 1350161624, 1350161622, 1350161611, 1350161607, 1350161604, 1350161600, 1350161598, 1350161596, 1350161594, 1350161593, 1350161592, 1350161589, 1350161586, 1350161583, 1350161566, 1350161565, 1350161564, 1350161561, 1350161556, 1350161553, 1350161537, 1350161536, 1350161535, 1350161532, 1350161529, 1350161526, 1350161515, 1350161514, 1350161512, 1350161511, 1350161505, 1350161502, 1350161492, 1350161490, 1350161488, 1350161487, 1350161485, 1350161484, 1350161477, 1350161474, 1350161470, 1350161466, 1350161464, 1350161462, 1350161459, 1350161458, 1350161456, 1350161453, 1350161450, 1350161447, 1350161437, 1350161434, 1350161433, 1350161432, 1350161430, 1350161429, 1350161423, 1350161419, 1350161416, 1350161412, 1350161408, 1350161406, 1350161403, 1350161402, 1350161400, 1350161397, 1350161395, 1350161391, 1350161384, 1350161382, 1350161381, 1350161380] begin to insert list_values into mtr_datou_result : length of list_values in save_final : 100 time used for this insertion : 0.658698558807373 save_final ERROR in last step split_time_score, (1205, 'Lock wait timeout exceeded; try restarting transaction') time spend for datou_step_exec : 83.6911187171936 time spend to save output : 0.6624276638031006 total time spend for step 0 : 84.3535463809967 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 1 set_done_treatment 1.11user 0.86system 1:40.24elapsed 1%CPU (0avgtext+0avgdata 122096maxresident)k 80inputs+24outputs (1major+54423minor)pagefaults 0swaps