python /home/admin/mtr/script_for_cron.py -j default -m 20 -a 'python3 ~/workarea/git/Velours/python/prod/datou.py -j batch_current -a 4742 ' -s traitement_4742 -M 0 -S 0 -U 100,80,95 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/home/admin/workarea/git/apy', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 616976 load datou : 4742 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13169 copy_chis is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13170 consolidate_hashtags_from_manual_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13167 rle_unique_nms_with_priority is not consistent : 3 used against 1 in the step definition ! WARNING : number of outputs for step 13167 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13174 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13168 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13171 blur_detection have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13172 brightness have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13173 send_mail_cod have less inputs used (4) than in the step definition (5) : maybe we manage optionnal inputs ! Step 13175 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13169 have datatype=11 whereas input 0 of step 13167 have datatype=2 WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 3 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 1 of step 13167 doesn't seem to be define in the database( WARNING : output 0 of step 13174 have datatype=10 whereas input 3 of step 13173 have datatype=6 WARNING : type of input 1 of step 13174 doesn't seem to be define in the database( WARNING : output 1 of step 13167 have datatype=7 whereas input 1 of step 13174 have datatype=None WARNING : type of output 1 of step 13174 doesn't seem to be define in the database( WARNING : type of input 4 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13169 doesn't seem to be define in the database( WARNING : type of input 1 of step 13170 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : id de la photo (peut être local ou global) was removed should we ? donnée sous forme de nombre was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? load thcls load pdts Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 4742, datou_cur_ids : ['2812250'] with mtr_portfolio_ids : ['22731302'] and first list_photo_ids : [] new path : /proc/616976/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13169 copy_chis is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13170 consolidate_hashtags_from_manual_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13167 rle_unique_nms_with_priority is not consistent : 3 used against 1 in the step definition ! WARNING : number of outputs for step 13167 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13174 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13168 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13171 blur_detection have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13172 brightness have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13173 send_mail_cod have less inputs used (4) than in the step definition (5) : maybe we manage optionnal inputs ! Step 13175 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13169 have datatype=11 whereas input 0 of step 13167 have datatype=2 WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 3 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 1 of step 13167 doesn't seem to be define in the database( WARNING : output 0 of step 13174 have datatype=10 whereas input 3 of step 13173 have datatype=6 WARNING : type of input 1 of step 13174 doesn't seem to be define in the database( WARNING : output 1 of step 13167 have datatype=7 whereas input 1 of step 13174 have datatype=None WARNING : type of output 1 of step 13174 doesn't seem to be define in the database( WARNING : type of input 4 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13169 doesn't seem to be define in the database( WARNING : type of input 1 of step 13170 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : copy_chis, consolidate_hashtags_from_manual_portfolio, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, send_mail_cod, split_time_score over limit max, limiting to limit_max 100 list_input_json : [] origin We have 1 , BFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 5 ; length of list_pids : 5 ; length of list_args : 5 time to download the photos : 1.469677209854126 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 9 step1:copy_chis Fri May 9 14:34:19 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Begin step datou_step_copy_crop batch 1 Loaded 46 chid ids of type : 4855 batch 1 Loaded 46 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : copy_chis we use saveGeneral [1356446088, 1356446064, 1356445493, 1356445457, 1356445406] Looping around the photos to save general results len do output : 5 /1356446088Didn't retrieve data . /1356446064Didn't retrieve data . /1356445493Didn't retrieve data . /1356445457Didn't retrieve data . /1356445406Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356446088', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356446064', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445493', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445457', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445406', None, None, None, None, None, '2812250') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 10 time used for this insertion : 0.016568660736083984 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.09345769882202148 time spend to save output : 0.016947507858276367 total time spend for step 1 : 0.11040520668029785 step2:consolidate_hashtags_from_manual_portfolio Fri May 9 14:34:19 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step consolidate_hashtags_from_manual_portfolio Iterating over portfolio : 22731302 on est dans le IF portfolio mere 26T SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22731302 AND mptpi.`type`=4856 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou')) To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22731302 AND mptpi.`type`=4857 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou')) To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22731302 AND mptpi.`type`=4857 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou')) To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=9745827 AND mptpi.`type`=4856 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou')) To do TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736613 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 272 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 59 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736614 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 272 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 107 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736615 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736616 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 272 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 123 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736617 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 80 chid ids of type : 4855 begin to find the sub_photo_id : batch 1 Loaded 38 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736622 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736624 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736625 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736626 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736627 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736628 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736629 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736631 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 213 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 100 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736632 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 279 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 136 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736634 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 59 chid ids of type : 4855 begin to find the sub_photo_id : batch 1 Loaded 29 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736635 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 279 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 166 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736636 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 272 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 184 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736637 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 272 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 199 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22736638 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 272 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 211 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! To test ! Use context local managing function ! Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : consolidate_hashtags_from_manual_portfolio we use saveGeneral [1356446088, 1356446064, 1356445493, 1356445457, 1356445406] Looping around the photos to save general results len do output : 6 /Tassign value string error string indices must be integers invalid literal for int() with base 10: 'T' begin to insert list_values into mtr_datou_result : length of list_values in save_final : 5 time used for this insertion : 0.016963720321655273 save_final save missing photos in datou_result : time spend for datou_step_exec : 3.793484687805176 time spend to save output : 0.017142772674560547 total time spend for step 2 : 3.8106274604797363 step3:rle_unique_nms_with_priority Fri May 9 14:34:23 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 158 chid ids of type : 4857 seulement à utiliser dans la step consolidation batch 1 Loaded 47 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.03287625312805176 seulement à utiliser dans la step consolidation batch 1 Loaded 49 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.033089637756347656 seulement à utiliser dans la step consolidation batch 1 Loaded 51 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.0328984260559082 seulement à utiliser dans la step consolidation batch 1 Loaded 64 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.0381929874420166 seulement à utiliser dans la step consolidation batch 1 Loaded 7 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.027161598205566406 map_output_result : {1356446088: (0.08213998155228935, 'Should be the crop_list due to order', 0.016648852109876194), 1356446064: (0.08213998155228935, 'Should be the crop_list due to order', 0.18779518621886213), 1356445493: (0.08213998155228935, 'Should be the crop_list due to order', 0.10702379471243852), 1356445457: (0.08213998155228935, 'Should be the crop_list due to order', 0.09923207472026996), 1356445406: (0.08213998155228935, 'Should be the crop_list due to order', 0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1356446088, 1356446064, 1356445493, 1356445457, 1356445406] Looping around the photos to save general results len do output : 5 /1356446088.Didn't retrieve data . /1356446064.Didn't retrieve data . /1356445493.Didn't retrieve data . /1356445457.Didn't retrieve data . /1356445406.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356446088', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356446064', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445493', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445457', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445406', None, None, None, None, None, '2812250') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 15 time used for this insertion : 0.015860795974731445 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.7306721210479736 time spend to save output : 0.01618504524230957 total time spend for step 3 : 1.7468571662902832 step4:ventilate_hashtags_in_portfolio Fri May 9 14:34:24 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 22731302 get user id for portfolio 22731302 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22731302 AND mptpi.`type`=4857 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou','pet_fonce','pet_opaque','barquette_opaque','film_plastique','autre_emballage','sac')) AND mptpi.`min_score`=0.1 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22731302 AND mptpi.`type`=4857 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou','pet_fonce','pet_opaque','barquette_opaque','film_plastique','autre_emballage','sac')) AND mptpi.`min_score`=0.1 To do To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22731302 AND mptpi.`type`=4856 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou','pet_fonce','pet_opaque','barquette_opaque','film_plastique','autre_emballage','sac')) AND mptpi.`min_score`=0.1 To do lien utilise dans velours : https://www.fotonower.com/velours/22736613,22736614,22736615,22736616,22736617,22736622,22736624,22736625,22736626,22736627,22736628,22736629,22736631,22736632,22736634,22736635,22736636,22736637,22736638?tags=papier,carton,metal,pet_clair,pehd,ela,textiles,verre,organique,dasri,masque,encombrant,autre_non_emballage,environnement,flux_dev,mal_croppe,flou,film_dev_souple,sac_om_plein&datou_id_consolidate=4742&port_consolidate=22731302 Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1356446088, 1356446064, 1356445493, 1356445457, 1356445406] Looping around the photos to save general results len do output : 1 /22731302. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356446088', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356446064', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445493', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445457', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445406', None, None, None, None, None, '2812250') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 6 time used for this insertion : 0.016575336456298828 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.2003464698791504 time spend to save output : 0.016811370849609375 total time spend for step 4 : 1.2171578407287598 step5:final Fri May 9 14:34:26 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1356446088: ('0.25459009411533207',), 1356446064: ('0.25459009411533207',), 1356445493: ('0.25459009411533207',), 1356445457: ('0.25459009411533207',), 1356445406: ('0.25459009411533207',)} new output for save of step final : {1356446088: ('0.25459009411533207',), 1356446064: ('0.25459009411533207',), 1356445493: ('0.25459009411533207',), 1356445457: ('0.25459009411533207',), 1356445406: ('0.25459009411533207',)} [1356446088, 1356446064, 1356445493, 1356445457, 1356445406] Looping around the photos to save general results len do output : 5 /1356446088.Didn't retrieve data . /1356446064.Didn't retrieve data . /1356445493.Didn't retrieve data . /1356445457.Didn't retrieve data . /1356445406.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356446088', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356446064', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445493', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445457', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445406', None, None, None, None, None, '2812250') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 15 time used for this insertion : 0.016266822814941406 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.04510688781738281 time spend to save output : 0.016620635986328125 total time spend for step 5 : 0.06172752380371094 step6:blur_detection Fri May 9 14:34:26 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection toutes les photos sont déjà traitées, on saute les calculs Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 5 time used for this insertion : 0.009809255599975586 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 5 time used for this insertion : 0.009899616241455078 save missing photos in datou_result : time spend for datou_step_exec : 0.023215770721435547 time spend to save output : 0.023699283599853516 total time spend for step 6 : 0.04691505432128906 step7:brightness Fri May 9 14:34:26 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness toutes les photos sont déjà traitées, on saute les calculs Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 5 time used for this insertion : 0.007756471633911133 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 5 time used for this insertion : 0.00907135009765625 save missing photos in datou_result : time spend for datou_step_exec : 0.024484634399414062 time spend to save output : 0.0216977596282959 total time spend for step 7 : 0.04618239402770996 step8:send_mail_cod Fri May 9 14:34:26 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin in order to get the selector url, please entre the license of selector results_COD_P22731302_09-05-2025_14_34_26.pdf 22826877 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette228268771746794066 22826878 imagette228268781746794069 22826879 imagette228268791746794069 22826880 imagette228268801746794069 22826881 imagette228268811746794069 22826882 imagette228268821746794069 22826883 imagette228268831746794069 22826884 change filename to text .change filename to text .change filename to text .imagette228268841746794069 22826893 imagette228268931746794069 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=22731302 and hashtag_type = 4857 and pcr.hashtag_id = h.hashtag_id; velour_link : https://www.fotonower.com/velours/22736613,22736614,22736615,22736616,22736617,22736622,22736624,22736625,22736626,22736627,22736628,22736629,22736631,22736632,22736634,22736635,22736636,22736637,22736638?tags=papier,carton,metal,pet_clair,pehd,ela,textiles,verre,organique,dasri,masque,encombrant,autre_non_emballage,environnement,flux_dev,mal_croppe,flou,film_dev_souple,sac_om_plein&datou_id_consolidate=4742&port_consolidate=22731302 your option no_mail is active, we will not send the real mail to your client args[1356446088] : ((1356446088, 3396.413624889299, 492609224), (1356446088, -0.2916178789411884, 496442774), '0.25459009411533207') We are sending mail with results at kexin@fotonower.com args[1356446064] : ((1356446064, 2397.008330113544, 492609224), (1356446064, -0.3939260028864529, 496442774), '0.25459009411533207') We are sending mail with results at kexin@fotonower.com args[1356445493] : ((1356445493, 1063.5541864861912, 2107751945), (1356445493, -0.21523270463443878, 496442774), '0.25459009411533207') We are sending mail with results at kexin@fotonower.com args[1356445457] : ((1356445457, 2309.046559119176, 492609224), (1356445457, -0.24409830392719728, 496442774), '0.25459009411533207') We are sending mail with results at kexin@fotonower.com args[1356445406] : ((1356445406, 307.41366627401385, 492688767), (1356445406, -1.1439963094382846, 501862349), '0.25459009411533207') We are sending mail with results at kexin@fotonower.com refus_total : 0.25459009411533207 2022-04-13 10:29:59 0 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22731302_09-05-2025_14_34_26.pdf results_COD_P22731302_09-05-2025_14_34_26.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22731302_09-05-2025_14_34_26.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('4742','22731302','results_COD_P22731302_09-05-2025_14_34_26.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22731302_09-05-2025_14_34_26.pdf','pdf','','0.46','0.25459009411533207') Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1356446088, 1356446064, 1356445493, 1356445457, 1356445406] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356446088', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356446064', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445493', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445457', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445406', None, None, None, None, None, '2812250') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 5 time used for this insertion : 0.012758255004882812 save_final save missing photos in datou_result : time spend for datou_step_exec : 5.413686752319336 time spend to save output : 0.012940645217895508 total time spend for step 8 : 5.4266273975372314 step9:split_time_score Fri May 9 14:34:31 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('34', 4), ('35', 1)) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 06052025 22731302 Nombre de photos uploadées : 5 / 23040 (0%) 06052025 22731302 Nombre de photos taguées (types de déchets): 0 / 5 (0%) 06052025 22731302 Nombre de photos taguées (volume) : 0 / 5 (0%) elapsed_time : load_data_split_time_score 1.9073486328125e-06 elapsed_time : order_list_meta_photo_and_scores 5.245208740234375e-06 ????? elapsed_time : fill_and_build_computed_from_old_data 0.0003266334533691406 elapsed_time : insert_dashboard_record_day_entry 0.03165149688720703 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.5180177153219047 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22731300_09-05-2025_14_31_53.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 22731300 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13169 copy_chis is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13170 consolidate_hashtags_from_manual_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13167 rle_unique_nms_with_priority is not consistent : 3 used against 1 in the step definition ! WARNING : number of outputs for step 13167 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13174 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13168 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13171 blur_detection have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13172 brightness have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13173 send_mail_cod have less inputs used (4) than in the step definition (5) : maybe we manage optionnal inputs ! Step 13175 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13169 have datatype=11 whereas input 0 of step 13167 have datatype=2 WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 3 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 1 of step 13167 doesn't seem to be define in the database( WARNING : output 0 of step 13174 have datatype=10 whereas input 3 of step 13173 have datatype=6 WARNING : type of input 1 of step 13174 doesn't seem to be define in the database( WARNING : output 1 of step 13167 have datatype=7 whereas input 1 of step 13174 have datatype=None WARNING : type of output 1 of step 13174 doesn't seem to be define in the database( WARNING : type of input 4 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13169 doesn't seem to be define in the database( WARNING : type of input 1 of step 13170 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22731300 AND mptpi.`type`=4857 To do Qualite : 0.18675044049694492 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22731301_09-05-2025_14_32_23.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 22731301 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13169 copy_chis is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13170 consolidate_hashtags_from_manual_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13167 rle_unique_nms_with_priority is not consistent : 3 used against 1 in the step definition ! WARNING : number of outputs for step 13167 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13174 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13168 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13171 blur_detection have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13172 brightness have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13173 send_mail_cod have less inputs used (4) than in the step definition (5) : maybe we manage optionnal inputs ! Step 13175 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13169 have datatype=11 whereas input 0 of step 13167 have datatype=2 WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 3 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 1 of step 13167 doesn't seem to be define in the database( WARNING : output 0 of step 13174 have datatype=10 whereas input 3 of step 13173 have datatype=6 WARNING : type of input 1 of step 13174 doesn't seem to be define in the database( WARNING : output 1 of step 13167 have datatype=7 whereas input 1 of step 13174 have datatype=None WARNING : type of output 1 of step 13174 doesn't seem to be define in the database( WARNING : type of input 4 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13169 doesn't seem to be define in the database( WARNING : type of input 1 of step 13170 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22731301 AND mptpi.`type`=4857 To do Qualite : 0.25459009411533207 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22731302_09-05-2025_14_34_26.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 22731302 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13169 copy_chis is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13170 consolidate_hashtags_from_manual_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13167 rle_unique_nms_with_priority is not consistent : 3 used against 1 in the step definition ! WARNING : number of outputs for step 13167 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13174 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13168 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13171 blur_detection have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13172 brightness have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13173 send_mail_cod have less inputs used (4) than in the step definition (5) : maybe we manage optionnal inputs ! Step 13175 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13169 have datatype=11 whereas input 0 of step 13167 have datatype=2 WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 3 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 1 of step 13167 doesn't seem to be define in the database( WARNING : output 0 of step 13174 have datatype=10 whereas input 3 of step 13173 have datatype=6 WARNING : type of input 1 of step 13174 doesn't seem to be define in the database( WARNING : output 1 of step 13167 have datatype=7 whereas input 1 of step 13174 have datatype=None WARNING : type of output 1 of step 13174 doesn't seem to be define in the database( WARNING : type of input 4 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13169 doesn't seem to be define in the database( WARNING : type of input 1 of step 13170 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22731302 AND mptpi.`type`=4857 To do find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P22731303_06-05-2025_22_14_07.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 22731303 order by id desc limit 1 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P22731304_06-05-2025_21_53_37.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 22731304 order by id desc limit 1 NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'06052025': {'nb_upload': 5, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1356446088, 1356446064, 1356445493, 1356445457, 1356445406] Looping around the photos to save general results len do output : 1 /22731302Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356446088', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356446064', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445493', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445457', None, None, None, None, None, '2812250') ('4742', None, None, None, None, None, None, None, '2812250') ('4742', '22731302', '1356445406', None, None, None, None, None, '2812250') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 6 time used for this insertion : 0.015596389770507812 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.0961823463439941 time spend to save output : 0.015816926956176758 total time spend for step 9 : 1.111999273300171 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 5 set_done_treatment 3.23user 1.89system 0:17.31elapsed 29%CPU (0avgtext+0avgdata 222564maxresident)k 1600inputs+51288outputs (30major+222531minor)pagefaults 0swaps