python /home/admin/mtr/script_for_cron.py -j default -m 20 -a 'python3 ~/workarea/git/Velours/python/prod/datou.py -j batch_current -a 4742' -s traitement_4742 -M 0 -S 0 -U 100,80,95 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/home/admin/workarea/git/apy', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 2286185 load datou : 4742 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13169 copy_chis is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13170 consolidate_hashtags_from_manual_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13167 rle_unique_nms_with_priority is not consistent : 3 used against 1 in the step definition ! WARNING : number of outputs for step 13167 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13174 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13168 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13171 blur_detection have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13172 brightness have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13173 send_mail_cod have less inputs used (4) than in the step definition (5) : maybe we manage optionnal inputs ! Step 13175 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13169 have datatype=11 whereas input 0 of step 13167 have datatype=2 WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 3 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 1 of step 13167 doesn't seem to be define in the database( WARNING : output 0 of step 13174 have datatype=10 whereas input 3 of step 13173 have datatype=6 WARNING : type of input 1 of step 13174 doesn't seem to be define in the database( WARNING : output 1 of step 13167 have datatype=7 whereas input 1 of step 13174 have datatype=None WARNING : type of output 1 of step 13174 doesn't seem to be define in the database( WARNING : type of input 4 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13169 doesn't seem to be define in the database( WARNING : type of input 1 of step 13170 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : id de la photo (peut être local ou global) was removed should we ? donnée sous forme de nombre was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? load thcls load pdts Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 4742, datou_cur_ids : ['2838051'] with mtr_portfolio_ids : ['22889869'] and first list_photo_ids : [] new path : /proc/2286185/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13169 copy_chis is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13170 consolidate_hashtags_from_manual_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13167 rle_unique_nms_with_priority is not consistent : 3 used against 1 in the step definition ! WARNING : number of outputs for step 13167 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13174 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13168 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13171 blur_detection have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13172 brightness have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13173 send_mail_cod have less inputs used (4) than in the step definition (5) : maybe we manage optionnal inputs ! Step 13175 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13169 have datatype=11 whereas input 0 of step 13167 have datatype=2 WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 3 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 1 of step 13167 doesn't seem to be define in the database( WARNING : output 0 of step 13174 have datatype=10 whereas input 3 of step 13173 have datatype=6 WARNING : type of input 1 of step 13174 doesn't seem to be define in the database( WARNING : output 1 of step 13167 have datatype=7 whereas input 1 of step 13174 have datatype=None WARNING : type of output 1 of step 13174 doesn't seem to be define in the database( WARNING : type of input 4 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13169 doesn't seem to be define in the database( WARNING : type of input 1 of step 13170 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : copy_chis, consolidate_hashtags_from_manual_portfolio, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, send_mail_cod, split_time_score over limit max, limiting to limit_max 100 list_input_json : [] origin We have 1 , BFBFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 6 ; length of list_pids : 6 ; length of list_args : 6 time to download the photos : 1.4791662693023682 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 9 step1:copy_chis Tue May 13 11:08:45 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Begin step datou_step_copy_crop batch 1 Loaded 85 chid ids of type : 4855 batch 1 Loaded 85 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : copy_chis we use saveGeneral [1357676358, 1357676347, 1357676341, 1357676331, 1357676324, 1357676318] Looping around the photos to save general results len do output : 6 /1357676358Didn't retrieve data . /1357676347Didn't retrieve data . /1357676341Didn't retrieve data . /1357676331Didn't retrieve data . /1357676324Didn't retrieve data . /1357676318Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676358', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676347', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676341', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676331', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676324', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676318', None, None, None, None, None, '2838051') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 12 time used for this insertion : 0.02079916000366211 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.12865638732910156 time spend to save output : 0.021280288696289062 total time spend for step 1 : 0.14993667602539062 step2:consolidate_hashtags_from_manual_portfolio Tue May 13 11:08:45 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step consolidate_hashtags_from_manual_portfolio Iterating over portfolio : 22889869 on est dans le IF portfolio mere 26T SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22889869 AND mptpi.`type`=4856 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou')) To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22889869 AND mptpi.`type`=4857 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou')) To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22889869 AND mptpi.`type`=4857 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou')) To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=9745827 AND mptpi.`type`=4856 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou')) To do TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895625 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 379 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 121 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895626 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 369 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 166 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895627 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 369 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 204 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895628 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 62 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 36 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895629 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 69 chid ids of type : 4855 begin to find the sub_photo_id : batch 1 Loaded 45 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895634 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895636 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895637 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895638 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895639 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895640 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895641 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895643 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 369 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 224 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895644 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 379 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 231 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895646 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 96 chid ids of type : 4855 begin to find the sub_photo_id : batch 1 Loaded 53 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895647 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895648 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895649 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22895650 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 379 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 286 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! To test ! Use context local managing function ! Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : consolidate_hashtags_from_manual_portfolio we use saveGeneral [1357676358, 1357676347, 1357676341, 1357676331, 1357676324, 1357676318] Looping around the photos to save general results len do output : 6 /Tassign value string error string indices must be integers invalid literal for int() with base 10: 'T' begin to insert list_values into mtr_datou_result : length of list_values in save_final : 6 time used for this insertion : 0.014686346054077148 save_final save missing photos in datou_result : time spend for datou_step_exec : 5.939743995666504 time spend to save output : 0.014880180358886719 total time spend for step 2 : 5.954624176025391 step3:rle_unique_nms_with_priority Tue May 13 11:08:51 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 286 chid ids of type : 4857 seulement à utiliser dans la step consolidation batch 1 Loaded 50 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.03216719627380371 seulement à utiliser dans la step consolidation batch 1 Loaded 65 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.7042162418365479 seulement à utiliser dans la step consolidation batch 1 Loaded 54 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.03357052803039551 seulement à utiliser dans la step consolidation batch 1 Loaded 70 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.0345003604888916 seulement à utiliser dans la step consolidation batch 1 Loaded 39 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.03235983848571777 seulement à utiliser dans la step consolidation batch 1 Loaded 8 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.09896183013916016 map_output_result : {1357676358: (0.2230466100455236, 'Should be the crop_list due to order', 0.20578395381163964), 1357676347: (0.2230466100455236, 'Should be the crop_list due to order', 0.14773732366534872), 1357676341: (0.2230466100455236, 'Should be the crop_list due to order', 0.06853557003101353), 1357676331: (0.2230466100455236, 'Should be the crop_list due to order', 0.1376326482536981), 1357676324: (0.2230466100455236, 'Should be the crop_list due to order', 0.10274277087420733), 1357676318: (0.2230466100455236, 'Should be the crop_list due to order', 0.675847393637234)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1357676358, 1357676347, 1357676341, 1357676331, 1357676324, 1357676318] Looping around the photos to save general results len do output : 6 /1357676358.Didn't retrieve data . /1357676347.Didn't retrieve data . /1357676341.Didn't retrieve data . /1357676331.Didn't retrieve data . /1357676324.Didn't retrieve data . /1357676318.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676358', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676347', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676341', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676331', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676324', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676318', None, None, None, None, None, '2838051') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 18 time used for this insertion : 0.019801855087280273 save_final save missing photos in datou_result : time spend for datou_step_exec : 3.9678196907043457 time spend to save output : 0.020310640335083008 total time spend for step 3 : 3.9881303310394287 step4:ventilate_hashtags_in_portfolio Tue May 13 11:08:55 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 22889869 get user id for portfolio 22889869 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22889869 AND mptpi.`type`=4857 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou','pet_fonce','pet_opaque','barquette_opaque','film_plastique','autre_emballage','sac')) AND mptpi.`min_score`=0.1 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22889869 AND mptpi.`type`=4857 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou','pet_fonce','pet_opaque','barquette_opaque','film_plastique','autre_emballage','sac')) AND mptpi.`min_score`=0.1 To do To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22889869 AND mptpi.`type`=4856 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou','pet_fonce','pet_opaque','barquette_opaque','film_plastique','autre_emballage','sac')) AND mptpi.`min_score`=0.1 To do lien utilise dans velours : https://www.fotonower.com/velours/22895625,22895626,22895627,22895628,22895629,22895634,22895636,22895637,22895638,22895639,22895640,22895641,22895643,22895644,22895646,22895647,22895648,22895649,22895650?tags=papier,carton,metal,pet_clair,pehd,ela,textiles,verre,organique,dasri,masque,encombrant,autre_non_emballage,environnement,flux_dev,mal_croppe,flou,film_dev_souple,sac_om_plein&datou_id_consolidate=4742&port_consolidate=22889869 Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1357676358, 1357676347, 1357676341, 1357676331, 1357676324, 1357676318] Looping around the photos to save general results len do output : 1 /22889869. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676358', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676347', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676341', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676331', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676324', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676318', None, None, None, None, None, '2838051') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 7 time used for this insertion : 0.08386611938476562 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.2149603366851807 time spend to save output : 0.08406805992126465 total time spend for step 4 : 1.2990283966064453 step5:final Tue May 13 11:08:56 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1357676358: ('0.3307354527311239',), 1357676347: ('0.3307354527311239',), 1357676341: ('0.3307354527311239',), 1357676331: ('0.3307354527311239',), 1357676324: ('0.3307354527311239',), 1357676318: ('0.3307354527311239',)} new output for save of step final : {1357676358: ('0.3307354527311239',), 1357676347: ('0.3307354527311239',), 1357676341: ('0.3307354527311239',), 1357676331: ('0.3307354527311239',), 1357676324: ('0.3307354527311239',), 1357676318: ('0.3307354527311239',)} [1357676358, 1357676347, 1357676341, 1357676331, 1357676324, 1357676318] Looping around the photos to save general results len do output : 6 /1357676358.Didn't retrieve data . /1357676347.Didn't retrieve data . /1357676341.Didn't retrieve data . /1357676331.Didn't retrieve data . /1357676324.Didn't retrieve data . /1357676318.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676358', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676347', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676341', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676331', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676324', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676318', None, None, None, None, None, '2838051') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 18 time used for this insertion : 0.06569409370422363 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.13600850105285645 time spend to save output : 0.06605935096740723 total time spend for step 5 : 0.20206785202026367 step6:blur_detection Tue May 13 11:08:56 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection toutes les photos sont déjà traitées, on saute les calculs Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 6 time used for this insertion : 0.008934497833251953 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 6 time used for this insertion : 0.009890079498291016 save missing photos in datou_result : time spend for datou_step_exec : 0.023091554641723633 time spend to save output : 0.022791385650634766 total time spend for step 6 : 0.0458829402923584 step7:brightness Tue May 13 11:08:56 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness toutes les photos sont déjà traitées, on saute les calculs Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 6 time used for this insertion : 0.008717060089111328 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 6 time used for this insertion : 0.009113788604736328 save missing photos in datou_result : time spend for datou_step_exec : 0.025808334350585938 time spend to save output : 0.022272825241088867 total time spend for step 7 : 0.048081159591674805 step8:send_mail_cod Tue May 13 11:08:56 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin in order to get the selector url, please entre the license of selector results_COD_P22889869_13-05-2025_11_08_56.pdf 22915897 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette229158971747127336 22915898 imagette229158981747127338 22915899 imagette229158991747127338 22915900 imagette229159001747127338 22915901 imagette229159011747127338 22915902 imagette229159021747127338 22915903 imagette229159031747127338 22915904 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette229159041747127338 22915913 imagette229159131747127340 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=22889869 and hashtag_type = 4857 and pcr.hashtag_id = h.hashtag_id; velour_link : https://www.fotonower.com/velours/22895625,22895626,22895627,22895628,22895629,22895634,22895636,22895637,22895638,22895639,22895640,22895641,22895643,22895644,22895646,22895647,22895648,22895649,22895650?tags=papier,carton,metal,pet_clair,pehd,ela,textiles,verre,organique,dasri,masque,encombrant,autre_non_emballage,environnement,flux_dev,mal_croppe,flou,film_dev_souple,sac_om_plein&datou_id_consolidate=4742&port_consolidate=22889869 your option no_mail is active, we will not send the real mail to your client args[1357676358] : ((1357676358, 3770.4256844931706, 492609224), (1357676358, -0.27258079765348414, 496442774), '0.3307354527311239') We are sending mail with results at kexin@fotonower.com args[1357676347] : ((1357676347, 3524.6544553090457, 492609224), (1357676347, -0.22440133322366962, 496442774), '0.3307354527311239') We are sending mail with results at kexin@fotonower.com args[1357676341] : ((1357676341, 3302.9432031563097, 492609224), (1357676341, -0.40106026799779704, 496442774), '0.3307354527311239') We are sending mail with results at kexin@fotonower.com args[1357676331] : ((1357676331, 5268.757815833344, 492609224), (1357676331, -0.3006546947640278, 496442774), '0.3307354527311239') We are sending mail with results at kexin@fotonower.com args[1357676324] : ((1357676324, 998.3043450320747, 2107751945), (1357676324, -0.2827465239508777, 496442774), '0.3307354527311239') We are sending mail with results at kexin@fotonower.com args[1357676318] : ((1357676318, 427.09759103342196, 492688767), (1357676318, -0.6761987583071036, 501862349), '0.3307354527311239') We are sending mail with results at kexin@fotonower.com refus_total : 0.3307354527311239 2022-04-13 10:29:59 0 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22889869_13-05-2025_11_08_56.pdf results_COD_P22889869_13-05-2025_11_08_56.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22889869_13-05-2025_11_08_56.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('4742','22889869','results_COD_P22889869_13-05-2025_11_08_56.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22889869_13-05-2025_11_08_56.pdf','pdf','','0.53','0.3307354527311239') Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1357676358, 1357676347, 1357676341, 1357676331, 1357676324, 1357676318] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676358', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676347', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676341', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676331', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676324', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676318', None, None, None, None, None, '2838051') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 6 time used for this insertion : 0.04576754570007324 save_final save missing photos in datou_result : time spend for datou_step_exec : 5.465163707733154 time spend to save output : 0.045940399169921875 total time spend for step 8 : 5.511104106903076 step9:split_time_score Tue May 13 11:09:02 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('04', 3), ('05', 3)) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 12052025 22889869 Nombre de photos uploadées : 6 / 23040 (0%) 12052025 22889869 Nombre de photos taguées (types de déchets): 0 / 6 (0%) 12052025 22889869 Nombre de photos taguées (volume) : 0 / 6 (0%) elapsed_time : load_data_split_time_score 2.1457672119140625e-06 elapsed_time : order_list_meta_photo_and_scores 5.7220458984375e-06 ?????? elapsed_time : fill_and_build_computed_from_old_data 0.0003676414489746094 elapsed_time : insert_dashboard_record_day_entry 0.054857730865478516 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.7803717202126839 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22889868_13-05-2025_11_08_45.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 22889868 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13169 copy_chis is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13170 consolidate_hashtags_from_manual_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13167 rle_unique_nms_with_priority is not consistent : 3 used against 1 in the step definition ! WARNING : number of outputs for step 13167 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13174 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13168 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13171 blur_detection have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13172 brightness have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13173 send_mail_cod have less inputs used (4) than in the step definition (5) : maybe we manage optionnal inputs ! Step 13175 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13169 have datatype=11 whereas input 0 of step 13167 have datatype=2 WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 3 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 1 of step 13167 doesn't seem to be define in the database( WARNING : output 0 of step 13174 have datatype=10 whereas input 3 of step 13173 have datatype=6 WARNING : type of input 1 of step 13174 doesn't seem to be define in the database( WARNING : output 1 of step 13167 have datatype=7 whereas input 1 of step 13174 have datatype=None WARNING : type of output 1 of step 13174 doesn't seem to be define in the database( WARNING : type of input 4 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13169 doesn't seem to be define in the database( WARNING : type of input 1 of step 13170 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22889868 AND mptpi.`type`=4857 To do Qualite : 0.3307354527311239 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22889869_13-05-2025_11_08_56.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 22889869 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13169 copy_chis is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13170 consolidate_hashtags_from_manual_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13167 rle_unique_nms_with_priority is not consistent : 3 used against 1 in the step definition ! WARNING : number of outputs for step 13167 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13174 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13168 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13171 blur_detection have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13172 brightness have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13173 send_mail_cod have less inputs used (4) than in the step definition (5) : maybe we manage optionnal inputs ! Step 13175 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13169 have datatype=11 whereas input 0 of step 13167 have datatype=2 WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 3 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 1 of step 13167 doesn't seem to be define in the database( WARNING : output 0 of step 13174 have datatype=10 whereas input 3 of step 13173 have datatype=6 WARNING : type of input 1 of step 13174 doesn't seem to be define in the database( WARNING : output 1 of step 13167 have datatype=7 whereas input 1 of step 13174 have datatype=None WARNING : type of output 1 of step 13174 doesn't seem to be define in the database( WARNING : type of input 4 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13169 doesn't seem to be define in the database( WARNING : type of input 1 of step 13170 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22889869 AND mptpi.`type`=4857 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'12052025': {'nb_upload': 6, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1357676358, 1357676347, 1357676341, 1357676331, 1357676324, 1357676318] Looping around the photos to save general results len do output : 1 /22889869Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676358', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676347', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676341', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676331', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676324', None, None, None, None, None, '2838051') ('4742', None, None, None, None, None, None, None, '2838051') ('4742', '22889869', '1357676318', None, None, None, None, None, '2838051') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 7 time used for this insertion : 0.07859945297241211 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.2797048091888428 time spend to save output : 0.07880258560180664 total time spend for step 9 : 0.3585073947906494 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 6 set_done_treatment 3.54user 2.39system 0:20.94elapsed 28%CPU (0avgtext+0avgdata 223508maxresident)k 416inputs+61176outputs (7major+256964minor)pagefaults 0swaps