python /home/admin/mtr/script_for_cron.py -j default -m 20 -a 'python3 ~/workarea/git/Velours/python/prod/datou.py -j batch_current -a 4742 ' -s traitement_4742 -M 0 -S 0 -U 100,80,95 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/home/admin/workarea/git/apy', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 588391 load datou : 4742 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13169 copy_chis is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13170 consolidate_hashtags_from_manual_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13167 rle_unique_nms_with_priority is not consistent : 3 used against 1 in the step definition ! WARNING : number of outputs for step 13167 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13174 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13168 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13171 blur_detection have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13172 brightness have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13173 send_mail_cod have less inputs used (4) than in the step definition (5) : maybe we manage optionnal inputs ! Step 13175 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13169 have datatype=11 whereas input 0 of step 13167 have datatype=2 WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 3 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 1 of step 13167 doesn't seem to be define in the database( WARNING : output 0 of step 13174 have datatype=10 whereas input 3 of step 13173 have datatype=6 WARNING : type of input 1 of step 13174 doesn't seem to be define in the database( WARNING : output 1 of step 13167 have datatype=7 whereas input 1 of step 13174 have datatype=None WARNING : type of output 1 of step 13174 doesn't seem to be define in the database( WARNING : type of input 4 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13169 doesn't seem to be define in the database( WARNING : type of input 1 of step 13170 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : id de la photo (peut être local ou global) was removed should we ? donnée sous forme de nombre was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? load thcls load pdts Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 4742, datou_cur_ids : ['2812241'] with mtr_portfolio_ids : ['22713349'] and first list_photo_ids : [] new path : /proc/588391/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13169 copy_chis is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13170 consolidate_hashtags_from_manual_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13167 rle_unique_nms_with_priority is not consistent : 3 used against 1 in the step definition ! WARNING : number of outputs for step 13167 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13174 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13168 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13171 blur_detection have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13172 brightness have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13173 send_mail_cod have less inputs used (4) than in the step definition (5) : maybe we manage optionnal inputs ! Step 13175 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13169 have datatype=11 whereas input 0 of step 13167 have datatype=2 WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 3 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 1 of step 13167 doesn't seem to be define in the database( WARNING : output 0 of step 13174 have datatype=10 whereas input 3 of step 13173 have datatype=6 WARNING : type of input 1 of step 13174 doesn't seem to be define in the database( WARNING : output 1 of step 13167 have datatype=7 whereas input 1 of step 13174 have datatype=None WARNING : type of output 1 of step 13174 doesn't seem to be define in the database( WARNING : type of input 4 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13169 doesn't seem to be define in the database( WARNING : type of input 1 of step 13170 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : copy_chis, consolidate_hashtags_from_manual_portfolio, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, send_mail_cod, split_time_score over limit max, limiting to limit_max 100 list_input_json : [] origin We have 1 , BFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 4 ; length of list_pids : 4 ; length of list_args : 4 time to download the photos : 1.560704231262207 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 9 step1:copy_chis Fri May 9 14:26:16 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Begin step datou_step_copy_crop batch 1 Loaded 35 chid ids of type : 4855 batch 1 Loaded 35 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : copy_chis we use saveGeneral [1356331766, 1356331762, 1356331757, 1356331746] Looping around the photos to save general results len do output : 4 /1356331746Didn't retrieve data . /1356331757Didn't retrieve data . /1356331762Didn't retrieve data . /1356331766Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331766', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331762', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331757', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331746', None, None, None, None, None, '2812241') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 8 time used for this insertion : 0.015813350677490234 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.09270644187927246 time spend to save output : 0.016069650650024414 total time spend for step 1 : 0.10877609252929688 step2:consolidate_hashtags_from_manual_portfolio Fri May 9 14:26:16 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step consolidate_hashtags_from_manual_portfolio Iterating over portfolio : 22713349 on est dans le IF portfolio mere 26T SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22713349 AND mptpi.`type`=4856 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou')) To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22713349 AND mptpi.`type`=4857 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou')) To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22713349 AND mptpi.`type`=4857 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou')) To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=9745827 AND mptpi.`type`=4856 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou')) To do TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713969 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 260 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 35 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713970 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 260 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 99 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713971 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713972 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 268 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 121 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713973 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713978 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713980 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713981 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713982 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713983 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713984 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713985 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713987 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713988 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 180 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 85 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713990 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 172 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 81 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713991 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 268 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 209 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713992 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 187 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 149 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713993 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 88 chid ids of type : 4855 begin to find the sub_photo_id : batch 1 Loaded 76 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22713994 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 260 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 226 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! To test ! Use context local managing function ! Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : consolidate_hashtags_from_manual_portfolio we use saveGeneral [1356331766, 1356331762, 1356331757, 1356331746] Looping around the photos to save general results len do output : 6 /Tassign value string error string indices must be integers invalid literal for int() with base 10: 'T' begin to insert list_values into mtr_datou_result : length of list_values in save_final : 4 time used for this insertion : 0.014065265655517578 save_final save missing photos in datou_result : time spend for datou_step_exec : 3.5358572006225586 time spend to save output : 0.014220476150512695 total time spend for step 2 : 3.5500776767730713 step3:rle_unique_nms_with_priority Fri May 9 14:26:19 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 149 chid ids of type : 4857 seulement à utiliser dans la step consolidation batch 1 Loaded 10 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.026678800582885742 seulement à utiliser dans la step consolidation batch 1 Loaded 83 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.04087686538696289 seulement à utiliser dans la step consolidation batch 1 Loaded 63 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.03527379035949707 seulement à utiliser dans la step consolidation batch 1 Loaded 80 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.03537464141845703 map_output_result : {1356331766: (0.11776921688199785, 'Should be the crop_list due to order', 0.0), 1356331762: (0.11776921688199785, 'Should be the crop_list due to order', 0.12848918914505444), 1356331757: (0.11776921688199785, 'Should be the crop_list due to order', 0.21625869161335992), 1356331746: (0.11776921688199785, 'Should be the crop_list due to order', 0.12632898676957702)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1356331766, 1356331762, 1356331757, 1356331746] Looping around the photos to save general results len do output : 4 /1356331766.Didn't retrieve data . /1356331762.Didn't retrieve data . /1356331757.Didn't retrieve data . /1356331746.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331766', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331762', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331757', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331746', None, None, None, None, None, '2812241') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 12 time used for this insertion : 0.015442371368408203 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.9871759414672852 time spend to save output : 0.01573777198791504 total time spend for step 3 : 2.0029137134552 step4:ventilate_hashtags_in_portfolio Fri May 9 14:26:21 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 22713349 get user id for portfolio 22713349 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22713349 AND mptpi.`type`=4857 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou','pet_fonce','pet_opaque','barquette_opaque','film_plastique','autre_emballage','sac')) AND mptpi.`min_score`=0.1 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22713349 AND mptpi.`type`=4857 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou','pet_fonce','pet_opaque','barquette_opaque','film_plastique','autre_emballage','sac')) AND mptpi.`min_score`=0.1 To do To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22713349 AND mptpi.`type`=4856 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou','pet_fonce','pet_opaque','barquette_opaque','film_plastique','autre_emballage','sac')) AND mptpi.`min_score`=0.1 To do lien utilise dans velours : https://www.fotonower.com/velours/22713969,22713970,22713971,22713972,22713973,22713978,22713980,22713981,22713982,22713983,22713984,22713985,22713987,22713988,22713990,22713991,22713992,22713993,22713994?tags=papier,carton,metal,pet_clair,pehd,ela,textiles,verre,organique,dasri,masque,encombrant,autre_non_emballage,environnement,flux_dev,mal_croppe,flou,film_dev_souple,sac_om_plein&datou_id_consolidate=4742&port_consolidate=22713349 Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1356331766, 1356331762, 1356331757, 1356331746] Looping around the photos to save general results len do output : 1 /22713349. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331766', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331762', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331757', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331746', None, None, None, None, None, '2812241') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 5 time used for this insertion : 0.01580214500427246 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.1732730865478516 time spend to save output : 0.016028165817260742 total time spend for step 4 : 1.1893012523651123 step5:final Fri May 9 14:26:23 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1356331766: ('0.31625000328337116',), 1356331762: ('0.31625000328337116',), 1356331757: ('0.31625000328337116',), 1356331746: ('0.31625000328337116',)} new output for save of step final : {1356331766: ('0.31625000328337116',), 1356331762: ('0.31625000328337116',), 1356331757: ('0.31625000328337116',), 1356331746: ('0.31625000328337116',)} [1356331766, 1356331762, 1356331757, 1356331746] Looping around the photos to save general results len do output : 4 /1356331766.Didn't retrieve data . /1356331762.Didn't retrieve data . /1356331757.Didn't retrieve data . /1356331746.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331766', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331762', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331757', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331746', None, None, None, None, None, '2812241') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 12 time used for this insertion : 0.01581883430480957 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.04203033447265625 time spend to save output : 0.016125917434692383 total time spend for step 5 : 0.05815625190734863 step6:blur_detection Fri May 9 14:26:23 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection toutes les photos sont déjà traitées, on saute les calculs Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 4 time used for this insertion : 0.01135563850402832 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 4 time used for this insertion : 0.008281230926513672 save missing photos in datou_result : time spend for datou_step_exec : 0.023389101028442383 time spend to save output : 0.024277925491333008 total time spend for step 6 : 0.04766702651977539 step7:brightness Fri May 9 14:26:23 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness toutes les photos sont déjà traitées, on saute les calculs Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 4 time used for this insertion : 0.009851932525634766 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 4 time used for this insertion : 0.0077724456787109375 save missing photos in datou_result : time spend for datou_step_exec : 0.02636122703552246 time spend to save output : 0.0225985050201416 total time spend for step 7 : 0.04895973205566406 step8:send_mail_cod Fri May 9 14:26:23 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin in order to get the selector url, please entre the license of selector results_COD_P22713349_09-05-2025_14_26_23.pdf 22826125 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette228261251746793583 22826126 imagette228261261746793585 22826127 imagette228261271746793585 22826128 imagette228261281746793585 22826129 imagette228261291746793585 22826130 imagette228261301746793585 22826131 imagette228261311746793585 22826132 imagette228261321746793585 22826141 imagette228261411746793586 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=22713349 and hashtag_type = 4857 and pcr.hashtag_id = h.hashtag_id; velour_link : https://www.fotonower.com/velours/22713969,22713970,22713971,22713972,22713973,22713978,22713980,22713981,22713982,22713983,22713984,22713985,22713987,22713988,22713990,22713991,22713992,22713993,22713994?tags=papier,carton,metal,pet_clair,pehd,ela,textiles,verre,organique,dasri,masque,encombrant,autre_non_emballage,environnement,flux_dev,mal_croppe,flou,film_dev_souple,sac_om_plein&datou_id_consolidate=4742&port_consolidate=22713349 your option no_mail is active, we will not send the real mail to your client args[1356331766] : ((1356331766, 775.2209858895532, 492688767), (1356331766, 0.6322115447213636, 2107752395), '0.31625000328337116') We are sending mail with results at kexin@fotonower.com args[1356331762] : ((1356331762, 3906.4472245192687, 492609224), (1356331762, -0.08212493018719456, 496442774), '0.31625000328337116') We are sending mail with results at kexin@fotonower.com args[1356331757] : ((1356331757, 1690.0543494897859, 2107751945), (1356331757, -0.24954691426707581, 496442774), '0.31625000328337116') We are sending mail with results at kexin@fotonower.com args[1356331746] : ((1356331746, 4005.691277388425, 492609224), (1356331746, -0.0544659874955837, 2107752395), '0.31625000328337116') We are sending mail with results at kexin@fotonower.com refus_total : 0.31625000328337116 2022-04-13 10:29:59 0 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22713349_09-05-2025_14_26_23.pdf results_COD_P22713349_09-05-2025_14_26_23.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22713349_09-05-2025_14_26_23.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('4742','22713349','results_COD_P22713349_09-05-2025_14_26_23.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22713349_09-05-2025_14_26_23.pdf','pdf','','0.45','0.31625000328337116') Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1356331766, 1356331762, 1356331757, 1356331746] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331766', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331762', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331757', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331746', None, None, None, None, None, '2812241') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 4 time used for this insertion : 0.013579845428466797 save_final save missing photos in datou_result : time spend for datou_step_exec : 4.815824508666992 time spend to save output : 0.01380014419555664 total time spend for step 8 : 4.829624652862549 step9:split_time_score Fri May 9 14:26:28 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('11', 1), ('13', 2), ('14', 1)) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 01052025 22713349 Nombre de photos uploadées : 4 / 23040 (0%) 01052025 22713349 Nombre de photos taguées (types de déchets): 0 / 4 (0%) 01052025 22713349 Nombre de photos taguées (volume) : 0 / 4 (0%) elapsed_time : load_data_split_time_score 4.291534423828125e-06 elapsed_time : order_list_meta_photo_and_scores 1.3113021850585938e-05 ???? elapsed_time : fill_and_build_computed_from_old_data 0.0003693103790283203 elapsed_time : insert_dashboard_record_day_entry 0.024722814559936523 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.31625000328337116 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22713349_09-05-2025_14_26_23.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 22713349 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13169 copy_chis is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13170 consolidate_hashtags_from_manual_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13167 rle_unique_nms_with_priority is not consistent : 3 used against 1 in the step definition ! WARNING : number of outputs for step 13167 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13174 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13168 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13171 blur_detection have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13172 brightness have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13173 send_mail_cod have less inputs used (4) than in the step definition (5) : maybe we manage optionnal inputs ! Step 13175 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13169 have datatype=11 whereas input 0 of step 13167 have datatype=2 WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 3 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 1 of step 13167 doesn't seem to be define in the database( WARNING : output 0 of step 13174 have datatype=10 whereas input 3 of step 13173 have datatype=6 WARNING : type of input 1 of step 13174 doesn't seem to be define in the database( WARNING : output 1 of step 13167 have datatype=7 whereas input 1 of step 13174 have datatype=None WARNING : type of output 1 of step 13174 doesn't seem to be define in the database( WARNING : type of input 4 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13169 doesn't seem to be define in the database( WARNING : type of input 1 of step 13170 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22713349 AND mptpi.`type`=4857 To do find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P22713611_06-05-2025_13_14_57.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 22713611 order by id desc limit 1 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P22713612_06-05-2025_12_01_26.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 22713612 order by id desc limit 1 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_Auto_P22713613_06-05-2025_12_16_22.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 22713613 order by id desc limit 1 NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'01052025': {'nb_upload': 4, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1356331766, 1356331762, 1356331757, 1356331746] Looping around the photos to save general results len do output : 1 /22713349Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331766', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331762', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331757', None, None, None, None, None, '2812241') ('4742', None, None, None, None, None, None, None, '2812241') ('4742', '22713349', '1356331746', None, None, None, None, None, '2812241') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 5 time used for this insertion : 0.01360321044921875 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.329857349395752 time spend to save output : 0.013816595077514648 total time spend for step 9 : 1.3436739444732666 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 4 set_done_treatment 3.29user 2.19system 0:16.81elapsed 32%CPU (0avgtext+0avgdata 214812maxresident)k 1560inputs+44104outputs (25major+189370minor)pagefaults 0swaps