python /home/admin/mtr/script_for_cron.py -j default -m 20 -a 'python3 ~/workarea/git/Velours/python/prod/datou.py -j batch_current -a 4742 ' -s traitement_4742 -M 0 -S 0 -U 100,80,95 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/home/admin/workarea/git/apy', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 576058 load datou : 4742 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13169 copy_chis is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13170 consolidate_hashtags_from_manual_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13167 rle_unique_nms_with_priority is not consistent : 3 used against 1 in the step definition ! WARNING : number of outputs for step 13167 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13174 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13168 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13171 blur_detection have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13172 brightness have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13173 send_mail_cod have less inputs used (4) than in the step definition (5) : maybe we manage optionnal inputs ! Step 13175 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13169 have datatype=11 whereas input 0 of step 13167 have datatype=2 WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 3 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 1 of step 13167 doesn't seem to be define in the database( WARNING : output 0 of step 13174 have datatype=10 whereas input 3 of step 13173 have datatype=6 WARNING : type of input 1 of step 13174 doesn't seem to be define in the database( WARNING : output 1 of step 13167 have datatype=7 whereas input 1 of step 13174 have datatype=None WARNING : type of output 1 of step 13174 doesn't seem to be define in the database( WARNING : type of input 4 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13169 doesn't seem to be define in the database( WARNING : type of input 1 of step 13170 doesn't seem to be define in the database( DataTypes for each output/input checked ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : id de la photo (peut être local ou global) was removed should we ? donnée sous forme de nombre was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? donnée sous forme de texte was removed should we ? [ (photo_id, photo_id_loc, hashtag_type, x0, x1, y0, y1, score), ...] was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? id de la photo (peut être local ou global) was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? donnée sous forme de texte was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? chemin de la photo was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? (photo_id, hashtag_id, score_max) was removed should we ? donnée sous forme de texte was removed should we ? None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? load thcls load pdts Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec updating current state to 1 list_input_json: [] Current got : datou_id : 4742, datou_cur_ids : ['2812240'] with mtr_portfolio_ids : ['22713557'] and first list_photo_ids : [] new path : /proc/576058/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13169 copy_chis is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13170 consolidate_hashtags_from_manual_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13167 rle_unique_nms_with_priority is not consistent : 3 used against 1 in the step definition ! WARNING : number of outputs for step 13167 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13174 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13168 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13171 blur_detection have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13172 brightness have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13173 send_mail_cod have less inputs used (4) than in the step definition (5) : maybe we manage optionnal inputs ! Step 13175 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13169 have datatype=11 whereas input 0 of step 13167 have datatype=2 WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 3 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 1 of step 13167 doesn't seem to be define in the database( WARNING : output 0 of step 13174 have datatype=10 whereas input 3 of step 13173 have datatype=6 WARNING : type of input 1 of step 13174 doesn't seem to be define in the database( WARNING : output 1 of step 13167 have datatype=7 whereas input 1 of step 13174 have datatype=None WARNING : type of output 1 of step 13174 doesn't seem to be define in the database( WARNING : type of input 4 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13169 doesn't seem to be define in the database( WARNING : type of input 1 of step 13170 doesn't seem to be define in the database( DataTypes for each output/input checked ! List Step Type Loaded in datou : copy_chis, consolidate_hashtags_from_manual_portfolio, rle_unique_nms_with_priority, ventilate_hashtags_in_portfolio, final, blur_detection, brightness, send_mail_cod, split_time_score over limit max, limiting to limit_max 100 list_input_json : [] origin We have 1 , BFBFBFBFBFwe have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB length of list_filenames : 5 ; length of list_pids : 5 ; length of list_args : 5 time to download the photos : 1.379509449005127 About to test input to load we should then remove the video here, and this would fix the bug of datou_current ! Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 9 step1:copy_chis Fri May 9 14:22:42 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Begin step datou_step_copy_crop batch 1 Loaded 46 chid ids of type : 4855 batch 1 Loaded 46 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : copy_chis we use saveGeneral [1356331742, 1356331735, 1356331730, 1356331719, 1356331714] Looping around the photos to save general results len do output : 5 /1356331742Didn't retrieve data . /1356331735Didn't retrieve data . /1356331730Didn't retrieve data . /1356331719Didn't retrieve data . /1356331714Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331742', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331735', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331730', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331719', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331714', None, None, None, None, None, '2812240') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 10 time used for this insertion : 0.014520645141601562 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.08600521087646484 time spend to save output : 0.014889955520629883 total time spend for step 1 : 0.10089516639709473 step2:consolidate_hashtags_from_manual_portfolio Fri May 9 14:22:42 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step consolidate_hashtags_from_manual_portfolio Iterating over portfolio : 22713557 on est dans le IF portfolio mere 26T SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22713557 AND mptpi.`type`=4856 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou')) To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22713557 AND mptpi.`type`=4857 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou')) To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22713557 AND mptpi.`type`=4857 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou')) To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=9745827 AND mptpi.`type`=4856 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou')) To do TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720636 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 141 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 47 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720637 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 136 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 58 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720638 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 97 chid ids of type : 4855 begin to find the sub_photo_id : batch 1 Loaded 46 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720639 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 97 chid ids of type : 4855 begin to find the sub_photo_id : batch 1 Loaded 47 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720640 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720645 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720647 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 39 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 15 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720648 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720649 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720650 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720651 AND mpp.hide_status=0 ORDER BY ph.size desc TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720652 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 97 chid ids of type : 4855 begin to find the sub_photo_id : batch 1 Loaded 48 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720654 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 116 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 57 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720655 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 151 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 76 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720657 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 126 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 70 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720658 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 160 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 124 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720659 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 141 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 125 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720660 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 116 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 100 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! TODO : # On doit donc construire les chi a partir des informations dans les photos filles query : SELECT ph.photo_id FROM MTRBack.photos ph, MTRUser.mtr_portfolio_photos mpp WHERE ph.photo_id=mpp.mtr_photo_id AND mpp.mtr_portfolio_id=22720661 AND mpp.hide_status=0 ORDER BY ph.size desc batch 1 Loaded 136 chid ids of type : 4855 begin to find the sub_photo_id : begin to find the sub_photo_id : begin to find the sub_photo_id : batch 1 Loaded 131 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! To test ! Use context local managing function ! Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : consolidate_hashtags_from_manual_portfolio we use saveGeneral [1356331742, 1356331735, 1356331730, 1356331719, 1356331714] Looping around the photos to save general results len do output : 6 /Tassign value string error string indices must be integers invalid literal for int() with base 10: 'T' begin to insert list_values into mtr_datou_result : length of list_values in save_final : 5 time used for this insertion : 0.011987447738647461 save_final save missing photos in datou_result : time spend for datou_step_exec : 3.2730422019958496 time spend to save output : 0.012115240097045898 total time spend for step 2 : 3.2851574420928955 step3:rle_unique_nms_with_priority Fri May 9 14:22:46 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array We expect there is only one output and this part is used while all output are not tuple or array VR 22-3-18 : For now we do not clean correctly the datou structure Begin step rle-unique-nms batch 1 Loaded 103 chid ids of type : 4857 seulement à utiliser dans la step consolidation batch 1 Loaded 22 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.029180288314819336 seulement à utiliser dans la step consolidation batch 1 Loaded 18 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.029198408126831055 seulement à utiliser dans la step consolidation batch 1 Loaded 91 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.03722643852233887 seulement à utiliser dans la step consolidation batch 1 Loaded 10 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.026767492294311523 seulement à utiliser dans la step consolidation batch 1 Loaded 13 chid ids of type : 4857 Number RLEs to save : 0 TO DO : save crop sub photo not yet done ! save time : 0.029144287109375 map_output_result : {1356331742: (0.3967166553738584, 'Should be the crop_list due to order', 0.841740924225121), 1356331735: (0.3967166553738584, 'Should be the crop_list due to order', 0.8596303656091026), 1356331730: (0.3967166553738584, 'Should be the crop_list due to order', 0.28221198703506833), 1356331719: (0.3967166553738584, 'Should be the crop_list due to order', 0.0), 1356331714: (0.3967166553738584, 'Should be the crop_list due to order', 0.0)} End step rle-unique-nms Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : rle_unique_nms_with_priority we use saveGeneral [1356331742, 1356331735, 1356331730, 1356331719, 1356331714] Looping around the photos to save general results len do output : 5 /1356331742.Didn't retrieve data . /1356331735.Didn't retrieve data . /1356331730.Didn't retrieve data . /1356331719.Didn't retrieve data . /1356331714.Didn't retrieve data . before output type Used above Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331742', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331735', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331730', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331719', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331714', None, None, None, None, None, '2812240') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 15 time used for this insertion : 0.01199793815612793 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.942495584487915 time spend to save output : 0.012378454208374023 total time spend for step 3 : 1.954874038696289 step4:ventilate_hashtags_in_portfolio Fri May 9 14:22:47 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure beginning of datou step ventilate_hashtags_in_portfolio : To implement ! Iterating over portfolio : 22713557 get user id for portfolio 22713557 SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22713557 AND mptpi.`type`=4857 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou','pet_fonce','pet_opaque','barquette_opaque','film_plastique','autre_emballage','sac')) AND mptpi.`min_score`=0.1 To do To do SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22713557 AND mptpi.`type`=4857 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou','pet_fonce','pet_opaque','barquette_opaque','film_plastique','autre_emballage','sac')) AND mptpi.`min_score`=0.1 To do To do ! Use context local managing function ! SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22713557 AND mptpi.`type`=4856 AND mptpi.`hashtag_id` in (select hashtag_id FROM MTRBack.hashtags where hashtag in ('papier','carton','metal','pet_clair','pehd','flux_dev','film_dev_souple','ela','sac_om_plein','textiles','verre','organique','dasri','masque','encombrant','autre_non_emballage','environnement','mal_croppe','flou','pet_fonce','pet_opaque','barquette_opaque','film_plastique','autre_emballage','sac')) AND mptpi.`min_score`=0.1 To do lien utilise dans velours : https://www.fotonower.com/velours/22720636,22720637,22720638,22720639,22720640,22720645,22720647,22720648,22720649,22720650,22720651,22720652,22720654,22720655,22720657,22720658,22720659,22720660,22720661?tags=papier,carton,metal,pet_clair,pehd,ela,textiles,verre,organique,dasri,masque,encombrant,autre_non_emballage,environnement,flux_dev,mal_croppe,flou,film_dev_souple,sac_om_plein&datou_id_consolidate=4742&port_consolidate=22713557 Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : ventilate_hashtags_in_portfolio we use saveGeneral [1356331742, 1356331735, 1356331730, 1356331719, 1356331714] Looping around the photos to save general results len do output : 1 /22713557. before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331742', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331735', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331730', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331719', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331714', None, None, None, None, None, '2812240') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 6 time used for this insertion : 0.01279449462890625 save_final save missing photos in datou_result : time spend for datou_step_exec : 1.1582937240600586 time spend to save output : 0.013029336929321289 total time spend for step 4 : 1.1713230609893799 step5:final Fri May 9 14:22:49 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 2 VR 22-3-18 : For now we do not clean correctly the datou structure Beginning of datou step final ! Inside saveOutput : final : False verbose : 0 original output for save of step final : {1356331742: ('0.28083351449559335',), 1356331735: ('0.28083351449559335',), 1356331730: ('0.28083351449559335',), 1356331719: ('0.28083351449559335',), 1356331714: ('0.28083351449559335',)} new output for save of step final : {1356331742: ('0.28083351449559335',), 1356331735: ('0.28083351449559335',), 1356331730: ('0.28083351449559335',), 1356331719: ('0.28083351449559335',), 1356331714: ('0.28083351449559335',)} [1356331742, 1356331735, 1356331730, 1356331719, 1356331714] Looping around the photos to save general results len do output : 5 /1356331742.Didn't retrieve data . /1356331735.Didn't retrieve data . /1356331730.Didn't retrieve data . /1356331719.Didn't retrieve data . /1356331714.Didn't retrieve data . before output type Used above Used above Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331742', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331735', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331730', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331719', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331714', None, None, None, None, None, '2812240') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 15 time used for this insertion : 0.011967182159423828 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.03786802291870117 time spend to save output : 0.01226043701171875 total time spend for step 5 : 0.05012845993041992 step6:blur_detection Fri May 9 14:22:49 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step blur_detection toutes les photos sont déjà traitées, on saute les calculs Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 5 time used for this insertion : 0.009810447692871094 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 5 time used for this insertion : 0.008943319320678711 save missing photos in datou_result : time spend for datou_step_exec : 0.02263021469116211 time spend to save output : 0.023203372955322266 total time spend for step 6 : 0.045833587646484375 step7:brightness Fri May 9 14:22:49 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure inside step calcul brightness toutes les photos sont déjà traitées, on saute les calculs Inside saveOutput : final : False verbose : 0 begin to insert list_values into class_photo_scores : length of list_valuse in save_photo_hashtag_id_thcl_score : 5 time used for this insertion : 0.009828329086303711 begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 5 time used for this insertion : 0.009005308151245117 save missing photos in datou_result : time spend for datou_step_exec : 0.02574777603149414 time spend to save output : 0.023311376571655273 total time spend for step 7 : 0.049059152603149414 step8:send_mail_cod Fri May 9 14:22:49 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed complete output_args for input 0 complete output_args for input 1 complete output_args for input 2 Inconsistent number of input and output, step which parrallelize and manage error in input by avoiding sending an output for this data can't be used in tree dependencies of input and output complete output_args for input 3 We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! VR 22-3-18 : For now we do not clean correctly the datou structure dans la step send mail cod work_area: /home/admin in order to get the selector url, please entre the license of selector results_COD_P22713557_09-05-2025_14_22_49.pdf 22826004 change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .change filename to text .imagette228260041746793369 22826005 change filename to text .change filename to text .imagette228260051746793370 22826006 imagette228260061746793371 22826007 imagette228260071746793371 22826008 imagette228260081746793371 22826009 imagette228260091746793371 22826010 change filename to text .imagette228260101746793371 22826011 change filename to text .change filename to text .change filename to text .imagette228260111746793371 22826020 imagette228260201746793371 SELECT h.hashtag,pcr.value FROM MTRUser.portfolio_carac_ratio pcr, MTRBack.hashtags h where pcr.portfolio_id=22713557 and hashtag_type = 4857 and pcr.hashtag_id = h.hashtag_id; velour_link : https://www.fotonower.com/velours/22720636,22720637,22720638,22720639,22720640,22720645,22720647,22720648,22720649,22720650,22720651,22720652,22720654,22720655,22720657,22720658,22720659,22720660,22720661?tags=papier,carton,metal,pet_clair,pehd,ela,textiles,verre,organique,dasri,masque,encombrant,autre_non_emballage,environnement,flux_dev,mal_croppe,flou,film_dev_souple,sac_om_plein&datou_id_consolidate=4742&port_consolidate=22713557 your option no_mail is active, we will not send the real mail to your client args[1356331742] : ((1356331742, 988.8145408582575, 2107751945), (1356331742, -0.41952047213563204, 496442774), '0.28083351449559335') We are sending mail with results at kexin@fotonower.com args[1356331735] : ((1356331735, 5888.680595689192, 492609224), (1356331735, -0.2939822368151335, 496442774), '0.28083351449559335') We are sending mail with results at kexin@fotonower.com args[1356331730] : ((1356331730, 5737.02692753846, 492609224), (1356331730, -0.4541369534437301, 496442774), '0.28083351449559335') We are sending mail with results at kexin@fotonower.com args[1356331719] : ((1356331719, 891.1455274361674, 2107751945), (1356331719, -1.0193062047680794, 501862349), '0.28083351449559335') We are sending mail with results at kexin@fotonower.com args[1356331714] : ((1356331714, 114.20540143219384, 492688767), (1356331714, -0.10843558063315373, 496442774), '0.28083351449559335') We are sending mail with results at kexin@fotonower.com refus_total : 0.28083351449559335 2022-04-13 10:29:59 0 start upload file to ovh https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22713557_09-05-2025_14_22_49.pdf results_COD_P22713557_09-05-2025_14_22_49.pdf uploaded to url https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22713557_09-05-2025_14_22_49.pdf start insert file to database insert into MTRUser.mtr_files (mtd_id,mtr_portfolio_id,text,url,format,tags,file_size,value) values ('4742','22713557','results_COD_P22713557_09-05-2025_14_22_49.pdf','https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22713557_09-05-2025_14_22_49.pdf','pdf','','0.45','0.28083351449559335') Inside saveOutput : final : False verbose : 0 saveOutput not yet implemented for datou_step.type : send_mail_cod we use saveGeneral [1356331742, 1356331735, 1356331730, 1356331719, 1356331714] Looping around the photos to save general results len do output : 0 before output type Used above Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331742', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331735', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331730', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331719', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331714', None, None, None, None, None, '2812240') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 5 time used for this insertion : 0.011850833892822266 save_final save missing photos in datou_result : time spend for datou_step_exec : 4.3936357498168945 time spend to save output : 0.011985063552856445 total time spend for step 8 : 4.405620813369751 step9:split_time_score Fri May 9 14:22:53 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec Currently we do not manage missing dependencies information, that could maybe be correctly interpreted with default behavior Some of the step done at execution of the step could be done before when the tree of execution is build and the dependencies of different step analysed We should have FATAL ERROR but same_nb_input_output==True : this should be an optionnal input ! complete output_args for input 1 VR 22-3-18 : For now we do not clean correctly the datou structure begin split time score TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 861, 'mtr_user_id': 31, 'name': 'Rungis_class_dechets_1212', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'Rungis_Aluminium,Rungis_Carton,Rungis_Papier,Rungis_Plastique_clair,Rungis_Plastique_dur,Rungis_Plastique_fonce,Rungis_Tapis_vide,Rungis_Tetrapak', 'svm_portfolios_learning': '1160730,571842,571844,571839,571933,571840,571841,572307', 'photo_hashtag_type': 999, 'photo_desc_type': 3963, 'type_classification': 'caffe', 'hashtag_id_list': '2107751280,2107750907,2107750908,2107750909,2107750910,2107750911,2107750912,2107750913'}] thcls : [{'id': 758, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 856, 'photo_desc_type': 3853, 'type_classification': 'caffe', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('54', 1), ('55', 1), ('56', 3)) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {} 30042025 22713557 Nombre de photos uploadées : 5 / 23040 (0%) 30042025 22713557 Nombre de photos taguées (types de déchets): 0 / 5 (0%) 30042025 22713557 Nombre de photos taguées (volume) : 0 / 5 (0%) elapsed_time : load_data_split_time_score 4.0531158447265625e-06 elapsed_time : order_list_meta_photo_and_scores 1.1444091796875e-05 ????? elapsed_time : fill_and_build_computed_from_old_data 0.00041675567626953125 elapsed_time : insert_dashboard_record_day_entry 0.02328014373779297 We will return after consolidate but for now we need the day, how to get it, for now depending on the previous heavy steps Qualite : 0.4205276134767622 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22713556_09-05-2025_14_22_10.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 22713556 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13169 copy_chis is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13170 consolidate_hashtags_from_manual_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13167 rle_unique_nms_with_priority is not consistent : 3 used against 1 in the step definition ! WARNING : number of outputs for step 13167 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13174 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13168 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13171 blur_detection have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13172 brightness have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13173 send_mail_cod have less inputs used (4) than in the step definition (5) : maybe we manage optionnal inputs ! Step 13175 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13169 have datatype=11 whereas input 0 of step 13167 have datatype=2 WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 3 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 1 of step 13167 doesn't seem to be define in the database( WARNING : output 0 of step 13174 have datatype=10 whereas input 3 of step 13173 have datatype=6 WARNING : type of input 1 of step 13174 doesn't seem to be define in the database( WARNING : output 1 of step 13167 have datatype=7 whereas input 1 of step 13174 have datatype=None WARNING : type of output 1 of step 13174 doesn't seem to be define in the database( WARNING : type of input 4 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13169 doesn't seem to be define in the database( WARNING : type of input 1 of step 13170 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22713556 AND mptpi.`type`=4857 To do Qualite : 0.28083351449559335 find url: https://storage.sbg.cloud.ovh.net/v1/AUTH_3b171620e76e4af496c5fd050759c9f0/media.fotonower.com/results_COD_P22713557_09-05-2025_14_22_49.pdf select completion_json, dashboard_run_id from MTRPhoto.dashboard_results where mtr_portfolio_id = 22713557 order by id desc limit 1 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13169 copy_chis is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13170 consolidate_hashtags_from_manual_portfolio is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13167 rle_unique_nms_with_priority is not consistent : 3 used against 1 in the step definition ! WARNING : number of outputs for step 13167 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13174 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13168 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13171 blur_detection have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13172 brightness have less inputs used (0) than in the step definition (1) : maybe we manage optionnal inputs ! Step 13173 send_mail_cod have less inputs used (4) than in the step definition (5) : maybe we manage optionnal inputs ! Step 13175 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13169 have datatype=11 whereas input 0 of step 13167 have datatype=2 WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 3 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13170 doesn't seem to be define in the database( WARNING : type of input 1 of step 13167 doesn't seem to be define in the database( WARNING : output 0 of step 13174 have datatype=10 whereas input 3 of step 13173 have datatype=6 WARNING : type of input 1 of step 13174 doesn't seem to be define in the database( WARNING : output 1 of step 13167 have datatype=7 whereas input 1 of step 13174 have datatype=None WARNING : type of output 1 of step 13174 doesn't seem to be define in the database( WARNING : type of input 4 of step 13168 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13169 doesn't seem to be define in the database( WARNING : type of input 1 of step 13170 doesn't seem to be define in the database( DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=22713557 AND mptpi.`type`=4857 To do NUMBER BATCH : 0 # DISPLAY ALL COLLECTED DATA : {'30042025': {'nb_upload': 5, 'nb_taggue_class': 0, 'nb_taggue_densite': 0}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1356331742, 1356331735, 1356331730, 1356331719, 1356331714] Looping around the photos to save general results len do output : 1 /22713557Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331742', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331735', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331730', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331719', None, None, None, None, None, '2812240') ('4742', None, None, None, None, None, None, None, '2812240') ('4742', '22713557', '1356331714', None, None, None, None, None, '2812240') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 6 time used for this insertion : 0.01365804672241211 save_final save missing photos in datou_result : time spend for datou_step_exec : 0.24026131629943848 time spend to save output : 0.013758420944213867 total time spend for step 9 : 0.25401973724365234 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 5 set_done_treatment 3.16user 2.26system 0:17.00elapsed 31%CPU (0avgtext+0avgdata 220420maxresident)k 4056inputs+50264outputs (10major+222486minor)pagefaults 0swaps