python /home/admin/mtr/script_for_cron.py -j datou_current3 -m 8 -a ' -a 4323 -l 20 --fifo' -s datou_current_sts -M 0 -S 0 -U 95,95,120 import MySQLdb succeeded Import error (python version) ['/Users/moilerat/Documents/Fotonower/install/caffe/distribute/python', '/home/admin/workarea/git/Velours/python/prod', '/home/admin/workarea/install/caffe_cuda8_python3/python', '/home/admin/workarea/install/darknet', '/home/admin/workarea/git/Velours/python', '/home/admin/workarea/install/caffe_frcnn_python3/py-faster-rcnn/caffe-fast-rcnn/python', '/home/admin/mtr/.credentials', '/home/admin/workarea/install/caffe/python', '/home/admin/workarea/install/caffe_frcnn/py-faster-rcnn/tools', '/home/admin/workarea/git/fotonowerpip', '/home/admin/workarea/install/segment-anything', '/home/admin/workarea/git/pyfvs', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/home/admin/.local/lib/python3.8/site-packages', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages'] process id : 4022016 load datou : 4323 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! Unexpected type for variable list_input_json ERROR or WARNING : can't parse json string Expecting value: line 1 column 1 (char 0) Tried to parse : None was removed should we ? donnée sous forme de texte was removed should we ? [ptf_id0,ptf_id1...] was removed should we ? load thcls load pdts Running datou job : batch_current TODO datou_current to load to do maybe to take outside batchDatouExec no input labels no input values updating current state to 1 we have a portfolio with more photos than limit : 40>20 please execute split_portfolio.py -i 24660084 -l 20 size over we load limit photo not treated list_input_json: {} Current got : datou_id : 4323, datou_cur_ids : ['3245455'] with mtr_portfolio_ids : ['24660084'] and first list_photo_ids : [] new path : /proc/4022016/ Inside batchDatouExec : verbose : 0 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) DONE and to test : checkNoCycle ! We are managing only one step so we do not consider checkConsistencyNbInputNbOutput ! We are managing only one step so we do not consider checkConsistencyTypeOutputInput ! List Step Type Loaded in datou : split_time_score list_input_json : {} origin We have 1 , WARNING: data may be incomplete, need to offset and complete ! we have missing 0 photos in the step downloads : photo missing : [] try to delete the photos missing in DB time to download the photos : 0.030311107635498047 About to test input to load Calling datou_exec Inside datou_exec : verbose : 0 number of steps : 1 step1:split_time_score Mon Jul 7 14:08:09 2025 VR 17-11-17 : now, only for linear exec dependencies tree, some output goes to fill the input of the next VR 22-3-18 : now we test the dependencies tree, but keep two separate code for datou_prepare_output_input until the code is correctly tested, clean and works in both case VR 22-3-18 : but we use the first code for the first step id = -1, build in the code of datou_exec VR 22-3-18 : we should manage here the case when we are at the first step instead of building this step before datou_exec begin split time score 2022-04-13 10:29:59 0 TODO : Insert select and so on Begin split_port_in_batch_balle thcls : [{'id': 3847, 'mtr_user_id': 31, 'name': 'learn_MM_generique_050224', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': 'aluminium,ela,emr,film_pedb,flux_dev,jrm,pcm,pcnc,pehd_pp,pet_clair,refus,tapis_vide', 'svm_portfolios_learning': '13096157,13096155,13096163,13096159,13301956,13095886,13096162,13096160,13358264,13096158,5515868,13276803', 'photo_hashtag_type': 4932, 'photo_desc_type': 6032, 'type_classification': 'tf_classification2', 'hashtag_id_list': '493546845,492741797,616987804,2107760237,2107760238,495916461,560181804,1284539308,2107760239,2107755846,538914404,2107748999'}] thcls : [{'id': 3513, 'mtr_user_id': 31, 'name': 'Rungis_amount_dechets_fall_2018_v2_tf', 'pb_hashtag_id': 0, 'live': b'\x00', 'list_hashtags': '05102018_Papier_non_papier_dense,05102018_Papier_non_papier_peu_dense,05102018_Papier_non_papier_presque_vide,05102018_Papier_non_papier_tres_dense,05102018_Papier_non_papier_tres_peu_dense', 'svm_portfolios_learning': '1108385,1108386,1108388,1108384,1108387', 'photo_hashtag_type': 4557, 'photo_desc_type': 5767, 'type_classification': 'tf_classification2', 'hashtag_id_list': '2107751013,2107751014,2107751015,2107751016,2107751017'}] (('00', 4), ('05', 20), ('06', 16)) ERROR counted https://github.com/fotonower/Velours/issues/663#issuecomment-421136223 {1: 40} 07072025 24660084 Nombre de photos uploadées : 40 / 23040 (0%) 07072025 24660084 Nombre de photos taguées (types de déchets): 40 / 40 (100%) 07072025 24660084 Nombre de photos taguées (volume) : 40 / 40 (100%) elapsed_time : load_data_split_time_score 1.1920928955078125e-06 elapsed_time : order_list_meta_photo_and_scores 1.8835067749023438e-05 elapsed_time : fill_and_build_computed_from_old_data 0.0018067359924316406 elapsed_time : insert_dashboard_record_day_entry 0.02608489990234375 Creating list_photo_total elapsed_time : select_descriptors 0.8409783840179443 07072025 24660084 Nombre de photos avec descriptors (type 6032) : 12 / 12 (100%) Missing descriptors for photos 0 and 1370966576 0:00:00|ON:Missing descriptors for photos 1370966576 and 1370966484 Missing descriptors for photos 1370966484 and 1370967442 Missing descriptors for photos 1370967442 and 1370967441 Missing descriptors for photos 1370967441 and 1370981970 Missing descriptors for photos 1370981970 and 1370981969 Missing descriptors for photos 1370981969 and 1370981968 Missing descriptors for photos 1370981968 and 1370981967 Missing descriptors for photos 1370981967 and 1370981966 Missing descriptors for photos 1370981966 and 1370981965 Missing descriptors for photos 1370982201 and 1370982200 Missing descriptors for photos 1370982200 and 1370982199 Missing descriptors for photos 1370982199 and 1370982198 Missing descriptors for photos 1370982198 and 1370982196 Missing descriptors for photos 1370985025 and 1370985220 Missing descriptors for photos 1370985220 and 1370985219 Missing descriptors for photos 1370985219 and 1370985217 Missing descriptors for photos 1370985217 and 1370985216 Missing descriptors for photos 1370985216 and 1370985215 Missing descriptors for photos 1370985215 and 1370985213 Missing descriptors for photos 1370985213 and 1371000696 Missing descriptors for photos 1371000696 and 1371000695 Missing descriptors for photos 1371000695 and 1371000694 Missing descriptors for photos 1371000694 and 1371000693 Missing descriptors for photos 1371000693 and 1371000692 Missing descriptors for photos 1371000692 and 1371000691 07072025 Removing 0 photos because of the 'same image' condition Total on : 0 Total off : 0.0 list_time_off Warning in study_and_display_distrib_list : min=max : 0.0 0.0 dist_desc begin to insert list_values into photo_hahstag_ids : length of list_valuse in save_photo_hashtag_id_type : 40 time used for this insertion : 0.012631416320800781 photos_removed : len 0 elapsed_time : remove_photo_duplicate 0.04927945137023926 Creating list_photo_total time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average time_diff is bigger than the limit of interval, we ignore the result of this image in moving average elapsed_time : count_sum_diff_and_build_graph 0.004826545715332031 Total photos : 40 .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .Change port : 0 hashtag : 616987804 photo_id =1370982133 : emr .......can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .Change port : 7 hashtag : 2107760238 photo_id =1370985030 : flux_dev .....can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info .can't find max_score_info Total photos : 40 Number of lists : 3 counter photos in port : 12 hashtag : aluminium(493546845) : 0 photos in 0 portfolios ! hashtag : ela(492741797) : 0 photos in 0 portfolios ! hashtag : emr(616987804) : 7 photos in 1 portfolios ! hashtag : film_pedb(2107760237) : 0 photos in 0 portfolios ! hashtag : flux_dev(2107760238) : 5 photos in 1 portfolios ! hashtag : jrm(495916461) : 0 photos in 0 portfolios ! hashtag : pcm(560181804) : 0 photos in 0 portfolios ! hashtag : pcnc(1284539308) : 0 photos in 0 portfolios ! hashtag : pehd_pp(2107760239) : 0 photos in 0 portfolios ! hashtag : pet_clair(2107755846) : 0 photos in 0 portfolios ! hashtag : refus(538914404) : 0 photos in 0 portfolios ! hashtag : tapis_vide(2107748999) : 0 photos in 1 portfolios ! elapsed_time : group_photo_by_moyenne_exp 0.0022733211517333984 elapsed_time : compute_and_correct_tag_with_moyenne_mobile 3.0994415283203125e-06 today str has not a value , we define it as the date of the first image todaystr_first : 07072025 attention , prev_timestamp is 0 , we do nothing ********* BIG TIME 2560.001088142395 (49.999924182891846, 2, 0, 0, 0.79323477, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1.882099985909462, 0.0017, 0.0, 0.9, 0.041000041961669925, -0.051667048533757524) on 6 1370985030 2025-07-07 05:59:43.010292 id_data : 22 ** BIG TIME 250.00017404556274 (90.99976706504822, 4, 0, 0, 0, 0, 0.85701793, 0, 0, 0, 0, 0, 0, 0, 2.155999855518341, 0.0027, 0.0, 1.9, 0.0499986400604248, -0.06666483680407206) on 10 1370985219 2025-07-07 06:07:04.010281 id_data : 29 * BIG TIME 1691.0001530647278 (90.99976706504822, 4, 0, 0, 0, 0, 0.85701793, 0, 0, 0, 0, 0, 0, 0, 2.155999855518341, 0.0027, 0.0, 1.9, 0.0499986400604248, -0.06666483680407206) on 10 1371000696 2025-07-07 06:35:54.010153 id_data : 34 ***Count Time bigger than 30s : 15 #Number Photos for regression : {'07072025': {493546845: {2107751013: 0, 2107751014: 0, 2107751015: 0, 2107751016: 0, 2107751017: 0}, 492741797: {2107751013: 0, 2107751014: 0, 2107751015: 0, 2107751016: 0, 2107751017: 0}, 616987804: {2107751013: 0, 2107751014: 28.998618841171265, 2107751015: 21.0002121925354, 2107751016: 0, 2107751017: 0}, 2107760237: {2107751013: 0, 2107751014: 0, 2107751015: 0, 2107751016: 0, 2107751017: 0}, 2107760238: {2107751013: 0, 2107751014: 0, 2107751015: 49.998640060424805, 2107751016: 0, 2107751017: 0}, 495916461: {2107751013: 0, 2107751014: 0, 2107751015: 0, 2107751016: 0, 2107751017: 0}, 560181804: {2107751013: 0, 2107751014: 0, 2107751015: 0, 2107751016: 0, 2107751017: 0}, 1284539308: {2107751013: 0, 2107751014: 0, 2107751015: 0, 2107751016: 0, 2107751017: 0}, 2107760239: {2107751013: 0, 2107751014: 10.001110076904297, 2107751015: 0, 2107751016: 0, 2107751017: 0}, 2107755846: {2107751013: 0, 2107751014: 0, 2107751015: 0, 2107751016: 0, 2107751017: 0}, 538914404: {2107751013: 0, 2107751014: 0, 2107751015: 0, 2107751016: 0, 2107751017: 0}, 2107748999: {2107751013: 0, 2107751014: 0, 2107751015: 668.0004916191101, 2107751016: 0, 2107751017: 0}}} 07072025|aluminium, 05102018_papier_non_papier_dense:0 07072025|aluminium, 05102018_papier_non_papier_peu_dense:0 07072025|aluminium, 05102018_papier_non_papier_presque_vide:0 07072025|aluminium, 05102018_papier_non_papier_tres_dense:0 07072025|aluminium, 05102018_papier_non_papier_tres_peu_dense:0 07072025|ela, 05102018_papier_non_papier_dense:0 07072025|ela, 05102018_papier_non_papier_peu_dense:0 07072025|ela, 05102018_papier_non_papier_presque_vide:0 07072025|ela, 05102018_papier_non_papier_tres_dense:0 07072025|ela, 05102018_papier_non_papier_tres_peu_dense:0 07072025|emr, 05102018_papier_non_papier_dense:0 07072025|emr, 05102018_papier_non_papier_peu_dense:28.998618841171265 07072025|emr, 05102018_papier_non_papier_presque_vide:21.0002121925354 07072025|emr, 05102018_papier_non_papier_tres_dense:0 07072025|emr, 05102018_papier_non_papier_tres_peu_dense:0 07072025|film_pedb, 05102018_papier_non_papier_dense:0 07072025|film_pedb, 05102018_papier_non_papier_peu_dense:0 07072025|film_pedb, 05102018_papier_non_papier_presque_vide:0 07072025|film_pedb, 05102018_papier_non_papier_tres_dense:0 07072025|film_pedb, 05102018_papier_non_papier_tres_peu_dense:0 07072025|flux_dev, 05102018_papier_non_papier_dense:0 07072025|flux_dev, 05102018_papier_non_papier_peu_dense:0 07072025|flux_dev, 05102018_papier_non_papier_presque_vide:49.998640060424805 07072025|flux_dev, 05102018_papier_non_papier_tres_dense:0 07072025|flux_dev, 05102018_papier_non_papier_tres_peu_dense:0 07072025|jrm, 05102018_papier_non_papier_dense:0 07072025|jrm, 05102018_papier_non_papier_peu_dense:0 07072025|jrm, 05102018_papier_non_papier_presque_vide:0 07072025|jrm, 05102018_papier_non_papier_tres_dense:0 07072025|jrm, 05102018_papier_non_papier_tres_peu_dense:0 07072025|pcm, 05102018_papier_non_papier_dense:0 07072025|pcm, 05102018_papier_non_papier_peu_dense:0 07072025|pcm, 05102018_papier_non_papier_presque_vide:0 07072025|pcm, 05102018_papier_non_papier_tres_dense:0 07072025|pcm, 05102018_papier_non_papier_tres_peu_dense:0 07072025|pcnc, 05102018_papier_non_papier_dense:0 07072025|pcnc, 05102018_papier_non_papier_peu_dense:0 07072025|pcnc, 05102018_papier_non_papier_presque_vide:0 07072025|pcnc, 05102018_papier_non_papier_tres_dense:0 07072025|pcnc, 05102018_papier_non_papier_tres_peu_dense:0 07072025|pehd_pp, 05102018_papier_non_papier_dense:0 07072025|pehd_pp, 05102018_papier_non_papier_peu_dense:10.001110076904297 07072025|pehd_pp, 05102018_papier_non_papier_presque_vide:0 07072025|pehd_pp, 05102018_papier_non_papier_tres_dense:0 07072025|pehd_pp, 05102018_papier_non_papier_tres_peu_dense:0 07072025|pet_clair, 05102018_papier_non_papier_dense:0 07072025|pet_clair, 05102018_papier_non_papier_peu_dense:0 07072025|pet_clair, 05102018_papier_non_papier_presque_vide:0 07072025|pet_clair, 05102018_papier_non_papier_tres_dense:0 07072025|pet_clair, 05102018_papier_non_papier_tres_peu_dense:0 07072025|refus, 05102018_papier_non_papier_dense:0 07072025|refus, 05102018_papier_non_papier_peu_dense:0 07072025|refus, 05102018_papier_non_papier_presque_vide:0 07072025|refus, 05102018_papier_non_papier_tres_dense:0 07072025|refus, 05102018_papier_non_papier_tres_peu_dense:0 07072025|tapis_vide, 05102018_papier_non_papier_dense:0 07072025|tapis_vide, 05102018_papier_non_papier_peu_dense:0 07072025|tapis_vide, 05102018_papier_non_papier_presque_vide:668.0004916191101 07072025|tapis_vide, 05102018_papier_non_papier_tres_dense:0 07072025|tapis_vide, 05102018_papier_non_papier_tres_peu_dense:0 #Number Photos for regression amount gros magasin papier (time_diff then nb_photo) : We have not displayed the number of photos removed for one material since Rungis_Papier wasn't in the thcl used ! 07072025_time_diff_distrib Number amount portfolio for this type of dechet : aluminium 0 Number amount portfolio for this type of dechet : ela 0 Number amount portfolio for this type of dechet : emr 5 https://marlene.fotonower.com/api/v1/secured/portfolio/new?name=07072025_emr_05102018_papier_non_papier_peu_dense&access_token=3e00c8f3e5af383c48e3c68940ba6fe7 Created to study and clean : 24680358 with name like 07072025_emr_05102018_papier_non_papier_peu_dense https://marlene.fotonower.com/api/v1/secured/portfolio/new?name=07072025_emr_05102018_papier_non_papier_presque_vide&access_token=3e00c8f3e5af383c48e3c68940ba6fe7 Created to study and clean : 24680359 with name like 07072025_emr_05102018_papier_non_papier_presque_vide Number amount portfolio for this type of dechet : film_pedb 0 Number amount portfolio for this type of dechet : flux_dev 4 Number amount portfolio for this type of dechet : jrm 0 Number amount portfolio for this type of dechet : pcm 0 Number amount portfolio for this type of dechet : pcnc 0 Number amount portfolio for this type of dechet : pehd_pp 1 https://marlene.fotonower.com/api/v1/secured/portfolio/new?name=07072025_pehd_pp_05102018_papier_non_papier_peu_dense&access_token=3e00c8f3e5af383c48e3c68940ba6fe7 Created to study and clean : 24680360 with name like 07072025_pehd_pp_05102018_papier_non_papier_peu_dense Number amount portfolio for this type of dechet : pet_clair 0 Number amount portfolio for this type of dechet : refus 0 Number amount portfolio for this type of dechet : tapis_vide 20 https://marlene.fotonower.com/api/v1/secured/portfolio/new?name=07072025_tapis_vide_05102018_papier_non_papier_presque_vide&access_token=3e00c8f3e5af383c48e3c68940ba6fe7 Created to study and clean : 24680361 with name like 07072025_tapis_vide_05102018_papier_non_papier_presque_vide NUMBER BATCH : 3 list_ponderation used : [0.001, 0.001, 0.001, 0.001, 0.001, 0.001, 0.001, 0.001, 0.001, 0.001, 0.001] , list_hashtag_class_create_as_list : ['pcnc', 'gm', 'emr', 'ela', 'pet_clair', 'film_pedb', 'pehd_pp', 'flux_dev_rigide', 'aluminium', 'tapis_vide', 'refus', 'gros_cartons', 'gm', 'flux_dev_rigide', 'gros_cartons'] We filter photos on hashtag condition ! We filter photos on hashtag condition ! result_one_balle_Type_emr:{'day': '07072025', 'map_nb_amount': {0: 5, 1: 2, 2: 0, 3: 0, 4: 0}, 'map_time_amount': {0: 38.99972891807556, 1: 21.0002121925354, 2: 0, 3: 0, 4: 0}, 'duration': 59.99994111061096, 'nb_balles_papier': 0.06099994111061097, 'begin_time_port': 'image_07072025_05_13_54_010295m0.jpg 0.001 for time 1, id_amount 1 this amount prod time diff : 0.001'} Production hashtag (incorrect ponderation at 20-10-18) : 0.06099994111061097 We filter photos on hashtag condition ! hashtag non trouvé ! We filter photos on hashtag condition ! hashtag non trouvé ! hashtag non trouvé ! We filter photos on hashtag condition ! We filter photos on hashtag condition ! We filter photos on hashtag condition ! We filter photos on hashtag condition ! We filter photos on hashtag condition ! We filter photos on hashtag condition ! We filter photos on hashtag condition ! We filter photos on hashtag condition ! We filter photos on hashtag condition ! We filter photos on hashtag condition ! hashtag non trouvé ! We filter photos on hashtag condition ! hashtag non trouvé ! hashtag non trouvé ! We filter photos on hashtag condition ! We filter photos on hashtag condition ! We filter photos on hashtag condition ! We filter photos on hashtag condition ! result_one_balle_Type_flux_dev_rigide:{'day': '07072025', 'map_nb_amount': {0: 0, 1: 5, 2: 0, 3: 0, 4: 0}, 'map_time_amount': {0: 0, 1: 49.998640060424805, 2: 0, 3: 0, 4: 0}, 'duration': 49.998640060424805, 'nb_balles_papier': 0.0509986400604248, 'begin_time_port': 'image_07072025_05_59_43_010292m0.jpg 0.001 for time 1, id_amount 2 this amount prod time diff : 0.001'} Production hashtag (incorrect ponderation at 20-10-18) : 0.0509986400604248 We filter photos on hashtag condition ! We have rejected 0 photos because of the batch_size condition ! NUMBER BATCH list_of_portfolios_to_create : 2 list_same_port_ids : [24664763] find same portfolio which already exist 24664763 , we will use it list_same_port_ids : [24664768] find same portfolio which already exist 24664768 , we will use it Qualite : 0.04716335224363393 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 11500 mask_detect is not consistent : 3 used against 2 in the step definition ! WARNING : number of outputs for step 11508 brightness is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11509 blur_detection is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 11504 crop_condition is not consistent : 3 used against 2 in the step definition ! Step 11504 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 11507 merge_mask_thcl_custom have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11507 merge_mask_thcl_custom is not consistent : 3 used against 2 in the step definition ! WARNING : number of inputs for step 11501 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 11501 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 11576 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 11576 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of outputs for step 11503 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 11502 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 11502 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 11511 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 11503 doesn't seem to be define in the database( WARNING : type of input 3 of step 11502 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 2 of step 11500 doesn't seem to be define in the database( WARNING : type of input 2 of step 11504 doesn't seem to be define in the database( WARNING : output 1 of step 11500 have datatype=2 whereas input 1 of step 11507 have datatype=7 WARNING : type of output 2 of step 11507 doesn't seem to be define in the database( WARNING : type of input 1 of step 11501 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 11503 have datatype=10 whereas input 3 of step 11510 have datatype=6 WARNING : type of input 2 of step 11576 doesn't seem to be define in the database( WARNING : output 1 of step 11501 have datatype=7 whereas input 2 of step 11576 have datatype=None WARNING : type of output 3 of step 11576 doesn't seem to be define in the database( WARNING : type of input 1 of step 11503 doesn't seem to be define in the database( WARNING : output 0 of step 11503 have datatype=10 whereas input 0 of step 11582 have datatype=18 WARNING : type of input 5 of step 11510 doesn't seem to be define in the database( WARNING : output 0 of step 11582 have datatype=11 whereas input 5 of step 11510 have datatype=None WARNING : type of output 1 of step 11508 doesn't seem to be define in the database( WARNING : type of input 3 of step 11504 doesn't seem to be define in the database( WARNING : type of output 1 of step 11509 doesn't seem to be define in the database( WARNING : type of input 3 of step 11504 doesn't seem to be define in the database( WARNING : output 0 of step 11507 have datatype=1 whereas input 0 of step 11501 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=24664763 AND mptpi.`type`=4207 To do Qualite : 0.44130199977121976 # VR 17-11-17 : to create in DB ! Here we check the datou graph and we reorder steps ! Tree builded and cycle checked, now we need to re-order the steps ! We have currenlty an error because there is no dependence between the last step for the case tile - detect - glue We can either keep the depence of, it is better to keep an order compatible with the id of steps if we do not have sons, so a lexical order : (number_son, step_id) All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! All sons are already in current list ! DONE and to test : checkNoCycle ! Here we check the consistency of inputs/outputs number between the given ones and the db ! eke 1-6-18 : checkConsistencyNbInputNbOutput should be processed after step reordering ! WARNING : number of outputs for step 13185 mask_detect is not consistent : 3 used against 2 in the step definition ! WARNING : number of outputs for step 13193 brightness is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13194 blur_detection is not consistent : 2 used against 1 in the step definition ! WARNING : number of inputs for step 13189 crop_condition is not consistent : 3 used against 2 in the step definition ! Step 13189 crop_condition have less outputs used (2) than in the step definition (3) : some outputs may be not used ! Step 13191 argmax have less outputs used (1) than in the step definition (2) : some outputs may be not used ! WARNING : number of outputs for step 13192 merge_mask_thcl_custom is not consistent : 3 used against 2 in the step definition ! WARNING : number of inputs for step 13186 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! WARNING : number of outputs for step 13186 rle_unique_nms_with_priority is not consistent : 2 used against 1 in the step definition ! Step 13197 crop_condition have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! WARNING : number of outputs for step 13197 crop_condition is not consistent : 4 used against 3 in the step definition ! WARNING : number of outputs for step 13188 ventilate_hashtags_in_portfolio is not consistent : 2 used against 1 in the step definition ! Step 13187 final have less inputs used (2) than in the step definition (3) : maybe we manage optionnal inputs ! Step 13187 final have less outputs used (1) than in the step definition (2) : some outputs may be not used ! Step 13196 split_time_score have less inputs used (1) than in the step definition (2) : maybe we manage optionnal inputs ! Number of inputs / outputs for each step checked ! Here we check the consistency of outputs/inputs types during steps connections eke 1-6-18 : checkConsistencyTypeOutputInput should be processed after checkConsistencyNbInputNbOutput ! We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 1 of step 13188 doesn't seem to be define in the database( WARNING : type of input 3 of step 13187 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : type of output 2 of step 13185 doesn't seem to be define in the database( WARNING : type of input 2 of step 13189 doesn't seem to be define in the database( WARNING : output 0 of step 13185 have datatype=16 whereas input 0 of step 13192 have datatype=1 WARNING : output 1 of step 13185 have datatype=2 whereas input 1 of step 13192 have datatype=7 WARNING : output 0 of step 13191 have datatype=6 whereas input 2 of step 13192 have datatype=5 WARNING : type of output 2 of step 13192 doesn't seem to be define in the database( WARNING : type of input 1 of step 13186 doesn't seem to be define in the database( We ignore checkConsistencyTypeOutputInput for datou_step final ! WARNING : output 0 of step 13188 have datatype=10 whereas input 3 of step 13195 have datatype=6 WARNING : type of input 2 of step 13197 doesn't seem to be define in the database( WARNING : output 1 of step 13186 have datatype=7 whereas input 2 of step 13197 have datatype=None WARNING : type of output 3 of step 13197 doesn't seem to be define in the database( WARNING : type of input 1 of step 13188 doesn't seem to be define in the database( WARNING : output 0 of step 13188 have datatype=10 whereas input 0 of step 13198 have datatype=18 WARNING : type of input 5 of step 13195 doesn't seem to be define in the database( WARNING : output 0 of step 13198 have datatype=11 whereas input 5 of step 13195 have datatype=None WARNING : type of output 1 of step 13193 doesn't seem to be define in the database( WARNING : type of input 3 of step 13189 doesn't seem to be define in the database( WARNING : type of output 1 of step 13194 doesn't seem to be define in the database( WARNING : type of input 3 of step 13189 doesn't seem to be define in the database( WARNING : output 0 of step 13192 have datatype=1 whereas input 0 of step 13186 have datatype=2 DataTypes for each output/input checked ! TODO Duplicate data, are they consistent 3 ? Duplicate data, are they consistent 4 ? SELECT mptpi.id, mptpi.mtr_portfolio_id_1, mptpi.mtr_portfolio_id_2, mptpi.type, mptpi.hashtag_id, mptpi.min_score, mptpi.mtr_user_id, mptpi.created_at, mptpi.updated_at, mptpi.last_updated_at_desc, mptpi.last_updated_at_asc, h.hashtag FROM MTRPhoto.mtr_port_to_port_ids mptpi, MTRBack.hashtags h WHERE h.hashtag_id=mptpi.hashtag_id AND mptpi.`mtr_portfolio_id_1`=24664768 AND mptpi.`type`=4200 To do elapsed_time : count_nb_balles_and_create_portfolio 19.195841312408447 # DISPLAY ALL COLLECTED DATA : {'07072025': {'nb_upload': 40, 'nb_taggue_class': 40, 'nb_taggue_densite': 40, 'nb_descriptors': 12, 'number_port': 3, 'count_photo_in_port': 12, 'nb_port_per_class': {'aluminium': {'nb_photos': 0, 'nb_portfolios': 0}, 'ela': {'nb_photos': 0, 'nb_portfolios': 0}, 'emr': {'nb_photos': 7, 'nb_portfolios': 1}, 'film_pedb': {'nb_photos': 0, 'nb_portfolios': 0}, 'flux_dev': {'nb_photos': 5, 'nb_portfolios': 1}, 'jrm': {'nb_photos': 0, 'nb_portfolios': 0}, 'pcm': {'nb_photos': 0, 'nb_portfolios': 0}, 'pcnc': {'nb_photos': 0, 'nb_portfolios': 0}, 'pehd_pp': {'nb_photos': 0, 'nb_portfolios': 0}, 'pet_clair': {'nb_photos': 0, 'nb_portfolios': 0}, 'refus': {'nb_photos': 0, 'nb_portfolios': 0}, 'tapis_vide': {'nb_photos': 0, 'nb_portfolios': 1}}}} Inside saveOutput : final : True verbose : 0 saveOutput not yet implemented for datou_step.type : split_time_score we use saveGeneral [1371000696, 1371000695, 1371000694, 1371000693, 1371000692, 1371000691, 1370985220, 1370985219, 1370985217, 1370985216, 1370985215, 1370985213, 1370985030, 1370985029, 1370985028, 1370985027, 1370985026, 1370985025, 1370982202, 1370982201] Looping around the photos to save general results len do output : 1 /24660084Didn't retrieve data . before output type Here is an output not treated by saveGeneral : Managing all output in save final without adding information in the mtr_datou_result ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1371000696', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1371000695', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1371000694', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1371000693', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1371000692', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1371000691', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1370985220', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1370985219', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1370985217', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1370985216', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1370985215', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1370985213', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1370985030', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1370985029', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1370985028', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1370985027', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1370985026', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1370985025', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1370982202', None, None, None, None, None, '3245455') ('4323', None, None, None, None, None, None, None, '3245455') ('4323', '24660084', '1370982201', None, None, None, None, None, '3245455') begin to insert list_values into mtr_datou_result : length of list_values in save_final : 21 time used for this insertion : 0.025051593780517578 save_final save missing photos in datou_result : time spend for datou_step_exec : 20.191911220550537 time spend to save output : 0.025396108627319336 total time spend for step 1 : 20.217307329177856 caffe_path_current : About to save ! 2 After save, about to update current ! ret : 2 len(input) + len(total_photo_id_missing) : 1 set_done_treatment 1.04user 0.89system 0:23.72elapsed 8%CPU (0avgtext+0avgdata 102108maxresident)k 0inputs+168outputs (10major+63061minor)pagefaults 0swaps