Commit Graph

109 Commits

Author SHA1 Message Date
spbeach46
2f99c63165 fixed more dl_pictures bugs 2021-04-20 21:09:35 -07:00
spbeach46
5532a82d8d debugging dl_pictures 2021-04-15 18:26:42 -07:00
spbeach46
34a6451400 mostly finished dl_pictures(). Need testing 2021-04-13 10:10:24 -07:00
spbeach46
fb327a9375 changing custom dict to dl_pictures function and making temp_dict_pics.txt file with target_dirs.txt 2021-04-09 22:37:14 -07:00
spbeach46
ef237b8a1b added dict_pics.txt updating for expanded_dfs method 2021-04-07 15:50:23 -07:00
spbeach46
622b824eaf fixed dict_pics errors. made more robust regex pattern 2021-04-05 14:38:07 -07:00
spbeach46
fcbe4e088c added dict_pics output to curate.py and set destination var in ebay_api.py 2021-04-04 14:45:56 -07:00
spbeach46
b205b44b50 added function to create custom image url dictionary 2021-04-04 14:38:04 -07:00
spbeach46
35100b7952 fixed extract_contents for expanded_dropd df 2021-04-03 13:09:21 -07:00
spbeach46
167f1f29ec added download fix 2021-04-02 23:42:31 -07:00
spbeach46
3c16301a96 Created curate file for running curate methods from ebay_api.py 2021-04-02 11:08:56 -07:00
spbeach46
6abf695908 added some exceptions but still likely incomplete 2021-02-14 01:34:56 -07:00
spbeach46
816fb797fa added function to extract list values from cells 2021-02-08 20:16:58 -07:00
spbeach46
855e65af80 dropping cols from nvl_df and combining class_training with nvl_training 2021-02-07 12:25:37 -07:00
spbeach46
1087027812 corrected for setcopywarning 2021-01-31 22:13:52 -07:00
spbeach46
fc0090ea3e added config file import 2021-01-31 21:49:24 -07:00
spbeach46
627992036a reverted back to creds in script. change to config file method 2021-01-30 03:38:26 -07:00
spbeach46
4f46741a0f added notes and changed security_appname to var 2021-01-30 03:08:10 -07:00
spbeach46
72db161858 created def for class training set and nvl training set. Still unfinished 2021-01-29 23:48:19 -07:00
spbeach46
18871a76d7 added nvl_dict function to create series based on nvl dict of raw data 2021-01-26 23:01:00 -07:00
spbeach46
d09090dc13 deleted curate defs that work from json directly instead of pandas df 2021-01-24 22:46:55 -07:00
spbeach46
92516ff4f0 adding gameplan for constructing two csv file from pandas after shopping api call; for main data and training set 2021-01-23 01:38:04 -07:00
spbeach46
fd6fd04ecd attempting curate by starting with pandas df first 2021-01-22 23:21:56 -07:00
spbeach46
b4e1961ace working on curate_data to remove formatting issues in csv 2020-12-29 01:55:21 -07:00
spbeach46
dcbdb7ced3 need to fix data.update in conky to be list of dicts not dict of dicts 2020-12-29 00:20:55 -07:00
spbeach46
e6cf1b6a5d fixed repeats and multithreading. Needs refining 2020-12-27 21:13:12 -07:00
spbeach46
7af5811792 added lambda function to multithreader in FindingApi method 2020-12-25 12:15:20 -07:00
spbeach46
d9781579cd wtf 2020-12-12 18:56:51 -07:00
spbeach46
bcb11de855 changed data var returned in shoppingapi to data['Item']. Vice versa in update_data func. This is so data.update is correct 2020-11-12 14:46:37 -07:00
spbeach46
97c5900a5b added multithreading to both finding and shopping apis including conky() def 2020-11-12 13:22:51 -07:00
spbeach46
1134d3f155 adding multithreading to first call in main 2020-11-08 18:47:03 -07:00
spbeach46
5965f19d2a adding functions to curateData class 2020-11-07 10:39:49 -07:00
spbeach46
cd21c98e54 removed Unbound assignment error for accessing training.csv during exception handling. working on runtimeerror for mutable dictionary in curate_data loop 2020-11-05 15:32:41 -07:00
spbeach46
904014ea38 added conditional to check if item id in both item_id_results_list and in local file to eliminate variations listings including written data 2020-10-30 21:55:40 -07:00
spbeach46
47b77600a5 corrected update_df to iterate over items and nvl kv pairs. and began check for repeat results 2020-10-24 03:36:31 -07:00
spbeach46
b64e2f74c1 had to convert inputs to ints and correct filenotfounderror in get_ids_from_cats fuction 2020-10-18 15:32:17 -07:00
spbeach46
26b425f31c Changed to_csv mode to append to allow for writing if not present and appending if present 2020-10-18 13:56:16 -07:00
spbeach46
60b8f8979c changed big_data names to training 2020-10-18 00:08:04 -07:00
spbeach46
7c338e69d5 added error handling for initial/empty csv file 2020-10-17 17:22:45 -07:00
spbeach46
ae5a4e92bb added CurateData class for preprocessing pipeline fucntionality 2020-10-17 16:21:11 -07:00
spbeach46
58d8c8cda7 added get_data method call to get_ids_from_cats call 2020-10-12 18:42:57 -07:00
spbeach46
68f1341012 added get_data method for easier debugging. Added zeroth index term to ...['itemId'] 2020-10-12 17:55:07 -07:00
spbeach46
12f4770eaf corrected pageNumber and service params. Need to fix typeError on line 24-27 2020-10-12 11:48:15 -07:00
spbeach46
acb5955b97 comment before adding pagenumber and service to init 2020-10-12 00:53:29 -07:00
spbeach46
aa73991c3c fixed classes, variables, methods to not show any terrible errors 2020-10-11 23:11:59 -07:00
spbeach46
47ad8639ff changing defs to methods and variables to attributes. Trial and error. Code seems to be broken right now 2020-10-11 22:12:01 -07:00
spbeach46
9d02381140 completed conditional to check for itemId existence in csv data and 20-itemId string list generator 2020-10-08 03:07:41 -07:00
spbeach46
873050f898 removed list(map(str,item_id_results)) because itemIds are already strings 2020-10-08 00:20:01 -07:00
spbeach46
3b0b96fd92 around line 37, creating itemId list to pipe into shopping api call/class 2020-10-07 18:53:19 -07:00
spbeach46
b5afa5cfc0 removed deleted stuff? 2020-10-04 23:48:52 -07:00