Build Dataset¶
-
flood_forecast.preprocessing.buil_dataset.
build_weather_csv
(json_full_path, asos_base_url, base_url_2, econet_data, visited_gages_path, start=0, end_index=100)[source]¶
-
flood_forecast.preprocessing.buil_dataset.
get_eco_netset
(directory_path: str) → set[source]¶ Econet data was supplied to us by the NC State climate office. They gave us a directory of CSV files in following format LastName_First_station_id_Hourly.txt This code simply constructs a set of stations based on what is in the folder.
-
flood_forecast.preprocessing.buil_dataset.
combine_data
(flow_df: pandas.core.frame.DataFrame, precip_df: pandas.core.frame.DataFrame)[source]¶
-
flood_forecast.preprocessing.buil_dataset.
create_usgs
(meta_data_dir: str, precip_path: str, start: int, end: int)[source]¶
-
flood_forecast.preprocessing.buil_dataset.
get_data
(file_path: str, gcp_service_key: Optional[str] = None) → str[source]¶ Extract bucket name and storage object name from file_path Args:
file_path (str): [description]
Example, file_path = “gs://task_ts_data/2020-08-17/Afghanistan____.csv” bucket_name = “task_ts_data” object_name = “2020-08-17/Afghanistan____.csv” loal_temp_filepath = “//data/2020-08-17/Afghanistan____.csv”
- Returns:
str: local file name