Looping through directory containing csv files to perform operation - python

2 Daily files are placed in a folder, format being Items_20190102 and Items_20190102_orig.
I would like to pull the file via a loop without the 'orig' ending and perform aggregation using groupby on item level.
I would like to tally the daily aggregates into a monthly running total so as to produce a monthly dashboard throughout the year.
So far i have been able to group the data at item level and then compute running total for the day, on three summarized totals.
Uncalibrated spend
Calibrated spend.
Total Spend
import pandas as pd
l = ['Umbrella', 'Umbrella']
df = pd.DataFrame({'Date':['10/2/2011', '11/2/2011', '12/2/2011', '13/2/2011'],
'Product':['Umbrella', 'Umbrella', 'Badminton', 'Shuttle'],
'Last Price':[1200, 1500, 1600, 352],
'Updated Price':[1250, 1450, 1550, 400],
'Discount':[10, 10, 10, 10]})
Uncalibrated = df[df['Product'].isin(l)].groupby('Product')['Last Price'].sum()
Uncalibrated_abs = abs(uncalibrated)
Uncalibrated spend = Uncalibrated_abs.sum()
Calibrated = df[~df['Product'].isin(l)].groupby('Product')['Last Price'].tail(1)
Calibrated_Abs = abs(Calibrated)
Calibrated spend = Calibrated_Abs.sum()
Total = df.groupby(['Product'])['Last Price'].sum()
Total_Spend = abs(Total).sum()
The result of this code will be in the form of three summarized statistics:
Uncalibrated_Spend.
Calibrated Spend.
Total Spend.
Now, i would like to generate a loop so that every day that this code runs it appends the three statsitcs generated from that daily file to a new column with a date as header and also generates a running total for the month on another column.
Output
26/06/2019 27/06/2016 Cum Monthly Total
Uncalibrated_Spend. xxxxxx xxxx xxxx
Calibrated Spend. xxxxxx xxxx xxxx
Total Spend. xxxxxx xxxx xxxx

Related

Pandas - Fill in Missing Column Values Regression

I have a data frame 'df' that has missing column values. I want to fill in the missing/NaN values in the Avg Monthly Long Distance Charges column through prediction (regression) using the other column values. Then, replace the NaN values with the new values found.
Data frame: 'df'
Customer ID,Gender,Age,Married,Number of Dependents,City,Zip Code,Latitude,Longitude,Number of Referrals,Tenure in Months,Offer,Phone Service,Avg Monthly Long Distance Charges,Multiple Lines,Internet Service,Internet Type,Avg Monthly GB Download,Online Security,Online Backup,Device Protection Plan,Premium Tech Support,Streaming TV,Streaming Movies,Streaming Music,Unlimited Data,Contract,Paperless Billing,Payment Method,Monthly Charge,Total Charges,Total Refunds,Total Extra Data Charges,Total Long Distance Charges,Total Revenue,Customer Status,Churn Category,Churn Reason
0002-ORFBO,Female,37,Yes,0,Frazier Park,93225,34.827662,-118.999073,2,9,None,Yes,42.39,No,Yes,Cable,16,No,Yes,No,Yes,Yes,No,No,Yes,One Year,Yes,Credit Card,65.6,593.3,0,0,381.51,974.81,Stayed,,
0003-MKNFE,Male,46,No,0,Glendale,91206,34.162515,-118.203869,0,9,None,Yes,10.69,Yes,Yes,Cable,10,No,No,No,No,No,Yes,Yes,No,Month-to-Month,No,Credit Card,-4,542.4,38.33,10,96.21,610.28,Stayed,,
0004-TLHLJ,Male,50,No,0,Costa Mesa,92627,33.645672,-117.922613,0,4,Offer E,Yes,33.65,No,Yes,Fiber Optic,30,No,No,Yes,No,No,No,No,Yes,Month-to-Month,Yes,Bank Withdrawal,73.9,280.85,0,0,134.6,415.45,Churned,Competitor,Competitor had better devices
0011-IGKFF,Male,78,Yes,0,Martinez,94553,38.014457,-122.115432,1,13,Offer D,Yes,27.82,No,Yes,Fiber Optic,4,No,Yes,Yes,No,Yes,Yes,No,Yes,Month-to-Month,Yes,Bank Withdrawal,98,1237.85,0,0,361.66,1599.51,Churned,Dissatisfaction,Product dissatisfaction
0013-EXCHZ,Female,75,Yes,0,Camarillo,93010,34.227846,-119.079903,3,3,None,Yes,7.38,No,Yes,Fiber Optic,11,No,No,No,Yes,Yes,No,No,Yes,Month-to-Month,Yes,Credit Card,83.9,267.4,0,0,22.14,289.54,Churned,Dissatisfaction,Network reliability
0013-MHZWF,Female,23,No,3,Midpines,95345,37.581496,-119.972762,0,9,Offer E,Yes,16.77,No,Yes,Cable,73,No,No,No,Yes,Yes,Yes,Yes,Yes,Month-to-Month,Yes,Credit Card,69.4,571.45,0,0,150.93,722.38,Stayed,,
0013-SMEOE,Female,67,Yes,0,Lompoc,93437,34.757477,-120.550507,1,71,Offer A,Yes,9.96,No,Yes,Fiber Optic,14,Yes,Yes,Yes,Yes,Yes,Yes,Yes,Yes,Two Year,Yes,Bank Withdrawal,109.7,7904.25,0,0,707.16,8611.41,Stayed,,
0014-BMAQU,Male,52,Yes,0,Napa,94558,38.489789,-122.27011,8,63,Offer B,Yes,12.96,Yes,Yes,Fiber Optic,7,Yes,No,No,Yes,No,No,No,No,Two Year,Yes,Credit Card,84.65,5377.8,0,20,816.48,6214.28,Stayed,,
0015-UOCOJ,Female,68,No,0,Simi Valley,93063,34.296813,-118.685703,0,7,Offer E,Yes,10.53,No,Yes,DSL,21,Yes,No,No,No,No,No,No,Yes,Two Year,Yes,Bank Withdrawal,48.2,340.35,0,0,73.71,414.06,Stayed,,
0016-QLJIS,Female,43,Yes,1,Sheridan,95681,38.984756,-121.345074,3,65,None,Yes,28.46,Yes,Yes,Cable,14,Yes,Yes,Yes,Yes,Yes,Yes,Yes,Yes,Two Year,Yes,Credit Card,90.45,5957.9,0,0,1849.9,7807.8,Stayed,,
0017-DINOC,Male,47,No,0,Rancho Santa Fe,92091,32.99356,-117.207121,0,54,None,No,,,Yes,Cable,10,Yes,No,No,Yes,Yes,No,No,Yes,Two Year,No,Credit Card,45.2,2460.55,0,0,0,2460.55,Stayed,,
0017-IUDMW,Female,25,Yes,2,Sunnyvale,94086,37.378541,-122.020456,2,72,None,Yes,16.01,Yes,Yes,Fiber Optic,59,Yes,Yes,Yes,Yes,Yes,Yes,Yes,Yes,Two Year,Yes,Credit Card,116.8,8456.75,0,0,1152.72,9609.47,Stayed,,
0018-NYROU,Female,58,Yes,0,Antelope,95843,38.715498,-121.363411,0,5,None,Yes,18.65,No,Yes,Fiber Optic,10,No,No,No,No,No,No,No,Yes,Month-to-Month,Yes,Bank Withdrawal,68.95,351.5,0,0,93.25,444.75,Stayed,,
0019-EFAEP,Female,32,No,0,La Mesa,91942,32.782501,-117.01611,0,72,Offer A,Yes,2.25,Yes,Yes,Fiber Optic,16,Yes,Yes,Yes,No,Yes,No,No,Yes,Two Year,Yes,Bank Withdrawal,101.3,7261.25,0,0,162,7423.25,Stayed,,
0019-GFNTW,Female,39,No,0,Los Olivos,93441,34.70434,-120.02609,0,56,None,No,,,Yes,DSL,19,Yes,Yes,Yes,Yes,No,No,No,Yes,Two Year,No,Bank Withdrawal,45.05,2560.1,0,0,0,2560.1,Stayed,,
0020-INWCK,Female,58,Yes,2,Woodlake,93286,36.464635,-119.094348,9,71,Offer A,Yes,27.26,Yes,Yes,Fiber Optic,12,No,Yes,Yes,No,No,Yes,Yes,Yes,Two Year,Yes,Credit Card,95.75,6849.4,0,0,1935.46,8784.86,Stayed,,
0020-JDNXP,Female,52,Yes,1,Point Reyes Station,94956,38.060264,-122.830646,0,34,None,No,,,Yes,DSL,20,Yes,No,Yes,Yes,Yes,Yes,Yes,Yes,One Year,No,Credit Card,61.25,1993.2,0,0,0,1993.2,Stayed,,
0021-IKXGC,Female,72,No,0,San Marcos,92078,33.119028,-117.166036,0,1,Offer E,Yes,7.77,Yes,Yes,Fiber Optic,22,No,No,No,No,No,No,No,Yes,One Year,Yes,Bank Withdrawal,72.1,72.1,0,0,7.77,79.87,Joined,,
0022-TCJCI,Male,79,No,0,Daly City,94015,37.680844,-122.48131,0,45,None,Yes,10.67,No,Yes,DSL,17,Yes,No,Yes,No,No,Yes,No,Yes,One Year,No,Credit Card,62.7,2791.5,0,0,480.15,3271.65,Churned,Dissatisfaction,Limited range of services
My code:
# Let X = predictor variable and y = target variable
X = pd.DataFrame(df[['Monthly Charge', 'Total Charges', 'Total Long Distance Charges']])
y = pd.DataFrame(df[['Avg Monthly Long Distance Charges']])
# Add a constant variable to the predictor variables
X = sm.add_constant(X)
model01 = sm.OLS(y, X).fit()
df['Avg Monthly Long Distance Charges'].fillna(sm.OLS(y, X).fit())
My code output:
0 42.39
1 10.69
2 33.65
3 27.82
4 7.38
...
7038 46.68
7039 16.2
7040 18.62
7041 2.12
7042 <statsmodels.regression.linear_model.Regressio...
Name: Avg Monthly Long Distance Charges, Length: 7043, dtype: object
My code outputs this, but does not print this into the original data frame. How do I do this? Thanks.

Pandas - groupby and show aggregate on all "levels"

I am a Pandas newbie and I am trying to automate the processing of ticket data we get from our IT ticketing system. After experimenting I was able to get 80 percent of the way to the result I am looking for.
Currently I pull in the ticket data from a CSV into a "df" dataframe. I then want to summarize the data for the higher ups to review and get high level info like totals and average "age" of tickets (number of days between ticket creation date and current date).
Here's an example of the ticket data for "df" dataframe:
I then create "df2" dataframe to summarize df using:
df2 = df.groupby(["dept", "group", "assignee", "ticket_type"]).agg(task_count=('ticket_type', 'size'), mean_age_in_days=('age', 'mean'),)
And here's what it I am getting if I print out df2...which is very close to what I need.
As you can see we look at the count of tickets assigned to each staff member, separated by type (incident, request), and also look at the average "age" of each ticket type (incident, request) for each staff member.
The roadblock that I am hitting now and have been pulling my hair out about is I need to show the aggregates (count and averages of ages) at all 3 levels (sorry if I am using the wrong jargon). Basically I need to show the count and average age for all tickets associated with a group, then the same thing for tickets at the department ("Division") level, and lastly the grand total and grand average in green...for all tickets which is the entire organization (all tickets in all departments, groups).
Here's an example of the ideal result I am trying to get:
You will see in red I want the count of tickets and average age for tickets for a given group. Then, in blue I want the count and average age for all tickets on the dept/division level (all tickets for all groups belonging to a given dept./division). Lastly, I want the grand total and grand average for all tickets in the entire organization. In the end both the df2 (summary of ticket data) and df will be dumped to an Excel file on separate worksheets in the same workbook.
Please have mercy on me! Can someone show me how I could generate the desired "summary" with counts and average age at all levels (group, dept., and organization)? Thanks in advance for any assistance, I'd really, really appreciate it!
*Added link to CSV with sample ticket data below:
on Github
Also, here's raw CSV text for the sample ticket data:
,number,created_on,dept,group,assignee,ticket_type,age
0,14500,2021-02-19 11:48:28,IT_Services_Division,Helpdesk,Jane Doe,Incident,361
1,16890,2021-04-20 10:51:49,IT_Services_Division,Helpdesk,Jane Doe,Incident,120
2,16891,2021-04-20 11:51:00,IT_Services_Division,Helpdesk,Tilly James,Request,120
3,15700,2021-06-09 09:05:28,IT_Services_Division,Systems,Steve Lee,Incident,252
4,16000,2021-08-12 09:32:39,IT_Services_Division,Systems,Linda Nguyen,Request,188
5,16100,2021-08-18 17:43:54,IT_Services_Division,TechSupport,Joseph Wills,Incident,181
6,19000,2021-01-17 15:01:50,IT_Services_Division,TechSupport,Bill Gonzales,Request,30
7,18990,2021-01-10 13:00:01,IT_Services_Division,TechSupport,Bill Gonzales,Request,37
8,18800,2021-12-03 21:13:12,Data_Division,DataGroup,Bob Simpson,Incident,74
9,16880,2021-10-18 11:56:03,Data_Division,DataGroup,Bob Simpson,Request,119
10,18000,2021-11-09 14:28:44,IT_Services_Division,Systems,Veronica Paulson,Incident,98
Here's a different approach which is easier, but results in a different structure
agg_df = df.copy()
#Add dept-level info to the department
gb = agg_df.groupby('dept')
task_counts = gb['ticket_type'].transform('count').astype(str)
mean_ages = gb['age'].transform('mean').round(2).astype(str)
agg_df['dept'] += ' ['+task_counts+' tasks, avg age= '+mean_ages+']'
#Add group-level info to the group label
gb = agg_df.groupby(['dept','group'])
task_counts = gb['ticket_type'].transform('count').astype(str)
mean_ages = gb['age'].transform('mean').round(2).astype(str)
agg_df['group'] += ' ['+task_counts+' tasks, avg age= '+mean_ages+']'
#Add org-level info
agg_df['org'] = 'Org [{} tasks, avg age = {}]'.format(len(agg_df),agg_df['age'].mean().round(2))
agg_df = (
agg_df.groupby(['org','dept','group','assignee','ticket_type']).agg(
task_count=('ticket_type','count'),
mean_ticket_age=('age','mean'))
)
agg_df
Couldn't think of a cleaner way to get the structure you want and had to manually loop through the different groupby levels adding one row at a time
multi_ind = pd.MultiIndex.from_tuples([],names=('dept','group','assignee','ticket_type'))
agg_df = pd.DataFrame(index=multi_ind, columns=['task_count','mean_age_in_days'])
data = lambda df: {'task_count':len(df),'mean_age_in_days':df['age'].mean()}
for dept,dept_g in df.groupby('dept'):
for group,group_g in dept_g.groupby('group'):
for assignee,assignee_g in group_g.groupby('assignee'):
for ticket_type,ticket_g in assignee_g.groupby('ticket_type'):
#Add ticket totals
agg_df.loc[(dept,group,assignee,ticket_type)] = data(ticket_g)
#Add group totals
agg_df.loc[(dept,group,assignee,'Group Total/Avg')] = data(group_g)
#Add dept totals
agg_df.loc[(dept,group,assignee,'Dept Total/Avg')] = data(dept_g)
#Add org totals
agg_df.loc[('','','','Org Total/Avg')] = data(df)
agg_df
Output

Post Processing Aggregates

I have a question for a data analysis question where I need to separate data into two groups: months and states.
The task involves as:
In any given month, the minimum monthly rainfall for each state is the lowest rainfall recording from any weather station in that state during that month.
For example, suppose NSW had recordings of 1, 10, 5, and 7 for January; then its minimum monthly rainfall for January would be , the lowest of those recordings.
If there are several correct answers, your program can output any of the correct answers.
On the given data, the program's output should look like this:
Month: 11
State: QLD
What I have tried is:
minimum_monthly_rainfall = {}
is_first_line = True
for row in open("climate_data_2017.csv"):
if is_first_line:
is_first_line = False
else:
values = row.split(",")
month = values[0].split("-")[1]
rainfall = float(values[6])
state = values [1]
if month in minimum_monthly_rainfall:
minimum_monthly_rainfall[month].append(rainfall)
else:
minimum_monthly_rainfall[month] = [rainfall]
print(minimum_monthly_rainfall)
I am not sure as what to do now. I have skipped the first line of the data file which i will show below, and I have discriminated all the monthly rainfall into corresponding months. However I am not sure how I will separate this back into months? I know I shoul
The data file is:
Date,State,City,Station Code,Minimum temperature (C),Maximum temperature (C),Rainfall (mm),Evaporation (mm),Sunshine (hours),Direction of maximum wind gust,Speed of maximum wind gust (km/h),9am Temperature (C),9am relative humidity (%),3pm Temperature (C),3pm relative humidity (%)
2017-06-03,VIC,Melbourne,86338,9.9,14.3,0,,,SSW,19,10.8,76,13.1,73
2017-06-04,NSW,Sydney,66062,10.5,19.7,1.4,0.6,6.8,W,28,12.2,77,19,57
2017-06-04,QLD,Brisbane,40913,12.8,22.6,0.6,1.8,9.1,ENE,31,16.3,77,22.5,52
2017-06-04,VIC,Melbourne,86338,4.4,15.8,0,,,NE,13,5.7,100,15.5,53
2017-06-05,NSW,Sydney,66062,9.4,19.6,0.5,2.8,9.4,W,28,11.5,78,19.2,41
2017-06-05,QLD,Brisbane,40913,10.1,25.3,0.6,1.8,9.2,ENE,31,15,71,24.9,30
2017-06-05,VIC,Melbourne,86338,5.6,17.5,0,,,NW,28,8.7,82,16.8,57
2017-06-06,NSW,Sydney,66062,9.1,17.9,0.5,2.2,7.2,WSW,59,11.2,58,14.7,43
2017-06-06,QLD,Brisbane,40913,9.2,25.2,0.6,2.4,7.7,N,41,17.9,38,24.5,13
2017-06-06,VIC,Melbourne,86338,8.7,13.9,6.4,,,SSW,43,11.8,78,12.6,68
2017-06-07,NSW,Sydney,66062,8.8,15.5,54.6,2.6,0,SSW,56,13.3,82,14.2,85
2017-06-07,QLD,Brisbane,40913,5,22.8,0.6,4.4,9.1,SE,31,12.4,68,22.5,39
2017-06-07,VIC,Melbourne,86338,5.9,14.3,0,,,SE,28,8.3,91,14,54
2017-06-08,NSW,Sydney,66062,12.4,18.1,61,1.8,1.3,SE,44,13.3,91,16.9,84
2017-06-08,QLD,Brisbane,40913,9.9,22.5,0.6,2.2,9.2,ENE,48,13,75,22.1,25
2017-06-08,VIC,Melbourne,86338,3.6,15,0,,,SSW,30,6.6,86,14,56
2017-06-09,NSW,Sydney,66062,10.4,18.9,3.4,1.6,4.8,ESE,52,12.5,90,18.2,64
2017-06-09,QLD,Brisbane,40913,7.7,24.1,0.6,4.8,9.3,NE,37,15.8,27,23.8,14
2017-06-09,VIC,Melbourne,86338,6.6,15,1.8,,,S,22,11,98,13.9,66
2017-06-10,NSW,Sydney,66062,12.5,18.1,38.8,3.6,0.2,SSE,50,15.8,85,17.5,70
2017-06-10,QLD,Brisbane,40913,7.7,20.8,0.6,3.8,1.9,NNE,33,14.7,33,20.4,25
2017-06-10,VIC,Melbourne,86338,7,15.8,0,,,SSW,17,7.5,100,14,68
2017-06-11,NSW,Sydney,66062,12.5,17.2,2.8,1.6,0.2,WNW,20,13.1,90,16,85
2017-06-11,QLD,Brisbane,40913,11.4,23.5,0.6,3.2,4.5,ENE,41,13.6,33,23.4,14
2017-06-11,VIC,Melbourne,86338,3.4,15.5,0.2,,,SSW,22,6.1,96,13.6,67
2017-06-12,NSW,Sydney,66062,11.3,20.4,1.2,0.8,7.5,SSE,26,11.6,92,19.1,59
2017-06-12,QLD,Brisbane,40913,10.1,24,0.6,3.8,7.6,NNE,39,16.7,26,23.7,22
2017-06-12,VIC,Melbourne,86338,6,15.9,0,,,SSW,19,10,85,15,77
2017-06-13,NSW,Sydney,66062,9.5,18.7,0.5,2.2,9.1,SSE,37,11.2,82,18.2,62
2017-06-13,QLD,Brisbane,40913,10.8,24.8,0.6,3.4,5.6,NE,28,15,35,22.4,24
2017-06-13,VIC,Melbourne,86338,8.8,15.1,0,,,WSW,20,10.7,88,13.4,67
2017-06-14,NSW,Sydney,66062,11.1,17.8,2.6,1.8,2.9,WNW,24,13.3,82,17.4,78
2017-06-14,QLD,Brisbane,40913,12.1,20.4,14.6,3.4,4.3,E,37,14.2,83,19.4,63
2017-06-14,VIC,Melbourne,86338,4,17.4,0,,,N,28,7.8,97,16.7,56
2017-06-15,NSW,Sydney,66062,10.7,20.1,0.6,1.2,6.4,W,22,11.9,89,18.7,69
2017-06-15,QLD,Brisbane,40913,9.9,19.3,0.6,3.2,6.5,SSW,24,14.7,87,18.1,65
2017-06-15,VIC,Melbourne,86338,7.7,19,0,,,N,20,9.8,87,17.4,53
2017-06-16,NSW,Sydney,66062,11.9,17.3,0.5,1.6,0.2,WNW,26,13.3,79,17.2,69
2017-06-16,QLD,Brisbane,40913,6.6,20.8,0.6,1,8.3,S,22,12.1,85,20.2,43
2017-06-16,VIC,Melbourne,86338,6.6,17.1,0,,,NNE,15,8.4,86,15.9,65
2017-06-17,NSW,Sydney,66062,13.2,19.1,0.5,1,0.2,SSW,26,14.6,81,17.4,67
2017-06-17,QLD,Brisbane,40913,5,21.7,0.6,2.4,9,E,39,13.1,78,21.5,47
2017-06-17,VIC,Melbourne,86338,4.2,15.5,0,,,NE,15,5.4,95,14.9,66
2017-06-18,NSW,Sydney,66062,11.3,18,1.8,2,6.3,S,52,12.9,83,17.6,62
2017-07-02,NSW,Sydney,66062,5.4,17.6,0.5,2,9.5,W,28,7.6,78,16.1,44
2017-07-02,QLD,Brisbane,40913,3.9,15,20,1.6,4.2,S,35,7.3,90,14.2,46
2017-07-02,VIC,Melbourne,86338,0.8,14.3,0,,,N,41,3.8,75,13.9,36
2017-07-03,NSW,Sydney,66062,5.6,15.8,0.5,2.6,2.7,N,31,7.9,81,15.4,57
2017-07-03,QLD,Brisbane,40913,7,16.6,0.6,2,4.6,NNW,31,8.9,82,15.2,58
2017-07-03,VIC,Melbourne,86338,3.8,13.9,0,,,NNE,39,9,60,11.2,92
2017-07-04,NSW,Sydney,66062,7.9,21.8,0.2,0.6,9.3,,,13.2,76,21.5,23
2017-07-04,QLD,Brisbane,40913,5.5,14.4,2,1.4,1.2,N,19,8.3,94,13.9,73
2017-07-04,VIC,Melbourne,86338,9,16.2,3,,,N,33,11,78,15.6,54
2017-07-05,NSW,Sydney,66062,12.2,20.8,0.5,5.4,9.2,W,52,14.4,51,18.7,36
2017-07-05,QLD,Brisbane,40913,8,13.9,3.4,1.8,2.8,SW,59,11.9,92,10.7,83
2017-07-05,VIC,Melbourne,86338,10.1,15.8,0,,,SSW,24,12.4,75,13.4,65
2017-07-06,NSW,Sydney,66062,9.1,18.2,0.5,4.8,9.7,WSW,37,11.5,62,17.4,36
2017-07-06,QLD,Brisbane,40913,4.7,16,10.4,4.2,5.5,WNW,30,7.4,88,14.4,60
2017-07-06,VIC,Melbourne,86338,4.8,15,0,,,N,28,8.1,75,14.1,46
2017-07-07,NSW,Sydney,66062,6.6,19.1,0.5,3.6,9.4,WNW,35,8.8,73,18.6,24
2017-07-07,QLD,Brisbane,40913,7.4,17.6,9.8,1,6.9,SW,33,11.6,97,16.2,59
2017-07-07,VIC,Melbourne,86338,8,14.2,0,,,NNW,37,9,79,13.4,56
2017-07-08,NSW,Sydney,66062,7,18.1,0.5,3.2,9.3,W,37,9,66,17,36
2017-07-08,QLD,Brisbane,40913,3.2,17.2,0.6,1.4,9,S,28,10.2,83,16.5,41
2017-07-08,VIC,Melbourne,86338,5.3,13.5,0,,,NW,30,9,84,13.2,58
2017-07-09,NSW,Sydney,66062,7.4,18,0.5,2,9.5,W,33,9.2,69,17.6,33
2017-07-09,QLD,Brisbane,40913,7.1,17.8,0.6,2.4,7.4,NE,22,10.4,68,17.4,42
2017-07-09,VIC,Melbourne,86338,7.3,14.6,0.8,,,NNW,39,9.7,70,13.3,49
2017-07-10,NSW,Sydney,66062,7,17.3,0.5,3.6,9.4,W,39,9.4,67,16.7,36
2017-07-10,QLD,Brisbane,40913,5.9,17.9,0.6,2.4,4,NNE,39,12.5,52,17.6,40
2017-07-10,VIC,Melbourne,86338,7,16,0,,,W,28,9.2,74,15.2,48
2017-07-11,NSW,Sydney,66062,6.6,18.4,0.5,2.8,9.2,SSW,39,9,63,17.7,46
2017-07-11,QLD,Brisbane,40913,12.5,19.7,0.6,2.6,4.2,WNW,52,15.6,52,18.5,78
2017-07-11,VIC,Melbourne,86338,6.9,14.3,0.4,,,NW,19,9.8,79,13.8,57
2017-07-12,NSW,Sydney,66062,7.8,14.8,1.6,3.4,0.5,W,31,8.9,76,13,79
2017-07-12,QLD,Brisbane,40913,14.8,20.4,3.4,2.4,8,NW,61,18,69,19.2,59
2017-07-12,VIC,Melbourne,86338,3.2,15.1,0,,,N,30,4.8,89,14.7,49
2017-07-13,NSW,Sydney,66062,8,17.4,10.4,1,9.3,ENE,30,9.6,80,17.2,49
2017-07-13,QLD,Brisbane,40913,13.2,18.6,7.2,3.6,3,SW,33,15.1,96,18.1,60
2017-07-13,VIC,Melbourne,86338,4.8,13.3,0,,,N,56,10.9,56,13,54
2017-07-14,NSW,Sydney,66062,9.6,20.8,0.5,4,6.1,NNW,30,12.2,67,19.3,41
2017-07-14,QLD,Brisbane,40913,12.3,19.2,0.4,2.8,2.3,NE,24,14.6,76,18.1,62
2017-07-14,VIC,Melbourne,86338,9.9,15.9,2,,,N,50,10.5,76,15.2,41
2017-07-15,NSW,Sydney,66062,12.1,19.8,0.4,4,6.1,SW,30,13,71,17.9,39
2017-07-15,QLD,Brisbane,40913,11.4,19.9,0.6,2.6,1.3,WNW,50,14,96,16.9,82
2017-07-15,VIC,Melbourne,86338,4.5,12,0,,,NW,22,7.8,71,11.6,58
2017-07-16,NSW,Sydney,66062,7.7,18.4,0.5,1.8,9.4,WSW,31,9.4,61,16.8,32
2017-07-16,QLD,Brisbane,40913,13.7,19.2,4.6,2,3.6,SW,37,15.8,66,18.1,45
2017-07-16,VIC,Melbourne,86338,3,12.5,0,,,N,52,5.7,75,11.1,54
2017-07-17,NSW,Sydney,66062,7.9,19.3,0.5,4.2,6.5,W,24,9.8,75,17.3,44
2017-07-17,QLD,Brisbane,40913,8.7,19.2,0.6,3.4,9.1,NE,33,12.4,64,18.5,41
2017-07-17,VIC,Melbourne,86338,5.7,15.3,0,,,N,52,11.8,55,14.6,50
2017-07-18,NSW,Sydney,66062,9.1,22.9,0.5,1.2,9.8,NW,54,14.1,62,22.6,30
2017-07-18,QLD,Brisbane,40913,11.2,24.6,0.6,3.2,9.2,N,46,14.8,52,24.1,35
2017-07-18,VIC,Melbourne,86338,9.7,14.5,0.2,,,NNW,41,10.6,75,13.8,55
2017-07-19,NSW,Sydney,66062,10.6,17.5,0.5,6.6,9.1,WSW,57,12,53,16.9,33
2017-07-19,QLD,Brisbane,40913,13.2,18.7,8.4,3.8,3,N,61,14,96,14.1,89
2017-07-19,VIC,Melbourne,86338,8.9,13.2,5,,,SSW,37,10.8,99,11.6,81
2017-07-20,NSW,Sydney,66062,9.3,17.5,0.5,4.4,9.6,WSW,63,11.4,55,15.8,26
2017-07-20,QLD,Brisbane,40913,11.5,18.2,18.4,3.6,7.7,WSW,56,15.3,69,16.4,71
2017-07-20,VIC,Melbourne,86338,7.9,10.7,0.6,,,SSW,39,9.4,82,9.7,77
2017-07-21,NSW,Sydney,66062,7.5,18,0.5,4.2,9.2,W,46,9.6,53,17.4,37
2017-07-21,QLD,Brisbane,40913,10.6,20.1,5,0.6,7,NW,59,13.2,88,18.6,53
2017-07-21,VIC,Melbourne,86338,3.3,13.8,4.2,,,N,26,5.8,93,13.4,53
2017-07-22,NSW,Sydney,66062,6,19.1,0.5,3.6,9.8,W,31,7.9,64,18.7,27
2017-07-22,QLD,Brisbane,40913,10.7,19.3,6.1,,4.5,WSW,31,12.4,97,17.8,55
2017-07-22,VIC,Melbourne,86338,5.4,12.4,0,,,N,59,8.4,65,11.6,54
2017-07-23,NSW,Sydney,66062,6.5,22.1,0.5,3.6,9.9,NW,44,10.8,49,22.1,22
2017-07-23,QLD,Brisbane,40913,11.7,18.1,2.3,6.6,4.8,N,50,13.4,93,18,66
2017-07-23,VIC,Melbourne,86338,8.4,14.1,0,,,N,44,10.2,57,11.3,90
2017-07-24,NSW,Sydney,66062,10,21.3,0.5,7.2,9.6,WSW,31,12.7,51,20.9,24
2017-07-24,QLD,Brisbane,40913,10,18.6,16.8,4.2,3.9,W,39,12.4,95,17.1,50
2017-07-24,VIC,Melbourne,86338,8.1,16.1,7,,,WSW,28,9.9,77,15.2,53
2017-07-25,NSW,Sydney,66062,7.9,20.6,0.5,4.2,9.9,NW,30,11,63,19.7,32
2017-07-25,QLD,Brisbane,40913,10.4,19.6,3,0,3.4,NW,39,13,96,18.7,53
2017-07-25,VIC,Melbourne,86338,9.8,17.2,0,,,N,48,11.1,72,16.4,53
2017-07-26,NSW,Sydney,66062,11,21.3,0.5,2.8,9.9,W,70,18,39,20.2,27
2017-07-26,QLD,Brisbane,40913,11.7,19.2,0.4,2.2,3.1,SW,48,15.6,78,17.4,73
2017-07-26,VIC,Melbourne,86338,10.9,15,2.4,,,SW,37,12.5,70,12.7,71
2017-07-27,NSW,Sydney,66062,9.1,17.5,0.5,4.8,9.7,W,31,11.2,61,17.1,44
2017-07-27,QLD,Brisbane,40913,11.6,21.5,24,1.8,6.5,NNW,59,14.5,90,19.4,68
2017-07-27,VIC,Melbourne,86338,4.4,14.2,2,,,N,52,6.2,90,13.8,55
2017-07-28,NSW,Sydney,66062,11.2,19.1,0.5,4.2,8.3,WSW,67,13.1,51,18.9,25
2017-07-28,QLD,Brisbane,40913,14.5,17.7,10.8,3.4,,WNW,78,17,60,13.9,83
2017-07-28,VIC,Melbourne,86338,6.2,14.3,2.8,,,NNE,37,10.9,73,13.4,56
2017-07-29,NSW,Sydney,66062,8.2,21.2,0.5,4.6,9.9,W,31,10.7,52,20.9,23
2017-07-29,QLD,Brisbane,40913,9.7,15.1,11.2,6.8,5,WSW,72,12.1,77,12.3,86
2017-07-29,VIC,Melbourne,86338,9.7,17.9,0,,,N,81,12.7,51,16.9,41
2017-07-30,NSW,Sydney,66062,10.7,26.5,0.5,6,6.2,NNW,44,18.4,25,24.8,22
2017-07-30,QLD,Brisbane,40913,5.3,17.1,9.8,1.6,6.5,WSW,33,11.1,79,15.4,52
2017-07-30,VIC,Melbourne,86338,11.3,18.1,0,,,NW,31,13.7,64,17.5,34
2017-07-31,NSW,Sydney,66062,15.6,16.5,0.5,4.4,0,WSW,56,15.7,72,11.8,85
2017-07-31,QLD,Brisbane,40913,9.5,15.7,0.6,2.2,0.6,WSW,67,12.3,75,12.8,81
2017-07-31,VIC,Melbourne,86338,6.7,14.8,0,,,S,19,9,79,13.7,56
2017-08-01,NSW,Sydney,66062,8.6,18.1,6,2.4,9.8,SW,39,11.6,57,16.9,39
2017-08-01,QLD,Brisbane,40913,1.8,15.3,23.8,2.6,9,SSW,39,8.4,87,14.2,53
2017-08-01,VIC,Melbourne,86338,3.5,15.5,0,,,SSW,20,6.2,89,13,64
2017-08-02,NSW,Sydney,66062,7.2,18.2,0.5,3.2,8.8,S,41,9.6,62,16.8,51
2017-08-02,QLD,Brisbane,40913,1,14.9,0.6,0.6,9.7,E,26,8.7,81,14.3,47
2017-08-02,VIC,Melbourne,86338,3.4,12.9,0,,,NNE,13,5.6,100,12.8,61
2017-08-03,NSW,Sydney,66062,8.9,17.4,3,2.8,5.5,NNE,37,10.3,85,16.4,70
2017-08-03,QLD,Brisbane,40913,3.6,17.9,0.2,1.8,2.1,W,54,8.5,87,14.3,96
2017-08-03,VIC,Melbourne,86338,3.1,9,0,,,NNE,15,4.3,89,8.7,84
2017-08-04,NSW,Sydney,66062,10.3,17.3,13.8,1.8,10.2,W,50,12.8,60,16.6,41
2017-08-04,QLD,Brisbane,40913,7.9,15.5,1.6,2,1,WSW,30,10.3,77,14.3,64
2017-08-04,VIC,Melbourne,86338,4.3,12.1,6.6,,,N,50,8.9,87,11.5,66
2017-08-05,NSW,Sydney,66062,9.6,19.9,0.5,5.6,10.1,WNW,48,14,46,19.3,22
2017-08-05,QLD,Brisbane,40913,9.3,18.3,0.4,1.6,6.1,WSW,31,12.4,94,16.8,59
2017-08-05,VIC,Melbourne,86338,8.8,14.2,0,,,N,52,10.6,68,13.1,55
2017-08-06,NSW,Sydney,66062,12.1,21.3,0.5,6.8,9.9,NW,54,15,46,20.9,26
2017-08-06,QLD,Brisbane,40913,6.7,19.8,0.6,3.2,6.1,NW,37,11.8,81,19.4,41
2017-08-06,VIC,Melbourne,86338,9,13,0,,,NNE,54,10,76,11.7,54
2017-08-07,NSW,Sydney,66062,13.5,18.4,0.5,6.8,9.8,W,63,14.7,44,16.9,25
2017-08-07,QLD,Brisbane,40913,10.6,19.4,0.6,2.4,0.8,N,30,14.5,78,18.4,65
2017-08-07,VIC,Melbourne,86338,7.7,12.3,4.2,,,SW,41,9.4,70,11,79
2017-08-08,NSW,Sydney,66062,9.7,19.3,0.5,7.2,10.3,WSW,54,12.1,45,18.5,24
2017-08-08,QLD,Brisbane,40913,12.5,19.2,0.4,2,2.3,NW,57,15.9,82,14.8,93
2017-08-08,VIC,Melbourne,86338,7.9,15,2.2,,,NW,26,10.5,72,13.9,59
2017-08-09,NSW,Sydney,66062,9.2,20.9,0.5,5,10.4,W,33,12,53,20,32
2017-08-09,QLD,Brisbane,40913,8.5,14.7,43.6,4.4,3,SW,72,9.7,93,11.6,63
2017-08-09,VIC,Melbourne,86338,8.9,16.8,0,,,N,46,11.4,74,15.1,60
2017-08-10,NSW,Sydney,66062,9.2,24.2,0.5,4,9.4,NNW,31,11.8,64,23.5,24017-09-10,NSW,Sydney,66062,8,19.1,0.5,5,10.8,W,24,13.5,52,17.3,51
2017-09-10,QLD,Brisbane,40913,10.6,20.8,0.2,4.6,4.7,W,30,14.6,91,20.3,46
2017-09-10,VIC,Melbourne,86338,5.3,16.1,0.2,,,N,20,10.8,76,16,47
2017-09-11,NSW,Sydney,66062,8.7,22.7,0.5,3.8,10.1,NE,24,14.4,60,22.5,29
2017-09-11,QLD,Brisbane,40913,11.2,19.6,5.2,4.4,5.6,WSW,41,16.4,94,17.8,73
2017-09-11,VIC,Melbourne,86338,8.1,18.1,0,,,N,46,12.7,63,16.9,50
2017-09-12,NSW,Sydney,66062,11.3,27.2,0.5,5.8,2.3,E,26,18.2,43,23.4,34
2017-09-12,QLD,Brisbane,40913,13.2,19.4,1.4,4.2,7.9,SW,54,15.9,68,19,35
2017-09-12,VIC,Melbourne,86338,12.2,19.9,0.2,,,NNW,39,15.2,60,19.7,59
2017-09-13,NSW,Sydney,66062,18.2,33.8,0.5,6.2,9.6,NNW,70,24.9,19,32.5,10
2017-09-13,QLD,Brisbane,40913,7.7,21.6,0.6,6.4,11,E,43,13.3,50,21.6,27
2017-09-13,VIC,Melbourne,86338,13.3,16.4,2,,,WSW,50,14.9,62,11.7,69
2017-09-14,NSW,Sydney,66062,12,17.3,0.2,13,10.5,WSW,72,13.8,37,16.4,26
2017-09-14,QLD,Brisbane,40913,6.3,25.6,0.6,6,7,WSW,30,17.2,34,24.1,27
2017-09-14,VIC,Melbourne,86338,5.8,15.3,1.4,,,W,43,9.6,66,14.1,56
2017-09-15,NSW,Sydney,66062,10.5,22.8,0.5,7.8,10.6,W,52,16.6,37,22.2,24
2017-09-15,QLD,Brisbane,40913,14.5,27.8,0.6,4.6,8.5,ENE,37,17.6,43,27.6,26
2017-09-15,VIC,Melbourne,86338,9.6,16.7,0,,,WSW,35,13.5,58,14.8,82
2017-09-16,NSW,Sydney,66062,12.5,24.2,0.5,7,6.4,SSW,56,17.3,43,21.3,22
2017-09-16,QLD,Brisbane,40913,10.9,24.4,0.6,4.8,6.5,WSW,33,15,96,21.9,48
2017-09-16,VIC,Melbourne,86338,7.4,13.3,21.4,,,SSW,54,9.1,81,12.5,52
2017-09-17,NSW,Sydney,66062,8.1,18.6,0.5,4.6,10.8,ESE,28,13.6,48,16.4,37
2017-09-17,QLD,Brisbane,40913,14.6,21.7,0.4,4.4,9.6,WSW,33,18.5,72,21.1,47
2017-09-17,VIC,Melbourne,86338,4,18.1,0.2,,,N,43,10.5,58,17.8,35
2017-09-18,NSW,Sydney,66062,8.2,21,0.5,5.6,10.7,NE,39,14,55,20.2,55
2017-09-18,QLD,Brisbane,40913,7.6,26,0.6,4,10.4,E,37,15.8,71,25.7,34
2017-09-18,VIC,Melbourne,86338,10.5,22,0,,,NNW,63,15.7,32,20.9,30
2017-09-19,NSW,Sydney,66062,14,25.2,0.5,6.4,10.8,SSW,54,20.4,30,23.1,16
2017-09-19,QLD,Brisbane,40913,8.9,22.4,0.6,7,8.5,WSW,31,17.9,73,21,67
2017-09-19,VIC,Melbourne,86338,9,13.9,0.2,,,WNW,31,11.6,58,12.4,58
2017-09-20,NSW,Sydney,66062,10.7,19,0.5,9.2,7.7,WSW,33,14.6,44,17.7,56
2017-09-20,QLD,Brisbane,40913,14.7,21.4,5.4,4.6,6.9,W,48,18,57,20.6,52
2017-09-20,VIC,Melbourne,86338,6,19.2,0,,,N,35,11.6,63,18.6,42
2017-09-21,NSW,Sydney,66062,10.4,21.3,0.5,3.8,10.1,NE,30,15.6,65,21.2,49
2017-09-21,QLD,Brisbane,40913,16.8,21.5,5.8,5,3.4,NW,81,19.8,72,19,61
2017-09-21,VIC,Melbourne,86338,11.6,23.6,0,,,N,54,14.8,46,22.4,33
2017-09-22,NSW,Sydney,66062,12.8,27.7,0.5,7.2,10.5,NNE,35,20.1,35,25.8,26
2017-09-22,QLD,Brisbane,40913,13,19.2,7.4,5.2,1.9,W,89,14.5,88,17.3,58
2017-09-22,VIC,Melbourne,86338,12.8,25.6,0,,,N,33,14.9,71,24.8,32
2017-09-23,NSW,Sydney,66062,15.5,32.2,0.5,8.6,5.7,NNE,50,23,25,29.1,25
2017-09-23,QLD,Brisbane,40913,11.4,17.1,9.8,5.4,5.3,SW,52,13.4,72,16.1,48
2017-09-23,VIC,Melbourne,86338,14.8,30.6,0,,,N,61,21.1,30,30,24
2017-09-24,NSW,Sydney,66062,23,29.2,0.5,12,6.5,W,54,27.1,20,27.8,15
2017-09-24,QLD,Brisbane,40913,10.1,17.3,5.2,4.4,4.5,NNW,46,14.8,75,14,87
2017-09-24,VIC,Melbourne,86338,12.9,19.9,0.4,,,NNW,56,14.9,56,18.4,40
2017-09-25,NSW,Sydney,66062,16,26.7,0.5,10.8,10.8,WNW,57,22.3,21,23.3,19
2017-09-25,QLD,Brisbane,40913,9.3,17,10.4,3.2,4.6,NE,35,11.8,81,14.9,67
2017-09-25,VIC,Melbourne,86338,10.3,14.2,1.6,,,SSW,35,12.9,64,12.8,68
2017-09-26,NSW,Sydney,66062,12.3,22.2,0.5,9.6,11,WSW,41,19.5,29,19,47
2017-09-26,QLD,Brisbane,40913,6,18.2,3.4,4.4,9.3,WSW,41,13.1,69,17.5,40
2017-09-26,VIC,Melbourne,86338,5,15.1,0.4,,,S,24,11.3,64,14,46
2017-09-27,NSW,Sydney,66062,15.4,22.7,0.5,6.6,8.2,ENE,50,18.1,58,21.1,62
2017-09-27,QLD,Brisbane,40913,5,16.7,0.6,6.8,5.6,SW,41,13.1,58,14.7,56
2017-09-27,VIC,Melbourne,86338,9.6,24.6,0,,,NNW,50,12.5,64,24.5,30
2017-09-28,NSW,Sydney,66062,18,25.7,0.5,6.2,3.8,WNW,59,22.7,47,24.1,26
2017-09-28,QLD,Brisbane,40913,6.9,16.6,1,3.4,3.5,WSW,33,12.7,72,13.5,76
2017-09-28,VIC,Melbourne,86338,10.4,18.5,0,,,SW,35,14.1,66,16.8,45 2017-10-01,VIC,Melbourne,86338,9.7,17.2,0,,,S,24,12.4,65,16.4,58
2017-10-02,NSW,Sydney,66062,12,22.9,0.5,6.2,10.1,NE,44,16.7,55,21.3,60
2017-10-02,QLD,Brisbane,40913,7.7,24,0.6,4.8,11.3,WSW,39,19.6,43,22.1,45
2017-10-02,VIC,Melbourne,86338,9.5,16.4,0,,,SSW,24,12.1,75,15.2,58
2017-10-15,NSW,Sydney,66062,16,22.4,0.5,3.8,2.4,ESE,28,22.3,52,20.1,58
2017-10-15,QLD,Brisbane,40913,16.4,25.4,0.6,7.2,11.6,W,33,20.4,61,22.4,53
2017-10-15,VIC,Melbourne,86338,6.8,17.6,0,,,SSW,31,12.4,67,16.7,67
2017-10-16,NSW,Sydney,66062,15.9,23.3,0.5,5,10.4,E,30,22.4,55,22.3,50
2017-10-16,QLD,Brisbane,40913,15.3,19.9,3,5.8,6.5,NW,91,15.8,91,18.2,51
2017-10-16,VIC,Melbourne,86338,11.3,28.6,0,,,NNE,28,12.9,91,26.2,40
2017-10-17,NSW,Sydney,66062,16.9,23.5,0.5,7.6,9.9,ENE,39,21.2,57,22.9,47
2017-10-17,QLD,Brisbane,40913,13.9,20.1,7.6,5.4,5.8,WNW,52,17,71,19.8,50
2017-10-17,VIC,Melbourne,86338,12.8,30.2,0,,,N,37,21,50,30.1,24
2017-10-18,NSW,Sydney,66062,18.8,23.2,0.5,9,10.4,NNE,50,20.9,64,22.5,56
2017-10-18,QLD,Brisbane,40913,12,19.4,5,5,11.7,SW,44,16.7,46,18.2,37
2017-10-18,VIC,Melbourne,86338,14.5,30.9,0,,,NNW,48,21.7,43,29.9,31
2017-10-19,NSW,Sydney,66062,17.1,25.3,0.5,10,11.4,NNE,52,22.1,52,24.6,47
2017-10-19,QLD,Brisbane,40913,6.8,24.7,0.6,6.4,12.1,E,56,16.8,44,24.4,24
2017-10-19,VIC,Melbourne,86338,21.7,25.6,0,,,SW,43,23.8,45,23.5,71
2017-10-20,NSW,Sydney,66062,19,19.6,7.8,9.2,0,S,56,19.6,87,15.3,85
2017-10-20,QLD,Brisbane,40913,12.1,29.9,0.6,6,12.1,E,61,22.3,31,29.4,17
2017-10-20,VIC,Melbourne,86338,11.3,16.2,2.6,,,S,41,13.4,69,15.3,56
2017-10-21,NSW,Sydney,66062,13,20,16.4,4.4,11.3,SSW,50,16.8,52,18.9,53
2017-10-21,QLD,Brisbane,40913,15.9,29.5,0.6,9.2,4.9,ENE,65,23.2,32,28.2,26
2017-10-21,VIC,Melbourne,86338,11,15.4,0,,,S,28,11.8,71,14.8,63
2017-10-22,NSW,Sydney,66062,13.4,21.7,0.2,4.8,5.5,ESE,30,16,75,18.7,60
2017-10-22,QLD,Brisbane,40913,18.5,28.4,1.2,7.2,11.7,E,70,23.1,56,27.1,46
2017-10-22,VIC,Melbourne,86338,11.1,16,0,,,SSW,28,12.7,78,15.2,76
2017-10-23,NSW,Sydney,66062,12.7,23,3.4,3.6,12.1,S,31,18,60,22.1,43
2017-10-23,QLD,Brisbane,40913,14.3,23.6,0.6,5.6,3.8,WSW,37,18.2,80,22.4,57
2017-10-23,VIC,Melbourne,86338,11.9,19,0,,,SSW,31,13.6,72,17.3,61
2017-10-24,NSW,Sydney,66062,13.9,24.3,0.5,6.8,10.5,NE,39,19.9,57,22.6,55
2017-10-24,QLD,Brisbane,40913,14.3,22.6,0.6,6.2,11,W,37,18.2,63,20.3,40
2017-10-24,VIC,Melbourne,86338,12.1,20.5,0.2,,,SSW,26,15,74,17.4,61
2017-10-25,NSW,Sydney,66062,18,26.9,0.5,8,5,S,59,21,67,23.3,56
2017-10-25,QLD,Brisbane,40913,9.6,22.7,0.6,7,7.3,W,44,19,69,21.7,54
2017-10-25,VIC,Melbourne,86338,13.3,18.5,11.4,,,SSW,24,14.5,94,17.3,67
2017-10-26,NSW,Sydney,66062,17.2,24.6,0.4,7,6.5,WSW,63,19.7,73,22.6,63
2017-10-26,QLD,Brisbane,40913,13.3,20.9,1.2,4.6,5.3,WNW,67,17.6,77,16.4,66
2017-10-26,VIC,Melbourne,86338,12.9,16.8,0.2,,,SSW,39,13.8,83,15.5,73
2017-10-27,NSW,Sydney,66062,14.8,24.2,34.2,7.2,3.9,WSW,61,18.1,85,22.3,65
2017-10-27,QLD,Brisbane,40913,8.4,19.3,2,6.2,4.8,WSW,44,16.7,48,17.5,51
2017-10-27,VIC,Melbourne,86338,10.3,27.4,0,,,N,48,15.1,78,25.2,37
2017-10-28,NSW,Sydney,66062,17.5,25.1,0.2,3.2,7.7,ENE,35,21.6,66,22.3,65
2017-10-28,QLD,Brisbane,40913,14,20.5,1,4.2,6.2,WSW,52,18.4,54,19.8,54
2017-10-28,VIC,Melbourne,86338,15,21.4,0.4,,,N,39,15.4,72,19.7,46
2017-10-29,NSW,Sydney,66062,20.2,29.6,0.2,4.6,10.6,NNE,39,24.4,49,27.7,35
2017-10-29,QLD,Brisbane,40913,12.6,20.6,0.8,4.2,11.6,SW,54,16.7,46,19.9,41
2017-10-29,VIC,Melbourne,86338,11.5,28.6,0,,,N,61,19,63,27.4,25
2017-10-30,NSW,Sydney,66062,20.3,35.4,0.5,11.4,9.3,SSW,69,28.5,37,34.9,13
2017-10-30,QLD,Brisbane,40913,7.9,27.5,0.6,6.8,12.5,WSW,35,19.1,44,25.9,33
2017-10-30,VIC,Melbourne,86338,8.8,17,4.4,,,W,46,12.6,62,14.4,55
2017-10-31,NSW,Sydney,66062,13,20.5,0.5,12,10.1,SSE,57,15.8,43,19.7,36
2017-10-31,QLD,Brisbane,40913,8.7,29.3,0.6,10.4,12.2,SW,39,20.2,54,27.1,35
2017-10-31,VIC,Melbourne,86338,8.1,16.4,6,,,S,28,12,62,15.4,48
2017-11-01,NSW,Sydney,66062,13,22.2,0.5,9.6,4.2,SSW,54,15.4,39,21,32
2017-11-01,QLD,Brisbane,40913,15,30.5,0.6,10.4,10.4,SE,44,21.5,48,29,39
2017-11-01,VIC,Melbourne,86338,11.9,18,0.4,,,SSW,26,14.7,78,15.8,66
2017-11-02,NSW,Sydney,66062,15.3,22.6,0.5,5.6,9.9,ESE,31,20.1,56,21.8,46
2017-11-02,QLD,Brisbane,40913,14.7,27.6,0.6,10.2,11.9,E,46,19.4,51,27,35
2017-11-02,VIC,Melbourne,86338,11.3,17.4,1.4,,,SSE,26,12.7,79,15.9,56
2017-11-03,NSW,Sydney,66062,16.9,26.9,0.5,6.6,5.9,SSW,57,20.6,58,25.7,48
2017-11-03,QLD,Brisbane,40913,13.6,26.9,0.6,9.8,12.3,ESE,43,19.5,49,25.3,41
2017-11-03,VIC,Melbourne,86338,9.4,16.6,1.4,,,S,46,12,58,15.3,45
2017-11-04,NSW,Sydney,66062,16.2,17.1,1.6,7.4,0,SSE,39,17,73,15.8,78
2017-11-04,QLD,Brisbane,40913,14.8,30.7,0.6,8,12.6,E,69,23.1,43,30.3,20
2017-11-04,VIC,Melbourne,86338,8.3,15.8,0.4,,,SSW,35,10.2,72,14.5,47
2017-11-05,NSW,Sydney,66062,14.1,19.5,31.8,2.6,0,ENE,39,14.9,85,17.7,59
2017-11-05,QLD,Brisbane,40913,15.2,31.1,0.6,10.2,12.6,E,67,21.6,38,31,20
2017-11-05,VIC,Melbourne,86338,6.4,17.6,0,,,SSW,39,13.4,59,15.6,49
2017-11-06,NSW,Sydney,66062,14.8,26.3,4.4,1.8,5.7,SSW,67,19.4,81,23.6,57
2017-11-06,QLD,Brisbane,40913,14.9,32.7,0.6,10.2,12.5,E,67,23.5,33,32.7,11
2017-11-06,VIC,Melbourne,86338,10.9,16.1,0,,,S,43,13.8,66,14.8,60
2017-11-07,NSW,Sydney,66062,13.8,21.8,5.6,8.8,11.3,SSE,54,18.9,40,20,40
2017-11-07,QLD,Brisbane,40913,17.3,36.4,0.6,12,12.5,E,59,27.1,21,35.4,14
2017-11-07,VIC,Melbourne,86338,10,15.8,3.4,,,S,43,12.2,76,14.4,52
2017-11-08,NSW,Sydney,66062,14.6,21.7,0.5,9.6,10,SSE,44,16.8,54,20.5,44
2017-11-08,QLD,Brisbane,40913,20,35.6,0.6,11.2,10.8,ENE,37,28,30,33.1,24
2017-11-08,VIC,Melbourne,86338,12.1,16.9,0,,,S,30,13.9,59,15.6,64
2017-11-09,NSW,Sydney,66062,12.3,22.1,0.5,6.8,11.6,E,28,17.9,52,21.2,50
2017-11-09,QLD,Brisbane,40913,15.2,31.6,0.6,7.8,12.8,SW,37,26.9,42,29.7,31
2017-11-09,VIC,Melbourne,86338,8.7,21.8,0,,,SSW,28,14.6,68,18.1,66
2017-11-10,NSW,Sydney,66062,14.2,23.4,0.5,6.8,11.6,E,26,19.2,59,22.7,50
2017-11-10,QLD,Brisbane,40913,14.1,32,0.6,8.4,12.6,W,33,26.4,41,30.1,37
2017-11-10,VIC,Melbourne,86338,12.1,27.2,0,,,SE,20,15.7,78,22.8,63
2017-11-11,NSW,Sydney,66062,14.7,23.7,0.5,7.4,8.6,ENE,33,19.9,58,20.9,56
2017-11-11,QLD,Brisbane,40913,16.2,38,0.6,9,10,W,41,30.4,32,33.5,27
2017-11-11,VIC,Melbourne,86338,14.8,24.7,0,,,SSW,26,17.8,79,22,65
2017-11-12,NSW,Sydney,66062,17.1,22.5,0.5,7.4,10.7,E,31,19.7,49,21.7,47
2017-11-12,QLD,Brisbane,40913,22.4,32.1,0.6,11.8,11.3,S,39,28.7,39,28.8,46
2017-11-12,VIC,Melbourne,86338,13.2,23.3,0,,,SSW,28,16.1,86,21.6,69
2017-11-13,NSW,Sydney,66062,17.5,22.3,0.5,6.4,9.1,E,33,19.1,48,21.2,42
2017-11-13,QLD,Brisbane,40913,17.5,30.1,0.6,10.2,12.7,SW,50,25.2,65,27.9,45
2017-11-13,VIC,Melbourne,86338,13.7,33,0,,,NE,24,19.3,73,32,20
2017-11-14,NSW,Sydney,66062,16,22.9,0.5,7.6,8.9,ENE,37,20,53,22.3,46
2017-11-14,QLD,Brisbane,40913,14.5,29.7,0.6,11.8,12.7,SE,54,20.2,47,29.1,19
2017-11-14,VIC,Melbourne,86338,18.2,33.9,0,,,N,48,25.3,38,33.1,27

Python dataset calculations

I have a data set recording different weeks and the new cases of dengue for that specific week and I am supposed to calculate the infection rate and recovery rate for each week. The infection rate can be calculated by dividing the number of newly infected patients by the susceptible population for that week while the recovery rate can be calculated by dividing the number of newly recovered patients by the infected population for that week. The infection rate is relatively simple but for the recovery rate I have to take into account that infected patients take exactly 2 weeks to recover and I'm stuck. Any help would be appreciated
t_pop = 4*10**6
s_pop = t_pop
i_pop = 0
r_pop = 0
weeks = 0
#Infection Rate
for index, row in data.iterrows():
new_i = row['New Cases']
s_pop -= new_i
weeks += 1
infection_rate = float(new_i)/float(s_pop)
print('Week', weeks, ':' ,infection_rate)
*Note: t_pop refers to total population which we assume to be 4million, s_pop refers to the population at risk of contracting dengue and i_pop refers to infected population
You could create a dictionary to store the data for each week, and then use it to refer back to when you need to calculate the recovery rate. For example:
dengue_dict = {}
dengue_dict["Week 1"] = {"Infection Rate": infection_rate, "Recovery Rate": None}
I use none at first, because there's no recovery rate until at least two weeks have gone by. Later, you can either update weeks or just add them right away. Here's an example for week 3:
recovery_rate = dengue_dict["Week 1"]["Infection Rate"]/infection_rate
And then update the entry in the dictionary:
dengue_dict["Week 3"]["Recovery Rate"] = recovery_rate

Combining a large amount of netCDF files

I have a large folder of netCDF (.nc) files each one with a similar name. The data files contain variables of time, longitude, latitude, and monthly precipitation. The goal is to get the average monthly precipitation over X amount of years for each month. So in the end I would have 12 values representing the average monthly precipitation over X amount of years for each lat and long. Each file is the same location over many years.
Each file starts with the same name and ends in a “date.sub.nc” for example:
'data1.somthing.somthing1.avg_2d_Ind_Nx.200109.SUB.nc'
'data1.somthing.somthing1.avg_2d_Ind_Nx.200509.SUB.nc'
'data2.somthing.somthing1.avg_2d_Ind_Nx.201104.SUB.nc'
'data2.somthing.somthing1.avg_2d_Ind_Nx.201004.SUB.nc'
'data2.somthing.somthing1.avg_2d_Ind_Nx.201003.SUB.nc'
'data2.somthing.somthing1.avg_2d_Ind_Nx.201103.SUB.nc'
'data1.somthing.somthing1.avg_2d_Ind_Nx.201203.SUB.nc'
The ending is YearMonth.SUB.nc
What I have so far is:
array=[]
f = nc.MFDataset('data*.nc')
precp = f.variables['prectot']
time = f.variables['time']
array = f.variables['time','longitude','latitude','prectot']
I get a KeyError: ('time', 'longitude', 'latitude', 'prectot'). Is there a way to combine all this data so I am able to manipulate it?
As #CharlieZender mentioned, ncra is the way to go here and I'll provide some more details on integrating that function into a Python script. (PS - you can install NCO easily with Homebrew, e.g. http://alejandrosoto.net/blog/2014/01/22/setting-up-my-mac-for-scientific-research/)
import subprocess
import netCDF4
import glob
import numpy as np
for month in range(1,13):
# Gather all the files for this month
month_files = glob.glob('/path/to/files/*{0:0>2d}.SUB.nc'.format(month))
# Using NCO functions ---------------
avg_file = './precip_avg_{0:0>2d}.nc'.format(month)
# Concatenate the files using ncrcat
subprocess.call(['ncrcat'] + month_files + ['-O', avg_file])
# Take the time (record) average using ncra
subprocess.call(['ncra', avg_file, '-O', avg_file])
# Read in the monthly precip climatology file and do whatever now
ncfile = netCDF4.Dataset(avg_file, 'r')
pr = ncfile.variables['prectot'][:,:,:]
....
# Using only Python -------------
# Initialize an array to store monthly-mean precip for all years
# let's presume we know the lat and lon dimensions (nlat, nlon)
nyears = len(month_files)
pr_arr = np.zeros([nyears,nlat,nlon], dtype='f4')
# Populate pr_arr with each file's monthly-mean precip
for idx, filename in enumerate(month_files):
ncfile = netCDF4.Dataset(filename, 'r')
pr = ncfile.variable['prectot'][:,:,:]
pr_arr[idx,:,:] = np.mean(pr, axis=0)
ncfile.close()
# Take the average along all years for a monthly climatology
pr_clim = np.mean(pr_arr, axis=0) # 2D now [lat,lon]
NCO does this with
ncra *.01.SUB.nc pcp_avg_01.nc
ncra *.02.SUB.nc pcp_avg_02.nc
...
ncra *.12.SUB.nc pcp_avg_12.nc
ncrcat pcp_avg_??.nc pcp_avg.nc
Of course the first twelve commands can be done with a Bash loop, reducing the total number of lines to less than five. If you prefer to script with python, you can check your answers with this. ncra docs here.
The command ymonmean calculates the mean of calendar months in CDO. Thus the task can be accomplished in two lines:
cdo mergetime data*.SUB.nc merged.nc # put files together into one series
cdo ymonmean merged.nc annual_cycle.nc # mean of all Jan,Feb etc.
cdo can also calculate the annual cycle of other statistics, ymonstd, ymonmax etc... and the time units can be days or pentads as well as months. (e.g. ydaymean).

Categories