I have a df that looks like this (shortened):
DateTime Value Date Time
0 2022-09-18 06:00:00 5.4 18/09/2022 06:00
1 2022-09-18 07:00:00 6.0 18/09/2022 07:00
2 2022-09-18 08:00:00 6.5 18/09/2022 08:00
3 2022-09-18 09:00:00 6.7 18/09/2022 09:00
8 2022-09-18 14:00:00 7.9 18/09/2022 14:00
9 2022-09-18 15:00:00 7.8 18/09/2022 15:00
10 2022-09-18 16:00:00 7.6 18/09/2022 16:00
11 2022-09-18 17:00:00 6.8 18/09/2022 17:00
12 2022-09-18 18:00:00 6.4 18/09/2022 18:00
13 2022-09-18 19:00:00 5.7 18/09/2022 19:00
14 2022-09-18 20:00:00 4.8 18/09/2022 20:00
15 2022-09-18 21:00:00 5.4 18/09/2022 21:00
16 2022-09-18 22:00:00 4.7 18/09/2022 22:00
17 2022-09-18 23:00:00 4.3 18/09/2022 23:00
18 2022-09-19 00:00:00 4.1 19/09/2022 00:00
19 2022-09-19 01:00:00 4.4 19/09/2022 01:00
22 2022-09-19 04:00:00 3.5 19/09/2022 04:00
23 2022-09-19 05:00:00 2.8 19/09/2022 05:00
24 2022-09-19 06:00:00 3.8 19/09/2022 06:00
I want to create a new column where i split the between day and night like this:
00:00 - 05:00 night ,
06:00 - 18:00 day ,
19:00 - 23:00 night
But apparently one can't use same label? How can I solve this problem? Here is my code
df['period'] = pd.cut(pd.to_datetime(df.DateTime).dt.hour,
bins=[0, 5, 17, 23],
labels=['night', 'morning', 'night'],
include_lowest=True)
It's returning
ValueError: labels must be unique if ordered=True; pass ordered=False for duplicate labels
if i understood correctly, if time is between 00:00 - 05:00 or 19:00 - 23:00, you want your new column to say 'night', else 'day', well here's that code:
df['day/night'] = df['Time'].apply(lambda x: 'night' if '00:00' <= x <= '05:00' or '19:00' <= x <= '23:00' else 'day')
or you can add ordered = false parameter using your method
input ->
df = pd.DataFrame(columns=['DateTime', 'Value', 'Date', 'Time'], data=[
['2022-09-18 06:00:00', 5.4, '18/09/2022', '06:00'],
['2022-09-18 07:00:00', 6.0, '18/09/2022', '07:00'],
['2022-09-18 08:00:00', 6.5, '18/09/2022', '08:00'],
['2022-09-18 09:00:00', 6.7, '18/09/2022', '09:00'],
['2022-09-18 14:00:00', 7.9, '18/09/2022', '14:00'],
['2022-09-18 15:00:00', 7.8, '18/09/2022', '15:00'],
['2022-09-18 16:00:00', 7.6, '18/09/2022', '16:00'],
['2022-09-18 17:00:00', 6.8, '18/09/2022', '17:00'],
['2022-09-18 18:00:00', 6.4, '18/09/2022', '18:00'],
['2022-09-18 19:00:00', 5.7, '18/09/2022', '19:00'],
['2022-09-18 20:00:00', 4.8, '18/09/2022', '20:00'],
['2022-09-18 21:00:00', 5.4, '18/09/2022', '21:00'],
['2022-09-18 22:00:00', 4.7, '18/09/2022', '22:00'],
['2022-09-18 23:00:00', 4.3, '18/09/2022', '23:00'],
['2022-09-19 00:00:00', 4.1, '19/09/2022', '00:00'],
['2022-09-19 01:00:00', 4.4, '19/09/2022', '01:00'],
['2022-09-19 04:00:00', 3.5, '19/09/2022', '04:00'],
['2022-09-19 05:00:00', 2.8, '19/09/2022', '05:00'],
['2022-09-19 06:00:00', 3.8, '19/09/2022', '06:00']])
output ->
DateTime Value Date Time is_0600_0900
0 2022-09-18 06:00:00 5.4 18/09/2022 06:00 day
1 2022-09-18 07:00:00 6.0 18/09/2022 07:00 day
2 2022-09-18 08:00:00 6.5 18/09/2022 08:00 day
3 2022-09-18 09:00:00 6.7 18/09/2022 09:00 day
4 2022-09-18 14:00:00 7.9 18/09/2022 14:00 day
5 2022-09-18 15:00:00 7.8 18/09/2022 15:00 day
6 2022-09-18 16:00:00 7.6 18/09/2022 16:00 day
7 2022-09-18 17:00:00 6.8 18/09/2022 17:00 day
8 2022-09-18 18:00:00 6.4 18/09/2022 18:00 day
9 2022-09-18 19:00:00 5.7 18/09/2022 19:00 night
10 2022-09-18 20:00:00 4.8 18/09/2022 20:00 night
11 2022-09-18 21:00:00 5.4 18/09/2022 21:00 night
12 2022-09-18 22:00:00 4.7 18/09/2022 22:00 night
13 2022-09-18 23:00:00 4.3 18/09/2022 23:00 night
14 2022-09-19 00:00:00 4.1 19/09/2022 00:00 night
15 2022-09-19 01:00:00 4.4 19/09/2022 01:00 night
16 2022-09-19 04:00:00 3.5 19/09/2022 04:00 night
17 2022-09-19 05:00:00 2.8 19/09/2022 05:00 night
18 2022-09-19 06:00:00 3.8 19/09/2022 06:00 day
You have two options.
Either you don't care about the order and you can set ordered=False as parameter of cut:
df['period'] = pd.cut(pd.to_datetime(df.DateTime).dt.hour,
bins=[0, 5, 17, 23],
labels=['night', 'morning', 'night'],
ordered=False,
include_lowest=True)
Or you care to have night and morning ordered, in which case you can further convert to ordered Categorical:
df['period'] = pd.Categorical(df['period'], categories=['night', 'morning'], ordered=True)
output:
DateTime Value Date Time period
0 2022-09-18 06:00:00 5.4 18/09/2022 06:00 morning
1 2022-09-18 07:00:00 6.0 18/09/2022 07:00 morning
2 2022-09-18 08:00:00 6.5 18/09/2022 08:00 morning
3 2022-09-18 09:00:00 6.7 18/09/2022 09:00 morning
8 2022-09-18 14:00:00 7.9 18/09/2022 14:00 morning
9 2022-09-18 15:00:00 7.8 18/09/2022 15:00 morning
10 2022-09-18 16:00:00 7.6 18/09/2022 16:00 morning
11 2022-09-18 17:00:00 6.8 18/09/2022 17:00 morning
12 2022-09-18 18:00:00 6.4 18/09/2022 18:00 night
13 2022-09-18 19:00:00 5.7 18/09/2022 19:00 night
14 2022-09-18 20:00:00 4.8 18/09/2022 20:00 night
15 2022-09-18 21:00:00 5.4 18/09/2022 21:00 night
16 2022-09-18 22:00:00 4.7 18/09/2022 22:00 night
17 2022-09-18 23:00:00 4.3 18/09/2022 23:00 night
18 2022-09-19 00:00:00 4.1 19/09/2022 00:00 night
19 2022-09-19 01:00:00 4.4 19/09/2022 01:00 night
22 2022-09-19 04:00:00 3.5 19/09/2022 04:00 night
23 2022-09-19 05:00:00 2.8 19/09/2022 05:00 night
24 2022-09-19 06:00:00 3.8 19/09/2022 06:00 morning
column:
df['period']
0 morning
1 morning
2 morning
...
23 night
24 morning
Name: period, dtype: category
Categories (2, object): ['morning', 'night']
Related
I have to elaborate simple statistics grouping a dataframe by one column, for instance day_of_the_week, and minutes ranges, for instance 15 minutess, without keeping the dates. In other words, I need statistics on what is happening in each interval of 15 minutes on all sundays, mondays, etc. not split by date. The starting dataframe is something like this
datetime high low Day_of_the_week HL_delta
2021-08-01 22:00:00 4403.00 4395.25 6.0 7.75
2021-08-01 22:15:00 4404.00 4401.00 6.0 3.00
2021-08-01 22:30:00 4409.00 4403.25 6.0 5.75
2021-08-01 22:45:00 4408.25 4406.25 6.0 2.00
2021-08-01 23:00:00 4408.25 4405.75 6.0 2.5
where datetime is the index of the dataframe and it is a DateTime type.
I need to calculate the mean and max value for each distinguished day of the week of HL_delta grouped by 15 minute ranges over 1 year data.
I have tried something like this
df_statistics['HL_mean'] = df_data_for_statistics.groupby([df_data_for_statistics['day_of_the_week'], pd.Grouper(freq=statistics_period_minutes_formatted,closed='left',label='left')]).agg({ "HL_delta": "mean"})
df_statistics['HL_max'] = df_data_for_statistics.groupby([df_data_for_statistics['day_of_the_week'], pd.Grouper(freq=statistics_period_minutes_formatted,closed='left',label='left')]).agg({ "HL_delta": "max"})
but what i get is not an aggregation on all the distinguished weekdays of the year, the aggregation is applied on group of 15 minutes of each date, not each monday, Tuesday, Wednesday,.... The statistics shall answer to the questions: "which is the max values of HL_delta between the time 00:00 and 00:15 of all the Mondays of the year", "which is the max values of HL_delta between the time 00:16 and 00:30 of all the Mondays of the year", ..., "which is the max values of HL_delta between the time 00:00 and 00:15 of all the Fridays of the year", ... etc. Instead what I get by this attempt is this
high low day_of_the_week HL_delta
datetime
2021-08-01 22:00:00 4403.00 4395.25 6.0 7.75
2021-08-01 22:15:00 4404.00 4401.00 6.0 3.00
2021-08-01 22:30:00 4409.00 4403.25 6.0 5.75
2021-08-01 22:45:00 4408.25 4406.25 6.0 2.00
2021-08-01 23:00:00 4408.25 4405.75 6.0 2.50
... ... ... ... ...
2022-03-21 22:45:00 4453.50 4451.50 0.0 2.00
2022-03-21 23:00:00 4452.25 4449.00 0.0 3.25
2022-03-21 23:15:00 4451.50 4449.25 0.0 2.25
2022-03-21 23:30:00 4451.50 4448.50 0.0 3.00
2022-03-21 23:45:00 4449.75 4445.25 0.0 4.50
Any suggestion?
With the following toy dataframe:
import random
import pandas as pd
list_of_dates = pd.date_range(start="2021-1-1", end="2021-12-31", freq="T")
df = pd.DataFrame(
{
"datetime": list_of_dates,
"low": [random.randint(0, 4_999) for _ in range(len(list_of_dates))],
"high": [random.randint(5_000, 9_999) for _ in range(len(list_of_dates))],
}
)
df["HL_delta"] = df["high"] - df["low"]
print(df)
# Output
datetime low high HL_delta
0 2021-01-01 00:00:00 4325 5059 734
1 2021-01-01 00:01:00 917 7224 6307
2 2021-01-01 00:02:00 2956 7804 4848
3 2021-01-01 00:03:00 1329 8056 6727
4 2021-01-01 00:04:00 1721 9144 7423
...
Here is one way to do it:
# Setup
df["weekday"] = df["datetime"].dt.day_name()
df["datetime_minute"] = df["datetime"].dt.minute
intervals = {"0-15": [0, 15], "16-30": [16, 30], "31-45": [31, 45], "46-59": [46, 59]}
# Find intervals
df["interval"] = df.apply(
lambda x: next(
filter(
None,
[
key
if x["datetime_minute"] >= intervals[key][0]
and x["datetime_minute"] <= intervals[key][1]
else None
for key in intervals.keys()
],
)
),
axis=1,
)
# Get stats
new_df = (
df.drop(columns=["low", "high", "datetime", "datetime_minute"])
.groupby(["weekday", "interval"], sort=False)
.agg(["max", "mean"])
)
And so:
print(new_df)
# Output
HL_delta
max mean
weekday interval
Friday 0-15 9989 5011.461666
16-30 9948 5003.452724
31-45 9902 4969.810577
46-59 9926 5007.599073
Saturday 0-15 9950 5004.103966
16-30 9961 4984.479327
31-45 9954 5005.854647
46-59 9973 5011.797447
Sunday 0-15 9979 4994.012270
16-30 9950 4981.940438
31-45 9877 5009.572276
46-59 9930 5020.719609
Monday 0-15 9974 4963.538812
16-30 9918 4977.481090
31-45 9971 4977.858173
46-59 9958 4992.886733
Tuesday 0-15 9924 5014.045623
16-30 9966 4990.358547
31-45 9948 4993.595566
46-59 9948 5000.271120
Wednesday 0-15 9975 4998.320463
16-30 9976 4981.763889
31-45 9981 4981.806303
46-59 9995 5001.579670
Thursday 0-15 9958 5015.276643
16-30 9900 4996.797489
31-45 9949 4991.088034
46-59 9948 4980.678457
I have a df that looks like this (the df much larger)
DateTime Value Date Time period DatePeriod
0 2022-09-18 06:00:00 5.4 18/09/2022 06:00 morning 18/09/2022-morning
1 2022-09-18 07:00:00 6.0 18/09/2022 07:00 morning 18/09/2022-morning
2 2022-09-18 08:00:00 6.5 18/09/2022 08:00 morning 18/09/2022-morning
3 2022-09-18 09:00:00 6.7 18/09/2022 09:00 morning 18/09/2022-morning
4 2022-09-18 10:00:00 6.9 18/09/2022 10:00 morning 18/09/2022-morning
11 2022-09-18 17:00:00 6.8 18/09/2022 17:00 morning 18/09/2022-morning
12 2022-09-18 18:00:00 6.4 18/09/2022 18:00 night 18/09/2022-night
13 2022-09-18 19:00:00 5.7 18/09/2022 19:00 night 18/09/2022-night
14 2022-09-18 20:00:00 4.8 18/09/2022 20:00 night 18/09/2022-night
15 2022-09-18 21:00:00 5.4 18/09/2022 21:00 night 18/09/2022-night
16 2022-09-18 22:00:00 4.7 18/09/2022 22:00 night 19/09/2022-night
21 2022-09-19 03:00:00 3.8 19/09/2022 03:00 night 19/09/2022-night
22 2022-09-19 04:00:00 3.5 19/09/2022 04:00 night 19/09/2022-night
23 2022-09-19 05:00:00 2.8 19/09/2022 05:00 night 19/09/2022-night
24 2022-09-19 06:00:00 3.8 19/09/2022 06:00 morning 19/09/2022-morning
I created a dictionary by grouping the Dateperiod and collected their values in a list, like this:
result = df.groupby('DatePeriod')['Value'].apply(list).to_dict()
Output:
{'18/09/2022-morning': [5.4, 6.0, 6.5, 6.9, 7.9, 8.5, 7.5, 7.9, 7.8, 7.6, 6.8],
'18/09/2022-night': [6.4, 5.7, 4.8, 5.4, 4.7, 4.3],
'19/09/2022-morning': [3.8],
'19/09/2022-night': [4.1, 4.4, 4.3, 3.8, 3.5, 2.8]}
Is there anyway I can get the exact same result but with the DateTime as key instead of DatePeriod in result dictionary? i.e I still want the grouping to be based on the DatePeriod and the values to be a list of values,
only difference is i want the full Date to be the key, it can be the first DateTime as key, but not the DatePeriod! Example:
{'2022-09-18 06:00:00': [5.4, 6.0, 6.5, 6.9, 7.9, 8.5, 7.5, 7.9, 7.8, 7.6, 6.8],
'2022-09-18 18:00:00' : [6.4, 5.7, 4.8, 5.4, 4.7, 4.3],
'2022-09-19 06:00:00': [3.8],
'2022-09-19 03:00:00': [4.1, 4.4, 4.3, 3.8, 3.5, 2.8]}
Is there any easy way to do this?
Thanks in advance
IIUC you can use aggregation:
result = (df.groupby('DatePeriod')
.agg({"Value": list, "DateTime": "first"})
.set_index("DateTime")["Value"]
.to_dict())
print (result)
{'2022-05-12 06:00:00': [11.8], '2022-05-12 18:00:00': [12.5], '2022-05-13 06:00:00': [10.9], '2022-05-13 18:00:00': [13.5], '2022-05-14 06:00:00': [11.8]}
I have following dataframe:
latitude longitude d1 d2 ar merge_time
0 15 10.0 12/1/1981 0:00 12/4/1981 3:00 2.317681391 1981-12-04 04:00:00
1 15 10.1 12/1/1981 0:00 12/1/1981 3:00 2.293604127 1981-12-01 04:00:00
2 15 10.2 12/1/1981 0:00 12/1/1981 2:00 2.264552161 1981-12-01 03:00:00
3 15 10.3 12/1/1981 0:00 12/4/1981 2:00 2.278556423 1981-12-04 03:00:00
4 15 10.1 12/1/1981 4:00 12/1/1981 22:00 2.168275766 1981-12-01 23:00:00
5 15 10.2 12/1/1981 3:00 12/1/1981 21:00 2.114636628 1981-12-01 22:00:00
6 15 10.4 12/1/1981 0:00 12/2/1981 17:00 1.384415903 1981-12-02 18:00:00
7 15 10.1 12/2/1981 8:00 12/2/1981 11:00 2.293604127 1981-12-01 12:00:00
I want to group and rearrange above dataframe (value of column ar) based on following criteria:
1. Values latitude and longitude are equal and
2. Values d2 and merge_time are equal withing grouped in 1
Here is desired output:
latitude longitude d1 d2 ar
15 10 12/1/1981 0:00 12/4/1981 3:00 2.317681391
15 10.1 12/1/1981 0:00 12/1/1981 22:00 4.461879893
15 10.2 12/1/1981 0:00 12/1/1981 21:00 4.379188789
15 10.3 12/1/1981 0:00 12/4/1981 2:00 2.278556423
15 10.4 12/1/1981 0:00 12/2/1981 17:00 1.384415903
15 10.1 12/2/1981 8:00 12/2/1981 11:00 2.293604127
How can I achieve this?
Any help is appreceated.
after expressing your requirements in comments
group by location (longitude & latitude)
find rows within this grouping that are contiguous in time
group and aggregate these contiguous sections
import io
import pandas as pd
df = pd.read_csv(io.StringIO(""" latitude longitude d1 d2 ar merge_time
0 15 10.0 12/1/1981 0:00 12/4/1981 3:00 2.317681391 1981-12-04 04:00:00
1 15 10.1 12/1/1981 0:00 12/1/1981 3:00 2.293604127 1981-12-01 04:00:00
2 15 10.2 12/1/1981 0:00 12/1/1981 2:00 2.264552161 1981-12-01 03:00:00
3 15 10.3 12/1/1981 0:00 12/4/1981 2:00 2.278556423 1981-12-04 03:00:00
4 15 10.1 12/1/1981 4:00 12/1/1981 22:00 2.168275766 1981-12-01 23:00:00
5 15 10.2 12/1/1981 3:00 12/1/1981 21:00 2.114636628 1981-12-01 22:00:00
6 15 10.4 12/1/1981 0:00 12/2/1981 17:00 1.384415903 1981-12-02 18:00:00
7 15 10.1 12/2/1981 8:00 12/2/1981 11:00 2.293604127 1981-12-01 12:00:00"""), sep="\s\s+", engine="python")
df = df.assign(**{c:pd.to_datetime(df[c]) for c in ["d1","d2","merge_time"]})
df.groupby(["latitude", "longitude"]).apply(
lambda d: d.groupby(
(d["d1"] != (d["d2"].shift() + pd.Timedelta("1H"))).cumsum(), as_index=False
).agg({"d1": "min", "d2": "max", "ar": "sum"})
).droplevel(2,0).reset_index()
output
latitude
longitude
d1
d2
ar
0
15
10
1981-12-01 00:00:00
1981-12-04 03:00:00
2.31768
1
15
10.1
1981-12-01 00:00:00
1981-12-01 22:00:00
4.46188
2
15
10.1
1981-12-02 08:00:00
1981-12-02 11:00:00
2.2936
3
15
10.2
1981-12-01 00:00:00
1981-12-01 21:00:00
4.37919
4
15
10.3
1981-12-01 00:00:00
1981-12-04 02:00:00
2.27856
5
15
10.4
1981-12-01 00:00:00
1981-12-02 17:00:00
1.38442
I currently have the following dataframe (with seven days, one day displayed below). Hours run from 01:00 to 24:00. How do I convert the HourEnding column to datetime format and combine it with the date_time column (which is already in datetime format)?
HourEnding LMP date_time
0 01:00 165.27 2021-02-20
1 02:00 155.89 2021-02-20
2 03:00 154.50 2021-02-20
3 04:00 153.44 2021-02-20
4 05:00 210.15 2021-02-20
5 06:00 298.90 2021-02-20
6 07:00 152.71 2021-02-20
7 08:00 204.61 2021-02-20
8 09:00 155.77 2021-02-20
9 10:00 90.64 2021-02-20
10 11:00 57.17 2021-02-20
11 12:00 43.74 2021-02-20
12 13:00 33.42 2021-02-20
13 14:00 5.05 2021-02-20
14 15:00 1.43 2021-02-20
15 16:00 0.99 2021-02-20
16 17:00 0.94 2021-02-20
17 18:00 12.13 2021-02-20
18 19:00 18.90 2021-02-20
19 20:00 19.04 2021-02-20
20 21:00 16.42 2021-02-20
21 22:00 14.47 2021-02-20
22 23:00 44.55 2021-02-20
23 24:00 40.51 2021-02-20
So far I've tried
df['time'] = pd.to_datetime(df['HourEnding'])
but that seems to fail because of the 24:00.
Similarly
df['time'] = pd.to_timedelta('HourEnding', 'h', errors = 'coerce')
yields a column of NaTs.
As you mentioned in the comments, hour 24 corresponds to midnight of the same day. I would simply start by replacing "24" by "00" :
df['HourEnding'] = df.HourEnding.str.replace('24:00', '00:00')
Then, convert date_time to string :
df['date_time'] = df.date_time.astype(str)
Create a new column that concatenates date_time and HourEnding :
df['date_and_hour'] = df.date_time + " " + df.HourEnding
df['date_and_hour'] = pd.to_datetime(df.date_and_hour)
Which gives you this :
>>> df
HourEnding LMP date_time date_and_hour
0 01:00 165.27 2021-02-20 2021-02-20 01:00:00
1 02:00 155.89 2021-02-20 2021-02-20 02:00:00
2 03:00 154.50 2021-02-20 2021-02-20 03:00:00
3 04:00 153.44 2021-02-20 2021-02-20 04:00:00
4 05:00 210.15 2021-02-20 2021-02-20 05:00:00
5 06:00 298.90 2021-02-20 2021-02-20 06:00:00
6 07:00 152.71 2021-02-20 2021-02-20 07:00:00
7 08:00 204.61 2021-02-20 2021-02-20 08:00:00
8 09:00 155.77 2021-02-20 2021-02-20 09:00:00
9 10:00 90.64 2021-02-20 2021-02-20 10:00:00
10 11:00 57.17 2021-02-20 2021-02-20 11:00:00
11 12:00 43.74 2021-02-20 2021-02-20 12:00:00
12 13:00 33.42 2021-02-20 2021-02-20 13:00:00
13 14:00 5.05 2021-02-20 2021-02-20 14:00:00
14 15:00 1.43 2021-02-20 2021-02-20 15:00:00
15 16:00 0.99 2021-02-20 2021-02-20 16:00:00
16 17:00 0.94 2021-02-20 2021-02-20 17:00:00
17 18:00 12.13 2021-02-20 2021-02-20 18:00:00
18 19:00 18.90 2021-02-20 2021-02-20 19:00:00
19 20:00 19.04 2021-02-20 2021-02-20 20:00:00
20 21:00 16.42 2021-02-20 2021-02-20 21:00:00
21 22:00 14.47 2021-02-20 2021-02-20 22:00:00
22 23:00 44.55 2021-02-20 2021-02-20 23:00:00
23 00:00 40.51 2021-02-20 2021-02-20 00:00:00
>>> df.dtypes
HourEnding object
LMP float64
date_time object
date_and_hour datetime64[ns]
Convert both columns to strings, then join them into a new 'datetime' column, and finally convert the 'datetime' column to datetime.
EDIT: To deal with the 1-24 hour problem, build a function to split the string and subtract 1 from each of the hours and then join:
def subtract_hour(t):
t = t.split(':')
t[0] = str(int(t[0]) - 1)
if len(t[0]) < 2:
t[0] = '0' + t[0]
return ':'.join(t)
Then you can apply this to your hour column (e.g., df['hour'] = df['hour'].apply(subtract_hour)) and proceed with joining columns and then parsing using pd.to_datetime.
EDIT 2: You just want to change '24' to '00', my bad.
def mod_midnight(t):
t = t.split(':')
if t[0] == '24':
t[0] = '00'
return ':'.join(t)
I have a df
Time Samstag Sonntag Werktag
00:15:00 95.3 87.8 94.7
00:30:00 95.5 88.3 94.1
00:45:00 96.2 89.0 94.1
01:00:00 97.4 90.1 95.0
01:15:00 98.9 91.3 96.6
01:30:00 100.3 92.4 98.4
01:45:00 101.0 92.9 99.8
02:00:00 100.4 92.5 99.8
02:15:00 98.2 91.0 98.0
02:30:00 95.1 88.7 95.1
02:45:00 91.9 86.4 91.9
03:00:00 89.5 84.7 89.5
03:15:00 88.6 84.0 88.4
03:30:00 88.6 84.0 88.3
03:45:00 88.7 84.0 88.3
04:00:00 88.3 83.5 87.7
04:15:00 86.8 82.1 86.1
04:30:00 85.1 80.6 84.3
04:45:00 84.2 80.1 83.5
05:00:00 85.3 81.6 84.7
05:15:00 89.0 85.9 88.5
05:30:00 94.1 91.6 94.0
05:45:00 99.3 97.0 99.5
06:00:00 102.8 100.4 103.4
06:15:00 103.7 100.7 104.7
06:30:00 102.6 98.8 104.0
06:45:00 100.7 96.2 102.4
07:00:00 99.2 94.3 101.0
07:15:00 99.1 94.4 100.8
07:30:00 100.8 95.7 102.1
07:45:00 104.4 97.6 105.3
08:00:00 110.1 99.2 110.7
08:15:00 117.7 99.7 118.2
08:30:00 126.1 99.6 126.7
08:45:00 133.9 99.2 134.7
09:00:00 139.7 99.2 140.9
09:15:00 142.4 99.8 144.2
09:30:00 142.9 100.9 145.4
09:45:00 142.4 102.1 145.5
10:00:00 142.1 102.8 145.8
10:15:00 142.9 102.9 147.0
10:30:00 144.5 102.5 149.0
10:45:00 146.3 101.8 151.2
11:00:00 147.6 101.0 153.0
11:15:00 147.9 100.4 154.0
11:30:00 147.5 100.0 154.3
11:45:00 146.8 99.8 154.3
12:00:00 146.4 99.8 154.2
12:15:00 146.3 100.0 154.3
12:30:00 146.5 100.5 154.5
12:45:00 146.2 101.0 154.3
13:00:00 145.1 101.6 153.6
13:15:00 142.8 102.2 152.2
13:30:00 139.3 102.4 149.9
13:45:00 134.6 102.1 147.0
14:00:00 128.8 101.0 143.3
14:15:00 122.3 98.9 139.2
14:30:00 115.5 96.3 135.2
14:45:00 109.4 93.8 132.1
15:00:00 104.6 91.9 130.6
15:15:00 101.8 91.1 131.3
15:30:00 100.5 91.2 133.5
15:45:00 100.2 91.8 136.2
16:00:00 100.4 92.5 138.5
16:15:00 100.6 93.1 139.8
16:30:00 101.0 93.4 140.3
16:45:00 101.9 93.6 140.5
17:00:00 103.4 93.7 140.9
17:15:00 105.8 93.9 142.0
17:30:00 108.7 94.3 143.7
17:45:00 111.5 95.2 145.8
18:00:00 113.7 96.8 148.2
18:15:00 115.0 99.1 150.6
18:30:00 115.7 102.2 152.5
18:45:00 116.3 105.7 153.3
19:00:00 117.3 109.5 152.4
19:15:00 119.0 113.2 149.3
19:30:00 120.6 116.3 144.4
19:45:00 121.4 117.9 138.4
20:00:00 120.4 117.3 131.8
20:15:00 117.0 114.2 125.3
20:30:00 112.1 109.4 119.3
20:45:00 106.8 104.2 114.3
21:00:00 102.2 99.8 110.7
21:15:00 99.2 97.1 108.8
21:30:00 97.4 95.9 108.1
21:45:00 96.4 95.4 108.0
22:00:00 95.6 95.0 107.7
22:15:00 94.5 94.1 106.6
22:30:00 93.3 92.8 104.9
22:45:00 92.0 91.2 103.0
23:00:00 90.7 89.5 101.0
23:15:00 89.6 87.8 99.3
23:30:00 88.6 86.4 97.8
23:45:00 88.0 85.7 96.6
00:00:00 87.7 85.9 95.6
I did:
td = pd.to_timedelta(df['Time'].astype(str))
df1 = df.assign(Time=td.mask(td == pd.Timedelta(0),td + pd.Timedelta('1 days 00:00:00')), a=1)
df2 = pd.DataFrame({'dates': pd.date_range(
'01.01.2020', '31.12.2020'), 'a': 1})
df = df2.merge(df1, how='outer').drop('a', axis=1)
df['dates'] = df['dates'].add(df.pop('Time')).dt.strftime('%d.%m.%Y %H:%M')
df['dates'] = pd.to_datetime(df['dates'], dayfirst=True)
df['day'] = df['dates'].dt.day_name()
It gave the following output:
dates Samstag Sonntag Werktag day
2020-01-01 00:15:00 95.3 87.8 94.7 Wednesday
2020-01-01 00:30:00 95.5 88.3 94.1 Wednesday
2020-01-01 00:45:00 96.2 89.0 94.1 Wednesday
2020-01-01 01:00:00 97.4 90.1 95.0 Wednesday
2020-01-01 01:15:00 98.9 91.3 96.6 Wednesday
2020-01-01 01:30:00 100.3 92.4 98.4 Wednesday
2020-01-01 01:45:00 101.0 92.9 99.8 Wednesday
2020-01-01 02:00:00 100.4 92.5 99.8 Wednesday
2020-01-01 02:15:00 98.2 91.0 98.0 Wednesday
2020-01-01 02:30:00 95.1 88.7 95.1 Wednesday
2020-01-01 02:45:00 91.9 86.4 91.9 Wednesday
2020-01-01 03:00:00 89.5 84.7 89.5 Wednesday
2020-01-01 03:15:00 88.6 84.0 88.4 Wednesday
2020-01-01 03:30:00 88.6 84.0 88.3 Wednesday
2020-01-01 03:45:00 88.7 84.0 88.3 Wednesday
2020-01-01 04:00:00 88.3 83.5 87.7 Wednesday
2020-01-01 04:15:00 86.8 82.1 86.1 Wednesday
2020-01-01 04:30:00 85.1 80.6 84.3 Wednesday
2020-01-01 04:45:00 84.2 80.1 83.5 Wednesday
2020-01-01 05:00:00 85.3 81.6 84.7 Wednesday
2020-01-01 05:15:00 89.0 85.9 88.5 Wednesday
2020-01-01 05:30:00 94.1 91.6 94.0 Wednesday
2020-01-01 05:45:00 99.3 97.0 99.5 Wednesday
2020-01-01 06:00:00 102.8 100.4 103.4 Wednesday
2020-01-01 06:15:00 103.7 100.7 104.7 Wednesday
2020-01-01 06:30:00 102.6 98.8 104.0 Wednesday
2020-01-01 06:45:00 100.7 96.2 102.4 Wednesday
2020-01-01 07:00:00 99.2 94.3 101.0 Wednesday
2020-01-01 07:15:00 99.1 94.4 100.8 Wednesday
2020-01-01 07:30:00 100.8 95.7 102.1 Wednesday
2020-01-01 07:45:00 104.4 97.6 105.3 Wednesday
2020-01-01 08:00:00 110.1 99.2 110.7 Wednesday
2020-01-01 08:15:00 117.7 99.7 118.2 Wednesday
2020-01-01 08:30:00 126.1 99.6 126.7 Wednesday
2020-01-01 08:45:00 133.9 99.2 134.7 Wednesday
2020-01-01 09:00:00 139.7 99.2 140.9 Wednesday
2020-01-01 09:15:00 142.4 99.8 144.2 Wednesday
2020-01-01 09:30:00 142.9 100.9 145.4 Wednesday
2020-01-01 09:45:00 142.4 102.1 145.5 Wednesday
2020-01-01 10:00:00 142.1 102.8 145.8 Wednesday
2020-01-01 10:15:00 142.9 102.9 147.0 Wednesday
2020-01-01 10:30:00 144.5 102.5 149.0 Wednesday
2020-01-01 10:45:00 146.3 101.8 151.2 Wednesday
2020-01-01 11:00:00 147.6 101.0 153.0 Wednesday
2020-01-01 11:15:00 147.9 100.4 154.0 Wednesday
2020-01-01 11:30:00 147.5 100.0 154.3 Wednesday
2020-01-01 11:45:00 146.8 99.8 154.3 Wednesday
2020-01-01 12:00:00 146.4 99.8 154.2 Wednesday
2020-01-01 12:15:00 146.3 100.0 154.3 Wednesday
2020-01-01 12:30:00 146.5 100.5 154.5 Wednesday
2020-01-01 12:45:00 146.2 101.0 154.3 Wednesday
2020-01-01 13:00:00 145.1 101.6 153.6 Wednesday
2020-01-01 13:15:00 142.8 102.2 152.2 Wednesday
2020-01-01 13:30:00 139.3 102.4 149.9 Wednesday
2020-01-01 13:45:00 134.6 102.1 147.0 Wednesday
2020-01-01 14:00:00 128.8 101.0 143.3 Wednesday
2020-01-01 14:15:00 122.3 98.9 139.2 Wednesday
2020-01-01 14:30:00 115.5 96.3 135.2 Wednesday
2020-01-01 14:45:00 109.4 93.8 132.1 Wednesday
2020-01-01 15:00:00 104.6 91.9 130.6 Wednesday
2020-01-01 15:15:00 101.8 91.1 131.3 Wednesday
2020-01-01 15:30:00 100.5 91.2 133.5 Wednesday
2020-01-01 15:45:00 100.2 91.8 136.2 Wednesday
2020-01-01 16:00:00 100.4 92.5 138.5 Wednesday
2020-01-01 16:15:00 100.6 93.1 139.8 Wednesday
2020-01-01 16:30:00 101.0 93.4 140.3 Wednesday
2020-01-01 16:45:00 101.9 93.6 140.5 Wednesday
2020-01-01 17:00:00 103.4 93.7 140.9 Wednesday
2020-01-01 17:15:00 105.8 93.9 142.0 Wednesday
2020-01-01 17:30:00 108.7 94.3 143.7 Wednesday
2020-01-01 17:45:00 111.5 95.2 145.8 Wednesday
2020-01-01 18:00:00 113.7 96.8 148.2 Wednesday
2020-01-01 18:15:00 115.0 99.1 150.6 Wednesday
2020-01-01 18:30:00 115.7 102.2 152.5 Wednesday
2020-01-01 18:45:00 116.3 105.7 153.3 Wednesday
2020-01-01 19:00:00 117.3 109.5 152.4 Wednesday
2020-01-01 19:15:00 119.0 113.2 149.3 Wednesday
2020-01-01 19:30:00 120.6 116.3 144.4 Wednesday
2020-01-01 19:45:00 121.4 117.9 138.4 Wednesday
2020-01-01 20:00:00 120.4 117.3 131.8 Wednesday
2020-01-01 20:15:00 117.0 114.2 125.3 Wednesday
2020-01-01 20:30:00 112.1 109.4 119.3 Wednesday
2020-01-01 20:45:00 106.8 104.2 114.3 Wednesday
2020-01-01 21:00:00 102.2 99.8 110.7 Wednesday
2020-01-01 21:15:00 99.2 97.1 108.8 Wednesday
2020-01-01 21:30:00 97.4 95.9 108.1 Wednesday
2020-01-01 21:45:00 96.4 95.4 108.0 Wednesday
2020-01-01 22:00:00 95.6 95.0 107.7 Wednesday
2020-01-01 22:15:00 94.5 94.1 106.6 Wednesday
2020-01-01 22:30:00 93.3 92.8 104.9 Wednesday
2020-01-01 22:45:00 92.0 91.2 103.0 Wednesday
2020-01-01 23:00:00 90.7 89.5 101.0 Wednesday
2020-01-01 23:15:00 89.6 87.8 99.3 Wednesday
2020-01-01 23:30:00 88.6 86.4 97.8 Wednesday
2020-01-01 23:45:00 88.0 85.7 96.6 Wednesday
2020-01-02 00:00:00 87.7 85.9 95.6 Thursday
2020-01-02 00:15:00 95.3 87.8 94.7 Thursday
2020-01-02 00:30:00 95.5 88.3 94.1 Thursday
2020-01-02 00:45:00 96.2 89.0 94.1 Thursday
2020-01-02 01:00:00 97.4 90.1 95.0 Thursday
2020-01-02 01:15:00 98.9 91.3 96.6 Thursday
2020-01-02 01:30:00 100.3 92.4 98.4 Thursday
2020-01-02 01:45:00 101.0 92.9 99.8 Thursday
2020-01-02 02:00:00 100.4 92.5 99.8 Thursday
2020-01-02 02:15:00 98.2 91.0 98.0 Thursday
2020-01-02 02:30:00 95.1 88.7 95.1 Thursday
2020-01-02 02:45:00 91.9 86.4 91.9 Thursday
2020-01-02 03:00:00 89.5 84.7 89.5 Thursday
2020-01-02 03:15:00 88.6 84.0 88.4 Thursday
2020-01-02 03:30:00 88.6 84.0 88.3 Thursday
2020-01-02 03:45:00 88.7 84.0 88.3 Thursday
2020-01-02 04:00:00 88.3 83.5 87.7 Thursday
2020-01-02 04:15:00 86.8 82.1 86.1 Thursday
2020-01-02 04:30:00 85.1 80.6 84.3 Thursday
2020-01-02 04:45:00 84.2 80.1 83.5 Thursday
2020-01-02 05:00:00 85.3 81.6 84.7 Thursday
2020-01-02 05:15:00 89.0 85.9 88.5 Thursday
2020-01-02 05:30:00 94.1 91.6 94.0 Thursday
2020-01-02 05:45:00 99.3 97.0 99.5 Thursday
2020-01-02 06:00:00 102.8 100.4 103.4 Thursday
2020-01-02 06:15:00 103.7 100.7 104.7 Thursday
2020-01-02 06:30:00 102.6 98.8 104.0 Thursday
2020-01-02 06:45:00 100.7 96.2 102.4 Thursday
2020-01-02 07:00:00 99.2 94.3 101.0 Thursday
2020-01-02 07:15:00 99.1 94.4 100.8 Thursday
2020-01-02 07:30:00 100.8 95.7 102.1 Thursday
2020-01-02 07:45:00 104.4 97.6 105.3 Thursday
2020-01-02 08:00:00 110.1 99.2 110.7 Thursday
2020-01-02 08:15:00 117.7 99.7 118.2 Thursday
2020-01-02 08:30:00 126.1 99.6 126.7 Thursday
2020-01-02 08:45:00 133.9 99.2 134.7 Thursday
2020-01-02 09:00:00 139.7 99.2 140.9 Thursday
2020-01-02 09:15:00 142.4 99.8 144.2 Thursday
2020-01-02 09:30:00 142.9 100.9 145.4 Thursday
2020-01-02 09:45:00 142.4 102.1 145.5 Thursday
2020-01-02 10:00:00 142.1 102.8 145.8 Thursday
2020-01-02 10:15:00 142.9 102.9 147.0 Thursday
2020-01-02 10:30:00 144.5 102.5 149.0 Thursday
2020-01-02 10:45:00 146.3 101.8 151.2 Thursday
2020-01-02 11:00:00 147.6 101.0 153.0 Thursday
2020-01-02 11:15:00 147.9 100.4 154.0 Thursday
2020-01-02 11:30:00 147.5 100.0 154.3 Thursday
2020-01-02 11:45:00 146.8 99.8 154.3 Thursday
2020-01-02 12:00:00 146.4 99.8 154.2 Thursday
2020-01-02 12:15:00 146.3 100.0 154.3 Thursday
2020-01-02 12:30:00 146.5 100.5 154.5 Thursday
2020-01-02 12:45:00 146.2 101.0 154.3 Thursday
2020-01-02 13:00:00 145.1 101.6 153.6 Thursday
2020-01-02 13:15:00 142.8 102.2 152.2 Thursday
2020-01-02 13:30:00 139.3 102.4 149.9 Thursday
2020-01-02 13:45:00 134.6 102.1 147.0 Thursday
2020-01-02 14:00:00 128.8 101.0 143.3 Thursday
2020-01-02 14:15:00 122.3 98.9 139.2 Thursday
2020-01-02 14:30:00 115.5 96.3 135.2 Thursday
2020-01-02 14:45:00 109.4 93.8 132.1 Thursday
2020-01-02 15:00:00 104.6 91.9 130.6 Thursday
2020-01-02 15:15:00 101.8 91.1 131.3 Thursday
2020-01-02 15:30:00 100.5 91.2 133.5 Thursday
2020-01-02 15:45:00 100.2 91.8 136.2 Thursday
2020-01-02 16:00:00 100.4 92.5 138.5 Thursday
2020-01-02 16:15:00 100.6 93.1 139.8 Thursday
2020-01-02 16:30:00 101.0 93.4 140.3 Thursday
2020-01-02 16:45:00 101.9 93.6 140.5 Thursday
2020-01-02 17:00:00 103.4 93.7 140.9 Thursday
2020-01-02 17:15:00 105.8 93.9 142.0 Thursday
2020-01-02 17:30:00 108.7 94.3 143.7 Thursday
2020-01-02 17:45:00 111.5 95.2 145.8 Thursday
2020-01-02 18:00:00 113.7 96.8 148.2 Thursday
2020-01-02 18:15:00 115.0 99.1 150.6 Thursday
2020-01-02 18:30:00 115.7 102.2 152.5 Thursday
2020-01-02 18:45:00 116.3 105.7 153.3 Thursday
2020-01-02 19:00:00 117.3 109.5 152.4 Thursday
2020-01-02 19:15:00 119.0 113.2 149.3 Thursday
2020-01-02 19:30:00 120.6 116.3 144.4 Thursday
2020-01-02 19:45:00 121.4 117.9 138.4 Thursday
2020-01-02 20:00:00 120.4 117.3 131.8 Thursday
2020-01-02 20:15:00 117.0 114.2 125.3 Thursday
2020-01-02 20:30:00 112.1 109.4 119.3 Thursday
2020-01-02 20:45:00 106.8 104.2 114.3 Thursday
2020-01-02 21:00:00 102.2 99.8 110.7 Thursday
2020-01-02 21:15:00 99.2 97.1 108.8 Thursday
2020-01-02 21:30:00 97.4 95.9 108.1 Thursday
2020-01-02 21:45:00 96.4 95.4 108.0 Thursday
2020-01-02 22:00:00 95.6 95.0 107.7 Thursday
2020-01-02 22:15:00 94.5 94.1 106.6 Thursday
2020-01-02 22:30:00 93.3 92.8 104.9 Thursday
2020-01-02 22:45:00 92.0 91.2 103.0 Thursday
2020-01-02 23:00:00 90.7 89.5 101.0 Thursday
2020-01-02 23:15:00 89.6 87.8 99.3 Thursday
2020-01-02 23:30:00 88.6 86.4 97.8 Thursday
2020-01-02 23:45:00 88.0 85.7 96.6 Thursday
2020-01-03 00:00:00 87.7 85.9 95.6 Friday
2020-01-03 00:15:00 95.3 87.8 94.7 Friday
2020-01-03 00:30:00 95.5 88.3 94.1 Friday
2020-01-03 00:45:00 96.2 89.0 94.1 Friday
What I would like to do is to change the value of day at 2020-01-02 00:00:00 from Thursday to Wednesday, and similarly the value of day at 2020-01-03 00:00:00 from Friday to Thursday and so on.
In other words: The value of day for next day at 00:00:00 should be similar to the value of the previous day and from 00:15:00, a new day should begin.
Expected output
dates Samstag Sonntag Werktag day
2020-01-01 00:15:00 95.3 87.8 94.7 Wednesday
2020-01-01 00:30:00 95.5 88.3 94.1 Wednesday
2020-01-01 00:45:00 96.2 89.0 94.1 Wednesday
2020-01-01 01:00:00 97.4 90.1 95.0 Wednesday
2020-01-01 01:15:00 98.9 91.3 96.6 Wednesday
2020-01-01 01:30:00 100.3 92.4 98.4 Wednesday
2020-01-01 01:45:00 101.0 92.9 99.8 Wednesday
2020-01-01 02:00:00 100.4 92.5 99.8 Wednesday
2020-01-01 02:15:00 98.2 91.0 98.0 Wednesday
2020-01-01 02:30:00 95.1 88.7 95.1 Wednesday
2020-01-01 02:45:00 91.9 86.4 91.9 Wednesday
2020-01-01 03:00:00 89.5 84.7 89.5 Wednesday
2020-01-01 03:15:00 88.6 84.0 88.4 Wednesday
2020-01-01 03:30:00 88.6 84.0 88.3 Wednesday
2020-01-01 03:45:00 88.7 84.0 88.3 Wednesday
2020-01-01 04:00:00 88.3 83.5 87.7 Wednesday
2020-01-01 04:15:00 86.8 82.1 86.1 Wednesday
2020-01-01 04:30:00 85.1 80.6 84.3 Wednesday
2020-01-01 04:45:00 84.2 80.1 83.5 Wednesday
2020-01-01 05:00:00 85.3 81.6 84.7 Wednesday
2020-01-01 05:15:00 89.0 85.9 88.5 Wednesday
2020-01-01 05:30:00 94.1 91.6 94.0 Wednesday
2020-01-01 05:45:00 99.3 97.0 99.5 Wednesday
2020-01-01 06:00:00 102.8 100.4 103.4 Wednesday
2020-01-01 06:15:00 103.7 100.7 104.7 Wednesday
2020-01-01 06:30:00 102.6 98.8 104.0 Wednesday
2020-01-01 06:45:00 100.7 96.2 102.4 Wednesday
2020-01-01 07:00:00 99.2 94.3 101.0 Wednesday
2020-01-01 07:15:00 99.1 94.4 100.8 Wednesday
2020-01-01 07:30:00 100.8 95.7 102.1 Wednesday
2020-01-01 07:45:00 104.4 97.6 105.3 Wednesday
2020-01-01 08:00:00 110.1 99.2 110.7 Wednesday
2020-01-01 08:15:00 117.7 99.7 118.2 Wednesday
2020-01-01 08:30:00 126.1 99.6 126.7 Wednesday
2020-01-01 08:45:00 133.9 99.2 134.7 Wednesday
2020-01-01 09:00:00 139.7 99.2 140.9 Wednesday
2020-01-01 09:15:00 142.4 99.8 144.2 Wednesday
2020-01-01 09:30:00 142.9 100.9 145.4 Wednesday
2020-01-01 09:45:00 142.4 102.1 145.5 Wednesday
2020-01-01 10:00:00 142.1 102.8 145.8 Wednesday
2020-01-01 10:15:00 142.9 102.9 147.0 Wednesday
2020-01-01 10:30:00 144.5 102.5 149.0 Wednesday
2020-01-01 10:45:00 146.3 101.8 151.2 Wednesday
2020-01-01 11:00:00 147.6 101.0 153.0 Wednesday
2020-01-01 11:15:00 147.9 100.4 154.0 Wednesday
2020-01-01 11:30:00 147.5 100.0 154.3 Wednesday
2020-01-01 11:45:00 146.8 99.8 154.3 Wednesday
2020-01-01 12:00:00 146.4 99.8 154.2 Wednesday
2020-01-01 12:15:00 146.3 100.0 154.3 Wednesday
2020-01-01 12:30:00 146.5 100.5 154.5 Wednesday
2020-01-01 12:45:00 146.2 101.0 154.3 Wednesday
2020-01-01 13:00:00 145.1 101.6 153.6 Wednesday
2020-01-01 13:15:00 142.8 102.2 152.2 Wednesday
2020-01-01 13:30:00 139.3 102.4 149.9 Wednesday
2020-01-01 13:45:00 134.6 102.1 147.0 Wednesday
2020-01-01 14:00:00 128.8 101.0 143.3 Wednesday
2020-01-01 14:15:00 122.3 98.9 139.2 Wednesday
2020-01-01 14:30:00 115.5 96.3 135.2 Wednesday
2020-01-01 14:45:00 109.4 93.8 132.1 Wednesday
2020-01-01 15:00:00 104.6 91.9 130.6 Wednesday
2020-01-01 15:15:00 101.8 91.1 131.3 Wednesday
2020-01-01 15:30:00 100.5 91.2 133.5 Wednesday
2020-01-01 15:45:00 100.2 91.8 136.2 Wednesday
2020-01-01 16:00:00 100.4 92.5 138.5 Wednesday
2020-01-01 16:15:00 100.6 93.1 139.8 Wednesday
2020-01-01 16:30:00 101.0 93.4 140.3 Wednesday
2020-01-01 16:45:00 101.9 93.6 140.5 Wednesday
2020-01-01 17:00:00 103.4 93.7 140.9 Wednesday
2020-01-01 17:15:00 105.8 93.9 142.0 Wednesday
2020-01-01 17:30:00 108.7 94.3 143.7 Wednesday
2020-01-01 17:45:00 111.5 95.2 145.8 Wednesday
2020-01-01 18:00:00 113.7 96.8 148.2 Wednesday
2020-01-01 18:15:00 115.0 99.1 150.6 Wednesday
2020-01-01 18:30:00 115.7 102.2 152.5 Wednesday
2020-01-01 18:45:00 116.3 105.7 153.3 Wednesday
2020-01-01 19:00:00 117.3 109.5 152.4 Wednesday
2020-01-01 19:15:00 119.0 113.2 149.3 Wednesday
2020-01-01 19:30:00 120.6 116.3 144.4 Wednesday
2020-01-01 19:45:00 121.4 117.9 138.4 Wednesday
2020-01-01 20:00:00 120.4 117.3 131.8 Wednesday
2020-01-01 20:15:00 117.0 114.2 125.3 Wednesday
2020-01-01 20:30:00 112.1 109.4 119.3 Wednesday
2020-01-01 20:45:00 106.8 104.2 114.3 Wednesday
2020-01-01 21:00:00 102.2 99.8 110.7 Wednesday
2020-01-01 21:15:00 99.2 97.1 108.8 Wednesday
2020-01-01 21:30:00 97.4 95.9 108.1 Wednesday
2020-01-01 21:45:00 96.4 95.4 108.0 Wednesday
2020-01-01 22:00:00 95.6 95.0 107.7 Wednesday
2020-01-01 22:15:00 94.5 94.1 106.6 Wednesday
2020-01-01 22:30:00 93.3 92.8 104.9 Wednesday
2020-01-01 22:45:00 92.0 91.2 103.0 Wednesday
2020-01-01 23:00:00 90.7 89.5 101.0 Wednesday
2020-01-01 23:15:00 89.6 87.8 99.3 Wednesday
2020-01-01 23:30:00 88.6 86.4 97.8 Wednesday
2020-01-01 23:45:00 88.0 85.7 96.6 Wednesday
2020-01-02 00:00:00 87.7 85.9 95.6 Wednesday
2020-01-02 00:15:00 95.3 87.8 94.7 Thursday
2020-01-02 00:30:00 95.5 88.3 94.1 Thursday
2020-01-02 00:45:00 96.2 89.0 94.1 Thursday
2020-01-02 01:00:00 97.4 90.1 95.0 Thursday
2020-01-02 01:15:00 98.9 91.3 96.6 Thursday
2020-01-02 01:30:00 100.3 92.4 98.4 Thursday
2020-01-02 01:45:00 101.0 92.9 99.8 Thursday
2020-01-02 02:00:00 100.4 92.5 99.8 Thursday
2020-01-02 02:15:00 98.2 91.0 98.0 Thursday
2020-01-02 02:30:00 95.1 88.7 95.1 Thursday
2020-01-02 02:45:00 91.9 86.4 91.9 Thursday
2020-01-02 03:00:00 89.5 84.7 89.5 Thursday
2020-01-02 03:15:00 88.6 84.0 88.4 Thursday
2020-01-02 03:30:00 88.6 84.0 88.3 Thursday
2020-01-02 03:45:00 88.7 84.0 88.3 Thursday
2020-01-02 04:00:00 88.3 83.5 87.7 Thursday
2020-01-02 04:15:00 86.8 82.1 86.1 Thursday
2020-01-02 04:30:00 85.1 80.6 84.3 Thursday
2020-01-02 04:45:00 84.2 80.1 83.5 Thursday
2020-01-02 05:00:00 85.3 81.6 84.7 Thursday
2020-01-02 05:15:00 89.0 85.9 88.5 Thursday
2020-01-02 05:30:00 94.1 91.6 94.0 Thursday
2020-01-02 05:45:00 99.3 97.0 99.5 Thursday
2020-01-02 06:00:00 102.8 100.4 103.4 Thursday
2020-01-02 06:15:00 103.7 100.7 104.7 Thursday
2020-01-02 06:30:00 102.6 98.8 104.0 Thursday
2020-01-02 06:45:00 100.7 96.2 102.4 Thursday
2020-01-02 07:00:00 99.2 94.3 101.0 Thursday
2020-01-02 07:15:00 99.1 94.4 100.8 Thursday
2020-01-02 07:30:00 100.8 95.7 102.1 Thursday
2020-01-02 07:45:00 104.4 97.6 105.3 Thursday
2020-01-02 08:00:00 110.1 99.2 110.7 Thursday
2020-01-02 08:15:00 117.7 99.7 118.2 Thursday
2020-01-02 08:30:00 126.1 99.6 126.7 Thursday
2020-01-02 08:45:00 133.9 99.2 134.7 Thursday
2020-01-02 09:00:00 139.7 99.2 140.9 Thursday
2020-01-02 09:15:00 142.4 99.8 144.2 Thursday
2020-01-02 09:30:00 142.9 100.9 145.4 Thursday
2020-01-02 09:45:00 142.4 102.1 145.5 Thursday
2020-01-02 10:00:00 142.1 102.8 145.8 Thursday
2020-01-02 10:15:00 142.9 102.9 147.0 Thursday
2020-01-02 10:30:00 144.5 102.5 149.0 Thursday
2020-01-02 10:45:00 146.3 101.8 151.2 Thursday
2020-01-02 11:00:00 147.6 101.0 153.0 Thursday
2020-01-02 11:15:00 147.9 100.4 154.0 Thursday
2020-01-02 11:30:00 147.5 100.0 154.3 Thursday
2020-01-02 11:45:00 146.8 99.8 154.3 Thursday
2020-01-02 12:00:00 146.4 99.8 154.2 Thursday
2020-01-02 12:15:00 146.3 100.0 154.3 Thursday
2020-01-02 12:30:00 146.5 100.5 154.5 Thursday
2020-01-02 12:45:00 146.2 101.0 154.3 Thursday
2020-01-02 13:00:00 145.1 101.6 153.6 Thursday
2020-01-02 13:15:00 142.8 102.2 152.2 Thursday
2020-01-02 13:30:00 139.3 102.4 149.9 Thursday
2020-01-02 13:45:00 134.6 102.1 147.0 Thursday
2020-01-02 14:00:00 128.8 101.0 143.3 Thursday
2020-01-02 14:15:00 122.3 98.9 139.2 Thursday
2020-01-02 14:30:00 115.5 96.3 135.2 Thursday
2020-01-02 14:45:00 109.4 93.8 132.1 Thursday
2020-01-02 15:00:00 104.6 91.9 130.6 Thursday
2020-01-02 15:15:00 101.8 91.1 131.3 Thursday
2020-01-02 15:30:00 100.5 91.2 133.5 Thursday
2020-01-02 15:45:00 100.2 91.8 136.2 Thursday
2020-01-02 16:00:00 100.4 92.5 138.5 Thursday
2020-01-02 16:15:00 100.6 93.1 139.8 Thursday
2020-01-02 16:30:00 101.0 93.4 140.3 Thursday
2020-01-02 16:45:00 101.9 93.6 140.5 Thursday
2020-01-02 17:00:00 103.4 93.7 140.9 Thursday
2020-01-02 17:15:00 105.8 93.9 142.0 Thursday
2020-01-02 17:30:00 108.7 94.3 143.7 Thursday
2020-01-02 17:45:00 111.5 95.2 145.8 Thursday
2020-01-02 18:00:00 113.7 96.8 148.2 Thursday
2020-01-02 18:15:00 115.0 99.1 150.6 Thursday
2020-01-02 18:30:00 115.7 102.2 152.5 Thursday
2020-01-02 18:45:00 116.3 105.7 153.3 Thursday
2020-01-02 19:00:00 117.3 109.5 152.4 Thursday
2020-01-02 19:15:00 119.0 113.2 149.3 Thursday
2020-01-02 19:30:00 120.6 116.3 144.4 Thursday
2020-01-02 19:45:00 121.4 117.9 138.4 Thursday
2020-01-02 20:00:00 120.4 117.3 131.8 Thursday
2020-01-02 20:15:00 117.0 114.2 125.3 Thursday
2020-01-02 20:30:00 112.1 109.4 119.3 Thursday
2020-01-02 20:45:00 106.8 104.2 114.3 Thursday
2020-01-02 21:00:00 102.2 99.8 110.7 Thursday
2020-01-02 21:15:00 99.2 97.1 108.8 Thursday
2020-01-02 21:30:00 97.4 95.9 108.1 Thursday
2020-01-02 21:45:00 96.4 95.4 108.0 Thursday
2020-01-02 22:00:00 95.6 95.0 107.7 Thursday
2020-01-02 22:15:00 94.5 94.1 106.6 Thursday
2020-01-02 22:30:00 93.3 92.8 104.9 Thursday
2020-01-02 22:45:00 92.0 91.2 103.0 Thursday
2020-01-02 23:00:00 90.7 89.5 101.0 Thursday
2020-01-02 23:15:00 89.6 87.8 99.3 Thursday
2020-01-02 23:30:00 88.6 86.4 97.8 Thursday
2020-01-02 23:45:00 88.0 85.7 96.6 Thursday
2020-01-03 00:00:00 87.7 85.9 95.6 Thursday
2020-01-03 00:15:00 95.3 87.8 94.7 Friday
2020-01-03 00:30:00 95.5 88.3 94.1 Friday
2020-01-03 00:45:00 96.2 89.0 94.1 Friday
How can this be done??
Edit 1
import pandas as pd
df = pd.DataFrame({ 'dates': ['2020-01-01 22:15:00',
'2020-01-01 22:35:00',
'2020-01-01 22:45:00',
'2020-01-01 23:00:00',
'2020-01-01 23:15:00',
'2020-01-01 23:30:00',
'2020-01-01 23:45:00',
'2020-01-02 00:00:00',
'2020-01-02 22:15:00',
'2020-01-02 22:35:00',
'2020-01-02 22:45:00',
'2020-01-02 23:00:00',
'2020-01-02 23:15:00',
'2020-01-02 23:30:00',
'2020-01-02 23:45:00',
'2020-01-03 00:00:00'],
'expected_output':['Wednesday',
'Wednesday',
'Wednesday',
'Wednesday',
'Wednesday',
'Wednesday',
'Wednesday',
'Wednesday',
'Thursday',
'Thursday',
'Thursday',
'Thursday',
'Thursday','Thursday','Thursday','Thursday']})
Just check the minutes of Timestamp using apply.
# df = pd.DataFrame({'dates': ['2020-01-01 22:15:00', .....]}, )
# convert str date into Timestamp
df['dates'] = pd.to_datetime(df['dates'])
def calculate_day(x):
# get previous day
if x.hour == 0 and x.minute < 15:
return (x - pd.DateOffset(days=1)).day_name()
return x.day_name()
df['day'] = df['dates'].apply(calculate_day)
print(df)
# dates day
#0 2020-01-01 22:15:00 Wednesday
#...
JFYI: weekday_name deprecated. Use day_name().
Hope this helps.