Convert Unix timestamp value to human readable python - python

I have a Unix timestamp which value is 1502878840. This Unix timestamp value can be converted to human readable like Aug 16, 2017 10:20:40.
I have 2 following python code to convert 1502878840 to Aug 16, 2017 10:20:40. Both of them give a same result (Aug 16, 2017 10:20:40)
First method
utc = datetime.fromtimestamp(1502878840)
Second method
utc = datetime(1970, 1, 1) + timedelta(seconds=1502878840)
Could anyone answer me 2 following questions.
1. The result of 2 methods are same. But at the logic view point of Python code, is there any case that may cause the difference in result?
I ask this question because I see most of the python code use First method.
2. As I read here, the Unix time will have a problem on 19 January, 2038 03:14:08 GMT.
I run a timestamp which has a date after 19.Jan, 2038 (2148632440- Feb 01, 2038 10:20:40). The result is as follows
First method: ValueError: timestamp out of range for platform time_t
Second method: 2038-02-01 10:20:40
Question is: Can I use Second method to overcome the problem of "Year 2038 problem"?

Quoting the documentation:
fromtimestamp() may raise OverflowError, if the timestamp is out of the range of values supported by the platform C localtime() or gmtime() functions, and OSError on localtime() or gmtime() failure. It’s common for this to be restricted to years in 1970 through 2038. Note that on non-POSIX systems that include leap seconds in their notion of a timestamp, leap seconds are ignored by fromtimestamp(), and then it’s possible to have two timestamps differing by a second that yield identical datetime objects. See also utcfromtimestamp().
The second solution solves your problem:
utc = datetime(1970, 1, 1) + timedelta(seconds=1502878840)

Related

What is the format for UNIX timestamp of '253402128000000'?

I'm trying to convert a whole column of timestamp values in UNIX format but I get some values that doesn't look like a normal timestamp format: 253402128000000
For what I know, a timestamp should look like: 1495245009655
I've tried in miliseconds, nanoseconds and other configurations for Pandas to_datetime but I haven't been able to find a solution that can convert the format.
EDIT
My data looks like below and the ValidEndDateTime seems way off.
"ValidStartDateTime": "/Date(1495245009655)/",
"ValidEndDateTime": "/Date(253402128000000)/",
SOLUTION
I've accepted the answer below because I can see the date is a "never-end" date as all the values in my dataset that can't be converted is set to the same value: 253402128000000
Thank you for the answers!
From a comment of yours:
The data I get looks like this: "ValidStartDateTime": "/Date(1495245009655)/", "ValidEndDateTime": "/Date(253402128000000)/",
The numbers appear to be UNIX timestamps in milliseconds and the big "End" one seems to mean "never end", note the special date:
1495245009655 = Sat May 20 2017 01:50:09
253402128000000 = Thu Dec 30 9999 00:00:00
Converted with https://currentmillis.com/
I think it was divided by 1,000,000 becoming 253402128 and calculated.
Which means approximately 44 years ago.
Format: Microseconds (1/1,000,000 second)
GMT: Wed Jan 11 1978 21:28:48 GMT+0000
I used this website as reference: https://www.unixtimestamp.com/
Use pd.to_datetime:
>>> pd.to_datetime(1495245009655, unit='ms')
Timestamp('2017-05-20 01:50:09.655000')
>>> pd.to_datetime(253402128000000 / 100, unit='ms')
Timestamp('2050-04-19 22:48:00')

Python Arrow Not Formatting Date Properly

I am using the Python Arrow package for formatting a date that I've got.
My input is:
datetime=2017-10-01T00:10:00Z and timezone=US/Pacific
My expected output is:
Sun 10/01 # 6:10pm
I've tried a host of different date time conversions but I always end up with Sun 10/01 # 12:10AM which is not time zone dependent.
If I try and say:
x = arrow.Arrow.strptime('2017-10-01T00:10:00Z',
'%Y-%m-%dT%H:%M:%SZ',
tzinfo=tz.gettz('US/Pacific'))
x is equal to:
<Arrow [2017-10-01T00:10:00-07:00]>
I then say:
x.strftime('%a %m/%d # %I:%M%p')
and it outputs
Sun 10/01 # 12:10AM
The Arrow object knows about the timezone as evidenced by the -7:00 but does not format the date accordingly.
Any ideas?
I think that there are a couple of misunderstandings in this question.
Convert to a timezone
I can see no way that the timestamp,
2017-10-01T00:10:00Z and timezone=US/Pacific
Can become,
Sun 10/01 # 6:10pm
There are several problems here.
The Z at the end of the timestamp is a timezone and means Zulu aka GMT, so the timestamp already has a timezone.
If we ignore problem #1, then for that timestamp 10 minutes after midnight (minus the Z) to become 6:10 pm the same day would require a timezone that was +18. This timezone does not exist.
US/Pacific is -7/-8 depending on the time of the year. If we accept the Z as the timezone and want to convert to US/Pacific, then the time should be 9/31 at 5:10pm
What does -7:00 mean?
So I am going to guess that what you intend is that the timestamp is in fact Zulu, and you want to display that timestamp as US/Pacific. If this is true than you need to do:
from dateutil import tz
import arrow
x = arrow.Arrow.strptime(
'2017-10-01T00:10:00Z',
'%Y-%m-%dT%H:%M:%SZ').to(tz.gettz('US/Pacific'))
print(x)
print(x.strftime('%a %m/%d # %I:%M%p'))
This results in:
2017-09-30T17:10:00-07:00
Sat 09/30 # 05:10PM
You will note, as you noted earlier, that these produce the same time. The difference is that the first also has a -7:00. This does not indicate, as you intimated in your question, that time needs to have 7 hours removed to be shown as in the timezone, it instead indicates that the timestamp has already had 7 hours removed from Zulu.

Improving the result of conversion of datenum to datetime

I have to convert a MATLAB's datenum to Python's datetime (e.g.2010-11-04 00:03:50.209589).
The datenum is represented in milliseconds and the date must be from 2010-11-04 00:00:00 to 2011-06-11 00:00:00.
The following code is as below:
matlab_datenum = 6.365057116950260162e+10
python_datetime = datetime.datetime.fromtimestamp(matlab_datenum / 1e3)
print (python_datetime)
The result is : 1972-01-07 16:42:51.169503
The result is wrong because the date must be from 2010-11-04 to 2011-06-11.
Do you have any idea how to correct the result ?
Thank you for your help
The datenum page in the Matlab documentation states:
The datenum function creates a numeric array that represents each point in time as the number of days from January 0, 0000.
Python's datetime module page states the following for fromtimestamp:
Return the local date corresponding to the POSIX timestamp
which is 00:00:00 1 January 1970
The two functions are counting from different start points and using different units (days and seconds), hence the discrepancy between your two dates.

epoch conversion in python

While processing logfiles, I need to compare epoc and human-readable timestamps.
epoc=time.strftime("%d.%m.%Y %H:%M - %Z", time.localtime(1358252743927))
print epoc
and
t1=time.gmtime(1358252743927)
print t1
both return something like
26.04.45011 22:52 - CEST
Whereas converting 1358252743927 using this site returns
GMT: Tue, 15 Jan 2013 12:25:43 GMT
Your time zone: 1/15/2013 1:25:43 PM GMT+1
Which is the correct time - but somehow python can't handle this timestamp.
Does anyone have an idea how to convert the timestamp to get the latter result?
Looks like that timestamp you have there has milliseconds with it, the gmtitme function cannot handle that. The site you mentioned can. If you remove the last three digits on that huge number, the site will still give you the same result, as it does not believe, that you really want the year 45011.
So just divide the number by 1000 before passing it (if you are sure, you always get that high resolution), and you are fine:
t1 = time.gmtime(1358252743.927)
print t1
gives:
time.struct_time(tm_year=2013, tm_mon=1, tm_mday=15, tm_hour=12, tm_min=25, tm_sec=43, tm_wday=1, tm_yday=15, tm_isdst=0)
which seems fine.
Python handles epoch time in seconds, and I suspect, looking how big is your timestamp, that is is in milliseconds instead.
If you drop the last 3 digits, you will get the expected time:
>>> value = 1358252743927
>>> import time
>>> time.strftime("%d.%m.%Y %H:%M - %Z", time.localtime(value / 1000))
'15.01.2013 13:25 - CET'
(minus the timezone issues known with Python).

How to convert the integer date format into YYYYMMDD?

Python and Matlab quite often have integer date representations as follows:
733828.0
733829.0
733832.0
733833.0
733834.0
733835.0
733836.0
733839.0
733840.0
733841.0
these numbers correspond to some dates this year. Do you guys know which function can convert them back to YYYYMMDD format?
thanks a million!
The datetime.datetime class can help you here. The following works, if those values are treated as integer days (you don't specify what they are).
>>> from datetime import datetime
>>> dt = datetime.fromordinal(733828)
>>> dt
datetime.datetime(2010, 2, 25, 0, 0)
>>> dt.strftime('%Y%m%d')
'20100225'
You show the values as floats, and the above doesn't take floats. If you can give more detail about what the data is (and where it comes from) it will be possible to give a more complete answer.
Since Python example was already demonstrated, here is the matlab one:
>> datestr(733828, 'yyyymmdd')
ans =
20090224
Also, note that while looking similar these are actually different things in Matlab and Python:
Matlab
A serial date number represents the whole and fractional number of days
from a specific date and time, where datenum('Jan-1-0000 00:00:00') returns
the number 1. (The year 0000 is merely a reference point and is not intended
to be interpreted as a real year in time.)
Python, datetime.date.fromordinal
Return the date corresponding to the proleptic Gregorian ordinal, where January 1 of year 1 has ordinal 1.
So they would differ by 366 days, which is apparently the length of the year 0.
Dates like 733828.0 are Rata Die dates, counted from January 1, 1 A.D. (and decimal fraction of days). They may be UTC or by your timezone.
Julian Dates, used mostly by astronomers, count the days (and decimal fraction of days) since January 1, 4713 BC Greenwich noon. Julian date is frequently confused with Ordinal date, which is the date count from January 1 of the current year (Feb 2 = ordinal day 33).
So datetime is calling these things ordinal dates, but I think this only makes sense locally, in the world of python.
Is 733828.0 a timestamp? If so, you can do the following:
import datetime as dt
dt.date.fromtimestamp(733828.0).strftime('%Y%m%d')
PS
I think Peter Hansen is right :)
I am not a native English speaker. Just trying to help. I don't quite know the difference between a timestamp and an ordinal :(

Categories