Most reliable way to generate a timestamp with Python - python

I am wondering what the most reliable way to generate a timestamp is using Python. I want this value to be put into a MySQL database, and for other programming languages and programs to be able to parse this information and use it.
I imagine it is either datetime, or the time module, but I can't figure out which I'd use in this circumstance, nor the method.

import datetime
print datetime.datetime.now().strftime("%Y-%m-%d-%H%M")
It should return a string with the format you want. Customize the string by taking a look at strftime(). This for example is the text format I used for a log filename.

For a database, your best bet is to store it in the database-native format, assuming its precision matches your needs. For a SQL database, the DATETIME type is appropriate.
EDIT: Or TIMESTAMP.

if it's just a simple timestamp that needs to be read by multiple programs, but which doesn't need to "mean" anything in sql, and you don't care about different timezones for different users or anything like that, then seconds from the unix epoch (start of 1970) is a simple, common standard, and is returned by time.time().
python actually returns a float (at least on linux), but if you only need accuracy to the second store it as an integer.
if you want something that is more meaningful in sql then use a sql type like datetime or timestamp. that lets you do more "meaningful" queries (like query for a particular day) more easily (you can do them with seconds from epoch too, but it requires messing around with conversions), but it also gets more complicated with timezones and converting into different formats in different languages.

Related

Specify a default rendering method for a certain type in Jinja2

In Jinja2, how would you specify a default rendering method for a certain type?
In particular, datetime?
I found it quite annoying when rendering datetime values from Django. They look like 2022-11-04T00:00:00.987654+00:00. What was that T for, and why there was a plus + followed by 00:00. My users who lived on small islands for their entire life wouldn't understand.
Aside from the formatting problem, Django gives UTC time objects. Always UTC, despite the TIME_ZONE in its settings module has been specified with a different value.
I know I can use a filter thing like me.time_of_death|format_datetime. However putting it after every single datetime field sounds insane to me, and I don't want to be woken up in the midnight because of a datetime without that filter released on the previous day.
Is it possible to make it default?
You can use dateparse:
from django.utils import dateparse
Then when before you pass the time to the template you can use the following to convert it to something more understandable to your fellow islanders:
readable_time = dateparse.parse_datetime(CONFUSING_TIME_STRING)

How to dynamically store and execute equations and functions

During my current project, I have been receiving data from a set of long-range sensors, which are sending data as a series of bytes. Generally, due to having multiple types of sensors, the bytes structures and data contained are different, hence the need to make the functionality more dynamic as to avoid having to hard-code every single setup in the future (which is not practical).
The server will be using Django, which I believe is irrelevant to the issue at hand but I have mentioned just in case it might have something that can be used.
The bytes data I am receiving looks like this:
b'B\x10Vu\x87%\x00x\r\x0f\x04\x01\x00\x00\x00\x00\x1e\x00\x00\x00\x00ad;l'
And my current process looks like this:
Take the first bytes to get the deviceID (deviceID = val[0:6].hex())
Look up the format to be used in the struct.unpack() (here: >BBHBBhBHhHL after removing the first bytes for the id.
Now, the issue is the next step. Many of the datas I have have different forms of per-processing that needs to be done. F.e. some values need to be ran with a join statement (e.g. ".".join(str(values[2]) ) while others need some simple mathematical changes (-113 + 2 * values[4]) and finally, others needs a simple logic check (values[7]==0x80) to return a boolean value.
My question is, what's the best way to code those methods? I would really like to avoid hardcoding them, but it almost seems like the best idea. another idea I saw was to store the functionalities as a string and execute them such as seen here, but I've been reading that its a very bad idea, and that it also slows down execution. The last idea I had was to hardcode some general functions only and use something similar to here, but this doesn't solve the issue of having to hard-code every new sensor-type, which is not realistic in a live-installation. Are there any better methods to achieve the same thing?
I have also looked at here, with the idea that some functionality can be somehow optimized as an equation, but I didn't see that a possibility for every occurrence, especially when any string manipulation is needed at all.
Additionally, is there a possibility of using some maths to apply some basic string manipulation? I can hard-code one string manipulation maybe, but to be honest this whole thing has been bugging me...
Finally, I am considering if I go with the function storing as string then executing, is there a way to set some "security" to avoid any malicious exploitation? Since such a method is... awful insecure to say the least.
However, after almost a week total of searching I am so far unable to find a better solution than storing functions as a string and running eval on them, despite not liking that option. If anyone finds a better option before then, I would be extremely grateful to any tips or ideas.
Appendum: Minimum code that can be used to show-case and test different methods:
import struct
def decode(input):
val = bytearray(input)
deviceID = val[0:6].hex()
del(val[0:6])
print(deviceID)
values = list(struct.unpack('>BBHBBhBHhHL', val))
print(values)
# Now what?
decode(b'B\x10Vu\x87%\x00x\r\x0f\x04\x01\x00\x00\x00\x00\x1e\x00\x00\x00\x00ad;l')

Forcing datetime unaware timestamps in/out of DB queries (Javascript, Flask, Postgres, stack)

I'm working on a webapp that has a Javascript front-end that talks JSON with a python (flask) middle and a postgres backend. I'd like to safely represent timestamps without loss, ambiguity or bugs. Python itself doesn't generate any timestamps, but it translates them between the client and the DB, which is where bugs can happen.
Javascript' lack of long means that sensible long-count-since-epoch-everywhere is lossy when stored as Javascript's number, and so iso datetime strings are the most least-unnatural format between the client and server. For example, from python, we can generate:
>>> datetime.fromtimestamp(time.time(), pytz.utc).isoformat()
'2018-02-08T05:42:48.866188+00:00'
Which can be unambiguously interpreted in the whole stack without loss of precision, whether or not the timezone offset is non-zero (as it may be coming from the client).
However, between python and the database, things get a little tricky. I'm concerned with preventing timezone unaware datetimes creeping into Python-land and then into the database. For example, a client in China may send JSON to the flask server in California with a string: '2018-02-07T21:46:33.250477' ... which we many parse as a timezone-unaware ISO datetime. Because this is an ambiguous time, the ONLY sensible thing to do it to reject it as an error. But where? I could manually write validation for each field received to Python, but with a large datamodel, it's easy to miss a field.
As I understand it, at the DB-schema level, it doesn't matter too much whether columns are declared timestamp or timestampz provided the queries always (ALWAYS) come with TZ information, they're unambiguously converted UTC for both types. However, as far as I know I don't think its possible for postgres to prevent you putting a timezone-less datetime or time-string into timestampz columns.
Two possibilities come to mind:
Could the flask JSON parser reliably detect iso dates without timezones and reject them?
I could leave the timestamps strings all the way to the psycopg2 cursor.execute. Can psycopg2's sql-formatter reject timestamp strings that don't have a timestamp?

Easiest way to store a single timestamp in appengine

I am running a Python script on Google's AppEngine. The python script is very basic. Every time the script runs i need it to update a timestamp SOMEWHERE so i can record and keep track of when the script last ran. This will allow me to do logic based on when the last time the script ran, etc. At the end of the script i'll update the timestamp to the current time.
Using Google's NBD seems to be overkill for this but it also seems to be the only way to store ANY data in AppEngine. Is there a better/easier way to do what i want?
It really is very simple, and not overkill.. Anything else will be overkill in appengine - trying to use the low level API, GCS or storing it in some other service will all require more work and introduce levels of complexity and potential unreliability and will be slower. In addition any mechanism that doesn't store/retrieve datetime objects as datetime objects (ie text file) means you will need to parse the string as a date, creating even more work.
Define it
class TimeStamp(ndb.Model):
timestamp = ndb.DatetimeProperty(auto_now=True)
Create it.
TimeStamp(key_name="TIMESTAMP").put()
Update it without reading.
TimeStamp(key_name="TIMESTAMP").put()
Read it, then update.
ts = TimeStamp.get_by_id("TIMESTAMP")
ts.put()
Another way to solve this, that i found, is to use memcache. It's super easy. Though it should probably be noted that memcache could be cleared at anytime, so NDB is probably a better solution.
Set the timestamp:
memcache.set("timestamp", current_timestamp)
Then, to read the timestamp:
memcache.get("timestamp")

How to filter over dates using a string?

Datatable on the page can be filtered by user input. Entered string is received in ajax query and I'm trying to filter objects by this string. String fields are filtered fine, but when I try to filter over a date:
querySet.filter(datefield__icontains=searchString)
It leads to MySQL exception:
Warning: Incorrect datetime value: '%%' for column '' at row 1
Is there a way to filter over date fields using a string?
You need to create a datetime objects from the string and then filter using that object. You will probably want to filter objects that have a date greater then or less then the datetime object. For this you can used filter(datefield__gt=datetime_obj) or filter(datefield__lt=datetime_obj)
I'm not sure what you would expect icontains to do when filtering over date data, so you may want to rethink that or clarify what your are expecting.
Overall you are trying to query based on the datefield. You have 2 choices. You either need to convert your sting into a date (or datetime) object, or you need to format your string as "yyyy-mm-dd".
-- edit --
Since it looks like you're really trying to use strings to search for dates here is some sort of clarification.
What you're looking to do is going to be beyond hard to do. I'd go with borderline impossible. Django is translating your queries into SQL and querying your database. Just because the ORM is there doesn't make it able to do things not possible in SQL (in fact the ORM greatly limits what you can do). However, you have several choices:
Write python to parse the strings and come up with date objects to match them. This may be quite difficult. You will then want to use Q objects from django.db.models to construct a complex OR query. This will likely be slow when it gets to the database on any reasonable number or records.
Figure out what the SQL would look like to generate what you're trying to do. This isn't going to be easy IMO as there is so much variance and the underlying representation of dates in your database isn't going to be a string so you're really going to be in for it. This is essentially manually doing the same thing that #1 does, except manually. Once you've constructed the SQL run a raw query.
Convert to using django-haystack to use a full text search engine. This is a lot of infrastructure and has a lot of potential negatives from having to rewrite your code to anticipate search engine results. Depending on the backend you choose to plug into haystack it may be smart enough to be able to understand the strings entered and correctly search...or you may have to manually generate a ton of string representations to be in your search index so that they can be properly queried. Overall this type of thing is what full text search engines excel at because they understand language. Databases do not, they understand data and they don't get to be fuzzy.
More or less what you're asking for is quite hard. The most viable option in my opinion is to convert to using haystack. I can't emphasize enough how many drawbacks there are to that. You're dealing with search engine results instead of model instances. In some circumstances that isn't an issue, but depending on the requirements, that could be quite painful.

Categories