Does converting json to dict with eval a good choice? - python

I am getting a json object from a remote server, and converting it to a python string like this:
a = eval(response)
Is this stupid in any way, or do I have a better option?

Using eval is not a good way to process JSON:
JSON isn't even valid Python, because of true, false, and null.
eval will execute arbitrary Python code, so you are at the mercy of malicious injection of code.
Use the json module available in the standard library instead:
import json
data = json.loads("[1, 2, 3]")
If you're using a version of Python older than 2.6, you'll need to download the module yourself. It's called simplejson and can be downloaded from PyPi.

Yes, very. Use a json decoder instead:
>>> from simplejson import loads
>>> loads(response)

Related

How to unpickle pickle extension file

I have downloaded a pickle file:
foo.pickle.gz.pickle
The page from where I downloaded this file describes decompressing it to .pickle. I searched about python pickle, there are many pages that describe how to use in python, but not system wide. How can I decompress or unzip it? I am using ubuntu 16.04
Thanks in advance!
Pickle is the name of Python object serialisation module. So, you have to 'unpickle' it with a python script. Basic synthax is:
import pickle
with open('filename', 'rb') as pickled_one:
data = pickle.load(pickled_one)
More details are available here, on official Python documentation.
I do have to warn you about this, from that same page:
The pickle module is not secure against erroneous or maliciously
constructed data. Never unpickle data received from an untrusted or
unauthenticated source.
Pickle object can only be deserialized in python. You can't use non-python environments to deserialize the object. Please see the official page
If there are multiple pickled objects, as the answers above only unpickle 1 object.Use
pickle_list =[]
pickle_file = open(file_name, 'rb')
while True:
try:
pickle_list.append(pickle.load(pickle_file))
except EOFError:
break
pickle_file.close()
Not able to indent the code properly, but try and except are inside the while loop

YAML vs Python configuration/parameter files (but perhaps also vs JSON vs XML)

I see Python used to do a fair amount of code generation for C/C++ header and source files. Usually, the input files which store parameters are in JSON or YAML format, although most of what I see is YAML. However, why not just use Python files directly? Why use YAML at all in this case?
That also got me thinking: since Python is a scripted language, its files, when containing only data and data structures, could literally be used the same as XML, JSON, YAML, etc. Do people do this? Is there a good use case for it?
What if I want to import a configuration file into a C or C++ program? What about into a Python program? In the Python case it seems to me there is no sense in using YAML at all, as you can just store your configuration parameters and variables in pure Python files. In the C or C++ case, it seems to me you could still store your data in Python files and then just have a Python script import that and auto-generate header and source files for you as part of the build process. Again, perhaps there's no need for YAML or JSON in this case at all either.
Thoughts?
Here's an example of storing some nested key/value hash table pairs in a YAML file:
my_params.yml:
---
dict_key1:
dict_key2:
dict_key3a: my string message
dict_key3b: another string message
And the same exact thing in a pure Python file:
my_params.py
data = {
"dict_key1": {
"dict_key2": {
"dict_key3a": "my string message",
"dict_key3b": "another string message",
}
}
}
And to read in both the YAML and Python data and print it out:
import_config_file.py:
import yaml # Module for reading in YAML files
import json # Module for pretty-printing Python dictionary types
# See: https://stackoverflow.com/a/34306670/4561887
# 1) import .yml file
with open("my_params.yml", "r") as f:
data_yml = yaml.load(f)
# 2) import .py file
from my_params import data as data_py
# OR: Alternative method of doing the above:
# import my_params
# data_py = my_params.data
# 3) print them out
print("data_yml = ")
print(json.dumps(data_yml, indent=4))
print("\ndata_py = ")
print(json.dumps(data_py, indent=4))
Reference for using json.dumps: https://stackoverflow.com/a/34306670/4561887
SAMPLE OUTPUT of running python3 import_config_file.py:
data_yml =
{
"dict_key1": {
"dict_key2": {
"dict_key3a": "my string message",
"dict_key3b": "another string message"
}
}
}
data_py =
{
"dict_key1": {
"dict_key2": {
"dict_key3a": "my string message",
"dict_key3b": "another string message"
}
}
}
Yes people do this, and have been doing this for years.
But many make the mistake you do and make it unsafe to by using import my_params.py. That would be the same as loading YAML using YAML(typ='unsafe') in ruamel.yaml (or yaml.load() in PyYAML, which is unsafe).
What you should do is using the ast package that comes with Python to parse your "data" structure, to make such an import safe. My package pon has code to update these kind of structures, and in each of my __init__.py files there is such an piece of data named _package_data that is read by some code (function literal_eval) in the setup.py for the package. The ast based code in setup.py takes around ~100 lines.
The advantage of doing this in a structured way are the same as with using YAML: you can programmatically update the data structure (version numbers!), although I consider PON, (Python Object Notation), less readable than YAML and slightly less easy to manually update.

Access a JSON in Python

I'm new at Python as well as JQuery.
I have the next JSON in JS
var intervalos= {"variables":{
"nombreVariables":nombreVariables,
"extremoInferior":fromarray,
"extremoSuperior":toarray,
"step":steparray,
"random":randomarray
}};//intervalos
I did in Python
parsed_input = json.loads(self.intervalos)
How can I access to the structure? (I know that this is a list/dictionary)
like this
intervalos['variables']['nombreVariables'][i];
Presumably, the JS is remote. Assuming its accessible at http://example.com/json one would import it into python as follows:
import requests
parsed_input = requests.get('http://example.com/json').json()
for i in xrange(0, len(parsed_input['variables']['nombreVariables']):
print parsed_input['variables']['nombreVariables'][i]
You'll need requests to make this work in python2. With python 3.x, your syntax will be slightly different.

python ultrajson: how to use?

I've just installed ultrajson (ujson) to see if I can't get the json decoding to go faster (string to object). However, I'm not seeing any examples of how to use it.
with regular json it's just
import json
my_object = json.loads(my_string)
Change the import statement to import ujson as json
Then you can leave the other parts of your program as it is.

How to save Python in-memory dictionary to a file as a Python source code?

How to get Python source code representation of in-memory Python dictionary?
I decided to ask this question after reading Thomas Kluyver's comment on Rob Galanakis' blog post titled Why bother with python and config files? In his comment Thomas states
But if you want any way to change settings inside the application
(like a preferences dialog), there’s no good way to automatically
write a correct Python file.
Assuming it uses only "basic" Python types, you can write out the repr() of the structure, and then use ast.literal_eval() to read it back in after.
As the article says, you're better off using JSON/YAML or other formats, but if you seriously wanted to use a Python dict and are only using basic Python types...
Writing out (attempt to use pformat to try and make it more human readable):
from pprint import pformat # instead of using repr()
d = dict(enumerate('abcdefghijklmnopqrstuvwxyz'))
open('somefile.py').write(pformat(d))
Reading back:
from ast import literal_eval
d = literal_eval(open('somefile.py').read())

Categories