I have a project with 10 different python files. It has classes and functions - pretty much the lot.
I do want to share specific data that will represent the settings in the project between all the project files.
I came up with creating a settings.py file:
settings = {}
settings['max_bitrate'] = 160000
settings['dl_dir'] = r"C:\Downloads"
and then I import the class from every file.
Is there a more suitable way to do it?
I'm probably a little old-school in this regard, but in my latest project, I created a config file in /etc, then created a config module that uses ConfigParser to read it in and make it available, and import that config module wherever I need to read settings.
Your method sounds good to me, and has the advantage that you can easily change the implementation of the settings module, for example to use configuration files or the windows registry, or to provided a read only API.
Related
I am creating a piece of code that will have multiple files that need to reference a single config. The reason is that our IT department will use puppet to manage this config file in case any changes are required in the future. We don't want to do a release to change the configuration. I've seen a few projects that have multiple configs in different places but I really do not like this idea and I'd prefer to have a single source. My thought was to create a specific config.py file that can be called anywhere in my code that will ask for the user to input the location of the file.
Is this a good way or is there a better way to do this?
import configparser
class Config(object):
def __init__(self,conf):
self._cfg = configparser.ConfigParser()
self._cfg.read(conf)
def get_conf_value(self,property):
if property not in self._cfg.sections:
return None
return self._cfg.sections[property]
If so, if I have a Main.py while, what's the best way to have the scheduler pass the config location and then reference it across all of my files in my Python Package?
You can create a config.json file in json format. You can read the contents at startup using the json.load function from the json library.
I am developing a program that has a settings window in which I can change various parameters for my program. What is the best way to read/save them in some kind of config file? I know that some software and games use .ini files or similar system. How can I achieve this in Python?
The Python standard library includes the ConfigParser module, which handles ini-style configuration files for you. It's more than adequate for most uses.
Another popular option for configuration files is JSON - it's a simple notation which has good support from a wide range of languages.
Python has the json module in the standard library, which makes it very easy.
Since you introduced the term config file in your question, the previous answers concentrated on means for creating plain text files, which also could be manipulated using a standard text editor. Depending on the sort of settings to store this might not be desired, since it requires strict plausibility checks after reading back the config file at the very least. So I add the proposal of the shelves module which is a straight-forward way to make information persistent in files.
Good day.
I set up a separate project from the main one called myproject-celery. It is a buildout based project which contains the async part of my project. For convenience I want to have a file, that will be containing this machine's configuration. I know that celery provides the python config file, but I do not like this configuration style.
Let's say I have a configuration in a Yaml config file named myproject.yaml
What I want to achieve:
./bin/celery worker --config /absolute/path/to/project/myproject.yaml --app myproject.celery
The problem really is that I want to specify the file's location, because it can change. I tried writing a custom loader class, but I failed, cause I do not even know why and when the many custom methods of this class are called (the only doc that I found is http://docs.celeryproject.org/en/latest/reference/celery.loaders.base.html?highlight=loader#id1 and It's no help for me). I tried to do something on import phase for the app module, but I can not pass the filepath to that module's code... The only solution that I came up with was using a custom ENV param that will contain the path, but I do not see why can't it be a launch param like in most apps, that I use(refering to pyramid with it's paster serve myproject.ini)
So the question:
What do I have to do to set up the config from a file that I could specify by an absolute path?
EDIT:
The question was not answered, sow I posted an issue on celery's github. Will wait for a response.
https://github.com/celery/celery/issues/1100
Looking at celery.loaders.base it looks like the method you want to override is read_configuration:
from celery.datastructures import DictAttr
from celery.loaders.base import BaseLoader
class YAMLLoader(BaseLoader):
def read_configuration():
# Load YAML file here and return a DictAttr instance
What would be a neat way to share configuration parameters\settings\constants between various projects in Python?
Using a DB seems like an overkill. Using a file raises the question of which project should host the file in its source control...
I'm open for suggestions :)
UPDATE:
For clarification - assume the various projects are deployed differently on different systems. In some cases in different directories, in other cases some of the projects are there and some are not.
I find that in many cases, using a configuration file is really worth the (minor) hassle. The builtin ConfigParser module is very handy, especially the fact that it's really easy to parse multiple files and let the module merge them together, with values in files parsed later overriding values from files parsed earlier. This allows for easy use of a global file (e.g. /etc/yoursoftware/main.ini) and a per-user file (~/.yoursoftware/main.ini).
Each of your projects would then open the config file and read values from it.
Here's a small code example:
basefile.ini:
[sect1]
value1=1
value2=2
overridingfile.ini:
[sect1]
value1=3
configread.py:
#!/usr/bin/env python
from ConfigParser import ConfigParser
config = ConfigParser()
config.read(["basefile.ini", "overridingfile.ini"])
print config.get("sect1", "value1")
print config.get("sect1", "value2")
Running this would print out:
3
2
Why don't you just have a file named constants.py and just have CONSTANT = value
Create a Python package and import it in the various projects...
Why is a database overkill? You're describing sharing data across different projects located on different physical systems with different paths to each project's directory. Oh, and sometimes the projects just aren't there. I can't imagine a better means of communicating the data. It only has to be a single table, that's hardly overkill if it provides the consistent access you need across platforms, computers, and even networks.
I'm a java developer new to python. In java, you can access all classes in the same directory without having to import them.
I am trying to achieve the same behavior in python. Is this possible?
I've tried various solutions, for example by importing everything in a file which I import everywhere. That works, but I have to type myClass = rootFolder.folder2.folder3.MyClass() each time I want to access a foreign class.
Could you show me an example for how a python architecture over several directories works? Do you really have to import all the classes you need in each file?
Imagine that I'm writing a web framework. Will the users of the framework have to import everything they need in their files?
Put everything into a folder (doesn't matter the name), and make sure that that folder has a file named __init__.py (the file can be empty).
Then you can add the following line to the top of your code:
from myfolder import *
That should give you access to everything defined in that folder without needing to give the prefix each time.
You can also have multiple depths of folders like this:
from folder1.folder2 import *
Let me know if this is what you were looking for.