passing a configparser.ConfigParser() object via __init__? - python

i'm currently working on a project for dns-enumeration, which sends requests to various APIs.
Some of these APIs require an API-Key, which i provide in a config.ini file. In my current setup I use configparser to read-in the different values into an object, so i can access the object when needed. Now, as I try to implement something like a class structure, i would like to read-in the config file once in the init of a parent class, so i can inherit every tool that needs an API-Key from that class.
Right now the setup looks something like this:
class Source:
def __init__(self):
config = configparser.ConfigParser()
config.read('./config.ini')
self.config = config
class BinaryEdge(Source):
def __init__(self):
super().__init__()
def query(self, domain, dnsprobe):
api_key = self.config['BINARYEDGE']['API-KEY']
url = 'https://api.binaryedge.io/v2/query/domains/subdomain/' + domain
fqdns = []
...
In my understanding, if i initiate a new BinaryEdge-Instance, for example like this:
if __name__ == "__main__":
BinaryEdge = BinaryEdge()
print(BinaryEdge.query("heise.de", False))
It technically should read in the config file into an object and pass it to the newly created object, so i can access it via self.config, something like this:
def query(self, domain, dnsprobe):
api_key = self.config['BINARYEDGE']['API-KEY']
url = 'https://api.binaryedge.io/v2/query/domains/subdomain/' + domain
fqdns = []
...
But when im debugging this setup, the config object stays default (and threrefore empty), which obviously leads straight into a key error:
File "/usr/lib64/python3.9/configparser.py", line 960, in __getitem__
raise KeyError(key)
KeyError: 'BINARYEDGE'
As im not as good in python programming as i would like to be, i'm struggling solving this error on my own and would be thankful for any advancing input.

I figured it out myself after getting input from #Jakub Szlaur:
My file-path pointed to the wrong folders, therefore the config.ini file was never reached.
After changing:
config.read('./config.ini')
to
config.read('$HOME/$PROJECT_PATH/config.ini')
it worked as expected.
I also changed the "Source"-Class according to the comments for "better code-style":
class Source:
def __init__(self):
self.config = self.readconfig('../config.ini')
def readconfig(self, filename):
config = configparser.ConfigParser()
config.read(filename)
return config
Thanks for the help! ;-)

The code looks like it should work (I can't find any errors).
Try checking your config file to see whether there really is such a key.
But about the code itself, there are a couple of things that I would recommend changing.
First, although reading the config is part of the initialisation of Source, it would be better if you made it a function, then called that function:
class Source:
def __init__(self):
self.config = self.readconfig("./config.ini")
def readconfig(self, filename):
config = configparser.ConfigParser()
config.read(filename)
return config
Never do this: BinaryEdge = BinaryEdge(). If you wanted to make another instance of BinaryEdge, it would call the BinaryEdge variable instead. Name it something different.

Related

In Python, how can I add another module to an already existing variable that contains one?

Is it possible use a variable as a container for a Python module, and then adding another one to the same variable?
So for example, if I would have a Python file called general_config.py containing a general config of some kind:
class GeneralConfig:
attribute1 = "Some attribute"
attribute2 = "Some other attribute"
And if I would import this Python module as a variable containing a general config, I would do:
import general_config.py as Config
Then I can access its attributes by doing:
generalParameter = Config.GeneralConfig.attribute1
But what if I want to add some specific parameters to my config (say from specific_config.py), while keeping the general one as part of the entire config? So it would do something like that:
if someSpecificCondition:
Config += import specific_config.py
else:
Config += import other_config.py
While keeping the Config in the original scope? Thanks in advance.
If you want your general config to inherit your other configs for whatever reason, you could do something like this. But Tom's answer makes more sense, since there's no runtime class creation.
class BaseConfig:
att = "hello world"
def inherit_configs(some_condition):
if some_condition:
from config1 import Config1
class Config(BaseConfig, Config1):
pass
return Config
else:
from config2 import Config2
class Config(BaseConfig, Config2):
pass
return Config
config = inherit_configs(some_condition)()

Readonly configuration variables ini file

I have a program in python that has an ini file with the configuration variables for the program. Some variables are ReadWrite and others ReadOnly.
My config file is something like this:
[AuthCtrlr]
enable = True
nbrOfMssgs = 10
where I want the variable enable to be ReadWrite and nbrOfMssgs ReadOnly.
In my python code, I can change my variables with:
parser.set('AuthCtrlr', 'enable', False)
with the configparser module.
Is there a way to make a code where if I want to change the variable nbrOfMssgs it prints something like "This variable is ReadOnly" and don't change the value of the variable?
import configparser
class wrappedParser(configparser.ConfigParser):
def __init__(self):
super().__init__()
self.readOnlySettings = []
def set(self, category, setting, value):
if setting in self.readOnlySettings:
raise PermissionError(f"{setting} setting is read-only")
else:
return super().set(category, setting, value)
def makeReadOnly(self, setting):
self.readOnlySettings.append(setting)
config = wrappedParser()
config['bitbucket.org'] = {}
config.set('bitbucket.org', 'User', 'hg')
config.makeReadOnly('User')
try:
config.set('bitbucket.org', 'User', 'pg')
except PermissionError as err:
print(err)
print(config.get('bitbucket.org', 'User'))
Make a child class inheriting from ConfigParser that reimplements set operation in a way that raises an error if the setting you are trying to change is one of the read-only ones.
Still, this does not in any way improve security of your code. It's only good to prevent you from accidentially changing the settings. Which asks the question, why would that happen?

Dynamic loading of classes

My goal is to load dynamically my different subclasses and execute them. To call my script, I am using this:
python Plugin.py Nmap execute google.com
Or
python Plugin.py Dig execute google.com
Here it's the code
Parent Class: Plugin.py
class Plugin(object)
def __init__(self):
self.sName = ''
def execPlugin(self):
return 'something'
def main():
# get the name of the plugin
sPlugin = sys.argv[1]
# create the object
mMod = __import__(sPlugin, fromlist=[sPlugin])
mInstance = getattr(mMod, sPlugin)
oPlugin = mInstance()
print oPlugin
print type(oPlugin)
if (sys.argv[2] == 'execute'):
# then execute it
return oPlugin.execPlugin(sys.argv[3])
if __name__ == '__main__':
main()
Sub Class located in Nmap/Nmap.py
class Nmap(Plugin):
def __init__(self):
self.sName = 'Nmap'
def execPlugin(self):
return 'something else'
Sub Class located in Dig/Dig.py
class Dig(Plugin):
def __init__(self):
self.sName = 'Dig'
def execPlugin(self):
return 'yeahhh'
My problem is located in
oPlugin = mInstance()
With the following error
TypeError: 'module' object is not callable
I tried so many things but nothing worked. How can I solve my problem?
You have a structure like:
Plugin.py
/Nmap
__init__.py
Nmap.py
# class Nmap(Plugin): ...
In Plugin.py, when you do mMod = __import__(sPlugin, fromlist=[sPlugin]) where sPlugin == 'Nmap', this makes mMod refer to the directory /Nmap, not the file Nmap.py (note that both files and directories can be modules in Python). Hence mInstance = getattr(mMod, sPlugin) makes mInstance the file Nmap.py rather than the class Nmap.
There are two ways to fix this, either:
Use the __init__.py in /Nmap to bring the class "up" one level, i.e. include from Nmap import Nmap; or
Add an extra level of getattr into Plugin.py.
Additionally, you should follow the style guide's naming conventions, which might have helped you track the issue down faster.

Access instance in other modules

I have a class instance I want to access in other modules. This class loads config values using configParser to update an class instance __dict__ attribute as per this post:
I want to access this instance in other module. The instance is only created in the main.py file where it has access to the required parameters, which come via command line arguments.
I have three files: main.py, config.py and file.py. I don't know the best way to access the instance in the file.py. I only have access to it in main.py and not other modules.
I've looked at the following answers, here and here but they don't fully answer my scenario.
#config.py
class Configuration():
def __init__(self, *import_sections):
#use configParser, get config for relevant sections, update self.__dict__
#main.py
from config import Configuration
conf = Configuration('general', 'dev')
# other lines of code use conf instance ... e.g. config.log_path in log setup
#file.py
#I want to use config instance like this:
class File():
def __init__(self, conf.feed_path):
# other code here...
Options considered:
Initialise Configuration in config.py module
In config.py after class definition I could add:
conf = Configuration('general', 'dev')
and in file.py and main.py:
from config import conf
but the general and dev variables are only found in main.py so doesn't look like it will work.
Make Configuration class a function
I could make it a function and create a module-level dictionary and import data into other modules:
#config.py
conf = {}
def set_config(*import_section):
# use configParser, update conf dictionary
conf.update(...)
This would mean referring to it as config.conf['log_path'] for example. I'd prefer conf.log_path as it's used multiple times.
Pass via other instances
I could pass the conf instance as parameters via other class instances from main.py, even if the intermediate instances don't use it. Seems very messy.
Other options?
Can I use Configuration as an instance somehow?
By changing your Configuration class into a Borg, you are guaranteed to get a common state from wherever you want. You can either provide initialization through a specific __init__:
#config.py
class Configuration:
__shared_state = {}
def __init__(self, *import_sections):
self.__dict__ = self.__shared_state
if not import_sections: # we are not initializing this time
return
#your old code verbatim
initialization is donne as usual with a c = config.Configuration('general','dev') and any call to conf = config.Configuration() will get the state that c created.
or you can provide an initialization method to avoid tampering with the shared state in the __init__:
#config.py
class Configuration:
__shared_state = {}
def __init__(self):
self.__dict__ = self.__shared_state
def import(self, *import_sections):
#your old __init__
that way there is only one meaning to the __init__ method, which is cleaner.
In both cases, you can get the shared state, once initialized, from anywhere in your code by using config.Configuration().

Best way to mix and match components in a python app

I have a component that uses a simple pub/sub module I wrote as a message queue. I would like to try out other implementations like RabbitMQ. However, I want to make this backend change configurable so I can switch between my implementation and 3rd party modules for cleanliness and testing.
The obvious answer seems to be to:
Read a config file
Create a modifiable settings object/dict
Modify the target component to lazily load the specified implementation.
something like :
# component.py
from test.queues import Queue
class Component:
def __init__(self, Queue=Queue):
self.queue = Queue()
def publish(self, message):
self.queue.publish(message)
# queues.py
import test.settings as settings
def Queue(*args, **kwargs):
klass = settings.get('queue')
return klass(*args, **kwargs)
Not sure if the init should take in the Queue class, I figure it would help in easily specifying the queue used while testing.
Another thought I had was something like http://www.voidspace.org.uk/python/mock/patch.html though that seems like it would get messy. Upside would be that I wouldn't have to modify the code to support swapping component.
Any other ideas or anecdotes would be appreciated.
EDIT: Fixed indent.
One thing I've done before is to create a common class that each specific implementation inherits from. Then there's a spec that can easily be followed, and each implementation can avoid repeating certain code they'll all share.
This is a bad example, but you can see how you could make the saver object use any of the classes specified and the rest of your code wouldn't care.
class SaverTemplate(object):
def __init__(self, name, obj):
self.name = name
self.obj = obj
def save(self):
raise NotImplementedError
import json
class JsonSaver(SaverTemplate):
def save(self):
file = open(self.name + '.json', 'wb')
json.dump(self.object, file)
file.close()
import cPickle
class PickleSaver(SaverTemplate):
def save(self):
file = open(self.name + '.pickle', 'wb')
cPickle.dump(self.object, file, protocol=cPickle.HIGHEST_PROTOCOL)
file.close()
import yaml
class PickleSaver(SaverTemplate):
def save(self):
file = open(self.name + '.yaml', 'wb')
yaml.dump(self.object, file)
file.close()
saver = PickleSaver('whatever', foo)
saver.save()

Categories