Autocomplete for dynamically populated dictionary containing classes? - python

I have a GUI, and through it I load some data. When a file is loaded, its filename is used as an identifier, which populates the GUI and also a dictionary underneath, to keep track of the current state for each file.
However, with this approach I can't get any autocomplete from the MetaData class, e.g. when I want to access data.container.[GUIcurrentFile].one_of_many_attributes.
Is there a way around this? Perhaps keeping files in memory in an entirely different fashion? What do people normally do in this scenarios? I'm not too familiar with GUI development.
class Data:
def __init__(self):
self.container = dict()
def load(self, name):
self.container[name] = MetaData()
class MetaData:
def __init__(self):
self.one_of_many_attributes = None
# This is instantiated in the main part of the GUI, e.g. self.data = Data()
data = Data()
## Series of events happening through the GUI
# Grab loaded file through a GUI
GUIcurrentFile = "file1"
data.load(GUIcurrentFile)
GUIcurrentFile = "file2"
data.load(GUIcurrentFile)
# Each file has separate attributes
data.container[GUIcurrentFile].one_of_many_attributes = "foo"
# File is removed from GUI, and can easily be removed from dictionary internally
data.container.pop(GUIcurrentFile)

Okay, so type hinting finally clicked for me. I hope the original title makes sense, in relation to this answer. Else, feel free to edit it.
Defining MetaData first, it's very straight forward to add type hinting for PyCharm, if a method is implemented to return an object of type "MetaData".
class MetaData:
def __init__(self):
self.foo = None
self.really_long_name = None
class Data:
def __init__(self):
self.container = dict()
def load(self, name):
self.container[name] = MetaData()
def get(self, name) -> MetaData: # specify what dict lookup returns
return self.container[name]
data = Data()
data.load("file1")
data.get("file1").foo

Related

Extending subclass instances from superclass instances in Python

Is there any way to instantiate a subclass object as an extension of a superclass object, in such a way that the subclass retains the arguments passed to the superclass?
I'm working on a simple melody generator to get back into programming. The idea was that a Project can contain an arbitrary number of Instruments, which can have any number of Sequences.
Every subordinate object would retain all of the information of the superior objects (e.g. every instrument shares the project's port device, and so forth).
I figured I could do something like this:
import rtmidi
class Project:
def __init__(self, p_port=None):
self.port = p_port
# Getter / Setter removed for brevity
class Instrument(Project):
def __init__(self, p_channel=1)
self.channel = p_channel
# Getter / Setter removed for brevity
def port_setup():
midi_out = rtmidi.MidiOut()
selected_port = midi_out.open_port(2)
return selected_port
if __name__ == '__main__':
port = port_setup()
project = Project(p_port=port)
project.inst1 = Instrument()
print(project.port, project.inst1.port)
The expectation was that the new instrument would extend the created Project and inherit the port passed to its parent.
However, that doesn't work; the project and instrument return different objects, so there seems to be no relation between the objects at all. A quick Google search also doesn't turn up any information, which I assume means I'm really missing something.
Is there a proper way to set up nested structures like this?
Your relationship is that each Project has many Instruments. An Instrument is not a Project.
One first step could be to tell each Instrument which project is belongs to:
import rtmidi
class Project:
def __init__(self, p_port=None):
self.port = p_port
# Getter / Setter removed for brevity
class Instrument:
def __init__(self, project, p_channel=1)
self.project = project
self.channel = p_channel
# Getter / Setter removed for brevity
def port_setup():
midi_out = rtmidi.MidiOut()
selected_port = midi_out.open_port(2)
return selected_port
if __name__ == '__main__':
port = port_setup()
project = Project(p_port=port)
project.inst1 = Instrument(project)
print(project.port, project.inst1.project.port)

Python Execute class function without instantiating

I am building a software and am using one class to only store data:
class Data():
data = [1,2,3]
Now the data in this class can be accessed and changed from other classes without instantiating the Data class which is exactly what I need.
In order to update the of the software properly I have to call functions in other classes whenever the data changes. I looked at the observer pattern in python but could not get it to work without making data an attribute of the class that's only available when instantiated. In other words all the observer pattern implementations I found required:
class Data():
def __init__(self):
self.data = [1,2,3]
Obviously, if Data is my publisher/observable it needs to be instantiated once to get the functionality (as far as I understand) but I am looking for an implementation like:
class Data():
data = [1,2,3]
def __init__(self):
self.subscribers = {}
def register(self, who, callback):
self.subscribers[who] = callback
def dispatch(self):
for susbriber, callback in self.subscribers.items():
callback()
For the sake of the example let's use this other class as Subscriber/Observer that can also change the data with another function. As this will be the main class handling the software, this is where I instantiate Data to get the observer behavior. It is important however that I would not have to instantiate it only to get data as data will be changed from a lot of other classes:
class B():
def __init__(self):
self.data = Data()
self.data.register(self, self.print_something)
def print_something(self):
print("Notification Received")
def change_data(self):
Data.data.append(100)
My question now is, how to automatically send the notification from the Publisher/Observable whenever data gets changed in any way?
I am running python 3.8 on Windows 10.

Share plugin resources with implemented permission rules

I have multiple scripts that are exporting a same interface and they're executed using execfile() in insulated scope.
The thing is, I want them to share some resources so that each new script doesn't have to load them again from the start, thus loosing starting speed and using unnecessary amount of RAM.
The scripts are in reality much better encapsulated and guarded from malicious plug-ins than presented in example below, that's where problems for me begins.
The thing is, I want the script that creates a resource to be able to fill it with data, remove data or remove a resource, and of course access it's data.
But other scripts shouldn't be able to change another's scripts resource, just read it. I want to be sure that newly installed plug-ins cannot interfere with already loaded and running ones via abuse of shared resources.
Example:
class SharedResources:
# Here should be a shared resource manager that I tried to write
# but got stuck. That's why I ask this long and convoluted question!
# Some beginning:
def __init__ (self, owner):
self.owner = owner
def __call__ (self):
# Here we should return some object that will do
# required stuff. Read more for details.
pass
class plugin (dict):
def __init__ (self, filename):
dict.__init__(self)
# Here some checks and filling with secure versions of __builtins__ etc.
# ...
self["__name__"] = "__main__"
self["__file__"] = filename
# Add a shared resources manager to this plugin
self["SharedResources"] = SharedResources(filename)
# And then:
execfile(filename, self, self)
# Expose the plug-in interface to outside world:
def __getattr__ (self, a):
return self[a]
def __setattr__ (self, a, v):
self[a] = v
def __delattr__ (self, a):
del self[a]
# Note: I didn't use self.__dict__ because this makes encapsulation easier.
# In future I won't use object itself at all but separate dict to do it. For now let it be
----------------------------------------
# An example of two scripts that would use shared resource and be run with plugins["name"] = plugin("<filename>"):
# Presented code is same in both scripts, what comes after will be different.
def loadSomeResource ():
# Do it here...
return loadedresource
# Then Load this resource if it's not already loaded in shared resources, if it isn't then add loaded resource to shared resources:
shr = SharedResources() # This would be an instance allowing access to shared resources
if not shr.has_key("Default Resources"):
shr.create("Default Resources")
if not shr["Default Resources"].has_key("SomeResource"):
shr["Default Resources"].add("SomeResource", loadSomeResource())
resource = shr["Default Resources"]["SomeResource"]
# And then we use normally resource variable that can be any object.
# Here I Used category "Default Resources" to add and/or retrieve a resource named "SomeResource".
# I want more categories so that plugins that deal with audio aren't mixed with plug-ins that deal with video for instance. But this is not strictly needed.
# Here comes code specific for each plug-in that will use shared resource named "SomeResource" from category "Default Resources".
...
# And end of plugin script!
----------------------------------------
# And then, in main program we load plug-ins:
import os
plugins = {} # Here we store all loaded plugins
for x in os.listdir("plugins"):
plugins[x] = plugin(x)
Let say that our two scripts are stored in plugins directory and are both using some WAVE files loaded into memory.
Plugin that loads first will load the WAVE and put it into RAM.
The other plugin will be able to access already loaded WAVE but not to replace or delete it, thus messing with other plugin.
Now, I want each resource to have an owner, some id or filename of the plugin script, and that this resource is writable only by it's owner.
No tweaking or workarounds should enable the other plugin to access the first one.
I almost did it and then got stuck, and my head is spining with concepts that when implemented do the thing, but only partially.
This eats me, so I cannot concentrate any more. Any suggestion is more than welcome!
Adding:
This is what I use now without any safety included:
# Dict that will hold a category of resources (should implement some security):
class ResourceCategory (dict):
def __getattr__ (self, i): return self[i]
def __setattr__ (self, i, v): self[i] = v
def __delattr__ (self, i): del self[i]
SharedResources = {} # Resource pool
class ResourceManager:
def __init__ (self, owner):
self.owner = owner
def add (self, category, name, value):
if not SharedResources.has_key(category):
SharedResources[category] = ResourceCategory()
SharedResources[category][name] = value
def get (self, category, name):
return SharedResources[category][name]
def rem (self, category, name=None):
if name==None: del SharedResources[category]
else: del SharedResources[category][name]
def __call__ (self, category):
if not SharedResources.has_key(category):
SharedResources[category] = ResourceCategory()
return SharedResources[category]
__getattr__ = __getitem__ = __call__
# When securing, this must not be left as this, it is unsecure, can provide a way back to SharedResources pool:
has_category = has_key = SharedResources.has_key
Now a plugin capsule:
class plugin(dict):
def __init__ (self, path, owner):
dict.__init__()
self["__name__"] = "__main__"
# etc. etc.
# And when adding resource manager to the plugin, register it with this plugin as an owner
self["SharedResources"] = ResourceManager(owner)
# ...
execfile(path, self, self)
# ...
Example of a plugin script:
#-----------------------------------
# Get a category we want. (Using __call__() ) Note: If a category doesn't exist, it is created automatically.
AudioResource = SharedResources("Audio")
# Use an MP3 resource (let say a bytestring):
if not AudioResource.has_key("Beep"):
f = open("./sounds/beep.mp3", "rb")
Audio.Beep = f.read()
f.close()
# Take a reference out for fast access and nicer look:
beep = Audio.Beep # BTW, immutables doesn't propagate as references by themselves, doesn't they? A copy will be returned, so the RAM space usage will increase instead. Immutables shall be wrapped in a composed data type.
This works perfectly but, as I said, messing resources is too much easy here.
I would like an instance of ResourceManager() to be in charge to whom return what version of stored data.
So, my general approach would be this.
Have a central shared resource pool. Access through this pool would be read-only for everybody. Wrap all data in the shared pool so that no one "playing by the rules" can edit anything in it.
Each agent (plugin) maintains knowledge of what it "owns" at the time it loads it. It keeps a read/write reference for itself, and registers a reference to the resource to the centralized read-only pool.
When an plugin is loaded, it gets a reference to the central, read-only pool that it can register new resources with.
So, only addressing the issue of python native data structures (and not instances of custom classes), a fairly locked down system of read-only implementations is as follows. Note that the tricks that are used to lock them down are the same tricks that someone could use to get around the locks, so the sandboxing is very weak if someone with a little python knowledge is actively trying to break it.
import collections as _col
import sys
if sys.version_info >= (3, 0):
immutable_scalar_types = (bytes, complex, float, int, str)
else:
immutable_scalar_types = (basestring, complex, float, int, long)
# calling this will circumvent any control an object has on its own attribute lookup
getattribute = object.__getattribute__
# types that will be safe to return without wrapping them in a proxy
immutable_safe = immutable_scalar_types
def add_immutable_safe(cls):
# decorator for adding a new class to the immutable_safe collection
# Note: only ImmutableProxyContainer uses it in this initial
# implementation
global immutable_safe
immutable_safe += (cls,)
return cls
def get_proxied(proxy):
# circumvent normal object attribute lookup
return getattribute(proxy, "_proxied")
def set_proxied(proxy, proxied):
# circumvent normal object attribute setting
object.__setattr__(proxy, "_proxied", proxied)
def immutable_proxy_for(value):
# Proxy for known container types, reject all others
if isinstance(value, _col.Sequence):
return ImmutableProxySequence(value)
elif isinstance(value, _col.Mapping):
return ImmutableProxyMapping(value)
elif isinstance(value, _col.Set):
return ImmutableProxySet(value)
else:
raise NotImplementedError(
"Return type {} from an ImmutableProxyContainer not supported".format(
type(value)))
#add_immutable_safe
class ImmutableProxyContainer(object):
# the only names that are allowed to be looked up on an instance through
# normal attribute lookup
_allowed_getattr_fields = ()
def __init__(self, proxied):
set_proxied(self, proxied)
def __setattr__(self, name, value):
# never allow attribute setting through normal mechanism
raise AttributeError(
"Cannot set attributes on an ImmutableProxyContainer")
def __getattribute__(self, name):
# enforce attribute lookup policy
allowed_fields = getattribute(self, "_allowed_getattr_fields")
if name in allowed_fields:
return getattribute(self, name)
raise AttributeError(
"Cannot get attribute {} on an ImmutableProxyContainer".format(name))
def __repr__(self):
proxied = get_proxied(self)
return "{}({})".format(type(self).__name__, repr(proxied))
def __len__(self):
# works for all currently supported subclasses
return len(get_proxied(self))
def __hash__(self):
# will error out if proxied object is unhashable
proxied = getattribute(self, "_proxied")
return hash(proxied)
def __eq__(self, other):
proxied = get_proxied(self)
if isinstance(other, ImmutableProxyContainer):
other = get_proxied(other)
return proxied == other
class ImmutableProxySequence(ImmutableProxyContainer, _col.Sequence):
_allowed_getattr_fields = ("count", "index")
def __getitem__(self, index):
proxied = get_proxied(self)
value = proxied[index]
if isinstance(value, immutable_safe):
return value
return immutable_proxy_for(value)
class ImmutableProxyMapping(ImmutableProxyContainer, _col.Mapping):
_allowed_getattr_fields = ("get", "keys", "values", "items")
def __getitem__(self, key):
proxied = get_proxied(self)
value = proxied[key]
if isinstance(value, immutable_safe):
return value
return immutable_proxy_for(value)
def __iter__(self):
proxied = get_proxied(self)
for key in proxied:
if not isinstance(key, immutable_scalar_types):
# If mutable keys are used, returning them could be dangerous.
# If owner never puts a mutable key in, then integrity should
# be okay. tuples and frozensets should be okay as keys, but
# are not supported in this implementation for simplicity.
raise NotImplementedError(
"keys of type {} not supported in "
"ImmutableProxyMapping".format(type(key)))
yield key
class ImmutableProxySet(ImmutableProxyContainer, _col.Set):
_allowed_getattr_fields = ("isdisjoint", "_from_iterable")
def __contains__(self, value):
return value in get_proxied(self)
def __iter__(self):
proxied = get_proxied(self)
for value in proxied:
if isinstance(value, immutable_safe):
yield value
yield immutable_proxy_for(value)
#classmethod
def _from_iterable(cls, it):
return set(it)
NOTE: this is only tested on Python 3.4, but I tried to write it to be compatible with both Python 2 and 3.
Make the root of the shared resources a dictionary. Give a ImmutableProxyMapping of that dictionary to the plugins.
private_shared_root = {}
public_shared_root = ImmutableProxyMapping(private_shared_root)
Create an API where the plugins can register new resources to the public_shared_root, probably on a first-come-first-served basis (if it's already there, you can't register it). Pre-populate private_shared_root with any containers you know you're going to need, or any data you want to share with all plugins but you know you want to be read-only.
It might be convenient if the convention for the keys in the shared root mapping were all strings, like file-system paths (/home/dalen/local/python) or dotted paths like python library objects (os.path.expanduser). That way collision detection is immediate and trivial/obvious if plugins try to add the same resource to the pool.

How to get object id from Kivy language to Python code

I created a widget in Kivy language, it has some properties which I'm using some of them inside this widget's python code. Since I'm using "id" keyword inside .kv file, they all have id's. I need to use same ids inside Python code. Here is the workaround:
CcaSwitch:
id: out0_bit3
name: "out0_bit3"
I'm using self.name in place of self.id in python code. How can I achieve same goal without a "duplicate" entry for each widget?
Thanks.
Edit
I'm using the "name" variable inside Python code like this:
class CcaSwitch(BoxLayout):
name = StringProperty()
def __init__(self, *args, **kwargs):
# ...
Clock.schedule_interval(self.refresh, 5)
def refresh(self, dt):
self.comm.publish(get_bool_status(self.name), self.routing_key)
So, I need a string, which identifies this widget instance itself, in a DRY way.
I would like to achieve something like that:
In .kv file:
CcaSwitch:
id: out0_bit3
In Python I'd like to use it like:
class CcaSwitch(BoxLayout):
def __init__(self, *args, **kwargs):
# ...
self.name = self.id.__name__
Clock.schedule_interval(self.refresh, 5)
def refresh(self, dt):
self.comm.publish(get_bool_status(self.name), self.routing_key)
You don't say why you need to use these ids in python. But you can access a widget that has an id through its parent's ids dictionary (this is a special dictionary that can be queried using dot notation, e.g. if the dict variable is x, and it has element 'apple' in it, you can get its value with x.apple).
So for your example, if you have in .kv:
<ExampleWidget>:
CcaSwitch:
id: out0_bit3
state: 'ON'
you can .py:
w = ExampleWidget()
print(w.ids.out0_bit3.state)
w.ids.out0_bit3 in this case returns a reference to that CcaSwitch widget.
EDIT
As I said in the comments, there's only one way to get the id value as a string - using its parent's ids dictionary. So the following will only work if the CcaSwitch widget has a parent widget (i.e. it's not the main widget):
class CcaSwitch(Widget):
name = ''
def __init__(self, *args, **kwargs):
super(CcaSwitch, self).__init__(**kwargs)
Clock.schedule_once(self.load_name)
Clock.schedule_interval(self.refresh, 5)
def load_name(self, *l):
for id_str, widget in self.parent.ids.iteritems():
if widget.__self__ is self:
self.name = id_str
return
def refresh(self, dt):
self.comm.publish(get_bool_status(self.name), self.routing_key)
What it does, is go through the ids dict of its parent, and extracts its id string. You cannot call self.load_name directly from init, because it might not have the parent yet etc. So we schedule it for the next frame.
If the widget has no parent, you're out of luck and you either have to do like you did before, or if the name doesn't have to be human readable or the same across different runs, you can in init set self.name = str(self).

Methods on descriptors

I'm trying to implement a wrapper around a redis database that does some bookkeeping, and I thought about using descriptors. I have an object with a bunch of fields: frames, failures, etc., and I need to be able to get, set, and increment the field as needed. I've tried to implement an Int-Like descriptor:
class IntType(object):
def __get__(self,instance,owner):
# issue a GET database command
return db.get(my_val)
def __set__(self,instance,val):
# issue a SET database command
db.set(instance.name,val)
def increment(self,instance,count):
# issue an INCRBY database command
db.hincrby(instance.name,count)
class Stream:
_prefix = 'stream'
frames = IntType()
failures = IntType()
uuid = StringType()
s = Stream()
s.frames.increment(1) # float' object has no attribute 'increment'
Is seems like I can't access the increment() method in my descriptor. I can't have increment be defined in the object that the __get__ returns. This would require an additional db query if all I want to do is increment! I also don't want increment() on the Stream class, as later on when I want to have additional fields like strings or sets in Stream, then I'd need to type check the heck out of everything.
Does this work?
class Stream:
_prefix = 'stream'
def __init__(self):
self.frames = IntType()
self.failures = IntType()
self.uuid = StringType()
Why not define the magic method iadd as well as get and set. This will allow you to do normal addition with assignment on the class. It will also mean you can treat the increment separately from the get function and thereby minimise the database accesses.
So change:
def increment(self,instance,count):
# issue an INCRBY database command
db.hincrby(instance.name,count)
to:
def __iadd__(self,other):
# your code goes here
Try this:
class IntType(object):
def __get__(self,instance,owner):
class IntValue():
def increment(self,count):
# issue an INCRBY database command
db.hincrby(self.name,count)
def getValue(self):
# issue a GET database command
return db.get(my_val)
return IntValue()
def __set__(self,instance,val):
# issue a SET database command
db.set(instance.name,val)

Categories