How do I implement a class browser in wxPython? Should I scan the whole code, or there is a function for this in wxPython?
Your question isn't entirely clear about what you want, but I'll make some assumptions and show you how to do one of the possible interpretations of what you're asking.
I'll assume you have a string with the contents of a Python script, or a fragment from your cut-and-paste repository, or whatever, and you just want to know the top-level classes defined in that string of source code.
You probably don't want to execute that code. For one thing, who knows what arbitrary strange code can do to your environment? For another, if you're building a class browser, you probably want it to work on code that's depends on other code you may not have access to, so you can't execute it.
So, you want to parse it. The easiest way to do that is to get Python to do it for you, using the ast module:
import ast
with open('mymodule.py') as f:
mycode = f.read()
myast = ast.parse(mycode)
for thing in myast.body:
if isinstance(thing, ast.ClassDef):
print('class {}({})'.format(thing.name,
', '.join(base.id for base in thing.bases)))
for subthing in thing.body:
if isinstance(subthing, ast.FunctionDef):
print(' def {}'.format(name))
When I run this against, say, the ast.py from Python 3.3's stdlib, I get this:
class NodeVisitor(object)
def visit
def generic_visit
class NodeTransformer(NodeVisitor)
def generic_visit
If that's not what you wanted, you'll have to explain what you do want. If, for example, you want all class definitions, even local ones within functions and methods… well, the names of those two classes just dumped out above should help.
Related
This question already has an answer here:
inspect.getmembers() vs __dict__.items() vs dir()
(1 answer)
Closed 1 year ago.
Python's dir() is nice, don't get me wrong. But I'd like a list that actually tells me what kind of things the objects are: methods, variables, things that are inherited, etc.
As best I can tell, dir always returns a simple list of strings with no indication as to what the objects are. I checked the documentation for dir() and I don't see any way of getting better information.
Are there any other packages or tools for doing this?
I'd like a list that actually tells me what kind of things the objects are: methods, variables, things that are inherited, etc.
pyclbr built-in module might be useful for you, if you are interested in classes in certain *.py file, let somefile.py content be
class MyClass():
def parent_method(self):
return None
class MyChildClass(MyClass):
def child_method(self):
return None
then
import pyclbr
objects = pyclbr.readmodule("somefile")
print(objects['MyChildClass'].super)
print(objects['MyChildClass'].methods)
output
['MyClass']
{'child_method': 6}
Explanation: pyclbr does not execute code, but extract information from python source code. In above example from .super we can conclude that MyChildClass is child of MyClass and that MyChildClass define method child_method in 6th line of line
I have an application I'm working on in Python 2.7 which has several classes that need to interact with each other before returning everything back to the main program for output.
A brief example of the code would be:
class foo_network():
"""Handles all network functionality"""
def net_connect(self, network):
"""Connects to the network destination"""
pass
class foo_fileparsing():
"""Handles all sanitation, formatting, and parsing on received file"""
def file_check(self, file):
"""checks file for potential problems"""
pass
Currently I have a main file/function which instantiates all the classes and then handles passing data back and forth, as necessary, between them and their methods. However this seems a bit clumsy.
As such I'm wondering two things:
What would be the most 'Pythonic' way to handle this?
What is the best way to handle this for performance and memory usage?
I'm wondering if I should just instantiate objects of one class inside another (from the example, say, creating a foo_fileparsing object within the foo_network class if that is the only class which will be calling it, rather than my current approach of returning everything to the main function and passing it between objects that way.
Unfortunately I can't find a PEP or other resource that seems to address this type of situation.
You can use modules. And have every class in one module.
and then you can use import to import only a particular method from that class.
And all you need to do for that is create a directory with the name same as you class name and put a __init__.py file in that directory which tells python to consider that directory as a module.
Then for example the foo_network folder contains a file named foo_network.py and a file __init__.py and in foo_network.py the code is
class foo_network():
"""Handles all network functionality"""
def net_connect(self, network):
"""Connects to the network destination"""
pass
and in any other file you can simply use
import net_connect from foo_network
it will only import that particular method. This way your code will not look messy and you can will be importing only what is required.
You can also do
from foo_network import *
to import all methods at once.
Introduction
Pydev is a great eclipse plugin that let us write python code easily.
It can even give autocompletion suggestion when I do this:
from package.module import Random_Class
x = Random_Class()
x. # the autocompletion will be popped up,
# showing every method & attribute inside Random_Class
That is great !!!
The Problem (And My Question)
However, when I don't use explicit import, and use __import__ for example, I can't have the same autocompletion effect.
import_location = ".".join(('package', 'module'))
__import__(import_location, globals(), locals(), ['*'])
My_Class = getattr(sys.modules[import_location], 'Random_Class')
x = My_Class()
x. # I expect autocompletion, but nothing happened
Question: is there any way (in pydev or any IDE) to make the second one also
show autocompletion?
Why do I need to do this?
Well, I make a simple MVC framework, and I want to provide something like load_model, load_controller, and load_view which is still work with autocompletion (or at least possible to work)
So, instead of leave users do this (although I don't forbid them to do so):
from applications.the_application.models.the_model import The_Model
x = The_Model()
x. # autocompletion works
I want to let users do this:
x = load_model('the_application', 'the_model')()
x. # autocompletion still works
The "applications" part is actually configurable by another script, and I don't want users to change all of their importing model/controller part everytime they change the configuration. Plus, I think load_model, load_controller, and load_view make MVC pattern shown more obvious.
Unexpected Answer
I know some tricks such as doing this (as what people do with
web2py):
import_location = ".".join(('package', 'module'))
__import__(import_location, globals(), locals(), ['*'])
My_Class = getattr(sys.modules[import_location], 'Random_Class')
x = My_Class()
if 0:
from package.module import Random_Class
x = Random_Class()
x. # Now autocompletion is going to work
and I don't expect to do this, since it will only add unnecessary
extra work.
I don't expect any don't try to be clever comments. I have enough of them
I don't expect dynamic import is evil comments. I'm not a purist.
I don't expect any just use django, or pylons, or whatever comments. Such as comments even unrelated to my question.
I have done this before. This may be slightly different from your intended method, so let me know if it doesn't apply.
I dynamically import different modules that all subclass a master class, using similar code to your example. Because the subclassing module already imports the master, I don't need to import it in the main module.
To get highlighting, the solution was to import the master class into the main module first, even though it wasn't used directly. In my case it was a good fallback if the particular subclass didn't exist, but that's an implementation detail.
This only works if your classes all inherit from one parent.
Not really an answer to my own question. However, I can change the approach. So, instead of provide "load_model()", I can use relative import. Something like this:
from ..models.my_model import Model_Class as Great_Model
m = Great_Model()
I am doing dynamic class generation that could be statically determined at "compile" time. The simple case that I have right now looks more or less like this:
class Base(object):
def __init__(self, **kwargs):
self.do_something()
def ClassFactory(*args):
some_pre_processing()
class GenericChild(Base):
def __init__(self, **kwargs):
self.some_processing()
super(GenericChild, self).__init__(*args, **kwargs)
return GenericChild
Child1 = ClassFactory(1, 'Child_setting_value1')
Child2 = ClassFactory(2, 'Child_setting_value2')
Child3 = ClassFactory(3, 'Child_setting_value3')
On import, the Python interpreter seems to compile to bytecode, then execute the file (thus generating Child1, Child2, and Child3) once per Python instance.
Is there a way to tell Python to compile the file, execute it once to unpack the Child classes, then compile that into the pyc file, so that the unpacking only happens once (even across successive executions of the Python script)?
I have other use cases that are more complicated and expansive, so simply getting rid of the factory by hand-writing the Child classes is not really an option. Also, I would like to avoid an extra preprocessor step if possible (like using the C-style macros with the C preprocessor).
No, you'd have to generate Python code instead where those classes are 'baked' to python code instead.
Use some form of string templating where you generate Python source code, save those to .py files, then bytecompile those.
However, the class generation happens only once on startup. Is it really that great a cost to generate these?
If there's no real need to have the child classes separate and you just want to have a 'standard configuration' for those particular sets of objects, you could just make your ObjectFactory a class with the configuration stored in there. Each instance will be able to spit out GenericChildren with the appropriate configuration, completely bypassing the runtime generation of Classes (and the debugging headache associated with it).
I'm writing a plugin system for my program and I can't get past one thing:
class ThingLoader(object):
'''
Loader class
'''
def loadPlugins(self):
'''
Get all the plugins from plugins folder
'''
from diones.thingpad.plugin.IntrospectionHelper import loadClasses
classList=loadClasses('./plugins', IPlugin)#Gets a list of
#plugin classes
self.plugins={}#Dictionary that should be filled with
#touples of objects and theirs states, activated, deactivated.
classList[0](self)#Runs nicelly
foo = classList[1]
print foo#prints <class 'TestPlugin.TestPlugin'>
foo(self)#Raise an exception
The test plugin looks like this:
import diones.thingpad.plugin.IPlugin as plugin
class TestPlugin(plugin.IPlugin):
'''
classdocs
'''
def __init__(self, loader):
self.name='Test Plugin'
super(TestPlugin, self).__init__(loader)
Now the IPlugin looks like this:
class IPlugin(object):
'''
classdocs
'''
name=''
def __init__(self, loader):
self.loader=loader
def activate(self):
pass
All the IPlugin classes works flawlessy by them selves, but when called by ThingLoader the program gets an exception:
File "./plugins\TestPlugin.py", line 13, in __init__
super(TestPlugin, self).__init__(loader) NameError:
global name 'super' is not defined
I looked all around and I simply don't know what is going on.
‘super’ is a builtin. Unless you went out of your way to delete builtins, you shouldn't ever see “global name 'super' is not defined”.
I'm looking at your user web link where there is a dump of IntrospectionHelper. It's very hard to read without the indentation, but it looks like you may be doing exactly that:
built_in_list = ['__builtins__', '__doc__', '__file__', '__name__']
for i in built_in_list:
if i in module.__dict__:
del module.__dict__[i]
That's the original module dict you're changing there, not an informational copy you are about to return! Delete these members from a live module and you can expect much more than ‘super’ to break.
It's very hard to keep track of what that module is doing, but my reaction is there is far too much magic in it. The average Python program should never need to be messing around with the import system, sys.path, and monkey-patching __magic__ module members. A little bit of magic can be a neat trick, but this is extremely fragile. Just off the top of my head from browsing it, the code could be broken by things like:
name clashes with top-level modules
any use of new-style classes
modules supplied only as compiled bytecode
zipimporter
From the incredibly round-about functions like getClassDefinitions, extractModuleNames and isFromBase, it looks to me like you still have quite a bit to learn about the basics of how Python works. (Clues: getattr, module.__name__ and issubclass, respectively.)
In this case now is not the time to be diving into import magic! It's hard. Instead, do things The Normal Python Way. It may be a little more typing to say at the bottom of a package's mypackage/__init__.py:
from mypackage import fooplugin, barplugin, bazplugin
plugins= [fooplugin.FooPlugin, barplugin.BarPlugin, bazplugin.BazPlugin]
but it'll work and be understood everywhere without relying on a nest of complex, fragile magic.
Incidentally, unless you are planning on some in-depth multiple inheritance work (and again, now may not be the time for that), you probably don't even need to use super(). The usual “IPlugin.__init__(self, ...)” method of calling a known superclass is the straightforward thing to do; super() is not always “the newer, better way of doing things” and there are things you should understand about it before you go charging into using it.
Unless you're running a version of Python earlier than 2.2 (pretty unlikely), super() is definitely a built-in function (available in every scope, and without importing anything).
May be worth checking your version of Python (just start up the interactive prompt by typing python at the command line).