I'm interested in wrapping pep8 so I can monkey-patch it before use. What is the "right" way to wrap a module?
If my module is named pep8 and lives in my path somewhere before the real pep8, any "import pep8" in my module will just import itself. I don't know in advance where the real pep8 will live, since this needs to be generalized for multiple systems. I can't remove the path where my pep8 wrapper lives from sys.path, because that too will be different depending on the system where it's executed.
I don't want to have to rename my pep8, because I'd like for the pep8 command to work without modification.
My pep8 is a directory containing a __init__.py with the following contents:
from pep8 import *
MAX_LINE_LENGTH = 119
For Python 2.5+, you can specify using absolute imports by default. With from __future__ import absolute_import.
For monkey patching a Python module, you'll want to do relative imports from your project to your overridden module.
For this example, I will assume you are distributing a library. It requires a little finessing for other projects, since the __main__ python file cannot have relative imports.
myproject/__init__.py:
from . import pep8 # optional "as pep8"
# The rest of your code, using this pep8 module.
myproject/pep8/__init__.py:
from __future__ import absolute_import
from pep8 import *
MAX_LINE_LENGTH = 119
I realize this is an old question, but it still comes up in google searches. For instances where this is actually desired (ex: protected library wrapping) I suggest the WRAPT package.
I actually use this for instances where I have a model that is part of a core set but can be extended by other applications (such as front-ends like flask apps). The core model is protected but can be extended by other developers.
https://pypi.python.org/pypi/wrapt
Related
So I have thought about it this way for a Python 2.7 project. It would be composed of two independent parts requiring a common class (module) file in a third package:
SomeRootFolder/Package1Folder/manyPythonModuleFiles.py
SomeRootFolder/Package2Folder/manyPythonModuleFiles.py
SomeRootFolder/SharedPackageFolder/OneCommonClassNeedsToBeShared.py
What I want to do is to import the common class in the shared package from both packages. The two first packages do not require to interact together but needs that one class. The python programs might be runned with the console opened from within the two package folders themselves, such as:
cd Package1Folder
python SomeMainFile.py
If it is easier, the Python call could be like python Package1Folder/SomeMainFile.py but I need to plan this.
Could you provide how I could do the relative imports from package 1 or 2 for a file in the third shared package? Do I need an __init__.py file in the SomeRootFolder folder? I am always confused by relative imports and those import standards and syntax that are different between Python 2 and 3. Also could you validate to me that this is an acceptable way to proceed? Any other ideas?
Thanks all!
If you want to use relative imports,you need __init__.py in SharedPackageFolder folder,and you can use this to import OneCommonClassNeedsToBeShared.py:
from ..SharedPackageFolder import OneCommonClassNeedsToBeShared
See more details about Rationale for Relative Imports .
With the shift to absolute imports, the question arose whether
relative imports should be allowed at all. Several use cases were
presented, the most important of which is being able to rearrange the
structure of large packages without having to edit sub-packages. In
addition, a module inside a package can't easily import itself without
relative imports.
Also you can use absolute imports,relative imports are no longer strongly discouraged,using absolute_import is strongly suggested in some case.
You need to make sure SomeRootFolder is in your PYTHONPATH,or make this folder as sources root,it's more easier for you to import package or scripts in your large project,but sometimes you should be careful with absolute imports.
from SharedPackageFolder import OneCommonClassNeedsToBeShared.py
Absolute imports. From PEP 8:
Relative imports for intra-package imports are highly discouraged. Always use the absolute package path for all imports. Even now that
PEP 328 [7] is fully implemented in Python 2.5, its style of explicit
relative imports is actively discouraged; absolute imports are more
portable and usually more readable.
By the way,relative imports in Python 3 might return SystemError,have a look at Question:Relative imports in Python 3.
#vaultah offers some solutions,they might be helpful.
Hope this helps.
I am writing a compiler in Python, using the PLY (Python Lex-Yacc) library to 'compile' the compiler. The compiler has to go through a lot of rules (the
number of just the core rules is eventually going to be a little less than a hundred, and they can be extended). So to keep the different types of rules separate, I made many Python modules in a single modules directory.
To include all the rules, I don't have to include the modules in this directory, but I have to include the rules (implemented as Python functions) into the current namespace. Once they simply exist there, the compiler's input will be properly tokenized, parsed, etc.
Here's what I've read about and tried:
using __import__, getattr, and sys.modules (very raw and in general not preferred)
the importlib library (how do I get everything inside the module?)
a lot of fiddling with __init__.py and just trying to from modules import * which will import everything in the modules as well
But none of these seem entirely satisfactory to me. I can't do precisely what I want to do with any of them. So my question is: how can I import some of the attributes of a Python module in a subdirectory into the running namespace of a top-level module?
Thanks for your attention!
You want to use an existing plugin library like stevedore. It will give you the tools to enumerate files that can be imported, and tools to import those modules.
I have this package:
mypackage/
__init__.py
a.py
b.py
And I want to import things from module a to module b, does it make sense to write in module b
from mypackage.a import *
or should I just use
from a import *
Both options will work, I'm just wondering which is better (the 2nd makes sense because it's in the same level but I'm considering the 1st to avoid collisions, for example if the system is running from a folder that contains a file named a.py).
You can safely use number 2 because there shouldn't be any collisions - you'll be always importing a module from the same package as the current one. Please note, that if your module has the same name as one of the standard library modules, it will be imported instead of the standard one. From the documentation:
When a module named spam is imported, the interpreter first searches
for a built-in module with that name. If not found, it then searches
for a file named spam.py in a list of directories given by the
variable sys.path. sys.path is initialized from these locations:
the directory containing the input script (or the current directory).
PYTHONPATH (a list of directory names, with the same syntax as the
shell variable PATH).
the installation-dependent default.
After initialization, Python programs can modify sys.path. The
directory containing the script being run is placed at the beginning
of the search path, ahead of the standard library path. This means
that scripts in that directory will be loaded instead of modules of
the same name in the library directory. This is an error unless the
replacement is intended. See section Standard Modules for more
information.
The option from mypackage.a import * can be used for consistency reasons all over the project. In some modules you will have to do absolute imports anyway. Thus you won't have to think whether the module is in the same package or not and simply use a uniform style in the entire project. Additionally this approach is more reliable and predictable.
Python style guidelines don't recommend using relative imports:
Relative imports for intra-package imports are highly discouraged.
Always use the absolute package path for all imports. Even now that
PEP 328 is fully implemented in Python 2.5, its style of explicit
relative imports is actively discouraged; absolute imports are more
portable and usually more readable.
Since python 2.5 a new syntax for intra-package relative imports has been introduced. Now you can . to refer to the current module and .. referring to the module being 1 level above.
from . import echo
from .. import formats
from ..filters import equalizer
You should use from mypackage.a import things, you, want.
There are two issues here, the main one is relative vs absolute imports, the semantics of which changed in Python 3, and can optionally be used in Python 2.6 and 2.7 using a __future__ import. By using mypackage.a you guarantee that you will get the code you actually want, and it will work reliably on future versions of Python.
The second thing is that you should avoid import *, as it can potentially mask other code. What if the a.py file gained a function called sum? It would silently override the builtin one. This is especially bad when importing your own code in other modules, as you may well have reused variable or function names.
Therefore, you should only ever import the specific functions you need. Using pyflakes on your sourcecode will then warn you when you have potential conflicts.
Suppose I have a package that contains modules:
SWS/
__init.py__
foo.py
bar.py
time.py
and the modules need to refer to functions contained in one another. It seems like I run into problems with my time.py module since there is a standard module that goes by the same name.
For instance, in the case that my foo.py module requires both my SWS.time and the standard python time modules, I run into trouble since the interpreter will look inside the package and find my time.py modules before it comes across the standard time module.
Is there any way around this? Is this a no-no situation and should modules names not be reused?
Any solutions and opinions on package philosophy would be useful here.
Reusing names of standard functions/classes/modules/packages is never a good idea. Try to avoid it as much as possible. However there are clean workarounds to your situation.
The behaviour you see, importing your SWS.time instead of the stdlib time, is due to the semantics of import in ancient python versions (2.x). To fix it add:
from __future__ import absolute_import
at the very top of the file. This will change the semantics of import to that of python3.x, which are much more sensible. In that case the statement:
import time
Will only refer to a top-level module. So the interpreter will not consider your SWS.time module when executing that import inside the package, but it will only use the standard library one.
If a module inside your package needs to import SWS.time you have the choice of:
Using an explicit relative import:
from . import time
Using an absolute import:
import SWS.time as time
So, your foo.py would be something like:
from __future__ import absolute_import
import time
from . import time as SWS_time
It depends on what version of Python you're using. If your targeted Python version is 2.4 or older (in 2015, I sure hope not), then yes it would be bad practice as there is no way (without hacks) to differentiate the two modules.
However, in Python 2.5+, I think that reusing standard lib module names within a package namespace is perfectly fine; in fact, that is the spirit of PEP328.
As Python's library expands, more and more existing package internal modules suddenly shadow standard library modules by accident. It's a particularly difficult problem inside packages because there's no way to specify which module is meant. To resolve the ambiguity, it is proposed that foo will always be a module or package reachable from sys.path . This is called an absolute import.
The python-dev community chose absolute imports as the default because they're the more common use case and because absolute imports can provide all the functionality of relative (intra-package) imports -- albeit at the cost of difficulty when renaming package pieces higher up in the hierarchy or when moving one package inside another.
Because this represents a change in semantics, absolute imports will be optional in Python 2.5 and 2.6 through the use of from __future__ import absolute_import
SWS.time is clearly not the same thing as time and as a reader of the code, I would expect SWS.time to not only use time, but to extend it in some way.
So, if SWS.foo needs to import SWS.time, then it should use the absolute path:
# in SWS.foo
# I would suggest renaming *within*
# modules that use SWS.time so that
# readers of your code aren't confused
# with which time module you're using
from SWS import time as sws_time
Or, it should use an explicit relative import as in Bakuriu's answer:
# in SWS.foo
from . import time as sws_time
In the case that you need to import the standard lib time module within the SWS.time module, you will first need to import the future feature (only for Python 2.5+; Python 3+ does this by default):
# inside of SWS.time
from __future__ import absolute_import
import time
time.sleep(28800) # time for bed
Note: from __future__ import absolute_imports will only affect import statements within the module that the future feature is imported and will not affect any other module (as that would be detrimental if another module depends on relative imports).
As others have said, this is generally a bad idea.
That being said, if you're looking for potential workarounds, or a better understanding of the problem, I suggest you read the following SO questions:
Importing from builtin library when module with same name exists
How to access a standard-library module in Python when there is a local module with the same name?
Yeah, really no good way around it. Try not to name your modules like standard packages. If you really want to call your module time, i'd recommend using _time.py instead. Even if there was a way to do it, it would make your code hard to read and confusing when it came to the 2 time modules.
I'm developing a Python application; it has all its code in one package and runs inside this of course. The application's Python package is of no interest from the interpreter to the user, it's simply a GUI application.
The question is, which style is preferred when importing modules inside the application package
from application import settings, utils
or
from . import settings, utils
That is I can either specify the name as it is (here 'application') or I can say "current package" by using "."
This is a Free software package so the possibility exists that someone wants to make a fork of my application and change its name. In that case, alternative 1 is a slight nuisance. Still, I use style 1 all the time (although early code uses style 2 in some places), since style 1 looks much better.
Are there any arguments for my style (1) that I have missed? Or is it stupid not to go with style 2?
The Python Style Guide recommends explicitly against relative imports (the . style):
Relative imports for intra-package imports are highly discouraged.
Always use the absolute package path for all imports.
Even now that PEP 328 [7] is fully implemented in Python 2.5,
its style of explicit relative imports is actively discouraged;
absolute imports are more portable and usually more readable.
I tend to agree. Relative imports mean the same module is imported in different ways in different files, and requires that I remember what I'm looking at when reading and writing. Not really worth it, and a rename can be done with sed.
Besides the issue of renaming, the only problem with absolute imports is that import foo might mean the top-level module foo or a submodule foo beneath the current module. If this is a problem, you can use from __future__ import absolute_import; this is standard in Python 3.