Python - Should I alias imports with underscores? - python

This is a conceptual question rather than an actual problem, I wanted to ask the great big Internet crowd for feedback.
We all know imported modules end up in the namespace of that module:
# Module a:
import b
__all__ = ['f']
f = lambda: None
That allows you to do this:
import a
a.b # <- Valid attribute
Sometimes that's great, but most imports are side effects of the feature your module provides. In the example above I don't mean to expose b as a valid interface for callers of a.
To counteract that we could do:
import b as _b
This marks the import as private. But I can't find that practice described anywhere nor does PEP8 talk about using aliasing to mark imports as private. So I take it it's not common practice. But from a certain angle I'd say it's definitely semantically clearer, because it cleans up the exposed bits of your module leaving only the relevant interfaces you actually mean to expose. Working with an IDE with autocomplete it makes the suggested list much slimmer.
My question boils down to if you've seen that pattern in use? Does it have a name? What arguments would go against using it?
I have not had success using the the __all__ functionality to hide the b import. I'm using PyCharm and do not see the autocomplete list change.
E.g. from some module I can do:
import a
And the autocomplete box show both b and f.

While Martijn Pieters says that no one actually uses underscore-hiding module imports, that's not exactly true. The traces of this technique can be easily seen in the Python's standard library itself (see a related question). Let's check it:
$ git clone --depth 1 git#github.com:python/cpython.git
$ cd cpython/Lib
$ find -iname '*.py' | xargs grep 'as \+_' | wc -l
183
$ find -iname '*.py' | xargs grep '^import' | wc -l
4578
So, about 4% of all imports are underscore-prefixed — not a majority, but yet far from “no one”. There also are some examples in numpy and matplotlib packages.
For me, this import-underscoring is the only right way to import module without exposing it at public. Unfortunately, it totally ruins code appearance, so many developers avoid using it. But it has some advantages over the __all__ approach:
Library user can decide whether a name is private or not without consulting documentation by just looking at the name. Looking to just __all__ is not enough to tell private from public as some public names may be not listed there.
No need to maintain a refactoring-unfriendly list of code entity names.
To the conclusion, both _name and __all__ are just plain evil, but the thing which actually needs fixing is the Python's module system, designed under an impression of “simple is better than complex” mantra. Compare to, for example, the way how modules behave in Haskell.
UPD:
It looks like PEP-8 has already answered this question in its “Public and internal-interfaces” section:
Even with __all__ set appropriately, internal interfaces (packages, modules, classes, functions, attributes or other names) should still be prefixed with a single leading underscore.

No one uses that pattern, and it is not named.
That's because the proper method to use is to explicitly mark your exported names with the __all__ variable. IDEs will honour this variable, as do tools like help().
Quoting the import statement documentation:
The public names defined by a module are determined by checking the module’s namespace for a variable named __all__; if defined, it must be a sequence of strings which are names defined or imported by that module. The names given in __all__ are all considered public and are required to exist. If __all__ is not defined, the set of public names includes all names found in the module’s namespace which do not begin with an underscore character ('_'). __all__ should contain the entire public API. It is intended to avoid accidentally exporting items that are not part of the API (such as library modules which were imported and used within the module).
(Emphasis mine).
Also see Can someone explain __all__ in Python?

Related

Private functions in python

Is it possible to avoid importing a file with from file import function?
Someone told me i would need to put an underscore as prefix, like: _function, but isn't working.
I'm using Python 2.6 because of a legacy code.
There are ways you can prevent the import, but they're generally hacks and you want to avoid them. The normal method is to just use the underscore:
def _function():
pass
Then, when you import,
from my_module import *
You'll notice that _function is not imported because it begins with an underscore. However, you can always do this:
# In Python culture, this is considered rude
from my_module import _function
But you're not supposed to do that. Just don't do that, and you'll be fine. Python's attitude is that we're all adults. There are a lot of other things you're not supposed to do which are far worse, like
import my_module
# Remove the definition for _function!
del my_module._function
There is no privacy in Python. There are only conventions governing what external code should consider publicly accessible and usable.
Importing a module for the first time, triggers the creation of a module object and the execution of all top-level code in the module. The module object contains the global namespace with the result of that code having run.
Because Python is dynamic you can always introspect the module namespace; you can see all names defined, all objects those names reference, and you can access and alter everything. It doesn't matter here if those names start with underscores or not.
So the only reason you use a leading _ underscore for a name, is to document that the name is internal to the implementation of the module, and that external code should not rely on that name existing in a future version. The from module import * syntax will ignore such names for that reason alone. But you can't prevent a determined programmer from accessing such a name anyway. They simply do so at their own risk, it is not your responsibility to keep them from that access.
If you have functions or other objects that are only needed to initialise the module, you are of course free to delete those names at the end.

Is there a point to setting __all__ and then using leading underscores anyway?

I've been reading through the source for the cpython HTTP package for fun and profit, and noticed that in server.py they have the __all__ variable set but also use a leading underscore for the function _quote_html(html).
Isn't this redundant? Don't both serve to limit what's imported by from HTTP import *?
Why do they do both?
Aside from the "private-by-convention" functions with _leading_underscores, there are:
Quite a few imported names;
Four class names;
Three function names without leading underscores;
Two string "constants"; and
One local variable (nobody).
If __all__ wasn't defined to cover only the classes, all of these would also be added to your namespace by a wildcard from server import *.
Yes, you could just use one method or the other, but I think the leading underscore is a stronger sign than the exclusion from __all__; the latter says "you probably won't need this often", the former says "keep out unless you know what you're doing". They both have their place.
__all__ indeed serves as a limit when doing from HTTP import *; prefixing _ to the name of a function or method is a convention for informing the user that that item should be considered private and thus used at his/her own risk.
This is mostly a documentation thing, in a similar vein to comments. A leading underscore is a clearer indication to a person reading the code that particular functions or variables aren't part of the public API than having that person check each name against __all__. PEP8 explicitly recommends using both conventions in this way:
To better support introspection, modules should explicitly declare
the names in their public API using the __all__ attribute. Setting
__all__ to an empty list indicates that the module has no public API.
Even with __all__ set appropriately, internal interfaces (packages,
modules, classes, functions, attributes or other names) should still
be prefixed with a single leading underscore.

Python: 'Private' module in a package

I have a package mypack with modules mod_a and mod_b in it. I intend the package itself and mod_a to be imported freely:
import mypack
import mypack.mod_a
However, I'd like to keep mod_b for the exclusive use of mypack. That's because it exists merely to organize the latter's internal code.
My first question is, is it an accepted practice in Python programming to have 'private' modules like this?
If yes, my second question is, what is the best way to convey this intention to the client? Do I prefix the name with an underscore (i.e. _mod_b)? Or would it be a good idea to declare a sub-package private and place all such modules there?
I prefix private modules with an underscore to communicate the intent to the user. In your case, this would be mypack._mod_b
This is in the same spirit (but not completely analogous to) the PEP8 recommendation to name C-extension modules with a leading underscore when it’s wrapped by a Python module; i.e., _socket and socket.
The solution I've settled on is to create a sub-package 'private' and place all the modules I wish to hide in there. This way they stay stowed away, leaving mypack's module list cleaner and easier to parse.
To me, this doesn't look unpythonic either.
While there are not explicit private keywords there is a convention to have put private functions start with a single underscore but a double leading underscore will make it so others cannot easily call the function from outside the module. See the following from PEP 8
- _single_leading_underscore: weak "internal use" indicator. E.g. "from M
import *" does not import objects whose name starts with an underscore.
- single_trailing_underscore_: used by convention to avoid conflicts with
Python keyword, e.g.
Tkinter.Toplevel(master, class_='ClassName')
- __double_leading_underscore: when naming a class attribute, invokes name
mangling (inside class FooBar, __boo becomes _FooBar__boo; see below).
- __double_leading_and_trailing_underscore__: "magic" objects or
attributes that live in user-controlled namespaces. E.g. __init__,
__import__ or __file__. Never invent such names; only use them
as documented.
To make an entire module private, don't include it __init__.py file.
One thing to be aware of in this scenario is indirect imports. If in mypack you
from mypack._mod_b import foo
foo()
Then a user can
from mypack import foo
foo()
and be none the wiser. I recommend importing as
from mypack import _mod_b
_mod_b.foo()
then a user will immediately see a red flag when they try to
from mypack import _mod_b
As for actual directory structure, you could even extend Jeremy's answer into a _package_of_this_kind package, where anything in that can have any 'access modifiers' on it you like - users will know there be dragons
Python doesn't strictly know or support "private" or "protected" methods or classes. There's a convention that methods prefixed with a single underscore aren't part of an official API, but I wouldn't do this on classes or files - it's ugly.
If someone really needs to subclass or access mod_b, why prevent him/her from doing so? You can always supply a preferred API in your documentation and document in your module that you shouldn't access it directly and use mypack in stead.

Python imports: Will changing a variable in "child" change variable in "parent"/other children?

Suppose you have 3 modules, a.py, b.py, and c.py:
a.py:
v1 = 1
v2 = 2
etc.
b.py:
from a import *
c.py:
from a import *
v1 = 0
Will c.py change v1 in a.py and b.py? If not, is there a way to do it?
All that a statement like:
v1 = 0
can do is bind the name v1 to the object 0. It can't affect a different module.
If I'm using unfamiliar terms there, and I guess I probably am, I strongly recommend you read Fredrik Lundh's excellent article Python Objects: Reset your brain.
The from ... import * form is basically intended for handy interactive use at the interpreter prompt: you'd be well advised to never use it in other situations, as it will give you nothing but problems.
In fact, the in-house style guide at my employer goes further, recommending to always import a module, never contents from within a module (a module from within a package is OK and in fact recommended). As a result, in our codebase, references to imported things are always qualified names (themod.thething) and never barenames (which always refer to builtin, globals of this same module, or locals); this makes the code much clearer and more readable and avoids all kinds of subtle anomalies.
Of course, if a module's name is too long, an as clause in the import, to give it a shorter and handier alias for the purposes of the importing module, is fine. But, with your one-letter module names, that won't be needed;-).
So, if you follow the guideline and always import the module (and not things from inside it), c.v1 will always be referring to the same thing as a.v1 and b.v1, both for getting AND setting: here's one potential subtle anomaly avoided right off the bat!-)
Remember the very last bit of the Zen of Python (do import this at the interpreter prompt to see it all):
Namespaces are one honking great idea -- let's do more of those!
Importing the whole module (not bits and pieces from within it) preserves its integrity as a namespace, as does always referring to things inside the imported module by qualified (dotted) names. It's one honking great idea: do more of that!-)
Yes, you just need to access it correctly (and don't use import *, it's evil)
c.py:
import a
print a.v1 # prints 1
a.v1 = 0
print a.v1 # prints 0

Python includes, module scope issue

I'm working on my first significant Python project and I'm having trouble with scope issues and executing code in included files. Previously my experience is with PHP.
What I would like to do is have one single file that sets up a number of configuration variables, which would then be used throughout the code. Also, I want to make certain functions and classes available globally. For example, the main file would include a single other file, and that file would load a bunch of commonly used functions (each in its own file) and a configuration file. Within those loaded files, I also want to be able to access the functions and configuration variables. What I don't want to do, is to have to put the entire routine at the beginning of each (included) file to include all of the rest. Also, these included files are in various sub-directories, which is making it much harder to import them (especially if I have to re-import in every single file).
Anyway I'm looking for general advice on the best way to structure the code to achieve what I want.
Thanks!
In python, it is a common practice to have a bunch of modules that implement various functions and then have one single module that is the point-of-access to all the functions. This is basically the facade pattern.
An example: say you're writing a package foo, which includes the bar, baz, and moo modules.
~/project/foo
~/project/foo/__init__.py
~/project/foo/bar.py
~/project/foo/baz.py
~/project/foo/moo.py
~/project/foo/config.py
What you would usually do is write __init__.py like this:
from foo.bar import func1, func2
from foo.baz import func3, constant1
from foo.moo import func1 as moofunc1
from foo.config import *
Now, when you want to use the functions you just do
import foo
foo.func1()
print foo.constant1
# assuming config defines a config1 variable
print foo.config1
If you wanted, you could arrange your code so that you only need to write
import foo
At the top of every module, and then access everything through foo (which you should probably name "globals" or something to that effect). If you don't like namespaces, you could even do
from foo import *
and have everything as global, but this is really not recommended. Remember: namespaces are one honking great idea!
This is a two-step process:
In your module globals.py import the items from wherever.
In all of your other modules, do "from globals import *"
This brings all of those names into the current module's namespace.
Now, having told you how to do this, let me suggest that you don't. First of all, you are loading up the local namespace with a bunch of "magically defined" entities. This violates precept 2 of the Zen of Python, "Explicit is better than implicit." Instead of "from foo import *", try using "import foo" and then saying "foo.some_value". If you want to use the shorter names, use "from foo import mumble, snort". Either of these methods directly exposes the actual use of the module foo.py. Using the globals.py method is just a little too magic. The primary exception to this is in an __init__.py where you are hiding some internal aspects of a package.
Globals are also semi-evil in that it can be very difficult to figure out who is modifying (or corrupting) them. If you have well-defined routines for getting/setting globals, then debugging them can be much simpler.
I know that PHP has this "everything is one, big, happy namespace" concept, but it's really just an artifact of poor language design.
As far as I know program-wide global variables/functions/classes/etc. does not exist in Python, everything is "confined" in some module (namespace). So if you want some functions or classes to be used in many parts of your code one solution is creating some modules like: "globFunCl" (defining/importing from elsewhere everything you want to be "global") and "config" (containing configuration variables) and importing those everywhere you need them. If you don't like idea of using nested namespaces you can use:
from globFunCl import *
This way you'll "hide" namespaces (making names look like "globals").
I'm not sure what you mean by not wanting to "put the entire routine at the beginning of each (included) file to include all of the rest", I'm afraid you can't really escape from this. Check out the Python Packages though, they should make it easier for you.
This depends a bit on how you want to package things up. You can either think in terms of files or modules. The latter is "more pythonic", and enables you to decide exactly which items (and they can be anything with a name: classes, functions, variables, etc.) you want to make visible.
The basic rule is that for any file or module you import, anything directly in its namespace can be accessed. So if myfile.py contains definitions def myfun(...): and class myclass(...) as well as myvar = ... then you can access them from another file by
import myfile
y = myfile.myfun(...)
x = myfile.myvar
or
from myfile import myfun, myvar, myclass
Crucially, anything at the top level of myfile is accessible, including imports. So if myfile contains from foo import bar, then myfile.bar is also available.

Categories