I have a number of functions that I have written myself over time, over time I have placed each of these functions in their own modules (as I thought this was best practice).
I would like take the next step and organise these modules that I can import in fewer lines of code. However, I'm finding that I need to need to use a lot of lines of 'repeated' code to import the functions I want to have access to.
my_functions
-src
--__init__.py
--func1.py
--func2.py
--func3.py
I have followed this talk tutorial to build this collection of modules into a package thinking that I could use something like
import my_fucntions as mf
but I have to import each module
import func1 as fu1
import func2 as fu2
...
a = fu1.func1()
my question is, what do I have to do to be able to import my functions as a package, the same way that I would import something like pandas
import my_functions as mf
a = mf.func1(arg)
I'm finding it difficult to find either a tutorial or a clear simple example of how to structure this so any guidance at all would be useful. If it's just not doable without building something as complex and pandas or numpy thats ok too, I just said I'd try one last shot at it.
Create a __init__.py file in both my_functions directory and src directory.
Now in the my_functions __init__.py file. No need to edit __init__ file inside src.
from src.func1 import func1
from src.func2 import func2
__all__ = [
'func1',
'func2'
]
Now you can use my_functions like below
import my_functions
my_functions.f1()
__all__ gets called when you try to import my_functions and allow you to import things you have mentioned using dot notation.
To call a python script from the root call it using dot notation as below
python3 -m <some_directory>.<file>
Related
Let's say I have a structure like this:
tests----------------
___init__.py
test_functions_a
functions-----------
___init__.py
functions_a
functions_b
I want to test a function from functions_a, but inside functions_a I am importing a function from functions_b.
When I am trying:
from functions.functions_a import function_aa
I am getting an error, because inside functions_a I have a line:
from functions_b import function_bb
and not:
from functions.functions_b import function_bb
How can I solve this?
Any good practises are welcome, as I have no experience in structuring projects.
According to Google Python Style Guide, you should:
Use import statements for packages and modules only, not for
individual classes or functions. Note that there is an explicit
exemption for imports from the typing module.
You should also:
Import each module using the full pathname location of the module.
If you follow those two conventions, you will probably avoid, in the future, situations like the one you just described.
Now, here's how your code will probably look like if you follow those tips:
Module functions.functions_a:
from functions import functions_b as funcs_b
def function_aa():
print("AA")
def function_aa_bb():
function_aa()
funcs_b.function_bb()
Module functions.functions_b:
def function_bb():
print("BB")
And, finally, test_functions_a.py:
from functions import functions_a as funcs_a
if __name__ == "__main__":
funcs_a.function_aa()
funcs_a.function_aa_bb()
Output:
AA
AA
BB
You cannot directly import the function instead you could import File 1 to some other File and then call the function from that particular file you imported .
I simply want to take all my .py files from a single folder (I don't care about the sub-folders for now) and put them into a single module.
The use case I'm having here is that I'm writing some pretty standard object-oriented code and I'm using a single file for every class, and I don't want to have to write from myClass import myClass for every class into my __init__.py. I can't use Python3, so I'm still working with impand reloadand such.
At the moment I'm using
# this is __init__.py
import pkgutil
for loader, name, is_pkg in pkgutil.walk_packages(__path__):
if not is_pkg:
__import__(__name__ + "." + name)
and it doesn't seem to work, it includes the packages but it includes them as modules, so that I have to write MyClass.MyClass for a class that is defined in a file with it's own name. That's silly and I don't like it.
I've been searching forever and I'm just getting more confused how complicated this seemingly standard use case seems to be. Do python devs just write everything into a single file? Or do they always have tons of imports?
Is this something that should be approached in an entirely different way?
What you really want to do
To do the job you need to bind your class names to namespace of your __init__.py script.
After this step you will be able to just from YourPackageName import * and just use your classes directly. Like this:
import YourPackageName
c = YourPackageName.MyClass()
or
from YourPackageName import *
c = MyClass()
Ways to achieve this
You have multiple ways to import modules dynamically: __import__(), __all__.
But.
The only way to bind names into namespace of current module is to use from myClass import myClass statement. Static statement.
In other words, content of each of your __init__.py scripts should be looking like that:
#!/usr/bin/env python
# coding=utf-8
from .MySubPackage import *
from .MyAnotherSubPackage import *
from .my_pretty_class import myPrettyClass
from .my_another_class import myAnotherClass
...
And you should know that even for a dynamic __all__:
It is up to the package author to keep this list up-to-date when a new version of the package is released.
(https://docs.python.org/2/tutorial/modules.html#importing-from-a-package)
So a clear answers to your questions:
Do python devs just write everything into a single file?
No, they don't.
Or do they always have tons of imports?
Almost. But definitely not tons. You need to import each of your modules just once (into an appropriate __init__.py scripts). And then just import whole package or sub-package at once.
Example
Let's assume that there is next package structure:
MyPackage
|---MySubPackage
| |---__init__.py
| |---pretty_class_1.py
| |---pretty_class_2.py
|---__init__.py
|---sleepy_class_1.py
|---sleepy_class_2.py
Content of the MyPackage/MySubPackage/__init__.py:
#!/usr/bin/env python
# coding=utf-8
from .pretty_class_1 import PrettyClass1
from .pretty_class_2 import PrettyClass2
Content of the MyPackage/__init__.py:
#!/usr/bin/env python
# coding=utf-8
from .MySubPackage import *
from .sleepy_class_1 import SleepyClass1
from .sleepy_class_2 import SleepyClass2
As result, now we are able to write next code in our application:
import MyPackage
p = MyPackage.PrettyClass1()
s = MyPackage.SleepyClass2()
or
from MyPackage import *
p = PrettyClass1()
s = SleepyClass2()
i've run through many posts about this, but still doesn't seem to work. The deal is pretty cut. I've the got the following hierarchy.
main.py
DirA/
__init__.py
hello.py
DirB/
__init__.py
foo.py
bla.py
lol.py
The__init__.py at DirA is empty. The respective one at DirB just contains the foo module.
__all__.py = ["foo"]
The main.py has the following code
import DirA
import DirB
hey() #Def written at hello.py
foolish1() #Def written at foo.py
foolish2() #Def written at foo.py
Long story short, I got NameError: name 'foo' is not defined. Any ideas? Thanks in advance.
You only get what you import. Therefore, in you main, you only get DirA and DirB. You would use them in one of those ways:
import DirA
DirA.something_in_init_py()
# Importing hello:
import DirA.hello
DirA.hello.something_in_hello_py()
# Using a named import:
from DirA.hello import something_in_hello_py
something_in_hello_py()
And in DirB, just make the __init__.py empty as well. The only use of __all__ is for when you want to import *, which you don't want because, as they say, explicit is better than implicit.
But in case you are curious, it would work this way:
from DirB import *
something_in_dirb()
By default the import * will import everything it can find that does not start with an underscore. Specifying a __all__ restricts what it imported to the names defined in __all__. See this question for more details.
Edit: about init.
The __init__.py is not really connected to the importing stuff. It is just a special file with the following properties:
Its existence means the directory is a python package, with several modules in it. If it does not exist, python will refuse to import anything from the directory.
It will always be loaded before loading anything else in the directory.
Its content will be available as the package itself.
Just try it put this in DirA/__init__.py:
foo = 42
Now, in your main:
from DirA import foo
print(foo) # 42
It can be useful, because you can import some of your submodules in the __init__.py to hide the inner structure of your package. Suppose you build an application with classes Author, Book and Review. To make it easier to read, you give each class its own file in a package. Now in your main, you have to import the full path:
from myapp.author import Author
from myapp.book import Book
from myapp.review import Review
Clearly not optimal. Now suppose you put those exact lines above in your __init__.py, you may simplify you main like this:
from myapp import Author, Book, Review
Python will load the __init__.py, which will in turn load all submodules and import the classes, making them available on the package. Now your main does not need to know where the classes are actually implemented.
Have you tried something like this:
One way
from DirA import hello
Another way
from DirA.hello import hey
If those don't work then append a new system path
You need to import the function itself:
How to call a function from another file in Python?
In your case:
from DirA import foolish1, foolish2
I'm trying to do the following in python 2.6.
my_module.py:-
from another_module import another_factory
def my_factory(name):
pass
another_module.py:-
from my_module import my_factory
def another_factory(name):
pass
Both modules in the same folder.
It gives me the error:
Error: cannot import name my_factory
As seen from the comments, you are trying to do a circle import which is impossible.
If in your module A you try to import something from the module B, and when loading the module B (to satisfy this dependency) you are trying to import something from the module A, you are where you started and you got a circle import: A needs B and B needs A!!, it is somehow like saying that A needs A, which is quite unlogic.
For instance:
# moduleA
from moduleB import functionB
...
So the interpreter tries to load the moduleB, which looks like the following:
# moduleB
from moduleA import functionA
...
And goes back to the moduleA, which tries again to import B, and, etc. Therefore python just raises the error and stops the insanity for a greater good.
Dependencies don't work like this. Define what module needs the other one, and just do a simple import. In your example, it seems that another_module needs my_module, so change my_module and eliminate the dependency on another_module.
If both modules actually need each other, it is a clear sign that they belong to the same logical concept, and should be merged.
PD: in some cases to avoid huge files, you can split a logical unit in two, and to avoid the circle dependencies, you write your imports inside of the functions (which are not executed at load time), so that there is not a circle. This is however in general something to avoid.
The real question is... do you consider each file as a module or are they part of a package ?
Trying to import modules outside a package is sometimes painful. You should rather build a package by simply creating an empty __init__.py module in the directory. Though, if you have
__init__.py
my_module.py
another_module.py
If you have te following function in my_module.py,
def my_factory(x):
return x * x
You should be able to access the my_factory() function from another_module.py by writing this :
from my_module import my_factory
But, if you don't have the __init__.py file/module, the import function will be (somehow) lost and will only use the sys.path for searching other modules. You may then add the following lines (before the import) in the another_module.py file :
sys.path.append(os.path.dirname(os.path.expanduser('.')))
You may also use the various packages available to help importing modules, like imp or import_file (see the documentation). Or you can decide to use load_source (also see the doc : https://docs.python.org/2/library/imp.html)
I have been creating a few modules organized by purpose, and each module contain a number number of functions. I would like to bundle these individual modules into a larger "package" that other users can import from a shared location.
Currently, I have all of my modules in one folder, called python_modules and I have appended this path to os.path so I can easily import my individual modules as needed.
However, I would like to instead import a single package, that contains all of my modules, so I don't have to import each one individually. I know that I could put all my modules into one file, but that doesn't seem like a good way to organize my processes.
Currently, I have to following files in my python_modules folder:
__init__.py
load_data_functions.py
parse_data_functions.py
network_functions.py
counting_function.py
math_functions.py
...
...
other_functions.py
The __init__.py file is empty and does not have anything inside of it. The other modules all have various functions inside of them, and some are dependent on others. For example, network_functions.py relies on load_data_functions.py and parse_data_functions.py.
As I said, I want to package all of these modules into a larger package that I can share with others, and so we don't have to import each module independently.
A Package is just a bunch of modules. You already have an __init__.py file, so python_modules is already a package. You should simply be able to import python_modules and then call functions from each individual module as load_data_functions.some_function(), parse_data_functions.some_other_function().
assuming "I would like to instead import a single package" means you want to be able to something like:
import python_modules
port = python_modules.network_functions.get_port()
your __init__.py file should look like:
from . import load_data_functions
from . import parse_data_functions
from . import network_functions
...
if you'd like to be able to do:
import python_modules
port = python_modules.get_port()
your __init__.py file should look like:
from .load_data_functions import *
from .parse_data_functions import *
from .network_functions import *
...
As for modules within the package referring to each other, you can use the same idea, e.g. at the top of network_funtions.py you'd want put from . import load_data_functions. You should structure things as to avoid circular imports.