I am trying to write a library (sort of) for my project. I would like to keep it as modularized as possible. I have a main.py in the root directory and I have two packages required for running the main. I am calling them package1 and package2. The packages in each of them will inherit from their own base packages. The overall tree is as following:
.
├── main.py
├── package1
│ ├── base_package1.py
│ ├── __init__.py
│ └── main_package1.py
└── package2
├── base_package2.py
├── __init__.py
└── main_package2.py
In the file for each package, I am also writing the code to test them. But to test package 2, I also need package 1. My question is about how to import package 1 to package 2. main.py is currently empty. I will share the contents of the remaining files here.
# package1/__init__.py
from .main_package1 import *
# package1/base_package1.py
class BasePackage1:
def __init__(self):
print("init base package 1")
# package1/main_package1.py
from base_package1 import BasePackage1
class MainPackage1(BasePackage1):
def __init__(self):
super().__init__()
print("init main package 1")
if __name__ == "__main__":
# Test MainPackage1
pkg = MainPackage1()
# package2/base_package2.py
class BasePackage2:
def __init__(self):
print("init base package 2")
# package2/main_package2.py
from base_package2 import BasePackage2
class MainPackage2(BasePackage2):
def __init__(self):
super().__init__()
print("init main package 2")
if __name__ == "__main__":
# Option 1
import sys
sys.path.append("../")
from package1 import MainPackage1
# Error:
# ModuleNotFoundError: No module named 'base_package1'
# Option 2
from ..package1 import MainPackage1
# Error:
# ImportError: attempted relative import with no known parent package
What is the correct way to do this? Is there a standard way to do this?
Update: I have found a solution that works for now and posted it as an answer below. If you have a better solution, please post it.
In package2/base_package2.py if you want to reference package1.base_package1 you can import that directly as your current reference point is where main.py is. Absolute imports (those which don't start with a dot) start from your working directory, so you can import package1 from anywhere in package2 and it will be the same.
The reason why you get
ImportError: attempted relative import with no known parent package
is because you are running package2/base_package.py as a main file, which would run as its own script. However, you have provided three files with paths given as package2/base_package.py. I'm assuming that the ones with __main__ are actually main_package1 and main_package2 respectively.
After trying a few other things, I found a solution that seems to be work. I will post it here but will not select it as the accepted answer in case someone comes up with a better and more elegant solution.
In package2/main_package2.py, I did the following import:
import sys
sys.path.append("../package1")
from main_package1 import MainPackage1
I have a Python package mypackage which contains a bunch of modules / classes with this directory structure:
├── interface.py
├── mypackage/
| └── __init__.py
| └── classA.py
| └── classB.py
| └── ...
The current use case is to use interface.py with a bunch of argparse flags:
python interface.py --foo --bar
Inside interface.py it instantiates a bunch of the classes with mypackage and runs their methods. Something like:
from classA import ClassA
def interfaceMethod(foo, bar):
a = ClassA(foo, ...)
print(a.classMethod(bar, ...)
if args.foo: interfaceMethod(args.foo, args.bar)
This works well when getting non-python / programmers to use my code. But I'd like to also be able to import my package within their Python code and run the same methods in interface.py. Something like:
import mypackage
print(mypackage.interfaceMethod(foo, bar)
Question
Is there a standard/best way to do this?
Note: I don't think users need to see my class structure so I'd rather there be one user facing class which implements all of the methods in interface.py
Solution 1 (I don't think this is the preferred solution):
Add methods from interface.py into __init__.py:
# in __init__.py
from classA import ClassA
def interfaceMethod():
a = ClassA(foo)
print(a.classMethod(bar))
Then users can do the following in their own code (it would look very similar in interface.py as well):
import mypackage
mypackage.interfaceMethod()
Solution 2:
Create a mypackage class:
class MyPackage():
self.classA = ClassA(...)
self.classB = ClassB(...)
def interfaceMethod():
a = self.classA()
If I create this class should I worry about the package and class having the same name? Do I change the hierarchy of the package structure to reflect that MyPackage is the forward facing class?
A good way would to use a setup.py and use console_scripts
Put you interface.py inside you package and this to your setup.py:
setup(
# other arguments here...
entry_points={
'console_scripts': [
'script_name = my_package.interface:interfaceMethod',
],
}
)
Change your interface.py to:
from classA import ClassA
def interfaceMethod(foo, bar):
a = ClassA(foo, ...)
print(a.classMethod(bar, ...)
if __name__ == '__main__':
interfaceMethod(args.foo, args.bar)
Once you install with Python setup.py install, you can call your program
from the command line:
script_name --foo --bar
For details see the full documentation.
You can still import it with:
from mypackage import interface
interface.interfaceMethod()
i try to run a unittest with unittest.TestLoader and unittest.TextTestRunner but get an ModuleNotFoundError everytime i try to run the 'main' test file (here: test_all.py). I have the following file structure:
src.
test_all.py
dir1.
__init__.py
module1.py
submodule1.py
test_module1.py
dir2.
__init__.py
module2.py
submodule2.py
test_module2.py
dir3.
...
The test_all.py file looks like this:
# test_all.py
import os
import unittest
loader = unittest.TestLoader()
suite = loader.discover(os.getcwd())
runner = unittest.TextTestRunner()
runner.run(suite)
And finally the structure of the single testcases look like this:
# test_module1.py
import unittest
from module1 import Module1
class Module1TestCase(unittest.TestCase):
def setUp(self):
# do something
def test_something(self):
# test test
def tearDown(self):
# do something
if __name__ == '__main__':
unittest.main()
So, running the test_all.py always results in an ModuleNotFoundError referencing to the from module1 import Module1 inside the TestCase test_module1.py (and the same in the following TestCases). As far as i can tell there are no circular dependencies. Maybe adding the current Path to the PythonPath would work, but it really makes no sense to me: on the one hand i run the test_all.pyas main in the current directory and on the other the unittest.TestLoader.discover() already takes the current path.
PS: I know that putting all the TestCases in one folder is way better. But first i want to figure out why this is not working. Thanks!
I have one large click application that I've developed, but navigating through the different commands/subcommands is getting rough. How do I organize my commands into separate files? Is it possible to organize commands and their subcommands into separate classes?
Here's an example of how I would like to separate it:
init
import click
#click.group()
#click.version_option()
def cli():
pass #Entry Point
command_cloudflare.py
#cli.group()
#click.pass_context
def cloudflare(ctx):
pass
#cloudflare.group('zone')
def cloudflare_zone():
pass
#cloudflare_zone.command('add')
#click.option('--jumpstart', '-j', default=True)
#click.option('--organization', '-o', default='')
#click.argument('url')
#click.pass_obj
#__cf_error_handler
def cloudflare_zone_add(ctx, url, jumpstart, organization):
pass
#cloudflare.group('record')
def cloudflare_record():
pass
#cloudflare_record.command('add')
#click.option('--ttl', '-t')
#click.argument('domain')
#click.argument('name')
#click.argument('type')
#click.argument('content')
#click.pass_obj
#__cf_error_handler
def cloudflare_record_add(ctx, domain, name, type, content, ttl):
pass
#cloudflare_record.command('edit')
#click.option('--ttl', '-t')
#click.argument('domain')
#click.argument('name')
#click.argument('type')
#click.argument('content')
#click.pass_obj
#__cf_error_handler
def cloudflare_record_edit(ctx, domain):
pass
command_uptimerobot.py
#cli.group()
#click.pass_context
def uptimerobot(ctx):
pass
#uptimerobot.command('add')
#click.option('--alert', '-a', default=True)
#click.argument('name')
#click.argument('url')
#click.pass_obj
def uptimerobot_add(ctx, name, url, alert):
pass
#uptimerobot.command('delete')
#click.argument('names', nargs=-1, required=True)
#click.pass_obj
def uptimerobot_delete(ctx, names):
pass
The downside of using CommandCollection for this is that it merges your commands and works only with command groups. The imho better alternative is to use add_command to achieve the same result.
I have a project with the following tree:
cli/
├── __init__.py
├── cli.py
├── group1
│ ├── __init__.py
│ ├── commands.py
└── group2
├── __init__.py
└── commands.py
Each subcommand has its own module, what makes it incredibly easy to manage even complex implementations with many more helper classes and files. In each module, the commands.py file contains the #click annotations. Example group2/commands.py:
import click
#click.command()
def version():
"""Display the current version."""
click.echo(_read_version())
If necessary, you could easily create more classes in the module, and import and use them here, thus giving your CLI the full power of Python's classes and modules.
My cli.py is the entry point for the whole CLI:
import click
from .group1 import commands as group1
from .group2 import commands as group2
#click.group()
def entry_point():
pass
entry_point.add_command(group1.command_group)
entry_point.add_command(group2.version)
With this setup, it is very easy to separate your commands by concerns, and also build additional functionality around them that they might need. It has served me very well so far...
Reference:
http://click.pocoo.org/6/quickstart/#nesting-commands
Suppose your project have the following structure:
project/
├── __init__.py
├── init.py
└── commands
├── __init__.py
└── cloudflare.py
Groups are nothing more than multiple commands and groups can be nested. You can separate your groups into modules and import them on you init.py file and add them to the cli group using the add_command.
Here is a init.py example:
import click
from .commands.cloudflare import cloudflare
#click.group()
def cli():
pass
cli.add_command(cloudflare)
You have to import the cloudflare group which lives inside the cloudflare.py file. Your commands/cloudflare.py would look like this:
import click
#click.group()
def cloudflare():
pass
#cloudflare.command()
def zone():
click.echo('This is the zone subcommand of the cloudflare command')
Then you can run the cloudflare command like this:
$ python init.py cloudflare zone
This information is not very explicit on the documentation but if you look at the source code, which is very well commented, you can see how groups can be nested.
It took me a while to figure this out
but I figured I'd put this here to remind myself when I forget how to do i again
I think part of the problem is that the add_command function is mentioned on click's github page but not the main examples page
first lets create an initial python file called root.py
import click
from cli_compile import cli_compile
from cli_tools import cli_tools
#click.group()
def main():
"""Demo"""
if __name__ == '__main__':
main.add_command(cli_tools)
main.add_command(cli_compile)
main()
Next lets put some tools commands in a file called cli_tools.py
import click
# Command Group
#click.group(name='tools')
def cli_tools():
"""Tool related commands"""
pass
#cli_tools.command(name='install', help='test install')
#click.option('--test1', default='1', help='test option')
def install_cmd(test1):
click.echo('Hello world')
#cli_tools.command(name='search', help='test search')
#click.option('--test1', default='1', help='test option')
def search_cmd(test1):
click.echo('Hello world')
if __name__ == '__main__':
cli_tools()
Next lets put some compile commands in a file called cli_compile.py
import click
#click.group(name='compile')
def cli_compile():
"""Commands related to compiling"""
pass
#cli_compile.command(name='install2', help='test install')
def install2_cmd():
click.echo('Hello world')
#cli_compile.command(name='search2', help='test search')
def search2_cmd():
click.echo('Hello world')
if __name__ == '__main__':
cli_compile()
running root.py should now give us
Usage: root.py [OPTIONS] COMMAND [ARGS]...
Demo
Options:
--help Show this message and exit.
Commands:
compile Commands related to compiling
tools Tool related commands
running "root.py compile" should give us
Usage: root.py compile [OPTIONS] COMMAND [ARGS]...
Commands related to compiling
Options:
--help Show this message and exit.
Commands:
install2 test install
search2 test search
You'll also notice you can run the cli_tools.py or cli_compile.py directly as well as I included a main statement in there
I'm looking for something like this at the moment, in your case is simple because you have groups in each of the files, you can solve this problem as explained in the documentation:
In the init.py file:
import click
from command_cloudflare import cloudflare
from command_uptimerobot import uptimerobot
cli = click.CommandCollection(sources=[cloudflare, uptimerobot])
if __name__ == '__main__':
cli()
The best part of this solution is that is totally compliant with pep8 and other linters because you don't need to import something you wouldn't use and you don't need to import * from anywhere.
edit: just realized that my answer/comment is little more than a rehash of what Click's official docs offer in the "Custom Multi Commands" section: https://click.palletsprojects.com/en/7.x/commands/#custom-multi-commands
Just to add to the excellent, accepted answer by #jdno, I came up with a helper function that auto-imports and auto-adds subcommand modules, which vastly cut down on the boilerplate in my cli.py:
My project structure is this:
projectroot/
__init__.py
console/
│
├── cli.py
└── subcommands
├── bar.py
├── foo.py
└── hello.py
Each subcommand file looks something like this:
import click
#click.command()
def foo():
"""foo this is for foos!"""
click.secho("FOO", fg="red", bg="white")
(for now, I just have one subcommand per file)
In cli.py, I've written a add_subcommand() function that loops through every filepath globbed by "subcommands/*.py" and then does the import and add command.
Here's what the body of the cli.py script is simplified to:
import click
import importlib
from pathlib import Path
import re
#click.group()
def entry_point():
"""whats up, this is the main function"""
pass
def main():
add_subcommands()
entry_point()
if __name__ == '__main__':
main()
And this is what the add_subcommands() function looks like:
SUBCOMMAND_DIR = Path("projectroot/console/subcommands")
def add_subcommands(maincommand=entry_point):
for modpath in SUBCOMMAND_DIR.glob('*.py'):
modname = re.sub(f'/', '.', str(modpath)).rpartition('.py')[0]
mod = importlib.import_module(modname)
# filter out any things that aren't a click Command
for attr in dir(mod):
foo = getattr(mod, attr)
if callable(foo) and type(foo) is click.core.Command:
maincommand.add_command(foo)
I don't know how robust this is if I were to design a command that had several levels of nesting and context switching. But it seems to work all right for now :)
I'm not an click expert, but it should work by just importing your files into the main one. I would move all commands in separate files and have one main file importing the other ones. That way it is easier to control the exact order, in case it is important for you. So your main file would just look like:
import commands_main
import commands_cloudflare
import commands_uptimerobot
When you want your user to pip install "your_module", and then use commands, you can add them in setup.py entry_points as a list:
entry_points={
'console_scripts': [
'command_1 = src.cli:function_command_1',
'command_2 = src.cli:function_command_2',
]
each command is bounded to function in a cli file.
I have a module I need to test that calls a function on import but I cannot call this function for various reasons. So I am mocking this function but even mocking it calls import.
For example I am testing mod1.py that looks like this:
import os
def bar():
return 'foo'
def dont_call():
os.listdir("C:\\tmp")
dont_call()
And my test looks something like this:
import mock
#mock.patch("mod1.dont_call")
def test_mod1(mock_dont_call):
import mod1
assert mod1.bar()=='foo'
if __name__=="__main__":
test_mod1()
The problem is os.listdir is called.
I cannot change mod1 so what can I do?
I am using python2.7.
To put this in context I am testing a module that opens a database connection on import which I do not agree with but I can see the reasoning behind it. Unfortunately I cannot access this database on my QA machine.
If you want code to 'not' be executed on import put them inside the following condition:
In mod1.py, do the following:
if __name__=="__main__":
dont_call()
This is because, by default when you import a python module, all the code in it gets executed. By adding the above condition, you are explicitly stating that dont_call() is to be called only when the file it run as a script and not when it is imported in other modules.
The workaround I found was to mock what dont_call was calling giving me something like this:
import mock
#mock.patch("os.listdir")
def test_mod1(mock_dont_call):
import mod1
assert mod1.bar()=='foo'
if __name__=="__main__":
test_mod1()
Check your dir
$tree.
test_shot/
├── mod1.py
├── __pycache__
│ └── mod1.cpython-310.pyc
└── test.py
Below code works fine for me.
mod1.py
import os
def bar():
return 'foo'
def dont_call():
os.listdir(".")
def call_this():
print('called this')
call_this()
dont_call()
test.py
import mock
#mock.patch("mod1.dont_call")
def test_mod1(mock_dont_call):
import mod1
assert mod1.bar()=='foo'
if __name__=="__main__":
test_mod1()
Here is output:
$cd test_shot
$python3 test.py
called this