I'm using Visual Studio Code and I want to use the panel Testing for my unittest tests. It's working for some but not all. Indeed when I import certains locals classes, VSCode doesn't recognize the test. If I remove the import the test is recognize.
Here is one of the class that I'm importing, I reduce it to the maximum:
class ConnectionAWS(InterfaceConnection):
def _connection(self) :
pass
def putFileInS3(self, file):
pass
Here the test :
import unittest
from repertoire.ConnectionAWS import ConnectionAWS
class TestConnectionAWS(unittest.TestCase):
def test_connection_success(self):
self.assertFalse(False)
def test_connection_failure(self):
self.assertFalse(False)
I didn't forget about init.py files.
Here the arborescence :
project
repertoire
ConnexionAWS.py
utils
Log.py
src
__init__.py
test
__init__.py
repertoire
__init__.py
test_connexionAWS.py
utils
__init__.py
test_log.py
test_log.py is recognized even if it's importing Log.py
Can someone explain to me where I'm wrong ?
When having a project tree like this one
/
└── weird_bug
├── main.py
└── model
├── directions.py
└── weirdo.py
and with this code in the specific classes:
main.py:
from model.directions import Direction
from model.weirdo import Weirdo
def print_direction():
player = Weirdo(Direction.up)
player.print_direction()
if __name__ == '__main__':
print_direction()
directions.py:
from enum import Enum
class Direction(Enum):
up, down = range(2)
and finally weirdo.py:
from directions import Direction
class Weirdo:
def __init__(self, direction):
self.direction = direction
def print_direction(self):
if self.direction == Direction.up:
print(self.direction)
I am receiving this error when I am calling main.py:
ModuleNotFoundError: No module named 'directions'
The error is triggered by the weirdo.py class. I am trying imports around for literally hours and was wondering if someone knows what is happening and can help me understand. Thanks!
When running main.py as an entrypoint, absolute path to directions is model.directions, and that is what needs to be in defined in the weirdo file import.
If you want to avoid specifying full path, you can use relative import in the form of from .directions import Direction in weirdo.py
I have a Python package mypackage which contains a bunch of modules / classes with this directory structure:
├── interface.py
├── mypackage/
| └── __init__.py
| └── classA.py
| └── classB.py
| └── ...
The current use case is to use interface.py with a bunch of argparse flags:
python interface.py --foo --bar
Inside interface.py it instantiates a bunch of the classes with mypackage and runs their methods. Something like:
from classA import ClassA
def interfaceMethod(foo, bar):
a = ClassA(foo, ...)
print(a.classMethod(bar, ...)
if args.foo: interfaceMethod(args.foo, args.bar)
This works well when getting non-python / programmers to use my code. But I'd like to also be able to import my package within their Python code and run the same methods in interface.py. Something like:
import mypackage
print(mypackage.interfaceMethod(foo, bar)
Question
Is there a standard/best way to do this?
Note: I don't think users need to see my class structure so I'd rather there be one user facing class which implements all of the methods in interface.py
Solution 1 (I don't think this is the preferred solution):
Add methods from interface.py into __init__.py:
# in __init__.py
from classA import ClassA
def interfaceMethod():
a = ClassA(foo)
print(a.classMethod(bar))
Then users can do the following in their own code (it would look very similar in interface.py as well):
import mypackage
mypackage.interfaceMethod()
Solution 2:
Create a mypackage class:
class MyPackage():
self.classA = ClassA(...)
self.classB = ClassB(...)
def interfaceMethod():
a = self.classA()
If I create this class should I worry about the package and class having the same name? Do I change the hierarchy of the package structure to reflect that MyPackage is the forward facing class?
A good way would to use a setup.py and use console_scripts
Put you interface.py inside you package and this to your setup.py:
setup(
# other arguments here...
entry_points={
'console_scripts': [
'script_name = my_package.interface:interfaceMethod',
],
}
)
Change your interface.py to:
from classA import ClassA
def interfaceMethod(foo, bar):
a = ClassA(foo, ...)
print(a.classMethod(bar, ...)
if __name__ == '__main__':
interfaceMethod(args.foo, args.bar)
Once you install with Python setup.py install, you can call your program
from the command line:
script_name --foo --bar
For details see the full documentation.
You can still import it with:
from mypackage import interface
interface.interfaceMethod()
Suppose I have a project organized as follows:
ProjectRoot/
__init__.py
test.py
A/
__init__.py
a_test.py
B/
__init__.py
b_test.py
And suppose that a_test depends on b_test. So the source code is relatively simple:
#
# a_test.py
#
from B.b_test import b_test_class
class a_test_class:
def func(self):
print("a_test_class")
b_instance = b_test_class()
b_instance.func()
if __name__ == "__main__":
a_instance = a_test_class()
a_instance.func()
#
# b_test.py
#
class b_test_class:
def func(self):
print("b_test_class")
#
# test.py
#
from A.a_test import a_test_class
if __name__ == "__main__":
a_instance = a_test_class()
a_instance.func()
As long as I launch test.py script, everything works as intended. Python loads all modules without any troubles and executes them. Now the question comes: how do I launch a_test.py without having test.py? So, basically, what I want to achieve is to cd into projectRoot/A and execute a_test.py. This results in getting ImportError: No module named 'B'
Currently I've been able to create a project with following structure:
ProjectRoot/
customLibModuleA/
...
customLibModuleB/
...
mainApp.py
And what I want to be able to create is following:
ProjectRoot/
customLibModuleA/ #custom protocol implementation
...
customLibModuleB/ #custom logging functions
...
application1/ #server
...
application2/ #client
...
How do I expected to manage complex projects? Any good references to project structuring manuals and styleguides are welcome.
Here's my temporal solution since no one provided pythonic approach.
Folder structure looks like that:
ProjectRoot/
__init__.py
customLibModuleA/ #custom protocol implementation
__init__.py
...
customLibModuleB/ #custom logging functions
__init__.py
...
application1/ #server
__init__.py
__path_setup__.py
server.py
...
application2/ #client
__init__.py
__path_setup__.py
client.py
...
__path_setup__.py content is:
import sys
import os
os.sys.path.append(os.path.join(os.path.dirname(os.path.abspath(__file__)), ".."))
Application scripts have some setup code preceding imports (server.py):
#
#settings up the environment
#
if __name__ == "__main__":
exec(open("./__path_setup__.py").read())
#
# system and libraries imports
#
import customLibModuleA.whatever
...
Hight quality pythonic solution to this problem is still welcome.
I have one large click application that I've developed, but navigating through the different commands/subcommands is getting rough. How do I organize my commands into separate files? Is it possible to organize commands and their subcommands into separate classes?
Here's an example of how I would like to separate it:
init
import click
#click.group()
#click.version_option()
def cli():
pass #Entry Point
command_cloudflare.py
#cli.group()
#click.pass_context
def cloudflare(ctx):
pass
#cloudflare.group('zone')
def cloudflare_zone():
pass
#cloudflare_zone.command('add')
#click.option('--jumpstart', '-j', default=True)
#click.option('--organization', '-o', default='')
#click.argument('url')
#click.pass_obj
#__cf_error_handler
def cloudflare_zone_add(ctx, url, jumpstart, organization):
pass
#cloudflare.group('record')
def cloudflare_record():
pass
#cloudflare_record.command('add')
#click.option('--ttl', '-t')
#click.argument('domain')
#click.argument('name')
#click.argument('type')
#click.argument('content')
#click.pass_obj
#__cf_error_handler
def cloudflare_record_add(ctx, domain, name, type, content, ttl):
pass
#cloudflare_record.command('edit')
#click.option('--ttl', '-t')
#click.argument('domain')
#click.argument('name')
#click.argument('type')
#click.argument('content')
#click.pass_obj
#__cf_error_handler
def cloudflare_record_edit(ctx, domain):
pass
command_uptimerobot.py
#cli.group()
#click.pass_context
def uptimerobot(ctx):
pass
#uptimerobot.command('add')
#click.option('--alert', '-a', default=True)
#click.argument('name')
#click.argument('url')
#click.pass_obj
def uptimerobot_add(ctx, name, url, alert):
pass
#uptimerobot.command('delete')
#click.argument('names', nargs=-1, required=True)
#click.pass_obj
def uptimerobot_delete(ctx, names):
pass
The downside of using CommandCollection for this is that it merges your commands and works only with command groups. The imho better alternative is to use add_command to achieve the same result.
I have a project with the following tree:
cli/
├── __init__.py
├── cli.py
├── group1
│ ├── __init__.py
│ ├── commands.py
└── group2
├── __init__.py
└── commands.py
Each subcommand has its own module, what makes it incredibly easy to manage even complex implementations with many more helper classes and files. In each module, the commands.py file contains the #click annotations. Example group2/commands.py:
import click
#click.command()
def version():
"""Display the current version."""
click.echo(_read_version())
If necessary, you could easily create more classes in the module, and import and use them here, thus giving your CLI the full power of Python's classes and modules.
My cli.py is the entry point for the whole CLI:
import click
from .group1 import commands as group1
from .group2 import commands as group2
#click.group()
def entry_point():
pass
entry_point.add_command(group1.command_group)
entry_point.add_command(group2.version)
With this setup, it is very easy to separate your commands by concerns, and also build additional functionality around them that they might need. It has served me very well so far...
Reference:
http://click.pocoo.org/6/quickstart/#nesting-commands
Suppose your project have the following structure:
project/
├── __init__.py
├── init.py
└── commands
├── __init__.py
└── cloudflare.py
Groups are nothing more than multiple commands and groups can be nested. You can separate your groups into modules and import them on you init.py file and add them to the cli group using the add_command.
Here is a init.py example:
import click
from .commands.cloudflare import cloudflare
#click.group()
def cli():
pass
cli.add_command(cloudflare)
You have to import the cloudflare group which lives inside the cloudflare.py file. Your commands/cloudflare.py would look like this:
import click
#click.group()
def cloudflare():
pass
#cloudflare.command()
def zone():
click.echo('This is the zone subcommand of the cloudflare command')
Then you can run the cloudflare command like this:
$ python init.py cloudflare zone
This information is not very explicit on the documentation but if you look at the source code, which is very well commented, you can see how groups can be nested.
It took me a while to figure this out
but I figured I'd put this here to remind myself when I forget how to do i again
I think part of the problem is that the add_command function is mentioned on click's github page but not the main examples page
first lets create an initial python file called root.py
import click
from cli_compile import cli_compile
from cli_tools import cli_tools
#click.group()
def main():
"""Demo"""
if __name__ == '__main__':
main.add_command(cli_tools)
main.add_command(cli_compile)
main()
Next lets put some tools commands in a file called cli_tools.py
import click
# Command Group
#click.group(name='tools')
def cli_tools():
"""Tool related commands"""
pass
#cli_tools.command(name='install', help='test install')
#click.option('--test1', default='1', help='test option')
def install_cmd(test1):
click.echo('Hello world')
#cli_tools.command(name='search', help='test search')
#click.option('--test1', default='1', help='test option')
def search_cmd(test1):
click.echo('Hello world')
if __name__ == '__main__':
cli_tools()
Next lets put some compile commands in a file called cli_compile.py
import click
#click.group(name='compile')
def cli_compile():
"""Commands related to compiling"""
pass
#cli_compile.command(name='install2', help='test install')
def install2_cmd():
click.echo('Hello world')
#cli_compile.command(name='search2', help='test search')
def search2_cmd():
click.echo('Hello world')
if __name__ == '__main__':
cli_compile()
running root.py should now give us
Usage: root.py [OPTIONS] COMMAND [ARGS]...
Demo
Options:
--help Show this message and exit.
Commands:
compile Commands related to compiling
tools Tool related commands
running "root.py compile" should give us
Usage: root.py compile [OPTIONS] COMMAND [ARGS]...
Commands related to compiling
Options:
--help Show this message and exit.
Commands:
install2 test install
search2 test search
You'll also notice you can run the cli_tools.py or cli_compile.py directly as well as I included a main statement in there
I'm looking for something like this at the moment, in your case is simple because you have groups in each of the files, you can solve this problem as explained in the documentation:
In the init.py file:
import click
from command_cloudflare import cloudflare
from command_uptimerobot import uptimerobot
cli = click.CommandCollection(sources=[cloudflare, uptimerobot])
if __name__ == '__main__':
cli()
The best part of this solution is that is totally compliant with pep8 and other linters because you don't need to import something you wouldn't use and you don't need to import * from anywhere.
edit: just realized that my answer/comment is little more than a rehash of what Click's official docs offer in the "Custom Multi Commands" section: https://click.palletsprojects.com/en/7.x/commands/#custom-multi-commands
Just to add to the excellent, accepted answer by #jdno, I came up with a helper function that auto-imports and auto-adds subcommand modules, which vastly cut down on the boilerplate in my cli.py:
My project structure is this:
projectroot/
__init__.py
console/
│
├── cli.py
└── subcommands
├── bar.py
├── foo.py
└── hello.py
Each subcommand file looks something like this:
import click
#click.command()
def foo():
"""foo this is for foos!"""
click.secho("FOO", fg="red", bg="white")
(for now, I just have one subcommand per file)
In cli.py, I've written a add_subcommand() function that loops through every filepath globbed by "subcommands/*.py" and then does the import and add command.
Here's what the body of the cli.py script is simplified to:
import click
import importlib
from pathlib import Path
import re
#click.group()
def entry_point():
"""whats up, this is the main function"""
pass
def main():
add_subcommands()
entry_point()
if __name__ == '__main__':
main()
And this is what the add_subcommands() function looks like:
SUBCOMMAND_DIR = Path("projectroot/console/subcommands")
def add_subcommands(maincommand=entry_point):
for modpath in SUBCOMMAND_DIR.glob('*.py'):
modname = re.sub(f'/', '.', str(modpath)).rpartition('.py')[0]
mod = importlib.import_module(modname)
# filter out any things that aren't a click Command
for attr in dir(mod):
foo = getattr(mod, attr)
if callable(foo) and type(foo) is click.core.Command:
maincommand.add_command(foo)
I don't know how robust this is if I were to design a command that had several levels of nesting and context switching. But it seems to work all right for now :)
I'm not an click expert, but it should work by just importing your files into the main one. I would move all commands in separate files and have one main file importing the other ones. That way it is easier to control the exact order, in case it is important for you. So your main file would just look like:
import commands_main
import commands_cloudflare
import commands_uptimerobot
When you want your user to pip install "your_module", and then use commands, you can add them in setup.py entry_points as a list:
entry_points={
'console_scripts': [
'command_1 = src.cli:function_command_1',
'command_2 = src.cli:function_command_2',
]
each command is bounded to function in a cli file.