Testing with configuration files - python

my Google-fu has failed me by giving me results I don't understand, so I'm asking here.
I'm working on a Python project and I currently have a configuration file, which is also .py, that holds various python objects to load when everything starts. I'm trying to get some practice unit testing with pytest and I don't know exactly how to go about this issue. My guess is that I am probably going to make a dedicated testing config file that doesn't change, but I have no clue how to tell my code when to use the actual config file and when to use the testing config file. My current setup in my code is just using import config and setting values from it. I would greatly appreciate some help here!

An approach I've used to solve this problem is manipulating the path when running tests so that an additional package (/tests/resources) can shadow the normal resources package (/main/resources) that I set up for containing assets such as configuration files and making them available for loading at runtime.
Note: This structure is inspired by a pattern that I've brought from Java/Maven, so I won't claim it's Pythonic, and I don't like the path manipulation where the tests are concerned (I know a lot of others won't, either, so beware!). In fact, I found this question while looking for a better way to do this.
To accomplish this you first have to set up a folder to serve as your 'resources' package (See the docs). Once that's done, you create a similar version for your tests, in the tests folder. The directory structure will look something like this:
project
|-main
| |-resources
| | |-__init__.py
| | \-application.yml
| \-[...other source modules/packages...]
|-tests
| |-resources
| | |-__init__.py
| | \-application.yml
| \-[...other modules/packages with tests...]
\-run_tests.py
Note that main and tests are NOT packages. This is what contradicts a lot of conventional guidance in Python project structure. Instead, they're both added to the path, with tests being inserted into the path in front of main, by the run_tests.py script using code like this (some bits borrowed from this post:
import os.path
import sys
import pytest
ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
sys.path.insert(0, os.path.join(ROOT_DIR, 'main'))
sys.path.insert(0, os.path.join(ROOT_DIR, 'tests'))
# Add any required args for testing.
args = []
pytest.main(args)
I'll typcially have additional options I want to feed pytest, and that's what args is for, but it's not relevant to this answer, so I left it empty here.
I know this is a late answer, but I hope this helps someone. Also, I know it's a controversial approach, but it's the most satisfying approach I've found so far. That may be because I'm experienced in Java as well as Python. To others, the approach of trying the import of tests.resources, and importing main.resources if it fails may be preferable.
I hope anyone with other approaches to this will share in comments, or post additional answers.

Figured out something based on this.
I ended up creating a testing config file in my folder named "tests" and having this code at the top of every file that used it:
import sys
if "pytest" in sys.modules:
import tests.testing_config as config
else:
import config
I don't think it's quite optimal since I feel something like this should be in the testing code, plus it kind of makes the new config file a dependency that should never be changed lest you break everything, but I guess it works for now if you have no clue how these cracked testing libraries work like me.

Related

Unused import statement: classic problem while importing own module

OK, prerequisites:
It's my first ever Python project. I used to do some scripting but never anything bigger
So I'm at the very beginning of a learning curve. It's like when you can't kill an ant in Fallout 2 Temple level. On later levels, I was really good in Fall 2:)
Problem:
I can't figure out how to import a module written by me and placed in a different folder
Context:
The project I'm intended to create is meant to do a lot of measures conversions. So I decided to store in DB all data in the same unit system & keep all conversions upon user preferences on a codebase level
In a different folder I decided to store tests. To write the very first one (testing the abovementioned module) I need to import the module, but here is the story begins. I know it's classic, but I'm completely messed with import
Toolkit:
PyCharm Pro (PyCharm 2021.3.1)
Python 3.7 interpreter
macOS 10.15, Intell
Set up:
Settings screenshot provided
Project structure. Folders are marked as Source & Test
I need to import from conversions.py to test_conversions.py
PYTHONPATH settings like this
What do I, for the sake of God, need:
with all the abovementioned, how do I import conversions.py to test_conversions.py or any other place of my project? I read a number of articles and it's getting me anywhere (contradictory, 2.x related, etc). I feel like I need a piece of more foundational info but as well I need a clear walkthrough, a code snippet to import bloody file, I really appreciate any kind of advice
imports are a bit tricky. The issue you have is where your python is looking for packages. I would discourage you to add to your PYTHONPATH a specific project but you could do that locally in your file.
A much easier way is just to launch your test file from the top directory.
In this case your import will just be import conversion.conversion
And then you can launch your test from the root folder with python -m tests.conversion.
In Pycharm you can use the interface to deal with that, see this link. But I like the python -m because it works from anywhere, not only inside Pycharm.
make a class inside a conversion.py, then you can import it from test_conversion.py.
conversion.py
class convert():
def km_to_mm(input):
output = input * 1000000
return output
then import it in test_conversion.py
input = 0.001 # specify your input value
from conversion import convert
converted = convert().a_to_b(input)
converted will have value 1000
make sure you use the same folder. otherwise should use folder name to import. like
import foldername.conversion
from foldername.conversion import convert
I really appreciate all of you who tried to help. I got the problem solved in a very ridiculous manner, below is to someone who might face the same issue in the future.
So, in case you see the next:
You use PyCharm (no idea how other IDEs behave)
You created a module & want to import it into other files of your project
You type import module_name
While you type it, the string looks active and autocomplete even proposes you your module name but as only you finished typing, the import string turns grey, PyCharm throws you a warning saying Unused import statement, yellow bulb next to the import string suggests you delete the import string
--> This does not mean you are not able to import your module, it means you've done it and now can call anything from your module in the code below.
This taught me to pay some more time to read docs before jumping to using anything new and think better about UX in anything I do.

How to structure a python package made of multiple sub-projects?

what is the correct way to structure a python package for multiple functionalities?
I have a case where my project can be broken into 3 completely separate, logical chunks that are entirely decoupled. As such, I have broken the sub-projects into respective folders in my python package project. This has led to a file structure like this;
foobartoo
|_setup.py
|_foobartoo
|_foo
| |_foo.py
|
|_bar
| |_bar.py
|
|_too
|_too.py
This method keeps things together but separate nicely, but after installing I have noticed a definite problem. The only way to access one of the files is to do
from foobartoo.foo.foo import <method/class>
or
import foobartoo.foo.foo as <something>
<something>.<method/class>
This seems extremely impractical.
I can see two alternatives
scrapping the folder system and having foo.py, bar.py and too.py in the same directory under foobartoo.
This seems bad because it will be impossible to know which file/code belongs to which of the projects
breaking the single package into multiple packages.
This seems ok, but it would be better if the 3 things were together as they are intended to be used together.
I have looked at numpy and its source code, and somehow they seem to have a lot of their functionality behind many folders, yet they dont need to reference any of the folders. This seems ideal, just being able to do something like
import foobartoo
foobartoo.<classname/methodname>()
You can add additional paths to your 'main' script so sys will search automatically to many directories like:
import sys
sys.path.append(example_sub_dir)
import sub_script
sys.path.remove(example_sub_dir)
!note that you can add many directories on sys to search but you need to care as import time will be rise respectivelly

relative import from utility script in subdirectory

From reading over other answers, it seems that possibly my layout is "un-Pythonic," although I'm really not quite sure. If so that would be helpful to know, along with a suggestion for the better layout.
Here is my script layout:
/
__init__.py
main_prog.py
utilities.py
/support_scripts
support_utility1.py
support_utility2.py
...
The support utilities contain functionality which are related to main_prog.py, but are best placed in their own scripts. Since there are many of them, I have moved them in their own directory. But they use some of the same functionality from utilities.py
When I try to import using from .. import utilities I get the error message "ValueError: attempted relative import beyond top-level package"
Now my first question would simply be: Is it considered bad to try to place additional scripts in subdirectories like this? Knowing a general principle like that would go a long way to solving my problems. And of course if you have any specific suggestions that will help too.

Import from parent directory for a test sub-directory without using packaging, Python 2.7

TL;DR
For a fixed and unchangeable non-package directory structure like this:
some_dir/
mod.py
test/
test_mod.py
example_data.txt
what is a non-package way to enable test_mod.py to import from mod.py?
I am restricted to using Python 2.7 in this case.
I want to write a few tests for the functions in mod.py. I want to create a new directory test that sits alongside mod.py and inside it there is one test file test/test_mod.py which should import the functions from mod.py and test them.
Because of the well-known limitations of relative imports, which rely on package-based naming, you can't do this in the straightforward way. Yet, all advice on the topic suggests to build the script as a package and then use relative imports, which is impossible for my use case.
In my case, it is not allowable for mod.py to be built as a package and I cannot require users of mod.py to install it. They are instead free to merely check the file out from version control and just begin using it however they wish, and I am not able to change that circumstance.
Given this, what is a way to provide a simple, straightforward test directory?
Note: not just a test file that sits alongside mod.py, but an actual test directory since there will be other assets like test data that come with it, and the sub-directory organization is critical.
I apologize if this is a duplicate, but out of the dozen or so permutations of this question I've seen in my research before posting, I haven't seen a single one that addresses how to do this. They all say to use packaging, which is not a permissible option for my case.
Based on #mgilson 's comment, I added a file import_helper.py to the test directory.
some_dir/
mod.py
test/
test_mod.py
import_helper.py
example_data.txt
Here is the content of import_helper.py:
import sys as _sys
import os.path as _ospath
import inspect as _inspect
from contextlib import contextmanager as _contextmanager
#_contextmanager
def enable_parent_import():
path_appended = False
try:
current_file = _inspect.getfile(_inspect.currentframe())
current_directory = _ospath.dirname(_ospath.abspath(current_file))
parent_directory = _ospath.dirname(current_directory)
_sys.path.insert(0, parent_directory)
path_appended = True
yield
finally:
if path_appended:
_sys.path.pop(0)
and then in the import section of test_mod.py, prior to an attempt to import mod.py, I have added:
import unittest
from import_helper import enable_parent_import
with enable_parent_import():
from mod import some_mod_function_to_test
It is unfortunate to need to manually mangle PYTHONPATH, but writing it as a context manager helps a little, and restores sys.path back to its original state prior to the parent directory modification.
In order for this solution to scale across multiple instances of this problem (say tomorrow I am asked to write a widget.py module for some unrelated tasks and it also cannot be distributed as a package), I have to replicate my helper function and ensure a copy of it is distributed with any tests, or I have to write that small utility as a package, ensure it gets globally installed across my user base, and then maintain it going forward.
When you manage a lot of Python code internal to a company, often one dysfunctional code distribution mode that occurs is that "installing" some Python code becomes equivalent to checking out the new version from version control.
Since the code is often extremely localized and specific to a small set of tasks for a small subset of a larger team, maintaining the overhead for sharing code via packaging (even if it is a better idea in general) will simply never happen.
As a result, I feel the use case I describe above is extremely common for real-world Python, and it would be nice if some import tools added this functionality for modifying PYTHONPATH, which some sensible default choices (like adding parent directory) being very easy.
That way you could rely on this at least being part of the standard library, and not needing to roll your own code and ensure it's either shipped with your tests or installed across your user base.

Python imports-Need someone to check this please

Ok yes it is a very silly question, but just that I am getting a little confused.
I have a file structure which looks like this:-
-Mainapplication
-models.py
-Helpingmodules
-Folder1
-module1.py
Now I have to import models into module1. So in module1.py I just did:-
from Mainapplication import models
Now this does work fine, but I get a feeling that it might be wrong. Can someone please let me know if this is the correct way.
There's nothing wrong with the import, but if the names of your packages are accurate, this looks like a design flaw in that you're destroying code reusability; I'd expect a package of "helping modules" to be independent of the application they're helping (although in the case the package name is so vague that I could be way off about their purpose.)
There's nothing wrong with your import.
You could say:
import Mainapplication.models
but then you'd have to reference models with its Package prefix every time you used it, e.g.:
Mainapplication.models.foo("bar")
The way you've done it allows you to use the following form which is usually preferable:
models.foo("bar")
For the full story you can read the documentation.

Categories