Python behave fixture on feature level not loaded - python

This is somewhat related to this question, but I have some further problems in this minimal example below.
For a feature test I prepared a fixture which backs up a file which shall be modified during the test run (e.g. a line is appended). After the test run this fixture restores the original file.
Project Files:
└───features
│ environment.py
│ modify_file.feature
│
└───steps
file_ops.py
#!/usr/bin/env python
# FILE: features/environment.py
import logging
from behave import fixture
from behave.runner import Context
logger = logging.getLogger(__name__)
#fixture
def backup_file(context: Context):
"""
A file will be modified during the feature test.
This fixture shall backup the file before the feature test
and restore the backup after the test.
"""
file = Path.home() / "important.txt"
backup_suffix = ".backup"
file.touch()
file.replace(file.with_suffix(backup_suffix))
logger.info("File backed up")
yield
file.with_suffix(backup_suffix).replace(file)
logger.info("File restored")
# FILE: features/modify_file.feature
#fixture.backup.file
Feature: Modify file
#wip
Scenario: Append a line to a file
Given the file exists
When I append a line to the file
Then the line appears at the end of the file
#!/usr/bin/env python
# File features/steps/file_ops.py
from pathlib import Path
from behave import given
from behave import when
from behave import then
from behave.runner import Context
import logging
logger = logging.getLogger(__name__)
file = Path.home() / "important.txt"
#given("the file exists")
def step_impl(context: Context):
logger.info(f"Touching file")
file.touch()
#when("I append a line to the file")
def step_impl(context: Context):
logger.info(f"Appending a line to file")
context.appended = "Test line appended\n"
with open(file, mode="a") as f:
f.write(context.appended)
#then("the line appears at the end of the file")
def step_impl(context: Context):
logger.info(f"Checking if line was appended")
with open(file, mode="r") as f:
for line in f:
pass
logger.info(f"Last line is '{line.strip()}'")
assert line == context.appended
I want to apply the fixture at the feature level before all scenarios are run. The file shall be restored after all scenarios have run. However, this is apparently not the case.
When I run behave -w (no log capture, wip tags only), I don't see any log lines from the fixture being output and also with every run I see another line appended to the file. This means the file is not being backed up and restored. The fixture is not applied even if in the I move the fixture down to the Scenario level modify_file.feature file.
Can you help me understand what is going on here? I'm also curious why the fixture tag is used with dot notation (#fixture.backup.file) rather than (fixture.backup_file) as this would be similar to the actual function name. There is no explanation of this in the behave documentation.

I also had trouble setting up a fixture because the docs aren't super clear that you have to explicitly enable them. It isn't enough to have the #fixture decoration in features/environment.py. You also have to call use_fixture(). For example, inside before_tag(), like this:
def before_tag(context, tag):
if tag == "fixture.backup.file":
use_fixture(backup_file, context)
This example helped me figure that out:
https://github.com/behave/behave/blob/main/features/fixture.feature#L16

Here is how you can do it without #fixture, just by using the before_tag hook:
#!/usr/bin/env python
# FILE: features/environment.py
import logging
from behave.runner import Context
from behave.model import Tag
from pathlib import Path
logger = logging.getLogger(__name__)
def rename_file(origin: Path, target: Path) -> tuple[Path, Path]:
"""
Will rename the `origin` to `target`, replaces existing `target`
"""
origin.touch()
backup = origin.replace(target)
logger.info(f"File {str(origin)} renamed to {str(target)}")
return origin, backup
def before_tag(context: Context, tag: Tag):
if tag == "backup.file.important.txt":
file = Path.home() / "important.txt"
backup = file.with_suffix(".backup")
context.file, context.backup_file = rename_file(file, backup)
context.add_cleanup(rename_file, context.backup_file, context.file)
#!/usr/bin/env python
# File features/steps/file_ops.py
from pathlib import Path
from behave import given
from behave import when
from behave import then
from behave.runner import Context
import logging
logger = logging.getLogger(__name__)
#given("the file exists")
def step_impl(context: Context):
logger.info(f"Touching file")
context.file.touch()
#when("I append a line to the file")
def step_impl(context: Context):
logger.info(f"Appending a line to file")
context.appended = "Test line appended\n"
with open(context.file, mode="a") as f:
f.write(context.appended)
#then("the line appears at the end of the file")
def step_impl(context: Context):
logger.info(f"Checking if line was appended")
with open(context.file, mode="r") as f:
for n, line in enumerate(f, start=1):
logger.info(f"Line {n} is {line}")
logger.info(f"Last line is '{line.strip()}'")
assert line == context.appended
But using a fixture and the fixture registry as described in the Realistic example in behave's documentation is superior, as you will likely end up having many more setup-cleanup pairs later on.

Related

Custom function to read file from current path in Python needs refactoring

I had to write a custom function to load a yaml file from the current working directory. The function itself works and my intention was to write it in a pure fashion but my senior colleague told me that the way I wrote this function is utterly bad and I have to rewrite it.
Which commandment in Python did I violate? Can anyone tell me what I did wrong here and how a "professional" solution would look like?
from typing import Dict
import yaml
from yaml import SafeLoader
from pathlib import Path
import os
def read_yaml_from_cwd(file: str) -> Dict:
"""[reads a yaml file from current working directory]
Parameters
----------
file : str
[.yaml or .yml file]
Returns
-------
Dict
[Dictionary]
"""
path = os.path.join(Path.cwd().resolve(), file)
if os.path.isfile(path):
with open(path) as f:
content = yaml.load(f, Loader=SafeLoader)
return content
else:
return None
content = read_yaml_from_cwd("test.yaml")
print(content)
The significant parts of your function can be reduced to this:
import yaml
from yaml import SafeLoader
def read_yaml_from_cwd(file):
try:
with open(file) as f:
return yaml.load(f, Loader=SafeLoader)
except Exception:
pass
In this way, the function will either return a dict object or None if either the file cannot be opened or parsed by the yaml loader

Creating Python submodule

I want to create a tool called unifile for saving and opening files
like this unifile.open.yaml("file.yaml").
This is my structure:
unifile
|
├-open
| └--__init__.py
|
└-save
└--__init__.py
Code that call my module:
import unifile
a = unifile.open.yaml("file.yaml")
open/init.py
import yaml
class open():
def yml(self, file_path):
try:
with open(file_path, "r", encoding="utf-8") as yaml_conf:
yaml_file = yaml.safe_load(yaml_conf)
return yaml_file
except OSError:
print("Can't load yaml")
1 error if I import unifile always say:
module unifile has no atribute open
2 error in __init__.py I can't open file
[pylint] Context manager 'open' doesn't implement enter and exit. [not-context-manager]
here adding solution to ur problem, make your project structure like this.
add unifile/__init__.py file in the unifile itself not in other modules.
then unifile/open/_open.py file content
import yaml
class Open():
def __init__(self):
pass
def yml(self, file_path):
try:
with open(file_path, "r", encoding="utf-8") as yaml_conf:
yaml_file = yaml.safe_load(yaml_conf)
return yaml_file
except OSError:
print("Can't load yaml")
content of the unifile/__init__.py file
from .open._open import Open
in terminal run the program like this
Also, It is better to create a object element first then proceed ahead.
Two issues, two answers.
First, you should add an init file in unifile. With this, Python will understand that unifile is a package with a sub package.
Second, open is a built-in function and you overwrite it by calling your class open. Change your class name and it should work.
You are getting this error because unifile is not a package. There isn't any init.py file at the top level same as open and save. You also cannot call open.yml directly, because open is a class in package open, so either you will have to import open from open, create its instance and then call iml on that instance.
from open import open
a = open().yml('file.yml')
You are getting this error, because you are trying to override an existing keyword in Python open which you should strictly prohibit doing. So you should name your class anything except a reserved keyword.

Python import library in submodule

I have the following structure:
run.py
app/hdfs_lib_test.py
app/src/HDFSFileReader.py
This is a Flask app.
The HSFDFileReader.py contains:
class HDFSFileReader:
"""This class represents a reader for accessing files in a HDFS."""
def __init__(self):
pass
#classmethod
def read(cls, file_path):
lines = []
try:
client = Config().get_client('dev')
with client.read('Serien_de', encoding='utf-8', delimiter='\n') as reader:
for line in reader:
lines.append(line)
except:
print("ERROR: Could not read from HDFS.")
raise
return lines
When I run ./run.py I get the ImportError: No module named 'hdfs'. However the library is installed and I can call python hdfs_lib_test.py, which contains the following:
from hdfs import Config
try:
client = Config().get_client('dev')
with client.read('Serien_de',encoding='utf-8',delimiter='\n') as reader:
for line in reader:
print(line)
except:
raise
So to me it seems, that because HDFSFileReader is part of a submodule and hdfs_lib_test.py isn't. The former doesn't work, but the later does. But how do I fix the import problem?
I can't be a circular import issue, I searched the whole project for imports of hdfs and the hdfs_lib_test.py is not used by the actual project.

how to concisely create a temporary file that is a copy of another file in python

I know that it is possible to create a temporary file, and write the data of the file I wish to copy to it. I was just wondering if there was a function like:
create_temporary_copy(file_path)
There isn't one directly, but you can use a combination of tempfile and shutil.copy2 to achieve the same result:
import tempfile, shutil, os
def create_temporary_copy(path):
temp_dir = tempfile.gettempdir()
temp_path = os.path.join(temp_dir, 'temp_file_name')
shutil.copy2(path, temp_path)
return temp_path
You'll need to deal with removing the temporary file in the caller, though.
This isn't quite as concise, and I imagine there may be issues with exception safety, (e.g. what happens if 'original_path' doesn't exist, or the temporary_copy object goes out of scope while you have the file open) but this code adds a little RAII to the clean up. The difference here to using NamedTemporaryFile directly is that rather than ending up with a file object, you end up with a file, which is occasionally desirable (e.g. if you plan to call out to other code to read it, or some such.)
import os,shutil,tempfile
class temporary_copy(object):
def __init__(self,original_path):
self.original_path = original_path
def __enter__(self):
temp_dir = tempfile.gettempdir()
base_path = os.path.basename(self.original_path)
self.path = os.path.join(temp_dir,base_path)
shutil.copy2(self.original_path, self.path)
return self.path
def __exit__(self,exc_type, exc_val, exc_tb):
os.remove(self.path)
in your code you'd write:
with temporary_copy(path) as temporary_path_to_copy:
... do stuff with temporary_path_to_copy ...
# Here in the code, the copy should now have been deleted.
The following is more concise (OP's ask) than the selected answer. Enjoy!
import tempfile, shutil, os
def create_temporary_copy(path):
tmp = tempfile.NamedTemporaryFile(delete=True)
shutil.copy2(path, tmp.name)
return tmp.name
A variation on #tramdas's answer, accounting for the fact that the file cannot be opened twice on windows. This version ignores the preservation of the file extension.
import os, shutil, tempfile
def create_temporary_copy(src):
# create the temporary file in read/write mode (r+)
tf = tempfile.TemporaryFile(mode='r+b', prefix='__', suffix='.tmp')
# on windows, we can't open the the file again, either manually
# or indirectly via shutil.copy2, but we *can* copy
# the file directly using file-like objects, which is what
# TemporaryFile returns to us.
# Use `with open` here to automatically close the source file
with open(src,'r+b') as f:
shutil.copyfileobj(f,tf)
# display the name of the temporary file for diagnostic purposes
print 'temp file:',tf.name
# rewind the temporary file, otherwise things will go
# tragically wrong on Windows
tf.seek(0)
return tf
# make a temporary copy of the file 'foo.txt'
name = None
with create_temporary_copy('foo.txt') as temp:
name = temp.name
# prove that it exists
print 'exists', os.path.isfile(name) # prints True
# read all lines from the file
i = 0
for line in temp:
print i,line.strip()
i += 1
# temp.close() is implicit using `with`
# prove that it has been deleted
print 'exists', os.path.isfile(name) # prints False
A slight variation (in particular I needed the preserve_extension feature for my use case, and I like the "self-cleanup" feature):
import os, shutil, tempfile
def create_temporary_copy(src_file_name, preserve_extension=False):
'''
Copies the source file into a temporary file.
Returns a _TemporaryFileWrapper, whose destructor deletes the temp file
(i.e. the temp file is deleted when the object goes out of scope).
'''
tf_suffix=''
if preserve_extension:
_, tf_suffix = os.path.splitext(src_file_name)
tf = tempfile.NamedTemporaryFile(suffix=tf_suffix)
shutil.copy2(src_file_name, tf.name)
return tf

Django test FileField using test fixtures

I'm trying to build tests for some models that have a FileField. The model looks like this:
class SolutionFile(models.Model):
'''
A file from a solution.
'''
solution = models.ForeignKey(Solution)
file = models.FileField(upload_to=make_solution_file_path)
I have encountered two problems:
When saving data to a fixture using ./manage.py dumpdata, the file contents are not saved, only the file name is saved into the fixture. While I find this to be the expected behavior as the file contents are not saved into the database, I'd like to somehow include this information in the fixture for tests.
I have a test case for uploading a file that looks like this:
def test_post_solution_file(self):
import tempfile
import os
filename = tempfile.mkstemp()[1]
f = open(filename, 'w')
f.write('These are the file contents')
f.close()
f = open(filename, 'r')
post_data = {'file': f}
response = self.client.post(self.solution.get_absolute_url()+'add_solution_file/', post_data,
follow=True)
f.close()
os.remove(filename)
self.assertTemplateUsed(response, 'tests/solution_detail.html')
self.assertContains(response, os.path.basename(filename))
While this test works just fine, it leaves the uploaded file in the media directory after finishing. Of course, the deletion could be taken care of in tearDown(), but I was wondering if Django had another way of dealing with this.
One solution I was thinking of was using a different media folder for tests which must be kept synced with the test fixtures. Is there any way to specify another media directory in settings.py when tests are being run? And can I include some sort of hook to dumpdata so that it syncs the files in the media folders?
So, is there a more Pythonic or Django-specific way of dealing with unit tests involving files?
Django provides a great way to write tests on FileFields without mucking about in the real filesystem - use a SimpleUploadedFile.
from django.core.files.uploadedfile import SimpleUploadedFile
my_model.file_field = SimpleUploadedFile('best_file_eva.txt', b'these are the contents of the txt file')
It's one of django's magical features-that-don't-show-up-in-the-docs :). However it is referred to here.
You can override the MEDIA_ROOT setting for your tests using the #override_settings() decorator as documented:
from django.test import override_settings
#override_settings(MEDIA_ROOT='/tmp/django_test')
def test_post_solution_file(self):
# your code here
I've written unit tests for an entire gallery app before, and what worked well for me was using the python tempfile and shutil modules to create copies of the test files in temporary directories and then delete them all afterwards.
The following example is not working/complete, but should get you on the right path:
import os, shutil, tempfile
PATH_TEMP = tempfile.mkdtemp(dir=os.path.join(MY_PATH, 'temp'))
def make_objects():
filenames = os.listdir(TEST_FILES_DIR)
if not os.access(PATH_TEMP, os.F_OK):
os.makedirs(PATH_TEMP)
for filename in filenames:
name, extension = os.path.splitext(filename)
new = os.path.join(PATH_TEMP, filename)
shutil.copyfile(os.path.join(TEST_FILES_DIR, filename), new)
#Do something with the files/FileField here
def remove_objects():
shutil.rmtree(PATH_TEMP)
I run those methods in the setUp() and tearDown() methods of my unit tests and it works great! You've got a clean copy of your files to test your filefield that are reusable and predictable.
with pytest and pytest-django, I use this in conftest.py file:
import tempfile
import shutil
from pytest_django.lazy_django import skip_if_no_django
from pytest_django.fixtures import SettingsWrapper
#pytest.fixture(scope='session')
##pytest.yield_fixture()
def settings():
"""A Django settings object which restores changes after the testrun"""
skip_if_no_django()
wrapper = SettingsWrapper()
yield wrapper
wrapper.finalize()
#pytest.fixture(autouse=True, scope='session')
def media_root(settings):
tmp_dir = tempfile.mkdtemp()
settings.MEDIA_ROOT = tmp_dir
yield settings.MEDIA_ROOT
shutil.rmtree(tmp_dir)
#pytest.fixture(scope='session')
def django_db_setup(media_root, django_db_setup):
print('inject_after')
might be helpful:
https://dev.funkwhale.audio/funkwhale/funkwhale/blob/de777764da0c0e9fe66d0bb76317679be964588b/api/tests/conftest.py
https://framagit.org/ideascube/ideascube/blob/master/conftest.py
https://stackoverflow.com/a/56177770/5305401
This is what I did for my test. After uploading the file it should end up in the photo property of my organization model object:
import tempfile
filename = tempfile.mkstemp()[1]
f = open(filename, 'w')
f.write('These are the file contents')
f.close()
f = open(filename, 'r')
post_data = {'file': f}
response = self.client.post("/org/%d/photo" % new_org_data["id"], post_data)
f.close()
self.assertEqual(response.status_code, 200)
## Check the file
## org is where the file should end up
org = models.Organization.objects.get(pk=new_org_data["id"])
self.assertEqual("These are the file contents", org.photo.file.read())
## Remove the file
import os
os.remove(org.photo.path)

Categories