To mimic os variables in my mock test cases - python

I have two python files, the first python file which is triggered in server will dynamically fetch the variable result based on the environment the script is triggered . for an example , when script is triggered in dev environment. ideally, ${RPM_ENVIRONMENT} will return as 'DEV'
I have two files, one is my main file and one is my unit test cases
import os
import json
import subprocess
import logging
from os import listdir
from os.path import isfile, join
_ENV = os.popen("echo ${RPM_ENVIRONMENT}").read().split('\n')[0]
SERVER_URL = {
'DEV':{'ENV_URL':'https://dev.net'},
'UAT':{'ENV_URL':'https://uat.net'},
'PROD':{'ENV_URL':'https://prod.net'}
}[_ENV]
inside my test cases script below, i wanted to mimic as dev environment using unitest mock . i have tried below script but it was returning RPM_ENVIROMENT as key error .
test_env.py
import unittest , sys , tempfile, os , json , shutil
from unittest import mock
## i wanted to mock all the required variables before running import env_test so that it wont return any error.
with mock.patch.object(os, 'popen') as mock_popen:
sys.path.insert(1, 'C:/home/test/conf')
import env_test as conf
class test_tbrp_case(unittest.TestCase):
def test_port(self):
#function yet to be created
pass
if __name__=='__main__':
unittest.main()
I have tried using os.popen to mimic , but i am confused on how i can assign 'DEV' to _ENV variable .
when i tried to run this script, it was returning error as
SERVER_URL = {
KeyError: <MagicMock name='popen().read().split().__getitem__()' id='1893950325424'
**Approach 2 i have tried **
What i am trying to mock is the import , when im importing my main.py , it should dynamically replace/mock _ENV as 'DEV' , and SERVER_URL variable should automatically call Dev.
In a scenario where i call conf._ENV after i have implemented the mock below. it should return the value as "DEV"
def rpm_environment():
return os.popen("echo ${RPM_ENVIRONMENT}").read()
def test_rpm_environment():
with mock.patch("os.popen") as popen_mock:
popen_mock().read.return_value = "DEV"
actual = rpm_environment()
assert actual == "DEV"
## When i import env_test , RPM_ENVIROMENT wont be able to mock as DEV on what was declared in our test_rpm_enviromnet
rpm_environment()
test_rpm_environment()
# How can we safely import our env_test files with having variables been mocked so that i can call server_url variable
sys.path.insert(1, 'C:/home/test/conf')
import env_test import conf

I didn't quite understand if your code is inside a function or not.
If it is, the best way to do so is not patch.object. It's just a normal patch:
Consider this example:
def question():
return os.popen("what_ever").read()
def test_question():
with patch("os.popen") as popen_mock:
popen_mock().read.return_value = "DEV"
actual = question()
assert actual == "DEV"
In my opinion, patching os.popen and adding read to it's structure is the best practice.
Good luck !

When you mock popen it will return a MagickMock object and that object does not have a defined read response. You need to define what happens when someone calls read() on a MagickMock object that you have returned. Although it is not the most elegant solution, you can do this by adding this line in the with block:
mock_popen.return_value.read.return_value = "DEV"
This will instruct the MagickMock object to return the string "DEV" when read() is called on it.

Related

force subprocss to return a value before importing a module

Objective : To Create UnitTestCases for main.py
How can we run subprocess.run as unittest case
File : main.py
import os
_ENV = os.environ['server']
BASE_PATH = '/home/test'
SERVER_URL = {
'DEV':{'ENV_URL':'https://dev.net'},
'UAT':{'ENV_URL':'https://uat.net'},
'PROD':{'ENV_URL':'https://prod.net'}
}[_ENV]
if _CYBERARK=='DYNAMIC' and _ENV.upper() in ('DEV','UAT','PROD') :
TOKEN = json.load(open(BASE_PATH + "secrets/cyberark_env.json"))
APPID = CONFIGURE_TOKEN.get('desc').get('appID')
## --> error on this because its running .sh files
VALUE=subprocess.run(["sh",BASE_PATH + "bin/cyberark.sh",APPID],capture_output=True,text=True).stdout.strip()
In my test_main.py
I am trying to import main.py to my test.py without any issues, but i find it challenging on how we to manually make VALUE variable return as 'ABCD'
Similar to below scripts
Testing : test_main.py
import unittest, sys
from unittest import mock
os.environ['server']='DEV'
# Similar to os.environ , i want to create temporarily VALUE before import main is called.
sys.path.insert(1, 'C:/home/src')
import main.py as conf
class test_tbrp_case(unittest.TestCase):
def test_port(self):
#function yet to be created
pass
if __name__=='__main__':
unittest.main()
Is this achieveable or i need to relogic my whole function

Python import statement is not working for parent package

I'm using Python 3.9.5.
Based on this post, I'm trying to reuse some functions from the parent directory. Here's my code hierarchy:
github_repository
src
base
string_utilities.py
validation
email_validator.py
I also have __init__.py in all folders. In ALL of them.
Here's the string_utilities.py content:
def isNullOrEmpty(text: str):
return text is not None and len(text) > 0
And here's the email_validator.py content:
from src.base import string_utilities
def is_email(text: str):
if string_utilities.isNullOrEmpty(text):
return False
# logic to check email
return True
Now when I run python email_validator.py, I get this error:
ModuleNotFoundError: No module named 'src'
I have changed that frustrating import statement to all of these different forms, and I still get no results:
from ...src.base import string_utilities
which results in:
ImportError: attempted relative import with no known parent package
import src.base.string_utilities
Which causes compiler to not know the isNullOrEmpty function.
import ...src.base.string_utilities
Which results in:
Relative imports cannot be used with "import .a" form; use "from . import a" instead
I'm stuck at this point on how to reuse that function in this file. Can someone please help?

how to import function in pytest

I am trying to create a pytest for a Python 2.x script executable with dashes included in its name. I tried to import it the usual way but I can't figure out how to make it work with the dashes.
My project structure is as follows:
package
-- tests
-- bin
-- subpackage
-- ...py
Specifically, I need to test a function called master_disaster() which exists inside bin/let-me-out (yes with -). let-me-out is an executable .py file and my folder has no setup.py file or anything similar.
How can I import this function inside my test? My test is going to be a simple fixture that checks the time with:
#pytest.fixture
def now():
return timezone.now()
It then uses the now() function to create a new file which let-me-out will delete after a specific amount of time.
First of all, dashes make let-me-out word to an invalid identifier in Python. To work around it, you have to invoke the imp (Python 2.7)
or importlib (Python 3.5+) machinery.
Python 3.5+
Here is an example of importing a new module having a qualified name let_me_out, but using bin/let-me-out as source file:
import importlib
def test_master_disaster():
loader = importlib.machinery.SourceFileLoader('let_me_out', 'bin/let-me-out')
spec = importlib.util.spec_from_loader(loader.name, loader)
let_me_out = importlib.util.module_from_spec(spec)
loader.exec_module(let_me_out)
# this is only a stub, to show an example of calling the master_disaster function
assert let_me_out.master_disaster() == 'spam'
You can extract this code into a fixture to make it reusable:
import importlib
import pytest
#pytest.fixture(scope='session')
def let_me_out():
loader = importlib.machinery.SourceFileLoader('let_me_out', 'bin/let-me-out')
spec = importlib.util.spec_from_loader(loader.name, loader)
let_me_out = importlib.util.module_from_spec(spec)
loader.exec_module(let_me_out)
return let_me_out
def test_master_disaster(let_me_out):
assert let_me_out.master_disaster() == 'spam'
Python 2.7
Things are even easier with Python 2.7:
import imp
import pytest
#pytest.fixture(scope='session')
def let_me_out():
return imp.load_source('let_me_out', 'bin/let-me-out')
def test_master_disaster(let_me_out):
assert let_me_out.master_disaster() == 'spam'

Python ImportLib 'No Module Named'

I'm trying to use a variable as a module to import from in Python.
Using ImportLib I have been successfully able to find the test...
sys.path.insert(0, sys.path[0] + '\\tests')
tool_name = selected_tool.split(".")[0]
selected_module = importlib.import_module("script1")
print(selected_module)
... and by printing the select_module I can see that it succesfully finds the script:
<module 'script1' from 'C:\\Users\\.....">
However, when I try to use this variable in the code to import a module from it:
from selected_module import run
run(1337)
The program quits with the following error:
ImportError: No module named 'selected_module'
I have tried to add a init.py file to the main directory and the /test directory where the scripts are, but to no avail. I'm sure it's just something stupidly small I'm missing - does anyone know?
Import statements are not sensitive to variables! Their content are treated as literals
An example:
urllib = "foo"
from urllib import parse # loads "urllib.parse", not "foo.parse"
print(parse)
Note that from my_module import my_func will simply bind my_module.my_func to the local name my_func. If you have already imported the module via importlib.import_module, you can just do this yourself:
# ... your code here
run = selected_module.run # bind module function to local name

Ignore ImportError when exec source code

I have an application that reads test scripts in python and sends them across the network for execution on a remote python instance. As the controlling program does not need to run these scripts I do not want to have all the modules the test scripts use installed on the controller's python environment. However the controller does need information from the test script to tell it how to run the test.
Currently what I do to read and import test script data is something like
with open( 'test.py', 'r' ) as f:
source = f.read()
m = types.ModuleType( "imported-temp", "Test module" )
co = compile( source, 'test.py', 'exec' )
exec co in m.__dict__
which yields a new module that contains the test. Unfortunately exec will raise ImportErrors if the test tries to import something the controller does not have. And worse, the module will not be fully imported.
If I can guarantee that the controller will not use the missing modules, is there someway I can ignore these exceptions? Or some other way to find out the names and classes defined in the test?
Examples test:
from controller import testUnit
import somethingThatTheControllerDoesNotHave
_testAttr = ['fast','foo','function']
class PartOne( testUnit ):
def run( self ):
pass
What the controller needs to know is the data in _testAttr and the name of all class definitions inheriting from testUnit.
Write an import hook that catches the exception and returns a dummy module if the module doesn't exist.
import __builtin__
from types import ModuleType
class DummyModule(ModuleType):
def __getattr__(self, key):
return None
__all__ = [] # support wildcard imports
def tryimport(name, globals={}, locals={}, fromlist=[], level=-1):
try:
return realimport(name, globals, locals, fromlist, level)
except ImportError:
return DummyModule(name)
realimport, __builtin__.__import__ = __builtin__.__import__, tryimport
import sys # works as usual
import foo # no error
from bar import baz # also no error
from quux import * # ditto
You could also write it to always return a dummy module, or to return a dummy module if the specified module hasn't already been loaded (hint: if it's in sys.modules, it has already been loaded).
I think, based on what you're saying, that you can just say:
try:
exec co in m.__dict__
except ImportError: pass
Does that help?
You could use python ast module, and parse the script to an AST tree and then scan through the tree looking for elements of interest. That way you don't have to execute the script at all.

Categories