force subprocss to return a value before importing a module - python

Objective : To Create UnitTestCases for main.py
How can we run subprocess.run as unittest case
File : main.py
import os
_ENV = os.environ['server']
BASE_PATH = '/home/test'
SERVER_URL = {
'DEV':{'ENV_URL':'https://dev.net'},
'UAT':{'ENV_URL':'https://uat.net'},
'PROD':{'ENV_URL':'https://prod.net'}
}[_ENV]
if _CYBERARK=='DYNAMIC' and _ENV.upper() in ('DEV','UAT','PROD') :
TOKEN = json.load(open(BASE_PATH + "secrets/cyberark_env.json"))
APPID = CONFIGURE_TOKEN.get('desc').get('appID')
## --> error on this because its running .sh files
VALUE=subprocess.run(["sh",BASE_PATH + "bin/cyberark.sh",APPID],capture_output=True,text=True).stdout.strip()
In my test_main.py
I am trying to import main.py to my test.py without any issues, but i find it challenging on how we to manually make VALUE variable return as 'ABCD'
Similar to below scripts
Testing : test_main.py
import unittest, sys
from unittest import mock
os.environ['server']='DEV'
# Similar to os.environ , i want to create temporarily VALUE before import main is called.
sys.path.insert(1, 'C:/home/src')
import main.py as conf
class test_tbrp_case(unittest.TestCase):
def test_port(self):
#function yet to be created
pass
if __name__=='__main__':
unittest.main()
Is this achieveable or i need to relogic my whole function

Related

To mimic os variables in my mock test cases

I have two python files, the first python file which is triggered in server will dynamically fetch the variable result based on the environment the script is triggered . for an example , when script is triggered in dev environment. ideally, ${RPM_ENVIRONMENT} will return as 'DEV'
I have two files, one is my main file and one is my unit test cases
import os
import json
import subprocess
import logging
from os import listdir
from os.path import isfile, join
_ENV = os.popen("echo ${RPM_ENVIRONMENT}").read().split('\n')[0]
SERVER_URL = {
'DEV':{'ENV_URL':'https://dev.net'},
'UAT':{'ENV_URL':'https://uat.net'},
'PROD':{'ENV_URL':'https://prod.net'}
}[_ENV]
inside my test cases script below, i wanted to mimic as dev environment using unitest mock . i have tried below script but it was returning RPM_ENVIROMENT as key error .
test_env.py
import unittest , sys , tempfile, os , json , shutil
from unittest import mock
## i wanted to mock all the required variables before running import env_test so that it wont return any error.
with mock.patch.object(os, 'popen') as mock_popen:
sys.path.insert(1, 'C:/home/test/conf')
import env_test as conf
class test_tbrp_case(unittest.TestCase):
def test_port(self):
#function yet to be created
pass
if __name__=='__main__':
unittest.main()
I have tried using os.popen to mimic , but i am confused on how i can assign 'DEV' to _ENV variable .
when i tried to run this script, it was returning error as
SERVER_URL = {
KeyError: <MagicMock name='popen().read().split().__getitem__()' id='1893950325424'
**Approach 2 i have tried **
What i am trying to mock is the import , when im importing my main.py , it should dynamically replace/mock _ENV as 'DEV' , and SERVER_URL variable should automatically call Dev.
In a scenario where i call conf._ENV after i have implemented the mock below. it should return the value as "DEV"
def rpm_environment():
return os.popen("echo ${RPM_ENVIRONMENT}").read()
def test_rpm_environment():
with mock.patch("os.popen") as popen_mock:
popen_mock().read.return_value = "DEV"
actual = rpm_environment()
assert actual == "DEV"
## When i import env_test , RPM_ENVIROMENT wont be able to mock as DEV on what was declared in our test_rpm_enviromnet
rpm_environment()
test_rpm_environment()
# How can we safely import our env_test files with having variables been mocked so that i can call server_url variable
sys.path.insert(1, 'C:/home/test/conf')
import env_test import conf
I didn't quite understand if your code is inside a function or not.
If it is, the best way to do so is not patch.object. It's just a normal patch:
Consider this example:
def question():
return os.popen("what_ever").read()
def test_question():
with patch("os.popen") as popen_mock:
popen_mock().read.return_value = "DEV"
actual = question()
assert actual == "DEV"
In my opinion, patching os.popen and adding read to it's structure is the best practice.
Good luck !
When you mock popen it will return a MagickMock object and that object does not have a defined read response. You need to define what happens when someone calls read() on a MagickMock object that you have returned. Although it is not the most elegant solution, you can do this by adding this line in the with block:
mock_popen.return_value.read.return_value = "DEV"
This will instruct the MagickMock object to return the string "DEV" when read() is called on it.

Import variables from a config File

I have a script that have multiple files and a config file with all the variables store in one Python file.
Folder structure:
Config file:
If I try to run the main file which calls the head function imported, an error pops up saying that the config cannot be imported.
Imports:
Your Functions folder has a __init__.py file. If your app executes from Main.py (ie if Main.py satisfies __name__ == "__main__") therefore wherever you are in your app you could import the config like this:
from Functions.Config import *
Edit:
However,from module import * is not recommended with local import. You could give an alias to the import and call name.variable instead of calling name directly.
Example:
def head():
# import packages
import Function.Config as conf
print(conf.more_200)
head()
>>> 50
Your syntax are wrong.
It is supposed to be from Config import * and not import .Config import *

Unable to import a function from outside the current working directory

I am currently having difficulties import some functions which are located in a python file which is in the parent directory of the working directory of the main flask application script. Here's how the structure looks like
project_folder
- public
--app.py
-scripts.py
here's a replica code for app.py:
def some_function():
from scripts import func_one, func_two
func_one()
func_two()
print('done')
if __name__ == "__main__":
some_function()
scripts.py contain the function as such:
def func_one():
print('function one successfully imported')
def func_two():
print('function two successfully imported')
What is the pythonic way of importing those functions in my app.py?
Precede it with a dot so that it searches the current directory (project_folder) instead of your python path:
from .scripts import func_one, func_two
The details of relative imports are described in PEP 328
Edit: I assumed you were working with a package. Consider adding an __init__.py file.
Anyways, you can import anything in python by altering the system path:
import sys
sys.path.append("/path/to/directory")
from x import y
1.
import importlib.util
def loadbasic():
spec = importlib.util.spec_from_file_location("basic", os.path.join(os.path.split(__file__)[0], 'basic.py'))
basic = importlib.util.module_from_spec(spec)
spec.loader.exec_module(basic)
return basic #returns a module
Or create an empty file __init__.py in the directory.
And
do not pollute your path with appends.

More elegant solution for python imports across app/tests?

I'm trying to keep my code reasonably organized by putting tests in a separate directory from my app. However, imports work either for the app or for the tests, but not both. Here's a contrived example that illustrates my current problem:
myapp/
app/
main.py
settings.py
moods.py
test/
test_moods.py
Contents of files as follows:
main.py
import settings
from moods import Moods
words = Moods(settings.EXAMPLE)
print(words.excited())
settings.py
EXAMPLE = "Wow$ Python$"
DELIM = "$"
moods.py
import settings
class Moods:
def __init__(self, text):
self.text = text
def excited(self):
return self.text.replace(settings.DELIM, "!!!")
test_moods.py
import sys, os, unittest
sys.path.insert(0, os.path.abspath('..'))
from app.moods import Moods
class TestMood(unittest.TestCase):
def setUp(self):
self.words = Moods("Broken imports$ So sad$")
def test_mood(self):
self.assertEqual(self.words.excited(), "Broken imports!!! So sad!!!")
with self.assertRaises(AttributeError):
self.words.angry()
if __name__ == "__main__":
unittest.main()
In the current state, from myapp/ I can run the following successfully:
>> python3 app/main.py
Wow!!! Python!!!
But when I try to run tests, the import fails in moods.py:
>> python3 -m unittest discover test/
ImportError: Failed to import test module: test_moods
[ stack trace here pointing to first line of moods.py ]
ImportError: No module named 'settings'
If I modify line 1 of moods.py to read from app import settings, the test will pass, but normal execution of the app fails because it sees no module app.
The only solutions I can think of are
Putting the tests in the same directory as the code (which I was trying to avoid)
Adding the sys.path.insert(0, os.path.abspath('..')) to each file in my app to make the imports work the same as the test ones do (which seems messy)
Is there are more elegant way to solve this import problem?
You should have both app and test directory in PYTHONPATH.
One way is to replace this:
sys.path.insert(0, os.path.abspath('..'))
from app.moods import Moods
with this:
sys.path.insert(0, os.path.abspath('../app'))
from moods import Moods
Or you can set the PYTHONPATH environament variable before running the tests.
Why? Because when you run main.py, then app is in the path, not app/.., so you want to have the same when running tests.

Python importing multiple scripts

I've been working on a python script that will require multiple libraries to be imported.
At the moment my directory structure is
program/
main.py
libs/
__init__.py
dbconnect.py
library01.py
library02.py
library03.py
My dbconnect.py which has the following contents
import psycopg2
class dbconnect:
def __init__(self):
self.var_parameters = "host='localhost' dbname='devdb' user='temp' password='temp'"
def pgsql(self):
try:
var_pgsqlConn = psycopg2.connect(self.var_parameters)
except:
print("connection failed")
return var_pgsqlConn
I am able to import and use this in my main.py using
from libs.dbconnect import dbconnect
class_dbconnect = dbconnect()
var_pgsqlConn = class_dbconnect.pgsql()
This works as expected however I am trying to import all of the library scripts each which have similar contents to bellow
def library01():
print("empty for now but this is library 01")
I have added to my __init__.py script
__all__ = ["library01", "library02"]
Then in my main.py I tried to import and use them as bellow
from libs import *
library01()
I am getting the following error
TypeError: 'module' object is not callable
I'll suppose content in your library0x.py are different (the functions/class have different names)
The best way is to import all your subfiles content in the __init__.py
# __init__.py
from .dbconnect import *
from .library01 import *
from .library02 import *
from .library03 import *
Then you can use the following :
from libs import library01, library02
If you want to restrict for some reasons importation with the wildcard (*) in your library0x.py files, you can define a __all__ variable containing all the names of the function you will import with the wildcard :
# library01.py
__all__ = ["library01"]
def a_local_function():
print "Local !"
def library01():
print "My library function"
Then, by doing from .library01 import *, only the function library01 will be import.
EDIT: Maybe i missunderstand the question : here are some ways to import the function library01 in the file library01.py :
# Example 1:
from libs.library01 import library01
library01()
# Example 2:
import libs.library01
libs.library01.library01()
# Example 3:
import libs.library01 as library01
library01.library01()
In your case library01 is a module which contains a function named library01. You import the library01 module and try to call it as a function. That's the problem. You should call the function like this:
library01.library01()

Categories