I want to run a python project on a remote server (AWS cluster).
In the project, there is a model ("model") in path project/folder2/model, in model.py I have df.write.parquet(parquetFileName).
In order for the salves to have that "model" from "folder2" I am using setup from setuptools.
This is my setup.py (path project/setup.py):
setup(
name='dataFrameFromSpark',
version='',
packages=['project/config', 'project/folder2', 'project/folder3'],
py_modules=['project/model1'],
url='',
license='',
author='....',
author_email='',
description=''
)
Before running main.py I type this line on the remote server:
python project/setup.py build && python project/setup.py sdist && python project/setup.py bdist_egg
The error:
ImportError: No module named folder2.model
It seems the slaves don't get the folders when I run the "write to parquet" code.
what can I do?
Related
I am new to Python. This looks like a very simple problem but I am unable to solve it after trying my best.
I am trying to publish a python package that I developed to an artifact store. However, when I download the package on a target machine, it runs into the error about inner modules not found. The packaging and installation both look good. The output messages show that it does include the submodules.
I have a directory structure as per below.
samplepackage/
hello.py
__init__.py
dir1/
__init__.py
dir1pkg.py
Below are the contents of the files. The init files are empty.
hello.py
import sys
from dir1.dir1pkg import dir1pkg
def main ():
dirpkg = dir1pkg('This is msg')
dirpkg.printmsg()
if __name__ == "__main__":
main()
dir1pkg.py
class dir1pkg:
def __init__(self,msg):
self.msg = msg
def printmsg(self):
print(self.msg)
setup.py
import setuptools
from setuptools import setup, find_packages, find_namespace_packages
setup(
name="samplepackage",
version="0.0.3",
author="myname",
author_email="myemail#email.com",
description="This is a sample package",
long_description="This is long description",
long_description_content_type="text/markdown",
packages=setuptools.find_packages(),
include_package_data=True,
classifiers=[
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
],
entry_points={
"console_scripts":[
"samplepackage=samplepackage.hello:main"
]
}
,python_requires='>=3.7'
)
Below is how I am packaging and publishing to artifacts repo.
python setup.py sdist bdist_wheel
twine upload --config-file ".pypirc" -r <artifact_feed> dist/*
Below is how I am installing on the target.
python -m pip install --upgrade samplepackage
python -m SamplePackage.hello.py
This gives me the error below
C:\Users\manan\Desktop>python -m samplepackage.hello.py
Traceback (most recent call last):
File "C:\Users\manan\AppData\Local\Programs\Python\Python38\lib\runpy.py", line 185, in _run_module_as_main
mod_name, mod_spec, code = _get_module_details(mod_name, _Error)
File "C:\Users\manan\AppData\Local\Programs\Python\Python38\lib\runpy.py", line 111, in _get_module_details
__import__(pkg_name)
File "C:\Users\manan\AppData\Local\Programs\Python\Python38\lib\site-packages\samplepackage\hello.py", line 2, in
from dir1.dir1pkg import dir1pkg
ModuleNotFoundError: No module named 'dir1'
However, this runs just fine from where I am developing the package. I can execute below and it is able to find the inner module without any issues.
C:\Users\mdmehta\Desktop\PythonPackage\samplepackage>python hello.py
This is msg
I have tried doing a lot of twicks around setup.py but none of them work. Even the output of the installed package looks good. I do see dir1 being included as a package.
>>> help('samplepackage')
Help on package samplepackage:
NAME
samplepackage
PACKAGE CONTENTS
dir1 (package)
hello
FILE
c:\users\mdmehta\appdata\local\programs\python\python38\lib\site-packages\samplepackage\__init__.py
Figured out the problem. We have to use full imports for it to work.
The file hello.py should use
from samplepackage.dir1.dir1pkg import dir1pkg
Do not use Visual studio for python projects. It does not like fully qualified package names. Switched to pycharm and modified the package imports to fully qualified ones and everything started working.
Context
I have a Python project for which I wrap some C/C++ code (using the excellent PyBind library). I have a set of C and Python unit tests and I've configured Gitlab's CI to run them at each push.
The C tests use a minimalist unit test framework called minunit and I use Python's unittest suite.
Before running the C tests, all the C code is compiled and then tested. I'd like to also compile the C/C++ wrapper for Python before running the Python tests, but have a hard time to do it.
Question in a few words
Is there a standard/good way to get Gitlab-CI to build a Python extension using setuptools before running unit-tests?
Question with more words / Description of what I tried
To compile the C/C++ wrapper locally, I use setuptools with a setup.py file including a build_ext command.
I locally compile everything with python setup.py build_ext --inplace (the last arg --inplace will just copy the compiled file to the current directory).
As far as I know, this is quite standard.
What I tried to do on Gitlab is to have a Python script (code below) that will run a few commands using os.system command (which appears to be bad practice...).
The first command is to run a script building and running all C tests. This works but I'm happy to take recommendations (should I configure Gitlab CI to run C tests separately?).
Now, the problem comes when I try to build the C/C++ wrapper, with os.system("cd python/ \npython setup.py build_ext --inplace"). This generates the error
File "setup.py", line 1, in <module>
from setuptools import setup, Extension
ImportError: No module named setuptools
So I tried to modify my gitlab's CI configuration file to install python-dev. My .gitlab-ci.yml looks like
test:
script:
- apt-get install -y python-dev
- python run_tests.py
But, not being sudo on the gitlab's server, I get the following error E: Could not open lock file /var/lib/dpkg/lock - open (13: Permission denied).
Anyone knows a way around that, or a better way to tackle this problem?
Any help would be more than welcome!
run_tests.py file
import unittest
import os
from shutil import copyfile
import glob
class AllTests(unittest.TestCase):
def test_all(self):
# this automatically loads all tests in current dir
testsuite = unittest.TestLoader().discover('tests/Python_tests')
# run tests
result = unittest.TextTestRunner(verbosity=2).run(testsuite)
# send/print results
self.assertEqual(result.failures, [], 'Failure')
if __name__ == "__main__":
# run C tests
print(' ------------------------------------------------------ C TESTS')
os.system("cd tests/C_tests/ \nbash run_all.sh")
# now python tests
print(' ------------------------------------------------- PYTHON TESTS')
# first build and copy shared library compiled from C++ in the python test directory
# build lib
os.system("cd python/ \npython setup.py build_ext --inplace")
# copy lib it to right place
dest_dir = 'tests/Python_tests/'
for file in glob.glob(r'python/*.so'):
print('Copying file to test dir : ', file)
copyfile(file, dest_dir+file.replace('python/', ''))
# run Python tests
unittest.main(verbosity=0)
My suggestion would be moving the entire test running logic into the setup script.
using test command
First of all, setuptools ships a test command, so you can run the tests via python setup.py test. Even better, the test calls build_ext command under the hood and places the built extensions so that they accessible in the tests, so no need for you to invoke python setup.py build_ext explicitly:
$ python setup.py test
running test
running egg_info
creating so.egg-info
writing so.egg-info/PKG-INFO
writing dependency_links to so.egg-info/dependency_links.txt
writing top-level names to so.egg-info/top_level.txt
writing manifest file 'so.egg-info/SOURCES.txt'
reading manifest file 'so.egg-info/SOURCES.txt'
writing manifest file 'so.egg-info/SOURCES.txt'
running build_ext
building 'wrap_fib' extension
creating build
creating build/temp.linux-aarch64-3.6
aarch64-unknown-linux-gnu-gcc -pthread -fPIC -I/data/gentoo64/usr/include/python3.6m -c wrap_fib.c -o build/temp.linux-aarch64-3.6/wrap_fib.o
aarch64-unknown-linux-gnu-gcc -pthread -fPIC -I/data/gentoo64/usr/include/python3.6m -c cfib.c -o build/temp.linux-aarch64-3.6/cfib.o
creating build/lib.linux-aarch64-3.6
aarch64-unknown-linux-gnu-gcc -pthread -shared -Wl,-O1 -Wl,--as-needed -L. build/temp.linux-aarch64-3.6/wrap_fib.o build/temp.linux-aarch64-3.6/cfib.o -L/data/gentoo64/usr/lib64 -lpython3.6m -o build/lib.linux-aarch64-3.6/wrap_fib.cpython-36m-aarch64-linux-gnu.so
copying build/lib.linux-aarch64-3.6/wrap_fib.cpython-36m-aarch64-linux-gnu.so ->
test_fib_0 (test_fib.FibonacciTests) ... ok
test_fib_1 (test_fib.FibonacciTests) ... ok
test_fib_10 (test_fib.FibonacciTests) ... ok
----------------------------------------------------------------------
Ran 3 tests in 0.002s
OK
(I used the code from the Cython Book examples repository to play with, but the output should be pretty similar to what PyBind produces).
using the extra keywords
Another feature that may come handy are the extra keywords setuptools adds: test_suite, tests_require, test_loader (docs). Here's an example of embedding a custom test suite as you do in run_tests.py:
# setup.py
import unittest
from Cython.Build import cythonize
from setuptools import setup, Extension
exts = cythonize([Extension("wrap_fib", sources=["cfib.c", "wrap_fib.pyx"])])
def pysuite():
return unittest.TestLoader().discover('tests/python_tests')
if __name__ == '__main__':
setup(
name='so',
version='0.1',
ext_modules=exts,
test_suite='setup.pysuite'
)
extending the test command
The last requirement is running C tests. We can embed them by overriding the test command and invoking some custom code from there. The advantage of that is that distutils offers a command API with many useful functions, like copying files or executing external commands:
# setup.py
import os
import unittest
from Cython.Build import cythonize
from setuptools import setup, Extension
from setuptools.command.test import test as test_orig
exts = cythonize([Extension("wrap_fib", sources=["cfib.c", "wrap_fib.pyx"])])
class test(test_orig):
def run(self):
# run python tests
super().run()
# run c tests
self.announce('Running C tests ...')
pwd = os.getcwd()
os.chdir('tests/C_tests')
self.spawn(['bash', 'run_all.sh'])
os.chdir(pwd)
def pysuite():
return unittest.TestLoader().discover('tests/python_tests')
if __name__ == '__main__':
setup(
name='so',
version='0.1',
ext_modules=exts,
test_suite='setup.pysuite',
cmdclass={'test': test}
)
I extended the original test command, running some extra stuff after the python unit tests finish (notice calling of an external command via self.spawn). All that is left is replacing the default test command with the custom one via passing cmdclass in the setup function.
Now you have everything collected in the setup script and python setup.py test will do all the dirty job.
But, not being sudo on the gitlab's server, I get the following error
I don't have any experience with Gitlab CI, but I can't imagine there is no possibility to install packages on the build server. Maybe this question will be helpful: How to use sudo in build script for gitlab ci?
If there really is no other option, you can bootstrap a local copy of setuptools with ez_setup.py. Note, however, that although this method still works, it was deprecated recently.
Also, if you happen to use a recent version of Python (3.4 and newer), then you should have pip bundled with Python distribution, so it should be possible to install setuptools without root permissions with
$ python -m pip install --user setuptools
I am using PyCharm to generate a setup.py file for my project. The following file is generated:
from setuptools import setup
setup(
name='untitled',
version='',
packages=['venv.lib.python3.6.site-packages.PyQt5', 'venv.lib.python3.6.site-packages.PyQt5.uic',
'venv.lib.python3.6.site-packages.PyQt5.uic.Loader', 'venv.lib.python3.6.site-packages.PyQt5.uic.port_v2',
'venv.lib.python3.6.site-packages.PyQt5.uic.port_v3',
'venv.lib.python3.6.site-packages.PyQt5.uic.Compiler', 'venv.lib.python3.6.site-packages.py2app',
'venv.lib.python3.6.site-packages.py2app.recipes', 'venv.lib.python3.6.site-packages.py2app.recipes.PIL',
'venv.lib.python3.6.site-packages.py2app.bootstrap', 'venv.lib.python3.6.site-packages.py2app.converters',
'venv.lib.python3.6.site-packages.py2app.apptemplate',
'venv.lib.python3.6.site-packages.py2app.bundletemplate', 'venv.lib.python3.6.site-packages.altgraph',
'venv.lib.python3.6.site-packages.macholib', 'venv.lib.python3.6.site-packages.modulegraph',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip.req',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip.vcs',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip.utils',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip.compat',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip.models',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.distlib',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.distlib._backport',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.colorama',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.html5lib',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.html5lib._trie',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.html5lib.filters',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.html5lib.treewalkers',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.html5lib.treeadapters',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.html5lib.treebuilders',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.lockfile',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.progress',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.requests',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.requests.packages',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.requests.packages.chardet',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.requests.packages.urllib3',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.requests.packages.urllib3.util',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.requests.packages.urllib3.contrib',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.requests.packages.urllib3.packages',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.requests.packages.urllib3.packages.ssl_match_hostname',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.packaging',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.cachecontrol',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.cachecontrol.caches',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.webencodings',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip._vendor.pkg_resources',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip.commands',
'venv.lib.python3.6.site-packages.pip-9.0.1-py3.6.egg.pip.operations'],
url='',
license='',
author='',
author_email='',
description=''
)
The period in python3.6 is causing problems because when I run python setup.py develop I get the following error:
error: package directory 'venv/lib/python3/6/site-packages/PyQt5' does not exist
I'm not sure how to go about fixing this? If I simply change the python3.6 to python36 it just further breaks everything. How do I fix this?
Project view:
Project structure:
I am a beginner in python and django. This error keeps coming, even after installing cx_freeze from http://www.lfd.uci.edu/~gohlke/pythonlibs/#cx_freeze
I am trying to make an executable file and I want to run my server through it, which I normally do by:
manage.py runserver
I am currently using commands:
setup.py build
I have already tried:
pyinstaller --name=mysite myproject_dir/manage.py
and my setup.py file contains
import sys
from distutils.core import setup
from cx_Freeze import setup, Executable
setup(
name = "Management System",
version = "1.0",
description = "A Database Management System",
py_modules=['virtualenv'],
executables = [Executable("manage.py", base = "Win32GUI")])
I have also tried py2exe, it doesn't work. You can also suggest to read something for this knowledge.
Here is the image of error that keeps appearing on running the exe file
If I use this command:
Barcode.exe runserver
There also comes an error
WindowsError: [Error 3] The system cannot find the path specified:'C:\\Users\\D ell\\AppData\\Local\\Temp\\_MEI85~1\\Entry\\migrations/*.*'
I'm currently looking into pyInstaller. I've been using py2exe for now but it would be nice if I'm only using one compiler for all platforms we target. My py2exe setup.py looks like this:
from distutils.core import setup
import py2exe
setup(
name='agent',
description='Service Test',
version='1.00.00',
service=['agent'],
console=['agent.py'],
zipfile=None,
options={ "py2exe":{
"includes":"win32service,win32serviceutil,win32event,servicemanager,autobahn",
"packages": 'twisted, autobahn',
'bundle_files': 1
},
},
)
I've managed to compile the windows service but as soon as i start using twisted it fails.
Commanlines I've used to compile with pyInstaller:
python PyInstaller.py --onefile c:\path\here\agent.py
python PyInstaller.py --hidden-import=twisted --onefile c:\path\here\agent.py
The error I get when I try to install my service
agent.exe?175104\twisted\python\modules.py:758:
UserWarning: C:\dist\path\agent.exe?175104 (for module twisted.web.util) not in path importer cache (PEP 302 violation - check your local configuration).