How do I run Python script from a subdirectory? - python

I want to be able to write a line of code like this and have it run smoothly:
python /path/to/python_file.py -arg1 -arg2 -etc
I figured out an easy way to discover all modules and add them to the current Python path, but it still doesn't seem to recognize the .py file, even though it's supposedly in the sys.path. I know the sys.path addition is working because I can perform this in the interpreter just fine:
>>>import ModuleManager # My Python script to discover modules in lower directories
>>>import testModule # Module I want to run from lower directory
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named testModule
>>>ModuleManager.discoverModules() # Now, I find the modules and add them to path
Discovering Modules...Complete!
Discovered 6 Modules.
>>>import testModule # No error now.
>>>
What I would like to do at this point is be able to go into my terminal and say:
python testModule -arg1 -arg2 -etc
and have it perform how I would expect.
To be clear, I want to write a line of code in ModuleManager.py (from my application root folder) that allows me to access a file named testModule.py which is found in /root/path/to/testModule.py in a way such that I can use arguments as in python testModule.py -arg1 -arg2 -etc. Any suggestions?

sys.path.append(os.getcwd()) work for me.
Now you can run commands like ./manage/my_cli_command.py

How to write python scripts efficiently
There are multiple ways to write Python script (a script, which is supposed to be called from command line).
For flexible use, forget about manipulating PYTHONPATH - such solutions are difficult to maintain and re rather shaky.
Script with Python stdlib imports only
Betting purely on packages and modules, which are part of Python stdlib makes the script easy to run from anywhere.
Extension to this is importing modules and packages, which are globally installed. Globally installing packages is mostly considered bad practice, as it spoils global Python environment. I have globally installed just small set of packages, which I use often: pyyaml, docopt, lxml, jinja2.
Script importing any installed package
Packages and modules, even your own, can be installed by means of setup.py. It took me a while to get used to this, I was initially ignoring setup.py as too complex solution, but later on I used to use it for any home-made package which I need to import in my programs.
Importing packages installed into virtualenv
This is by far my most popular solution now.
You will need your packages having setup.py.
After virtualenv is created, you use pip to install all the packages you need.
With given virtualenv being active, you can than simply use the script, which will see the packages available for import regardless of where you are your script starting from.
Script installed by means of setup.py
Again, this solution seems first too complex "just for simple script", but finally it can become the simplest method.
You have to create simple Python project with your script and setup.py only. The setup.py shall declare the script for installing. There are two methods, either by scripts argument, or by entry_points, later one being more flexible - use that one.
The script is best installed from within virtualenv - after the installation is complete, find location of the script and move it whenever you want to use it. For using the script, there is no need to activate the virtualenv, it is enough to have it in PATH. The script inside includes refernces to Python from given virtualenv and this ensures, it uses it including all installed packages.
Note, that the best solution for having required packages installed for your script is mentioning it in install_requires argument in setup.py, it ensures, they get installed with your script automatically.
Using zc.buildout
zc.buildout used to be my favourite solution until pip and virtualenv became mature solutions.
You have to create buildout.cfg file and there you define all what is needed for your script. this moslty includes having the setup.py in place, but this is not always necessary.
Personally, I found zc.buildout powerfull, but rather difficult to learn and I would recommend using virtualenv solution.
Using other solutions like PyInstaller
There are other solution, which allow turning Python script into single executable file.
PyInstaller looks like good way to go. There might be other similar solutions.
Conclusions
Forget about making your modules and packages importable by manipulating PYTHONPATH, such solutions are quite shaky.
My recommendation is to go for project having setup.py using entry_points to install your script.
Installation into virtualenv seems to be the cleanest method.
Setting up this environment (the project) the first time takes some effort, but after that, you can simply copy and adopt the solution and it will become natural way to go.

How to call from Python another Python script
You want to call a Python script, which works well only if run from particular directory as it has some imports which work well only from given location.
Assuming there is a subdirectory named "script_dir" and there is the "local_script_name.py" file:
import subprocess
subprocess.call(["python", "local_script_name.py"], cwd="script_dir")
If your script accepts command line arguments "arg1", "arg2", "-etc"
import subprocess
subprocess.call(["python", "local_script_name.py", "arg1", "arg2", "-etc"], cwd="script_dir")

Append the parent directory temporarily to the path:
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), os.path.pardir)))

Related

How to robustly retrieve the bin path of a python environment?

I already figured out how one can retrieve the include and site-package paths of a python environment. For instance the following is one of multiple possibilities:
from distutils.sysconfig import get_python_lib, get_python_inc
print(get_python_lib()) # Prints the location of site-packages
print(get_python_inc()) # Prints the location of the include dir
However, I was not able to find a robust method to retrieve the bin folder of a python environment, that is, the folder where python itself and tools like pip, pyinstaller, easy_install, etc., typically reside. Does anyone know how I can get this path from within python?
Some may want to suggest binpath = os.path.dirname(sys.executable). On Mac however, this does not work if python was installed as a Framework (binpath would point at: /Library/Frameworks/Python.framework/Versions/2.7/Resources/Python.app/Contents/MacOS)
What could I use instead that works cross-platform?
The stdlib way to get this location is via sysconfig module. Works on 2.7 and 3.3+. Works whether or not you are using a virtual environment.
>>> from sysconfig import get_path
>>> get_path('scripts')
'/usr/local/Cellar/python/3.7.0/Frameworks/Python.framework/Versions/3.7/bin'
Almost any package with scripts to install will put them there. However, note that with an imperative installer the setup.py code is actually executed, which means it could literally lay down scripts anywhere on the filesystem, given permission.

Run a python script in an environment where a library may not be present

I have a python script where I am including a third party library:
from docx import Document.
Now, I need to run this script in an environment where bare-bones python is present but not this library.
Installing this library in the target environment is beyond my scope and I tried using distutils, but couldn't go far with it. The target environment just need to run the script, not install a package.
I am from Java background and in Java I would have just exported and created a jar file which would have included all the libraries I needed. I need to do similar with python.
Edit: With distutils, I tried creating a setup.py:
from distutils.core import setup
import docx
setup(name='mymodule',
version='1.0',
py_modules=['mymodule', docx]
)
But I am not sure this works.
PyInstaller won't work if you can't make a pyc file and you cannot make pyc file unless your code runs without fatal errors.
You could have the import in a try block that excepts ImportError 's but that will result in NameError 's where the package is referenced. Long story short if the package is integral to the script no amount of avoiding it will fix your problem. You need the dependencies.
You said installing the package is beyond your scope, well then, it is time to expand your scope. Docx is an open source package you can find on github here
You can download that and run setup.py
You can include the modules for docx in your application. Just distribute them together.
But docx depends on the lmxl operating system package and needs to run setup on that. You can't just copy it to the target machine.
I'm not sure PyInstaller supports docx, especially add it has the non python dependency.
Really using pip or easy_install is the way to go.
PyInstaller is a program that converts (packages) Python programs into stand-alone executables, under Windows, Linux, Mac OS X, Solaris and AIX.

How to "build" a python script with its dependencies

I have a simple python shell script (no gui) who uses a couple of dependencies (requests and BeautifulfSoup4).
I would like to share this simple script over multiple computers. Each computer has already python installed and they are all Linux powered.
At this moment, on my development environments, the application runs inside a virtualenv with all its dependencies.
Is there any way to share this application with all the dependencies without the needing of installing them with pip?
I would like to just run python myapp.py to run it.
You will need to either create a single-file executable, using something like bbfreeze or pyinstaller or bundle your dependencies (assuming they're pure-python) into a .zip file and then source it as your PYTHONPATH (ex: PYTHONPATH=deps.zip python myapp.py).
The much better solution would be to create a setup.py file and use pip. Your setup.py file can create dependency links to files or repos if you don't want those machines to have access to the outside world. See this related issue.
As long as you make the virtualenv relocatable (use the --relocatable option on it in its original place), you can literally just copy the whole virtualenv over. If you create it with --copy-only (you'll need to patch the bug in virtualenv), then you shouldn't even need to have python installed elsewhere on the target machines.
Alternatively, look at http://guide.python-distribute.org/ and learn how to create an egg or wheel. An egg can then be run directly by python.
I haven't tested your particular case, but you can find source code (either mirrored or original) on a site like github.
For example, for BeautifulSoup, you can find the code here.
You can put the code into the same folder (probably a rename is a good idea, so as to not call an existing package). Just note that you won't get any updates.

Is there a way to embed dependencies within a python script?

I have a simple script that has a dependency on dnspython for parsing zone files. I would like to distribute this script as a single .py that users can run just so long as they have 2.6/2.7 installed. I don't want to have the user install dependencies site-wide as there might be conflicts with existing packages/versions, nor do I want them to muck around with virtualenv. I was wondering if there was a way to embed a package like dnspython inside the script (gzip/base64) and have that script access that package at runtime. Perhaps unpack it into a dir in /tmp and add that to sys.path? I'm not concerned about startup overhead, I just want a single .py w/ all dependencies included that I can distribute.
Also, there would be no C dependencies to build, only pure python packages.
Edit: The script doesn't have to be a .py. Just so long as it is a single executable file.
You can package multiple Python files up into a .egg. Egg files are essentially just zip archives with well defined metadata - look at the setuptools documentation to see how to do this. Per the docs you can make egg files directly executable by specifying the entry point. This would give you a single executable file that can contain your code + any other dependencies.
EDIT: Nowadays I would recommend building a pex to do this. pex is basically an executable zip file with non stdlib dependencies. It doesn't contain a python distribution (like py2app/py2exe) but holds everything else and can be built with a single command line invocation. https://pex.readthedocs.org/en/latest/
The simplest way is just to put your python script named __main__.py with pure Python dependencies in a zip archive, example.
Otherwise PyInstaller could be used to produce a stand-alone executable.
please don't do this. If you do DO NOT make a habit of it.
pydns is BDS licensed but if you try to "embed" a gpl module in this way you could get in trouble
you can learn to use setuptools and you will be much happier in the long run
setuptools will handle the install of dependencies you identified (I'm not sure if the pydns you are using is pure python so you might create problems for your users if you try to add it yourself without knowing their environment)
you can set a url or pypi so that people could upgrade your script with easy_install -U

xcopy python deployment

I'm new to python and I'm writing my first program. I would like after I finish to be able to run the program from the source code on a windows or mac machine. My program has dependencies on 3rd party modules.
I read about virtualenv but I don't think it helps me because it says it's not relocatable and it's not cross-platform (see Making Environments Relocatable http://pypi.python.org/pypi/virtualenv).
The best scenario is to install the 3rd party modules locally in my project, aka xcopy installation.
I will be really surprised if python doesn't support this easily especially since it promotes simplicity and frictionless programming.
You can do what you want, you just have to make sure that the directory containing your third-party modules is on the python path.
There's no requirement to install modules system-wide.
Note, while packaging your whole app with py2exe may not be an option, you can use it to make a simple launcher environment. You make a script with imports your module/package/whatever and launches the main() entry-point. Package this with py2exe but keep your application code outside this, as python code or an egg. I do something similar where I read a .pth text file to learn what paths to add to the sys.path in order to import my application code.
Simply, that's generally not how python works. Modules are installed site-wide and used that way. Are you familiar with pip and/or easy_install? Those + pypi let you automatically install dependencies no matter what you need.
If you want to create a standalone executable typically you'd use py2exe, py2app or something like that. Then you would have no dependencies on python at all.
I also found about zc.buildout that can be used to include dependencies in an automatic way.

Categories