How do I ship some standard modules from Python together with my code?
I'm writing an add-on for Anki, for which I need Queue and threading modules from Python2.7 standard library.
When I try launching Anki, I get ImportError: No module named Queue. I assume that is because Anki does not ship with full Python interpreter and if I am missing any standard modules, I am to bundle them myself.
From Anki docs on addons:
Standard Modules
Anki ships with only the standard modules necessary to run the program
- a full copy of Python is not included. For that reason, if you need to use a standard module that is not included with Anki, you’ll need
to bundle it with your add-on.
So my question is: what steps do I take to bundle standard Python modules threading and Queue together with my add-on?
Note that add-ons in Anki are just Python scripts that have certain extra modules available.
From the Anki doc:
For a simple one-file add-on, you can upload the .py file. For multi-file add-ons, please create a subfolder that acts as a Python package, and create a small .py file that imports that package. Using the Japanese support add-on as an example, the structure looks like:
japanese/file1.py
japanese/file2.py
japanese/__init__.py # can be empty; marks the folder as a package
japanese/<binary support files>
jp.py
To upload a multi-file add-on, please zip up the folder and the loader .py file and upload the zip.
The <binary support files> can be the modules you want.
Checkout html_cleaner and image-occlusion-enhanced
on Github if you want to see how others do it.
For anybody else who is wondering how to import a .so file (I was using a library that tried to import parser and discovered that parser.__file__ was a .so file) the answer is it's the same as a .py file:
Create a directory (mkdir parser), copy the .so file into that directory (cp parser.cpython-37m-x86_64-linux-gnu.so parser/) and then add an __init__.py to the directory (touch parser/__init__.py).
This is almost certainly not cross platform, but it worked for my needs.
Related
I have a .py file that imports from other python modules that import from config files, other modules, etc.
I am to move the code needed to run that .py file, but only whatever the py file is reading from (I am not talking about packages installed by pip install, it's more about other python files in the project directory, mostly classes, functions and ini files).
Is there a way to find out only the external files used by that particular python script? Is it something that can be found using PyCharm for example?
Thanks!
Static analysis tools (such as PyCharm's refactoring tools) can (mostly) figure out the module import tree for a program (unless you do dynamic imports using e.g. importlib.import_module()).
However, it's not quite possible to statically definitively know what other files are required for your program to function. You could use Python's audit events (or strace/ptrace or similar OS-level functions) to look at what files are being opened by your program (e.g. during your tests being run (you do have tests, right?), or during regular program use), but it's likely not going to be exhaustive.
I have a Maven project with multiple modules. Many of the modules contain the code and config for a separate application. I have a python script with each application that will take care of launching the application. I also have a separate "helper" python file that has functionality intended to be used by each of the application launcher scripts.
I'd like to avoid having to have a copy of the helper in each application module, so I was hoping I could figure out a way to either make the python file a "dependency", or to copy it into the built distribution (zip file containing the needed jar files and other configuration) during the maven build process.
I already have a "common" module that contains Java classes commonly used by code in many of the other modules, so this module is already a dependency required by other module pom files. Seems like the perfect place to put the helper python file. The thing is: maven seems to very good at handling maven artifact dependencies, but I'm not sure how (or if) it can handle this type of situation.
Has anyone done something like this?
I have used gr_modtool to add custom blocks in python to an OOT module. It appears that all the source python I write must reside in the gr-my_oot_module/python directory.
I will be writing a lot of code spread over many python files. I would like to organize those files into sub-directories (presumably) under gr-my_oot_module/python. Simply creating those directories and putting code there does not lead to a successful installation.
What is the correct approach to organizing the python files I write for this module into sub-directories?
More specifically:
I added a block via gr_modtool. The associated python file with put in the python directory.
I then moved that .py file into a sub-directory (sub_dir) under python/.
I modified init.py and CMakeLists.txt under the python directory to reflect the sub-directory location and then did the install.
The block appears in GRC. When I try to use it, it complains
File "/home/my_name/devel/gnuradio3_8/lib/python3.6/dist-packages/my_module/__init__.py"
from .sub_dir.sub_dir_test_blk import sub_dir_test_blk
ModuleNotFoundError: No module named 'my_module.sub_dir' –
You're right, Python code resides under python/. Then, you should use gr_modtool add to add GNU Radio python blocks. That will also add them to the CMakeLists.txt, which will in turn make sure they get installed during installation.
No, just putting files in subdirectories doesn't make them part of the installed module. That is not different than for any other python code. If you want things to be part of a module, you need to have them in an __init__.py. The python.org tutorial is your friend!
I'm very new to python, as i'm embedding it (in form of a static lib) in an ios/obj-c project. It's not possible for me to dynamically load python modules, so i would like to compile my modules along with python.
For modules shipped with the python source this works (by modifying setup.py or Module/Setup), but when i downloaded a third party module i noticed, i don't fully understand the mechanism.
The modules shipped with python come with a .c file in the Modules dir as well as a .py file in the Lib dir.
My third party module just comes with .py files.
1.Why do those modules have different file extensions?
2.How to integrate a module coming with .py files in an embedded python version? Obviously pasting them in Modules/Setup does require some .c files.
3.Do these .c files have something to do with the Python C-Api?
I guess i'm missing something essential :) Help is much appreciated.
I have a Python project that has the following structure:
package1
class.py
class2.py
...
package2
otherClass.py
otherClass2.py
...
config
dev_settings.ini
prod_settings.ini
I wrote a setup.py file that converts this into an egg with the same file structure. (When I examine it using a zip program the structure seems identical.) The funny thing is, when I run the Python code from my IDE it works fine and can access the config files; but when I try to run it from a different Python script using the egg, it can't seem to find the config files in the egg. If I put the config files into a directory relative to the calling Python script (external to the egg), it works - but that sort of defeats the purpose of having a self-contained egg that has all the functionality of the program and can be called from anywhere. I can use any classes/modules and run any functions from the egg as long as they don't use the config files... but if they do, the egg can't find them and so the functions don't work.
Any help would be really appreciated! We're kind of new to the egg thing here and don't really know where to start.
The problem is, the config files are not files anymore - they're packaged within the egg. It's not easy to find the answer in the docs, but it is there. From the setuptools developer's guide:
Typically, existing programs manipulate a package's __file__ attribute in order to find the location of data files. However, this manipulation isn't compatible with PEP 302-based import hooks, including importing from zip files and Python Eggs.
To access them, you need to follow the instructions for the Resource Management API.
In my own code, I had this problem with a logging configuration file. I used the API successfully like this:
from pkg_resources import resource_stream
_log_config_file = 'logging.conf'
_log_config_location = resource_stream(__name__, _log_config_file)
logging.config.fileConfig(_log_config_location)
_log = logging.getLogger('package.module')
See Setuptools' discussion of accessing pacakged data files at runtime. You have to get at your configuration file a different way if you want the script to work inside an egg. Also, for that to work, you may need to make your config directory a Python package by tossing in an empty __init__.py file.