By default no Write permissions to package_data files - python

I would like to include certain user defined files such cfg.ini under package_data also with write permissions. But by default write permissions are disabled because I installed my package with sudo.
sudo python setup.install
By the way i am using setuptools to create my setup.py
Any suggestions please.

There are 2 solutions to this.
Install the package to the local user (instead of the system) by just using (without sudo):
python setup.py install
This would allow the user to be able to edit the custom files.
As an alternative to having user defined files under the *package_data*, you could simply have user specific files generated in user specific directories which were read by your module. Then be edited by the user. On Windows this would be stored under the AppData directory for the user.
The appdirs package provides easy access to user directories for storing application data (it is MIT licensed).
To install appdirs from PyPi:
sudo pip install appdirs
Or you can download the source, and then install it with:
sudo python setup.py install
Here's an example:
>>> import appdirs
>>> user_dir = appdirs.user_data_dir("YourAppOrModule", "YourName")
>>> print(user_dir)
C:\Users\TheUser\AppData\Local\YourName\YourAppOrModule
And then you can generate the user files under user_dir and allow the user to edit them from there.

Related

Question about Packaging in Python with pip

I saw this nice explanation video (link) of packaging using pip and I got two questions:
The first one is:
I write a code which I want to share with my colleagues, but I do not aim to share it via pypi. Thus, I want to share it internally, so everyone can install it within his/ her environment.
I actually needn't to create a wheel file with python setup.py bdist_wheel, right? I create the setup.py file and I can install it with the command pip install -e . (for editable use), and everyone else can do it so as well, after cloning the repository. Is this right?
My second question is more technical:
I create the setup.py file:
from setuptools import setup
setup(
name = 'helloandbyemate',
version = '0.0.1',
description="Say hello in slang",
py_modules=['hellomate'],
package_dir={"": "src"}
)
To test it, I write a file hellomate.py which contains a function printing hello, mate!. I put this function in src/. In the setup.py file I put only this module in the list py_modules. In src/ is another module called byemate.py. When I install the whole module, it installs the module byemate.py as well, although I only put hellomate in the list of py_modules. Has anyone an explanation for this behaviour?
I actually needn't to create a wheel file ... everyone else can do it so as well, after cloning the repository. Is this right?
This is correct. However, the installation from source is slower, so you may want to publish wheels to an index anyway if you would like faster installs.
When I install the whole module, it installs the module byemate.py as well, although I only put hellomate in the list of py_modules. Has anyone an explanation for this behaviour?
Yes, this is an artifact of the "editable" installation mode. It works by putting the src directory onto the sys.path, via a line in the path configuration file .../lib/pythonX.Y/site-packages/easy-install.pth. This means that the entire source directory is exposed and everything in there is available to import, whether it is packaged up into a release file or not.
The benefit is that source code is "editable" without reinstallation (adding/removing/modifying files in src will be reflected in the package immediately)
The drawback is that the editable installation is not exactly the same as a "real" installation, where only the files specified in the setup.py will be copied into site-packages directly
If you don't want other files such as byemate.py to be available to import, use a regular install pip install . without the -e option. However, local changes to hellomate.py won't be reflected until the installation step is repeated.
Strict editable installs
It is possible to get a mode of installation where byemate.py is not exposed at all, but live modifications to hellomate.py are still possible. This is the "strict" editable mode of setuptools. However, it is not possible using setup.py, you have to use a modern build system declaration in pyproject.toml:
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[project]
name = "helloandbyemate"
version = "0.0.1"
description = "Say hello in slang"
[tool.setuptools]
py-modules = ["hellomate"]
include-package-data = false
[tool.setuptools.package-dir]
"" = "src"
Now you can perform a strict install with:
pip install -e . --config-settings editable_mode=strict

Change .egg-info directory with pip install --editable

Is there a way to modify the location of the .egg-info directory that is generated upon:
pip install --editable .
I'm asking because I store my source code on (locally synchronized) cloud storage, and I want to install the package in editable mode on independent computers. So, ideally, the package directory would not be polluted with anything related to a given installation of the package.
I have tried using the --src option but this did not work; I don't understand what this option is meant to do.
You can achieve this by adding the egg_base option to setup.cfg:
[egg_info]
egg_base = relative/path/to/egg_info_folder
I have used this successfully in pip 19.3.1.
In my environment, the actual files that this altered are:
/anaconda/envs/my_env/lib/python3.6/site-packages/easy-install.pth
/anaconda/envs/my_env/lib/python3.6/site-packages/package_name.egg-link
Note, pip install raises an error if the egg_base base is not a relative path. But directly altering the files appears to work:
/anaconda/envs/my_env/lib/python3.6/site-packages/easy-install.pth:
/path/to/repository/folder
/anaconda/envs/my_env/lib/python3.6/site-packages/package_name.egg-link:
/path/to/egg_info/folder
/path/to/repository/folder/
Not sure if still relevant, but here is a setup.py based solution: https://jbhannah.net/articles/python-docker-disappearing-egg-info

Python Import error on installing ruamel.yaml in custom directory

I am using python 2.7.13 and
I am facing problems importing ruamel.yaml when I install it in a custom directory.
**ImportError: No module named ruamel.yaml**
The command used is as follows:
pip install --target=Z:\XYZ\globalpacks ruamel.yaml
I have added this custom directory to PYTHONPATH env variable
and also have a .pth file in this location with the following lines
Z:\XYZ\globalpacks\anotherApp
Z:\XYZ\globalpacks\ruamel
There is another app installed similarly with the above settings
and it works.
What am I missing here?
PS: It works when I install in site-packages folder
also it worked in the custom folder when I created an init.py file
in the ruamel folder.
EDIT:
Since our content creation software uses python 2.7 we are restricted to
using the same.We have chosen to install the same version of python on all
machines and set import paths to to point to modules/apps locacted on shared
network drive.
Like mentioned it works in pythons site-packages but not on the network drive
which is on the PYTHONPATH env-variable.
The ruamel.yaml-**.nspkg.pth and ruamel.ordereddict-*-nspkg.pth are
dutifully installed.Sorry for not giving complete details earlier.Your inputs
are much appreciated.

Installing PRAW

I would like to install PRAW so I can make reddit bots and stuff, but all the install guides are confusing to me so could someone explain how to as noob friendly as possible. I've had some experience with vanilla python. Thanks!
praw is best installed, according to the documentation, via pip. To install pip, you need setuptools. Here is a simple guide on installing pip via setuptools.
Basically, download ez_setup.py and get-pip.py, two Python scripts that automatically get and install setuptools and pip. You'll want to run the following commands in the terminal in the same directory as the location of the files, in order:
python ez_setup.py
python get-pip.py
Finally, you'll want to use pip to get praw. pip is an executable file that is usually located in your python build directory. For example, in Windows, it's located in C:\Python27\scripts. You can add that directory to your system path variable, but right now you can just navigate to that directory where pip.exe is installed. Then, run the following command in the terminal:
pip install praw
I recently had trouble with this so I thought I would add what I did.
Install Pip - https://pip.pypa.io/en/stable/installing/
install praw pip install praw; This is done on your pc/mac/linux(?)
Installation guide
Register on reddit as a developer and register the app. To be able to use the api you need to have a client_id and a client_secret. Get those by registering here. More information about the types of applications can be found here.
Now you are ready to begin coding. This is a good script to verify that you are connecting to the reddit api. The client_id and client_secret are from the previous step and the user_agent is a string that is unique to your app. I used something like 'my first app by /u/myUsername'. The password and username is your reddit login
Run this code and it should output your username.
import praw
reddit = praw.Reddit(client_id='CLIENT_ID',
client_secret="CLIENT_SECRET", password='PASSWORD',
user_agent='USERAGENT', username='USERNAME')
print(reddit.user.me())

How to build debian package with CPack to execute setup.py?

Until now, my project had only .cpp files that were compiled into different binaries and I managed to configure CPack to build a proper debian package without any problems.
Recently I wrote a couple of python applications and added them to the project, as well as some custom modules that I would also like to incorporate to the package.
After writing a setup.py script, I'm wondering how to add these files to the CPack configuration in a way that setup.py get's executed automatically when the user installs the package on the system with dpkg -i package.deb.
I'm struggling to find relevant information on how to configure CPack to install custom python applications/modules. Has anyone tried this?
I figured out a way to do it but it's not very simple. I'll do my best to explain the procedure so please be patient.
The idea of this approach is to use postinst and prerm to install and remove the python application from the system.
In the CMakeLists.txt that defines the project, you need to state that CPACK is going to be used to generate a .deb package. There's some variables that need to be filled with info related to the package itself, but one named CPACK_DEBIAN_PACKAGE_CONTROL_EXTRA is very important because it's used to specify the location of postinst and prerm, which are standard scripts of the debian packaging system that are automatically executed by dpkg when the package is installed/removed.
At some point of your main CMakeLists.txt you should have something like this:
add_subdirectory(name_of_python_app)
set(CPACK_COMPONENTS_ALL_IN_ONE_PACKAGE 1)
set(CPACK_PACKAGE_NAME "fake-package")
set(CPACK_PACKAGE_VENDOR "ACME")
set(CPACK_PACKAGE_DESCRIPTION_SUMMARY "fake-package - brought to you by ACME")
set(CPACK_PACKAGE_VERSION "1.0.2")
set(CPACK_PACKAGE_VERSION_MAJOR "1")
set(CPACK_PACKAGE_VERSION_MINOR "0")
set(CPACK_PACKAGE_VERSION_PATCH "2")
SET(CPACK_SYSTEM_NAME "i386")
set(CPACK_GENERATOR "DEB")
set(CPACK_DEBIAN_PACKAGE_MAINTAINER "ACME Technology")
set(CPACK_DEBIAN_PACKAGE_DEPENDS "libc6 (>= 2.3.1-6), libgcc1 (>= 1:3.4.2-12), python2.6, libboost-program-options1.40.0 (>= 1.40.0)")
set(CPACK_DEBIAN_PACKAGE_CONTROL_EXTRA "${CMAKE_SOURCE_DIR}/name_of_python_app/postinst;${CMAKE_SOURCE_DIR}/name_of_python_app/prerm;")
set(CPACK_SET_DESTDIR "ON")
include(CPack)
Some of these variables are optional, but I'm filling them with info for educational purposes.
Now, let's take a look at the scripts:
postinst:
#!/bin/sh
# postinst script for fake_python_app
set -e
cd /usr/share/pyshared/fake_package
sudo python setup.py install
prerm:
#!/bin/sh
# prerm script
#
# Removes all files installed by: ./setup.py install
sudo rm -rf /usr/share/pyshared/fake_package
sudo rm /usr/local/bin/fake_python_app
If you noticed, script postinst enters at /usr/share/pyshared/fake_package and executes the setup.py that is laying there to install the app on the system. Where does this file come from and how it ends up there? This file is created by you and will be copied to that location when your package is installed on the system. This action is configured in name_of_python_app/CMakeLists.txt:
install(FILES setup.py
DESTINATION "/usr/share/pyshared/fake_package"
)
install(FILES __init__.py
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
install(FILES fake_python_app
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
install(FILES fake_module_1.py
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
install(FILES fake_module_2.py
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
As you can probably tell, besides the python application I want to install there's also 2 custom python modules that I wrote that also need to be installed. Below I describe the contents of the most important files:
setup.py:
#!/usr/bin/env python
from distutils.core import setup
setup(name='fake_package',
version='1.0.5',
description='Python modules used by fake-package',
py_modules=['fake_package.fake_module_1', 'fake_package.fake_module_2'],
scripts=['fake_package/fake_python_app']
)
_init_.py: is an empty file.
fake_python_app : your python application that will be installed in /usr/local/bin
And that's pretty much it!
A setup.py file is the equivalent of the configure && make && make install dance for a standard unix source distribution and as such is inappropriate to run as a part of a distributions package install process. See this discussion of the different ways to include Python modules in a .deb package.

Categories