Issue using Flask-Assets to compile less files - python

I'm currently trying to set up a Flask web app, and trying to use Flask-Assets to compile my less files into minified css.
Here is my assets.py file that creates the bundle.
from flask_assets import Bundle
common_css = Bundle(
'vendor/less/theme.less',
filters='less',
output='static/css/common.css',
)
The error that I am getting is:
OSError: [Errno 2] No such file or directory
In the webassets documentation for the less filter, it says that:
This depends on the NodeJS implementation of less, installable via npm. To use the old Ruby-based version (implemented in the 1.x Ruby gem), see Less.
...
LESS_BIN (binary)
Path to the less executable used to compile source files. By default, the filter will attempt to run lessc via the system path.
I installed less using $ npm install less, but for some reason it looks like webassets can't use it.
When I try to use different filters, then webassets can successfully create the bundle.
Thanks!

npm install installs packages in current directory by default (you should be able to find node_modules directory there). You have two choices:
Install lessc globally:
$ npm install -g less
This way webassets will be able to find it itself.
Provide a full path to lessc executable:
assets = Environment(app)
assets.config['less_bin'] = '/path/to/lessc'
The path should be <some_directory>/node_modules/.bin/lessc.

Related

Change .egg-info directory with pip install --editable

Is there a way to modify the location of the .egg-info directory that is generated upon:
pip install --editable .
I'm asking because I store my source code on (locally synchronized) cloud storage, and I want to install the package in editable mode on independent computers. So, ideally, the package directory would not be polluted with anything related to a given installation of the package.
I have tried using the --src option but this did not work; I don't understand what this option is meant to do.
You can achieve this by adding the egg_base option to setup.cfg:
[egg_info]
egg_base = relative/path/to/egg_info_folder
I have used this successfully in pip 19.3.1.
In my environment, the actual files that this altered are:
/anaconda/envs/my_env/lib/python3.6/site-packages/easy-install.pth
/anaconda/envs/my_env/lib/python3.6/site-packages/package_name.egg-link
Note, pip install raises an error if the egg_base base is not a relative path. But directly altering the files appears to work:
/anaconda/envs/my_env/lib/python3.6/site-packages/easy-install.pth:
/path/to/repository/folder
/anaconda/envs/my_env/lib/python3.6/site-packages/package_name.egg-link:
/path/to/egg_info/folder
/path/to/repository/folder/
Not sure if still relevant, but here is a setup.py based solution: https://jbhannah.net/articles/python-docker-disappearing-egg-info

Python Import error on installing ruamel.yaml in custom directory

I am using python 2.7.13 and
I am facing problems importing ruamel.yaml when I install it in a custom directory.
**ImportError: No module named ruamel.yaml**
The command used is as follows:
pip install --target=Z:\XYZ\globalpacks ruamel.yaml
I have added this custom directory to PYTHONPATH env variable
and also have a .pth file in this location with the following lines
Z:\XYZ\globalpacks\anotherApp
Z:\XYZ\globalpacks\ruamel
There is another app installed similarly with the above settings
and it works.
What am I missing here?
PS: It works when I install in site-packages folder
also it worked in the custom folder when I created an init.py file
in the ruamel folder.
EDIT:
Since our content creation software uses python 2.7 we are restricted to
using the same.We have chosen to install the same version of python on all
machines and set import paths to to point to modules/apps locacted on shared
network drive.
Like mentioned it works in pythons site-packages but not on the network drive
which is on the PYTHONPATH env-variable.
The ruamel.yaml-**.nspkg.pth and ruamel.ordereddict-*-nspkg.pth are
dutifully installed.Sorry for not giving complete details earlier.Your inputs
are much appreciated.

How do I build pysvn on Windows32?

I'm trying to build PySVN from source on my Windows 7 PC. It's running 64bit Windows, but for various reasons I need to compile it so that it works on 32bit Python. That's what we run on a lot of our automation servers.
I've downloaded the PySvn extensions source, I've got Visual Studio 2008 Express Edition installed. I've tried to a batch to automate the process, it looks like this:
set PROJECT_DIR=%~dp0
set SRC_DIR=%PROJECT_DIR%pysvn-1.7.8
cd %SRC_DIR%\Builder
set SVN_VER_MAJ_MIN=1.8
call builder_custom_init.cmd
cd %SRC_DIR%\Source
python setup.py configure --platform=win32
When I get to the last line I get the error message:
Info: Configure for python 2.7.6 in exec_prefix c:\python27
('Error:', 'cannot find PyCXX include CXX/Version.hxx - use --pycxx-dir')
My python - include directory does not contain a file called Version.hxx. Where do I get this file, how do I fix this bug?
One way is to install PYCXX by hand - it installs a Version.hxx under the Include folder on windows.
The source for PYCXX is here: http://cxx.sourceforge.net/
Another way is to point the C/L parameter --pycxx-dir at the Import folder under the pysvn root where the corresponding version of PYCXX for the version of pysvn is kept.
(On top of this you will have to build the subversion libraries on windows.)

How to build debian package with CPack to execute setup.py?

Until now, my project had only .cpp files that were compiled into different binaries and I managed to configure CPack to build a proper debian package without any problems.
Recently I wrote a couple of python applications and added them to the project, as well as some custom modules that I would also like to incorporate to the package.
After writing a setup.py script, I'm wondering how to add these files to the CPack configuration in a way that setup.py get's executed automatically when the user installs the package on the system with dpkg -i package.deb.
I'm struggling to find relevant information on how to configure CPack to install custom python applications/modules. Has anyone tried this?
I figured out a way to do it but it's not very simple. I'll do my best to explain the procedure so please be patient.
The idea of this approach is to use postinst and prerm to install and remove the python application from the system.
In the CMakeLists.txt that defines the project, you need to state that CPACK is going to be used to generate a .deb package. There's some variables that need to be filled with info related to the package itself, but one named CPACK_DEBIAN_PACKAGE_CONTROL_EXTRA is very important because it's used to specify the location of postinst and prerm, which are standard scripts of the debian packaging system that are automatically executed by dpkg when the package is installed/removed.
At some point of your main CMakeLists.txt you should have something like this:
add_subdirectory(name_of_python_app)
set(CPACK_COMPONENTS_ALL_IN_ONE_PACKAGE 1)
set(CPACK_PACKAGE_NAME "fake-package")
set(CPACK_PACKAGE_VENDOR "ACME")
set(CPACK_PACKAGE_DESCRIPTION_SUMMARY "fake-package - brought to you by ACME")
set(CPACK_PACKAGE_VERSION "1.0.2")
set(CPACK_PACKAGE_VERSION_MAJOR "1")
set(CPACK_PACKAGE_VERSION_MINOR "0")
set(CPACK_PACKAGE_VERSION_PATCH "2")
SET(CPACK_SYSTEM_NAME "i386")
set(CPACK_GENERATOR "DEB")
set(CPACK_DEBIAN_PACKAGE_MAINTAINER "ACME Technology")
set(CPACK_DEBIAN_PACKAGE_DEPENDS "libc6 (>= 2.3.1-6), libgcc1 (>= 1:3.4.2-12), python2.6, libboost-program-options1.40.0 (>= 1.40.0)")
set(CPACK_DEBIAN_PACKAGE_CONTROL_EXTRA "${CMAKE_SOURCE_DIR}/name_of_python_app/postinst;${CMAKE_SOURCE_DIR}/name_of_python_app/prerm;")
set(CPACK_SET_DESTDIR "ON")
include(CPack)
Some of these variables are optional, but I'm filling them with info for educational purposes.
Now, let's take a look at the scripts:
postinst:
#!/bin/sh
# postinst script for fake_python_app
set -e
cd /usr/share/pyshared/fake_package
sudo python setup.py install
prerm:
#!/bin/sh
# prerm script
#
# Removes all files installed by: ./setup.py install
sudo rm -rf /usr/share/pyshared/fake_package
sudo rm /usr/local/bin/fake_python_app
If you noticed, script postinst enters at /usr/share/pyshared/fake_package and executes the setup.py that is laying there to install the app on the system. Where does this file come from and how it ends up there? This file is created by you and will be copied to that location when your package is installed on the system. This action is configured in name_of_python_app/CMakeLists.txt:
install(FILES setup.py
DESTINATION "/usr/share/pyshared/fake_package"
)
install(FILES __init__.py
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
install(FILES fake_python_app
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
install(FILES fake_module_1.py
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
install(FILES fake_module_2.py
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
As you can probably tell, besides the python application I want to install there's also 2 custom python modules that I wrote that also need to be installed. Below I describe the contents of the most important files:
setup.py:
#!/usr/bin/env python
from distutils.core import setup
setup(name='fake_package',
version='1.0.5',
description='Python modules used by fake-package',
py_modules=['fake_package.fake_module_1', 'fake_package.fake_module_2'],
scripts=['fake_package/fake_python_app']
)
_init_.py: is an empty file.
fake_python_app : your python application that will be installed in /usr/local/bin
And that's pretty much it!
A setup.py file is the equivalent of the configure && make && make install dance for a standard unix source distribution and as such is inappropriate to run as a part of a distributions package install process. See this discussion of the different ways to include Python modules in a .deb package.

Problem installing OpenERP server with buildout !

I'm trying to deploy OpenERP with a buildout and my own piece of code. In fact I would like to build a complete deployement structure allowing me to use OpenERP with custom modules and patch.
First of all, before adding any personnal configuration, I was trying to create a buildout which will have the responsability to configure everything.
Buildout Configuration
My buildout.cfg configuration file look like this:
[buildout]
parts = eggs
versions=versions
newest = false
extensions = lovely.buildouthttp
unzip = true
find-links =
http://download.gna.org/pychart/
[versions]
[eggs]
recipe = zc.recipe.egg
interpreter = python
eggs =
Paste
PasteScript
PasteDeploy
psycopg2
PyChart
pydot
openerp-server
Configuration problem
But when trying to launch the buildout I have a couples of errors when trying to install the last needed egg (openerp-server)
On my side it just cannot find these modules, but they are in my eggs dir:
Error: python module psycopg2 (PostgreSQL module) is required
Error: python module libxslt (libxslt python bindings) is required
Error: python module pychart (pychart module) is required
Error: python module pydot (pydot module) is required
error: Setup script exited with 1
An error occured when trying to install openerp-server 5.0.0-3. Look above this message for any errors that were output by easy_install.
Is this possible that openerp hardcoded the his searching path somewhere ?
easy_install, a try
I decided to give a try to a clean virtualenv without any relation to the main site-package. But when using easy_install on openerp-server:
$ source openerp-python/bin/activate
$ easy_install openerp-server
...
File "build/bdist.linux-i686/egg/pkg_resources.py", line 887, in extraction_error
pkg_resources.ExtractionError: Can't extract file(s) to egg cache
The following error occurred while trying to extract file(s) to the Python egg
cache:
SandboxViolation: mkdir('/home/mlhamel/.python-eggs/psycopg2-2.0.13-py2.5-linux-x86_64.egg-tmp', 511) {}
I have always the error message however psyopg2 was installed or not on my machine
System's Configuration
Ubuntu 9.10 x86-64
Tried on Python 2.5/Python 2.6
Ok I did this recently:
Don't try to install the egg, openerp is not really standard.
I used this buildout snippet:
# get the openerp-stuff as a distutils package
[openerp-server]
recipe = zerokspot.recipe.distutils
urls = http://www.openerp.com/download/stable/source/openerp-server-5.0.6.tar.gz
# similar idea for the web component
[openerp-web]
recipe = zc.recipe.egg:scripts
find-links = http://www.openerp.com/download/stable/source/openerp-web-5.0.6.tar.gz
# add some symlinks so you can run it out of bin
[server-symlinks]
recipe = cns.recipe.symlink
symlink = ${buildout:parts-directory}/openerp-server/bin/openerp-server = ${buildout:bin-directory}
The key however, is that I did not use virtualenv. You don't need to with buildout. Buildout + virtualenv is like Trojan + Ramses... one is enough, unless you are ... well one is enough. ;)
Now for this particular project I had followed the debian instructions and installed the required libs via aptitude. This was only because I was new to buildout at the time, one could just as easily install the psycopg2 module
Here are some excellent instructions. Ignore the django stuff if you don't need it. Dan Fairs is both a great writer and great buildout tutor. Check it out. Disclaimer: I am a disciple of the man, based on his buildout usage.
I am certain you do not want to use the egg on pypi, it never worked for me, openerp is not eggified, it's a distutils package.
Good luck!
Just for the record: there is a buildout recipe for OpenERP available in Pypi.
I'm not familiar with buildout, but if I were going to try building an OpenERP installer, I'd start by looking at the nice one from Open Source Consulting. I've used it and been pretty happy with it.
Last time I checked, it doesn't set up the CRM e-mail gateway, but everything else I need was covered.

Categories