How to fix locale issue in Red Hat distro? - python

I'm having a strange problem today in my RHEL system. My python script is returning:
>>> locale.setlocale(locale.LC_ALL, '')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib64/python2.6/locale.py", line 513, in setlocale
return _setlocale(category, locale)
locale.Error: unsupported locale setting
WHen I run...
$ locale
The ouput is...
locale: Cannot set LC_ALL to default locale: No such file or directory
LANG=en_US.UTF-8
LC_CTYPE="en_US.UTF-8"
LC_NUMERIC="en_US.UTF-8"
...
I have been trying many suggestions but none of them solved my issue yet.
For example:
Reinstall glibc-common.
Export LC_ALL as environment variable into ~/.bashrc.
Change the file /etc/sysconfig/i18n'.
locale-gen does not exists in RHEL.
Does anyone has a good suggestion to solve my issue. Remembering that I'm using RHEL and not Ubuntu (there are many tutorials about locale issues and Ubuntu).

Add this to your /etc/environment:
LC_ALL=en_US.UTF-8
LC_CTYPE=en_US.UTF-8
Then logout and login to shell again and try executing your commands.

In my case it was iTerm setting the locale variables automatically. I fixed it by going to the iTerm menu and then
Preferences ... > Profiles > "select your profile" > Terminal > uncheck "Set locale variables automatically"

The addition to /etc/environment of following variables fixed my issue:
LC_ALL=en_US.UTF-8
LC_CTYPE=en_US.UTF-8
Good answer above.

Related

Sublime Text environment variables not present

I have installed yapf and the PyYapf package (https://github.com/jason-kane/PyYapf) for sublime text for Python code formatting.
Furthermore, for my Python projects I'm using pipenv.
When executing the formatting command in sublime I receive the following error
Traceback (most recent call last):
File "~/.pyenv/versions/3.6.8/bin/pipenv", line 10, in <module>
sys.exit(cli())
File "~/.pyenv/versions/3.6.8/lib/python3.6/site-packages/pipenv/vendor/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "~/.pyenv/versions/3.6.8/lib/python3.6/site-packages/pipenv/vendor/click/core.py", line 696, in main
_verify_python3_env()
File "~/.pyenv/versions/3.6.8/lib/python3.6/site-packages/pipenv/vendor/click/_unicodefun.py", line 124, in _verify_python3_env
' mitigation steps.' + extra
RuntimeError: Click will abort further execution because Python 3 was configured to use ASCII as encoding for the environment. Consult https://click.palletsprojects.com/en/7.x/python3/ for mitigation steps.
This system supports the C.UTF-8 locale which is recommended.
You might be able to resolve your issue by exporting the
following environment variables:
export LC_ALL=C.UTF-8
export LANG=C.UTF-8
I have now tried the suggestion to use the two above exports and set them globally in my profile (and reboot).
Opening a terminal and checking they seem to be available:
$printenv | grep "C.UTF-8"
LC_ALL=C.UTF-8
LANG=C.UTF-8
However, sublime text still gives the original error. Looking at the source of the package the environment is read in like this inside the PyYapf.py package:
self.popen_env = os.environ.copy()
When printing the content of self.popen_env then the LANG and LC_ALL don't seem to be set or not to the C.UTF-8 value.
Where is sublime text getting the envs from?
Quick workaround:
$ cd ~/.config/sublime-text-3/Packages
$ mkdir -p Default
$ echo 'import os; os.environ["LC_ALL"] = os.environ["LANG"] = "C.UTF-8"' > Default/echo.py
Well, for God's sake, DO NOT try this nasty hack. Please take some time to setup your locales properly, otherwise you will likely encounter more locale problems in the future.
For Linux Mint & other Debian variants, simply run $ sudo dpkg-reconfigure locales, and tick off en_US.UTF-8 (or any other locale that you prefer) on the list. Follow these two posts for more details about locale errors.

Set language of GTK Buttons to a different language - Linux - Python

How can I set the default language of the Gtk3 Stock Buttons to another language?
I tried:
sudo apt-get install language-pack-en language-pack-gnome-en
But:
import locale
from pprint import pprint
pprint(locale.getlocale(locale.LC_ALL))
locale.setlocale(locale.LC_ALL, 'en_EN.utf8')
response:
('de_DE', 'UTF-8')
Traceback (most recent call last):
File "tp_tools.py", line 41, in <module>
locale.setlocale(locale.LC_ALL, 'en_EN.utf8')
File "/usr/lib/python2.7/locale.py", line 581, in setlocale
return _setlocale(category, locale)
locale.Error: unsupported locale setting
Working on Linux Mint 18 with Python 2.7 and Gtk3
en_EN.utf8 and also en_EN is no valid locale (contrary to de_DE). You can list your installed locales with locale -a in your shell.
A correct locale would be for example en_US or en_GB. And a correct instruction would be locale.setlocale(locale.LC_ALL, 'en_US').
The problem is that gi.repository.Gtk calls Gtk.init() while being imported, and after that it is nearly impossible to make any changes to localization. Furthermore, setting the locale to hardcoded strings makes your application almost unportable, since the only locale you can assume to exist is "C", which doesn't even include UTF-8 support.
The only solution I found so far is by setting the environment variable 'LANGUAGE' before any import of GLib modules, which is given priority by gettext and does not need to have an encoding definition appended to it (more information). This works for me:
import os
os.environ["LANGUAGE"] = "en"
PS: Stop using Python 2, it's outdated.
PPS: Gtk+ 3 stock buttons are deprecated.

environment variables PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON

I have installed pyspark recently. It was installed correctly. When I am using following simple program in python, I am getting an error.
>>from pyspark import SparkContext
>>sc = SparkContext()
>>data = range(1,1000)
>>rdd = sc.parallelize(data)
>>rdd.collect()
while running the last line I am getting error whose key line seems to be
[Stage 0:> (0 + 0) / 4]18/01/15 14:36:32 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1)
org.apache.spark.api.python.PythonException: Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/pyspark/python/lib/pyspark.zip/pyspark/worker.py", line 123, in main
("%d.%d" % sys.version_info[:2], version))
Exception: Python in worker has different version 2.7 than that in driver 3.5, PySpark cannot run with different minor versions.Please check environment variables PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON are correctly set.
I have the following variables in .bashrc
export SPARK_HOME=/opt/spark
export PYTHONPATH=$SPARK_HOME/python3
I am using Python 3.
By the way, if you use PyCharm, you could add PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON to run/debug configurations per image below
You should set the following environment variables in $SPARK_HOME/conf/spark-env.sh:
export PYSPARK_PYTHON=/usr/bin/python
export PYSPARK_DRIVER_PYTHON=/usr/bin/python
If spark-env.sh doesn't exist, you can rename spark-env.sh.template
This may happen also if you're working within an environment. In this case, it may be harder to retrieve the correct path to the python executable (and anyway I think it's not a good idea to hardcode the path if you want to share it with others).
If you run the following lines at the beginning of your script/notebook (at least before you create the SparkSession/SparkContext) the problem is solved:
import os
import sys
os.environ['PYSPARK_PYTHON'] = sys.executable
os.environ['PYSPARK_DRIVER_PYTHON'] = sys.executable
Package os allows you to set global variables; package sys gives the string with the absolute path of the executable binary for the Python interpreter.
I got the same issue, and I set both variable in .bash_profile
export PYSPARK_PYTHON=/usr/local/bin/python3
export PYSPARK_DRIVER_PYTHON=/usr/local/bin/python3
But My problem is still there.
Then I found out the problem is that my default python version is python 2.7 by typing python --version
So I solved the problem by following below page:
How to set Python's default version to 3.x on OS X?
Just run the code below in the very beginning of your code. I am using Python3.7. You might need to run locate python3.7 to get your Python path.
import os
os.environ["PYSPARK_PYTHON"] = "/Library/Frameworks/Python.framework/Versions/3.7/bin/python3.7"
os.environ["PYSPARK_DRIVER_PYTHON"] = "/Library/Frameworks/Python.framework/Versions/3.7/bin/python3.7"
I'm using Jupyter Notebook to study PySpark, and that's what worked for me.
Find where python3 is installed doing in a terminal:
which python3
Here is pointing to /usr/bin/python3.
Now in the the beginning of the notebook (or .py script), do:
import os
# Set spark environments
os.environ['PYSPARK_PYTHON'] = '/usr/bin/python3'
os.environ['PYSPARK_DRIVER_PYTHON'] = '/usr/bin/python3'
Restart your notebook session and it should works!
Apache-Spark 2.4.3 on Archlinux
I've just installed Apache-Spark-2.3.4 from Apache-Spark website, I'm using Archlinux distribution, it's simple and lightweight distribution. So, I've installed and put the apache-spark directory on /opt/apache-spark/, now it's time to export our environment variables, remember, I'm using Archlinux, so take in mind to using your $JAVA_HOME for example.
Importing environment variables
echo 'export JAVA_HOME=/usr/lib/jvm/java-7-openjdk/jre' >> /home/user/.bashrc
echo 'export SPARK_HOME=/opt/apache-spark' >> /home/user/.bashrc
echo 'export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH' >> /home/user/.bashrc
echo 'export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.10.7-src.zip:$PYTHONPATH' >> /home/user/.bashrc
source ../.bashrc
Testing
emanuel#hinton ~ $ echo 'export JAVA_HOME=/usr/lib/jvm/java-7-openjdk/jre' >> /home/emanuel/.bashrc
emanuel#hinton ~ $ echo 'export SPARK_HOME=/opt/apache-spark' >> /home/emanuel/.bashrc
emanuel#hinton ~ $ echo 'export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH' >> /home/emanuel/.bashrc
emanuel#hinton ~ $ echo 'export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.10.7-src.zip:$PYTHONPATH' >> /home/emanuel/.bashrc
emanuel#hinton ~ $ source .bashrc
emanuel#hinton ~ $ python
Python 3.7.3 (default, Jun 24 2019, 04:54:02)
[GCC 9.1.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import pyspark
>>>
Everything it's working fine since you correctly imported the environment variables for SparkContext.
Using Apache-Spark on Archlinux via DockerImage
For my use purposes I've created a Docker image with python, jupyter-notebook and apache-spark-2.3.4
running the image
docker run -ti -p 8888:8888 emanuelfontelles/spark-jupyter
just go to your browser and type
http://localhost:8888/tree
and will prompted a authentication page, come back to terminal and copy the token number and voila, will have Archlinux container running a Apache-Spark distribution.
If you are using Pycharm , Got to Run - > Edit Configurations and click on Environment variables to add as below(basically the PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON should point to the same version of Python) . This solution worked for me .Thanks to the above posts.
To make it easier to see for people, that instead of having to set a specific path /usr/bin/python3 that you can do this:
I put this line in my ~/.zshrc
export PYSPARK_PYTHON=python3.8
export PYSPARK_DRIVER_PYTHON=python3.8
When I type in python3.8 in my terminal I get Python3.8 going. I think it's because I installed pipenv.
Another good website to reference to get your SPARK_HOME is https://towardsdatascience.com/how-to-use-pyspark-on-your-computer-9c7180075617
(for permission denied issues use sudo mv)
1. Download and Install Java (Jre)
2. It has two options, you can choose one of the following solution:-
## -------- Temporary Solution -------- ##
Just put the path in your jupyter notebook in the following code and RUN IT EVERYTIME:-
import os
os.environ["PYSPARK_PYTHON"] = r"C:\Users\LAPTOP0534\miniconda3\envs\pyspark_v3.3.0"
os.environ["PYSPARK_DRIVER_PYTHON"] = r"C:\Users\LAPTOP0534\miniconda3\envs\pyspark_v3.3.0"
os.environ["JAVA_HOME"] = r"C:\Program Files\Java\jre1.8.0_333"
----OR----
## -------- Permanent Solution -------- ##
Set above 3 variables in your Environment Variable.
I have gone through many answers but nothing works for me.
But both of these resolution worked for me. This has resolved my error.
Thanks
import os
os.environ["JAVA_HOME"] = "C:\Program Files\Java\jdk-19"
os.environ["SPARK_HOME"] = "C:\Program Files\Spark\spark-3.3.1-bin-hadoop2"
import sys
os.environ['PYSPARK_PYTHON'] = sys.executable
os.environ['PYSPARK_DRIVER_PYTHON'] = sys.executable
This worked for me in a jupyter notebook as the os library makes our life easy in setting up the environment variables. Make sure to run this cell befor running the sparksession.
I tried two methods for the question. the method in the picture can works.
add environment variables
PYSPARK_PYTHON=/usr/local/bin/python3.7;PYSPARK_DRIVER_PYTHON=/usr/local/bin/python3.7;PYTHONUNBUFFERED=1

Sublime text & Linux-KDE. System locales is set to a value that can not handle non-ASCII characters

I'm getting this error when I open ST3. Package control is not working.
Package Control Your system's local is set to a value that can not
handle non ASCII characters. Package Control can not properly work
unless this is fixed.
On Linux, please reference your distribution's docs for on properly
setting the LANG environmental variable. As a temporary work-around,
you can launch Sublime Text from the terminal with:
LANG=en_US.UTF-8 sublime_text
The termporary workaround doesn't work. Also, when I write 'locale' in the terminal I get:
locale: Cannot set LC_CTYPE to default locale: No such file or directory
locale: Cannot set LC_MESSAGES to default locale: No such file or directory
locale: Cannot set LC_ALL to default locale: No such file or directory
LANG=en_EC.UTF-8
LANGUAGE=en:es:en
LC_CTYPE="en_EC.UTF-8"
LC_NUMERIC=en_EC.UTF-8
LC_TIME=en_EC.UTF-8
LC_COLLATE="en_EC.UTF-8"
LC_MONETARY=en_EC.UTF-8
LC_MESSAGES="en_EC.UTF-8"
LC_PAPER=en_EC.UTF-8
LC_NAME=en_EC.UTF-8
LC_ADDRESS=en_EC.UTF-8
LC_TELEPHONE=en_EC.UTF-8
LC_MEASUREMENT=en_EC.UTF-8
LC_IDENTIFICATION=en_EC.UTF-8
LC_ALL=
Please help. I'm on Ubuntu+KDE, Im coding blindly now with no sublime packages installed.
I had the same issue while attempting to install Package Control on Ubuntu 14.04 LTS via the prescribed method within Chroot of ChromeOS, Sublime Text Build 3083.
A sensible solution for this problem exists in a duplicate of this issue here: https://askubuntu.com/a/440341/200027
The solution involves adding "bash -c "LANG=en_US.UTF-8 /opt/sublime_text/sublime_text" to the appropriate Exec lines of Sublime's .desktop unity launch file located at /usr/applications/share/sublime_text.desktop as follows:
[Desktop Entry]
Version=1.0
Type=Application
Name=Sublime Text
GenericName=Text Editor
Comment=Sophisticated text editor for code, markup and prose
Exec=bash -c "LANG=en_US.UTF-8 /opt/sublime_text/sublime_text %F"
Terminal=false
MimeType=text/plain;
Icon=sublime-text
Categories=TextEditor;Development;
StartupNotify=true
Actions=Window;Document;
[Desktop Action Window]
Name=New Window
Exec=bash -c "LANG=en_US.UTF-8/opt/sublime_text/sublime_text -n"
OnlyShowIn=Unity;
[Desktop Action Document]
Name=New File
Exec= bash -c "LANG=en_US.UTF-8/opt/sublime_text/sublime_text --command new_file"
OnlyShowIn=Unity;
Note that you will need sudo to edit that desktop launch file.

loading custom gstreamer plugin raises gst.ElementNotFoundError

I'm trying to set GST_PLUGIN_PATH environment variable before loading custom gstreamer plugin in an integration test, so I need to change GST_PLUGIN_PATH programmatically.
But without GST_PLUGIN_PATH set in shell, gst.element_factory_make fails.
I'm using gstreamer-0.10, python2.6 and linux 3.2.0 (debian 6).
Example:
import gst, os
os.environ['GST_PLUGIN_PATH'] = '/tmp'
print gst.element_factory_make('myelem')
Without GST_PLUGIN_PATH set in shell:
$ export GST_PLUGIN_PATH=
$ ./gstpathtest.py
Traceback (most recent call last):
File "./gstpathtest.py", line 7, in <module>
print gst.element_factory_make('myelem')
gst.ElementNotFoundError: myelem
With GST_PLUGIN_PATH set in shell:
$ GST_PLUGIN_PATH=/tmp ./gstpathtest.py
/MyElem:myelem0 (__main__.MyElem)
or with GST_PLUGIN_PATH exported in shell:
$ export GST_PLUGIN_PATH=/tmp
$ ./gstpathtest.py
/MyElem:myelem0 (__main__.MyElem)
When run with GST_DEBUG=6 I noticed that myelem gets created but is immediately unref-ed and destroyed, which is probably the cause of the error.
I even tried to add the path to os.environ['PATH'], or directly to sys.path but it didn't change anything.
My main question is: am I doing something wrong on python-level (and what exactly) or does it indicate some bug in the myelem plugin itself?
Oh, stupid me - if I set os.environ['GST_PLUGIN_PATH'] before importing gst, everything works as expected.
Also, gst.registry.scan_path works:
import gst
gst.registry_get_default().scan_path('/tmp')
print gst.element_factory_make('myelem')

Categories