Meson cannot find pykeepass module, I am certain it is installed - python

I am attempting to build an application using meson, Gnome PasswordSafe. I have successfully built it on Arch, but have since moved to PureOS (Debian).
When running:
$ meson . _build --prefix=/usr
it tells me:
meson.build:36:4: ERROR: Problem encountered: Missing dependency pykeepass >= master
What am I missing?
Thanks!
I installed pykeepass using pip. When that did not work, I tried using pip3. When that did not work, I tried both again, but with sudo. Still no dice.
I then installed it from the repo/source (https://github.com/pschmitt/pykeepass). No dice.
Currently, python help recognizes pykeepass as being installed to:
/home/dc3p/.local/lib/python3.7/site-packages/pykeepass/__init__.py
/usr/local/lib/python3.7/dist-packages/pykeepass/__init__.py
/home/dc3p/.local/lib/python2.7/site-packages/pykeepass/__init__.py
/usr/local/lib/python2.7/dist-packages/pykeepass/__init__.py
pip and pip3 list shows pykeepass as being present.
While I have it installed in all four locations currently, I have also tried while having only one installed at any location at a time.
I have also tried the meson command without and with sudo. Regardless of what I do, meson throws the same error.
Expected result is a build.

The meson.build file in PasswordSafe is testing for the presence of a directory in the file system which can lead to false negatives if the installation directory varies. See the code extract below.
# Python Module Check
pykeepass_dir = join_paths(python_dir, 'pykeepass')
construct_dir = join_paths(python_dir, 'construct')
if run_command('[', '-d', pykeepass_dir, ']').returncode() != 0
error('Missing dependency pykeepass >= master')
endif
if run_command('[', '-d', construct_dir, ']').returncode() != 0
error('Missing dependency python-construct >= 2.9.45')
endif
You can replace the above with the following to test for dependencies based on imports:
python3_required_modules = ['pykeepass', 'construct']
foreach p : python3_required_modules
script = 'import importlib.util; import sys; exit(1) if importlib.util.find_spec(\''+ p +'\') is None else exit(0)'
if run_command(python_bin, '-c', script).returncode() != 0
error('Required Python3 module \'' + p + '\' not found')
endif
endforeach
This should solve the issue providing that pykeepass is within your path.

Related

Install USD pixar library

I have some trouble for installing USD library on Ubuntu. Here is the tuto I want to follow.
On github, I cloned the git, run the script build_usd.py and change the env var. But when I want to run this simple code
from pxr import Usd, UsdGeom
stage = Usd.Stage.CreateNew('HelloWorld.usda')
xformPrim = UsdGeom.Xform.Define(stage, '/hello')
spherePrim = UsdGeom.Sphere.Define(stage, '/hello/world')
stage.GetRootLayer().Save()
By changing the env var, eitheir the command "python " can't be found or the "module pxr" can't be found.
I tried to move the lib and include in my directory in usr/lib, ur/local/lib, usr/local/include but still pxr seems not to be found.
I am really confused about how to install and use library in general for python to found on ubuntu.

Metaphone-pt-br library for Windows 10 returns "No such file or directory" error when pip install

I downloaded this library on github and am trying to install with Python using pip install . but the following error message appears:
metaphone_ptbrpy.c (32): fatal error C1083
Cannot open include file: '../source/metaphone_ptbr.h'
No such file or directory
error: command 'C:\\Program Files(x86)\\Microsoft Visual Studio 14.0\\VC\\BIN\\ x86_amd64\\
cl.exe' failed with exit status 2
and the file that the error message says does not exist, in fact it does (follow print):
What do I do?
#Edit 1
Print of source folder bellow:
Not sure though Try it with the -e argument. pip install -e youpackage
I got in touch with the library developer, he helped me a lot. I also applied the suggestions that #phd talked about in the question comments. The complete solution:
copy /source folder into the /python folder
edit /python/setup.py
change this:
c_ext = Extension("metaphoneptbr", ["metaphone_ptbrpy.c", join("..", "source", "metaphone_ptbr.c")])
setup(
name='Metaphone-ptbr',
version='1.17',
ext_modules=[c_ext],
include_dirs=[".", join("..", "source")]
)
for this:
c_ext = Extension("metaphoneptbr", ["metaphone_ptbrpy.c", join("source", "metaphone_ptbr.c")])
setup(
name='Metaphone-ptbr',
version='1.17',
ext_modules=[c_ext],
include_dirs=[".", join("source")]
)
edit /python/metaphone_ptbrpy.c:
change this:
#include "../source/metaphone_ptbr.h"
for this:
#include "source/metaphone_ptbr.h"
replace the files /source/metaphone_ptbr.c and /source/metaphone_ptbr.h for this and this
run python setup.py build command in /python folder to build the project and then run pip install . to complete the installation.
to test if everything is okay, just run the command:
from metaphoneptbr import phonetic
print(phonetic('hello'))
PS: if when executing the command above a warning appears, just change the file /python/metaphone_ptbrpy.c for that version
The author said that all these modifications that were necessary to run the Metaphone-ptbr library for Windows 10 will be added to the repository itself, but anyway, I am sharing with you how the solution went step by step.

Install all the site packages of Django rest framework project at once [duplicate]

What is the most efficient way to list all dependencies required to deploy a working project elsewhere (on a different OS, say)?
Python 2.7, Windows dev environment, not using a virtualenv per project, but a global dev environment, installing libraries as needed, happily hopping from one project to the next.
I've kept track of most (not sure all) libraries I had to install for a given project. I have not kept track of any sub-dependencies that came auto-installed with them. Doing pip freeze lists both, plus all the other libraries that were ever installed.
Is there a way to list what you need to install, no more, no less, to deploy the project?
EDIT In view of the answers below, some clarification. My project consists of a bunch of modules (that I wrote), each with a bunch of imports. Should I just copy-paste all the imports from all modules into a single file, sort eliminating duplicates, and throw out all from the standard library (and how do I know they are)? Or is there a better way? That's the question.
pipreqs solves the problem. It generates project-level requirement.txt file.
Install pipreqs: pip install pipreqs
Generate project-level requirement.txt file: pipreqs /path/to/your/project/
requirements file would be saved in /path/to/your/project/requirements.txt
If you want to read more advantages of pipreqs over pip freeze, read it from here
Scan your import statements. Chances are you only import things you explicitly wanted to import, and not the dependencies.
Make a list like the one pip freeze does, then create and activate a virtualenv.
Do pip install -r your_list, and try to run your code in that virtualenv. Heed any ImportError exceptions, match them to packages, and add to your list. Repeat until your code runs without problems.
Now you have a list to feed to pip install on your deployment site.
This is extremely manual, but requires no external tools, and forces you to make sure that your code runs. (Running your test suite as a check is great but not sufficient.)
On your terminal type:
pip install pipdeptree
cd <your project root>
pipdeptree
I found the answers here didn't work too well for me as I only wanted the imports from inside our repository (eg. import requests I don't need, but from my.module.x import y I do need).
I noticed that PyInstaller had perfectly good functionality for this though. I did a bit of digging and managed to find their dependency graph code, then just created a function to do what I wanted with a bit of trial and error. I made a gist here since I'll likely need it again in the future, but here is the code:
import os
from PyInstaller.depend.analysis import initialize_modgraph
def get_import_dependencies(*scripts):
"""Get a list of all imports required.
Args: script filenames.
Returns: list of imports
"""
script_nodes = []
scripts = set(map(os.path.abspath, scripts))
# Process the scripts and build the map of imports
graph = initialize_modgraph()
for script in scripts:
graph.run_script(script)
for node in graph.nodes():
if node.filename in scripts:
script_nodes.append(node)
# Search the imports to find what is in use
dependency_nodes = set()
def search_dependencies(node):
for reference in graph.getReferences(node):
if reference not in dependency_nodes:
dependency_nodes.add(reference)
search_dependencies(reference)
for script_node in script_nodes:
search_dependencies(script_node)
return list(sorted(dependency_nodes))
if __name__ == '__main__':
# Show the PyInstaller imports used in this file
for node in get_import_dependencies(__file__):
if node.identifier.split('.')[0] == 'PyInstaller':
print(node)
All the node types are defined in PyInstaller.lib.modulegraph.modulegraph, such as SourceModule, MissingModule, Package and BuiltinModule. These will come in useful when performing checks.
Each of these has an identifier (path.to.my.module), and depending on the node type, it may have a filename (C:/path/to/my/module/__init__.py), and packagepath (['C:/path/to/my/module']).
I can't really post any extra code as it is quite specific to our setup with using pyarmor with PyInstaller, I can happily say it works flawlessly so far though.
You could use the findpydeps module I wrote:
Install it via pip: pip install findpydeps
If you have a main file: findpydeps -l -i path/to/main.py (the -l will follow the imports in the file)
Or your code is in a folder: findpydeps -i path/to/folder
Most importantly, the output is pip-friendly:
do findpydeps -i . > requirements.txt (assuming . is your project's directory)
then pip install -r requirements.txt
You can of course search through multiple directories and files for requirements, like: findpydeps -i path/to/file1.py path/to/folder path/to/file2.py, etc.
By default, it will remove the packages that are in the python standard library, as well as local packages. Refer to the -r/--removal-policy argument for more info.
If you don't want imports that are done in if, try/except or with blocks, just add --no-blocks. The same goes for functions with --no-functions.
Anyway, you got the idea: there are a lot of options (most of them are not discussed here). Refer the findpydeps -h output for more help!
The way to do this is analyze your imports. To automate that, check out Snakefood. Then you can make a requirements.txt file and get on your way to using virtualenv.
The following will list the dependencies, excluding modules from the standard library:
sfood -fuq package.py | sfood-filter-stdlib | sfood-target-files
Related questions:
Get a list of python packages used by a Django Project
list python package dependencies without loading them?
You can simply use pipreqs, install it using:
pip install pipreqs
Then, type: pipreqs . on the files directory.
A text file named requirements will be created for you, which looks like this:
numpy==1.21.1
pytest==6.2.4
matplotlib==3.4.2
PySide2==5.15.2
I would just run something like this:
import importlib
import os
import pathlib
import re
import sys, chardet
from sty import fg
sys.setrecursionlimit(100000000)
dependenciesPaths = list()
dependenciesNames = list()
paths = sys.path
red = fg(255, 0, 0)
green = fg(0, 200, 0)
end = fg.rs
def main(path):
try:
print("Finding imports in '" + path + "':")
file = open(path)
contents = file.read()
wordArray = re.split(" |\n", contents)
currentList = list()
nextPaths = list()
skipWord = -1
for wordNumb in range(len(wordArray)):
word = wordArray[wordNumb]
if wordNumb == skipWord:
continue
elif word == "from":
currentList.append(wordArray[wordNumb + 1])
skipWord = wordNumb + 2
elif word == "import":
currentList.append(wordArray[wordNumb + 1])
currentList = set(currentList)
for i in currentList:
print(i)
print("Found imports in '" + path + "'")
print("Finding paths for imports in '" + path + "':")
currentList2 = currentList.copy()
currentList = list()
for i in currentList2:
if i in dependenciesNames:
print(i, "already found")
else:
dependenciesNames.append(i)
try:
fileInfo = importlib.machinery.PathFinder().find_spec(i)
print(fileInfo.origin)
dependenciesPaths.append(fileInfo.origin)
currentList.append(fileInfo.origin)
except AttributeError as e:
print(e)
print(i)
print(importlib.machinery.PathFinder().find_spec(i))
# print(red, "Odd noneType import called ", i, " in path ", path, end, sep='')
print("Found paths for imports in '" + path + "'")
for fileInfo in currentList:
main(fileInfo)
except Exception as e:
print(e)
if __name__ == "__main__":
# args
args = sys.argv
print(args)
if len(args) == 2:
p = args[1]
elif len(args) == 3:
p = args[1]
open(args[2], "a").close()
sys.stdout = open(args[2], "w")
else:
print('Usage')
print('PyDependencies <InputFile>')
print('PyDependencies <InputFile> <OutputFile')
sys.exit(2)
if not os.path.exists(p):
print(red, "Path '" + p + "' is not a real path", end, sep='')
elif os.path.isdir(p):
print(red, "Path '" + p + "' is a directory, not a file", end, sep='')
elif "".join(pathlib.Path(p).suffixes) != ".py":
print(red, "Path '" + p + "' is not a python file", end, sep='')
else:
print(green, "Path '" + p + "' is a valid python file", end, sep='')
main(p)
deps = set(dependenciesNames)
print(deps)
sys.exit()
If you're using an Anaconda virtual environment, you can run the below command inside the environment to create a txt file of all the dependencies used in the project.
conda list -e > requirements.txt
This answer is to help someone list all dependencies with versions from the Python script itself. This will list all dependencies in the user virtual environment.
from pip._internal.operations import freeze
x = freeze.freeze()
for dependency in x:
print(dependency)
for this you need to install pip as a dependency. Use the following command to install pip dependency.
pip install pip
The print output would look like the following.
certifi==2020.12.5
chardet==4.0.0
idna==2.10
numpy==1.20.3
oauthlib==3.1.0
pandas==1.2.4
pip==21.1.2
python-dateutil==2.8.1
pytz==2021.1
requests==2.25.1
requests-oauthlib==1.3.0
setuptools==41.2.0
six==1.16.0
urllib3==1.26.4

Create requirements.txt with specific dependencies [duplicate]

What is the most efficient way to list all dependencies required to deploy a working project elsewhere (on a different OS, say)?
Python 2.7, Windows dev environment, not using a virtualenv per project, but a global dev environment, installing libraries as needed, happily hopping from one project to the next.
I've kept track of most (not sure all) libraries I had to install for a given project. I have not kept track of any sub-dependencies that came auto-installed with them. Doing pip freeze lists both, plus all the other libraries that were ever installed.
Is there a way to list what you need to install, no more, no less, to deploy the project?
EDIT In view of the answers below, some clarification. My project consists of a bunch of modules (that I wrote), each with a bunch of imports. Should I just copy-paste all the imports from all modules into a single file, sort eliminating duplicates, and throw out all from the standard library (and how do I know they are)? Or is there a better way? That's the question.
pipreqs solves the problem. It generates project-level requirement.txt file.
Install pipreqs: pip install pipreqs
Generate project-level requirement.txt file: pipreqs /path/to/your/project/
requirements file would be saved in /path/to/your/project/requirements.txt
If you want to read more advantages of pipreqs over pip freeze, read it from here
Scan your import statements. Chances are you only import things you explicitly wanted to import, and not the dependencies.
Make a list like the one pip freeze does, then create and activate a virtualenv.
Do pip install -r your_list, and try to run your code in that virtualenv. Heed any ImportError exceptions, match them to packages, and add to your list. Repeat until your code runs without problems.
Now you have a list to feed to pip install on your deployment site.
This is extremely manual, but requires no external tools, and forces you to make sure that your code runs. (Running your test suite as a check is great but not sufficient.)
On your terminal type:
pip install pipdeptree
cd <your project root>
pipdeptree
I found the answers here didn't work too well for me as I only wanted the imports from inside our repository (eg. import requests I don't need, but from my.module.x import y I do need).
I noticed that PyInstaller had perfectly good functionality for this though. I did a bit of digging and managed to find their dependency graph code, then just created a function to do what I wanted with a bit of trial and error. I made a gist here since I'll likely need it again in the future, but here is the code:
import os
from PyInstaller.depend.analysis import initialize_modgraph
def get_import_dependencies(*scripts):
"""Get a list of all imports required.
Args: script filenames.
Returns: list of imports
"""
script_nodes = []
scripts = set(map(os.path.abspath, scripts))
# Process the scripts and build the map of imports
graph = initialize_modgraph()
for script in scripts:
graph.run_script(script)
for node in graph.nodes():
if node.filename in scripts:
script_nodes.append(node)
# Search the imports to find what is in use
dependency_nodes = set()
def search_dependencies(node):
for reference in graph.getReferences(node):
if reference not in dependency_nodes:
dependency_nodes.add(reference)
search_dependencies(reference)
for script_node in script_nodes:
search_dependencies(script_node)
return list(sorted(dependency_nodes))
if __name__ == '__main__':
# Show the PyInstaller imports used in this file
for node in get_import_dependencies(__file__):
if node.identifier.split('.')[0] == 'PyInstaller':
print(node)
All the node types are defined in PyInstaller.lib.modulegraph.modulegraph, such as SourceModule, MissingModule, Package and BuiltinModule. These will come in useful when performing checks.
Each of these has an identifier (path.to.my.module), and depending on the node type, it may have a filename (C:/path/to/my/module/__init__.py), and packagepath (['C:/path/to/my/module']).
I can't really post any extra code as it is quite specific to our setup with using pyarmor with PyInstaller, I can happily say it works flawlessly so far though.
You could use the findpydeps module I wrote:
Install it via pip: pip install findpydeps
If you have a main file: findpydeps -l -i path/to/main.py (the -l will follow the imports in the file)
Or your code is in a folder: findpydeps -i path/to/folder
Most importantly, the output is pip-friendly:
do findpydeps -i . > requirements.txt (assuming . is your project's directory)
then pip install -r requirements.txt
You can of course search through multiple directories and files for requirements, like: findpydeps -i path/to/file1.py path/to/folder path/to/file2.py, etc.
By default, it will remove the packages that are in the python standard library, as well as local packages. Refer to the -r/--removal-policy argument for more info.
If you don't want imports that are done in if, try/except or with blocks, just add --no-blocks. The same goes for functions with --no-functions.
Anyway, you got the idea: there are a lot of options (most of them are not discussed here). Refer the findpydeps -h output for more help!
The way to do this is analyze your imports. To automate that, check out Snakefood. Then you can make a requirements.txt file and get on your way to using virtualenv.
The following will list the dependencies, excluding modules from the standard library:
sfood -fuq package.py | sfood-filter-stdlib | sfood-target-files
Related questions:
Get a list of python packages used by a Django Project
list python package dependencies without loading them?
You can simply use pipreqs, install it using:
pip install pipreqs
Then, type: pipreqs . on the files directory.
A text file named requirements will be created for you, which looks like this:
numpy==1.21.1
pytest==6.2.4
matplotlib==3.4.2
PySide2==5.15.2
I would just run something like this:
import importlib
import os
import pathlib
import re
import sys, chardet
from sty import fg
sys.setrecursionlimit(100000000)
dependenciesPaths = list()
dependenciesNames = list()
paths = sys.path
red = fg(255, 0, 0)
green = fg(0, 200, 0)
end = fg.rs
def main(path):
try:
print("Finding imports in '" + path + "':")
file = open(path)
contents = file.read()
wordArray = re.split(" |\n", contents)
currentList = list()
nextPaths = list()
skipWord = -1
for wordNumb in range(len(wordArray)):
word = wordArray[wordNumb]
if wordNumb == skipWord:
continue
elif word == "from":
currentList.append(wordArray[wordNumb + 1])
skipWord = wordNumb + 2
elif word == "import":
currentList.append(wordArray[wordNumb + 1])
currentList = set(currentList)
for i in currentList:
print(i)
print("Found imports in '" + path + "'")
print("Finding paths for imports in '" + path + "':")
currentList2 = currentList.copy()
currentList = list()
for i in currentList2:
if i in dependenciesNames:
print(i, "already found")
else:
dependenciesNames.append(i)
try:
fileInfo = importlib.machinery.PathFinder().find_spec(i)
print(fileInfo.origin)
dependenciesPaths.append(fileInfo.origin)
currentList.append(fileInfo.origin)
except AttributeError as e:
print(e)
print(i)
print(importlib.machinery.PathFinder().find_spec(i))
# print(red, "Odd noneType import called ", i, " in path ", path, end, sep='')
print("Found paths for imports in '" + path + "'")
for fileInfo in currentList:
main(fileInfo)
except Exception as e:
print(e)
if __name__ == "__main__":
# args
args = sys.argv
print(args)
if len(args) == 2:
p = args[1]
elif len(args) == 3:
p = args[1]
open(args[2], "a").close()
sys.stdout = open(args[2], "w")
else:
print('Usage')
print('PyDependencies <InputFile>')
print('PyDependencies <InputFile> <OutputFile')
sys.exit(2)
if not os.path.exists(p):
print(red, "Path '" + p + "' is not a real path", end, sep='')
elif os.path.isdir(p):
print(red, "Path '" + p + "' is a directory, not a file", end, sep='')
elif "".join(pathlib.Path(p).suffixes) != ".py":
print(red, "Path '" + p + "' is not a python file", end, sep='')
else:
print(green, "Path '" + p + "' is a valid python file", end, sep='')
main(p)
deps = set(dependenciesNames)
print(deps)
sys.exit()
If you're using an Anaconda virtual environment, you can run the below command inside the environment to create a txt file of all the dependencies used in the project.
conda list -e > requirements.txt
This answer is to help someone list all dependencies with versions from the Python script itself. This will list all dependencies in the user virtual environment.
from pip._internal.operations import freeze
x = freeze.freeze()
for dependency in x:
print(dependency)
for this you need to install pip as a dependency. Use the following command to install pip dependency.
pip install pip
The print output would look like the following.
certifi==2020.12.5
chardet==4.0.0
idna==2.10
numpy==1.20.3
oauthlib==3.1.0
pandas==1.2.4
pip==21.1.2
python-dateutil==2.8.1
pytz==2021.1
requests==2.25.1
requests-oauthlib==1.3.0
setuptools==41.2.0
six==1.16.0
urllib3==1.26.4

Pip install upgrade unable to remove temp files

In the course of maintaining a CLI utility, I want to add an update action that will grab the latest version of that package from PyPI and upgrade the existing installation.
$ cli -V
1.0.23
$ cli update
// many lines of pip spam
$ cli -V
1.0.24 // or etc
This is working perfectly on all machines that have Python installed system-wide (in C:\Python36 or similar), but machines that have Python installed as a user (in C:\users\username\AppData\Local\Programs\Python\Python36) receive this error as the old version is uninstalled:
Could not install packages due to an EnvironmentError: [WinError 5] Access is denied: 'C:\\Users\\username\\AppData\\Local\\Temp\\pip-uninstall-f5a7rk2y\\cli.exe'
Consider using the `--user` option or check the permissions.
I had assumed that this is due to the fact that the cli.exe called out in the error text is currently running when pip tries to remove it, however the path here is not to %LOCALAPPDATA%\Programs\Python\Python36\Scripts where that exe lives, but instead to %TEMP%. How is it allowed to move the file there, but not remove it once it's there?
including --user in the install args as recommended by the error message does not (contrary to the indication of an earlier edit of this question) resolve the issue, but moving the cli executable elsewhere does.
I'm hoping for an answer that:
Explains the underlying issue of failing to delete the executable from the TEMP directory, and...
Provides a solution to the issue, either to bypass the permissions error, or to query to see if this package is installed as a user so the code can add --user to the args.
While the question is fairly general, a MCVE is below:
def update(piphost):
args = ['pip', 'install',
'--index-url', piphost,
'-U', 'cli']
subprocess.check_call(args)
update('https://mypypiserver:8001')
As originally surmised, the issue here was trying to delete a running executable. Windows isn't a fan of that sort of nonsense, and throws PermissionErrors when you try. Curiously though, you can definitely rename a running executable, and in fact several questions from different tags use this fact to allow an apparent change to a running executable.
This also explains why the executable appeared to be running from %LOCALAPPDATA%\Programs\Python\Python36\Scripts but failing to delete from %TEMP%. It has been renamed (moved) to the %TEMP% folder during execution (which is legal) and then pip attempts to remove that directory, also removing that file (which is illegal).
The implementation goes like so:
Rename the current executable (Path(sys.argv[0]).with_suffix('.exe'))
pip install to update the package
Add logic to your entrypoint that deletes the renamed executable if it exists.
import click # I'm using click for my CLI, but YMMV
from pathlib import Path
from sys import argv
def entrypoint():
# setup.py's console_scripts points cli.exe to here
tmp_exe_path = Path(argv[0]).with_suffix('.tmp')
try:
tmp_exe_path.unlink()
except FileNotFoundError:
pass
return cli_root
#click.group()
def cli_root():
pass
def update(pip_host):
exe_path = Path(argv[0])
tmp_exe_path = exe_path.with_suffix('.tmp')
handle_renames = False
if exe_path.with_suffix('.exe').exists():
# we're running on Windows, so we have to deal with this tomfoolery.
handle_renames = True
exe_path.rename(tmp_exe_path)
args = ['pip', 'install',
'--index-url', piphost,
'-U', 'cli']
try:
subprocess.check_call(args)
except Exception: # in real code you should probably break these out to handle stuff
if handle_renames:
tmp_exe_path.rename(exe_path) # undo the rename if we haven't updated
#cli_root.command('update')
#click.option("--host", default='https://mypypiserver:8001')
def cli_update(host):
update(host)
Great solution provided by the previous commenter: Pip install upgrade unable to remove temp files by https://stackoverflow.com/users/3058609/adam-smith
I want to add to remarks that made the code work for me (using python 3.8):
Missing parentheses for the return of the entrypoint function (however, the function I'm pointing to is not decorated with #click.group() so not sure if that's the reason)
def entrypoint():
# setup.py's console_scripts points cli.exe to here
tmp_exe_path = Path(argv[0]).with_suffix('.tmp')
try:
tmp_exe_path.unlink()
except FileNotFoundError:
pass
>>> return cli_root() <<<
Missing with_suffix('.exe') when attempting the rename in the update function. If I use the original code I get FileNotFoundError: [WinError 2] The system cannot find the file specified: '..\..\..\updater' -> '..\..\..\updater.tmp'
def update(pip_host):
exe_path = Path(argv[0])
tmp_exe_path = exe_path.with_suffix('.tmp')
handle_renames = False
if exe_path.with_suffix('.exe').exists():
# we're running on Windows, so we have to deal with this tomfoolery.
handle_renames = True
>>> exe_path.with_suffix('.exe').rename(tmp_exe_path) <<<
...

Categories