I apologize in advance since this seems like a basic question...
I am trying to learn using mujoco(link here), and inside its python binding Makefile it has:
upload:
rm -rf dist
python setup.py sdist
twine upload dist/*
What does twin upload dist/* command do?
In addition, this asks me for a username and password like this:
Uploading distributions to https://pypi.python.org/pypi
Enter your username: guest
Enter your password:
Uploading mujoco-py-0.5.7.tar.gz
HTTPError: 401 Client Error: You must be identified to edit package information for url: https://pypi.python.org/pypi
Makefile:2: recipe for target 'upload' failed
Is this asking for my computer username and password?
Twine is a commonly used system for uploading project builds to PyPI (the Python Package Index).
It will take care of securely transferring your project's build artifacts, in either wheel, sdist, etc. formats to PyPI or some other user defined index server.
When you specify twine upload <files>, twine will attempt to upload said files to PyPI, but in order to do so, it will require you to authenticate yourself. This is because PyPI wants to protect a project from having their advertised packages "hijacked" by a ne'er-do-well. In order for this step to proceed, you would have to give credentials that are marked as authoritative for the project that your uploaded project artifacts belong to.
It looks like the mujoco project's Makefile includes a target to ease in uploading updates of the project to PyPI by utilizing the Twine application. This target would only be meant to be used by the package maintainer(s).
Oh, and in case you were wondering, the python setup.py sdist command is what makes a source code artifact that can be uploaded to PyPI. It will place this artifact in the ./build/ directory as project-name_version.tar.gz.
Related
I have been able to setup an Azure pipeline that publishes a python package to our internal Azure Feed. Now I am trying to have it publish directly to PyPi. This is what I have done already:
I have setup a PyPi "Service Connection" in the Azure Project with the following configuration
Authentication Method = Username and Password
Python Repository Url for upload = https://upload.pypi.org/legacy
EndpointName: I wasnt too sure about this but I set it as the package name on PyPi
And I named this Service Connection PyPi.
In the pipeline I will run the following authentication task:
- task: TwineAuthenticate#1
inputs:
pythonUploadServiceConnection: 'PyPi'
Then I build the wheel for publishing.
Whenever I try to publish to the internal Azure feed it works, but when I try to upload that same package to pypi it gets stuck on this:
Uploading distributions to https://upload.pypi.org/legacy/
Are there any clear issues anyone can see that can have it get stuck trying to upload to pypi?
Twine authenticate probably isn't actually providing credentials to the twine upload command, so it's hanging waiting for user input. Try adding --non-interactive to your twine command like this twine upload --non-interactive dist/*. It will probably end up showing an error.
I've always understood the rule #1 of secrets is you keep them out of public source control.
So, I was prepping to upload a new package to pypi.
In .travis.yml I see:
deploy:
provider: pypi
distributions: sdist bdist_wheel
user: mysuser
password:
secure: PLEASE_REPLACE_ME
on:
tags: true
repo: myuser/pynoorm
condition: $TOXENV == py27
Fair enough, and I guess I forgot to replace it.
But... the part that is really puzzling is when I pick a random .travis.yml on github and it has:
ANzomjrVPkzLO7MG9Zekl1Tz/Gxxx ... HmSQ3GRNPHMIRqf1xle+8/0IwDBuC/eTsOkit7WU1j9lgurCj8snXuTLUVEqf/SecAcLpmLrelRFvz//ZcOopIbwD66RJWT8pYGBH/L3MMIDFj1bIf0UIpXdBXgeTJhxW054+BhdFPGI66IvWU/kOlOcE606wqRqI9bdvop34OewJFnOQ9El...71dROWO4ETzz1wGXmO0dTVfCWMbqk7dT8OPft+tHsWWJqqeCEL3wj1uYEIYpCwLo9oSyVXwrhzRW0dysZfTCx/XfDaws3eFA6iMg6dUoBt12kwGZ5vCbgjBwPOmQrRMUEmYoyZz8n20HKojoxzUpwueFN/nbLv76arJbN8bLeb/GyE6r1Rw0DEzs8f0fBtv5agUnIpMh6EPOFYN4rwHMxt52HU7BB/Kg=
What is the point of adding a secure password to a file that you are uploading to github? What does it do? I thought the usual process was to log into github and then link your account to travis. In which case both services ought to know how to authenticate you, if you're logged into either one, without having to go through a password in a public settings file.
How dangerous/sensitive is this particular part of a github travis configuration?
Can I do without it?
The initial pypi package files were generated with CookieCutter cookiecutter-pypackage.
A repository’s .travis.yml file can have "encrypted values", such as environment variables, notification settings, and deploy API keys. These encrypted values can be added by anyone, but are only readable by Travis CI.
This is what the secure: field name indicates. It's safe to include these encrypted values in your .travis.yml and safe to upload them to Github as well.
You can generate secure values by installing the travis gem and running it:
$ gem install travis
$ travis encrypt "secretvalue"
<encrypted string>
I want to upload packages to pypi.org as mentioned in the Migrating to PyPI.org documentation, but Twine uploads to https://upload.pypi.org/legacy/.
It's available on pypi.python.org/pypi/mypolr, but is not found on pypi.org.
I've tried to read
several other questions, tutorials, and guides.
My pip.ini-file (I'm on Windows 10) looks like this:
[distutils]
index-servers =
pypi
[pypi]
I don't have my username or password stored, so the [pypi] section is empty (as mentioned in migration docs).
I've put the .ini-file in my user folder, and confirmed (per this answer) that it's actually using the one I've set (using environment variable PIP_CONFIG_FILE).
Afraid that I had got something wrong, I also tried without a pip.ini-file to make Twine use its defaults.
I'm using Python 3.6.3 (from Anaconda), and my tools' versions are:
Twine 1.9.1 (migration docs says it should be 1.8+)
setuptools 38.2.3 (migration docs says it should be 27+)
Whether or not it's relevant, here is some more info:
Link to my setup.py
setup is imported from setuptools and not distutils.core
README.rst is used as long description, but in the PyPi page only first 8 asterix of header is shown. (Compare this with this)
The package I upload is version is 0.2.1 (at the time of posting this)
setuptools_scm is used to fetch versions from git tags
build is made with python setup.py sdist bdist_wheel
Please let me know if there is any other information that could be useful to figure this out.
You appear to be doing everything correctly. Twine is not uploading via legacy PyPI (https://pypi.python.org). It is uploading to the new PyPI (https://pypi.org, a.k.a. "Warehouse") via the original (and so far only) PyPI API, and this API just happens to be named "legacy".
Also, your package is present on Warehouse at https://pypi.org/project/mypolr/; Warehouse search is apparently not production-ready.
The docs for Warehouse explain this confusing nomenclature. Quotes below are from the front page and from the page about the Legacy API:
Warehouse is a web application that implements the canonical Python package index (repository); its production deployment is PyPI. It replaces an older code base that powered pypi.python.org.
Legacy API
The “Legacy API” provides feature parity with pypi-legacy, hence the term “legacy”.
...
Upload API
The API endpoint served at upload.pypi.org/legacy/ is Warehouse’s emulation of the legacy PyPI upload API. This is the endpoint that tools such as twine and distutils use to upload distributions to PyPI.
In other words, as I understand it:
PyPI was once a web application hosted at pypi.python.org. That old application, which no longer runs, is now referred to by the name pypi-legacy.
PyPI is now a web application hosted at pypi.org. This new application is named Warehouse. The old pypi.python.org is now just a redirect to pypi.org.
In addition to some new endpoints, Warehouse still exposes a couple of API endpoints that pypi-legacy used to have. Because these endpoints were copied across from pypi-legacy, they are together known as the "Legacy API".
In addition to that, the upload endpoint within Warehouse's Legacy API is served from the URL path /legacy, a naming choice which again reflects the fact that it is a (partial) reimplementation of the endpoint used for uploads in pypi-legacy.
This all seems more confusing than it needs to be, but it is what it is.
In case anyone else is coming here from google, mystified why their uploads are failing, don't forget to check https://status.python.org/ to make sure there isn't an outage. Sometimes you just gotta wait :p
I have a package for which local installation and distribution work correctly. However, when I try to upload it to pypi with twine, I get the following error message:
$ twine upload dist/mypackage.tar.gz
Uploading distributions to https://upload.pypi.org/legacy
Uploading mypackage.tar.gz
HTTPError: 400 Client Error: author_email: Invalid email address. for url: https://upload.pypi.org/legacy
By the way, I tried also commenting the author_email field in setup.py, but the error still remains and I don't think it is related to setup.py.
I tried creating .pypirc with no repository specification, as well as without using it at all.
I tried also using setup upload, but this also failed with a
$ Upload failed (410) gone: (This API has been deprecated...
As I understand pypi repos are currently in transition phase, but I can't figure out where the problem is.
I'm running twine 1.9.1. and python 3.5.2 (Ubuntu 16.04).
Hope you can help me!
[SOLVED]
It turned out that email address did not match with author name. Changing email address accordingly worked for me.
I've been working on an update to a server that requires a private GitHub repository. I can download the repo on my machine, because I'm able to enter a password when prompted. When I try to do this with my server, which is running on an Amazon EC2 instance, I don't have these prompts, so the module from the GitHub repository is not installed. Is there a way for me to provide the username and password in the installing file that I'm using for pip, so I can have the private repo module install successfully?
I'm using the -e git+<url>#egg=<name> in my requirements.txt
You can use SSH links instead of HTTPS links, like git#github.com:username/projectname.git instead of https://github.com/username/projectname.git, and use authentication keys instead of a password.
Step by step, you have to:
Change URL in requirements.txt to git#....
Create a key pair for your deployment machine and store it in ~/.ssh/ directory.
Add the key to you Github account.
Read the GitHub help pages for more detailed instructions.