Upload file to a Tomcat server using python - python

I have written a java servlet that uploads multiple files, I used cURL to upload the file
curl -F filedata=#myfile.txt http://127.0.0.1/test_Server/multipleupload this uploads the file to a folder uploads that is located in the webapps folder. I'm in the middle of writing a python module that can be used instead of curl, the reason being this server is going to be used by a build farm so using cURL is not an option and the sane goes for using pycURL. The python module I'm working on was previously written for doing this on pastebin , so all i'm doing is editing this to use my server and it looks like urllib doesn't do multipart/form-data?. If anyone could point me in the right direction it would be great, I haven't posted the code but if anyone wants it I will do so, There isn't much in that code as a start all I did was changed teh URL to my server and thats when I found out that its using application/x-www-form-urlencoded (Thank you Wireshark ! )

You can use the Request-class to send your own headers. Maybe you wanna use requests, it makes life easier.
EDIT: uploading files with requests

Related

Python deployment on webserver with CGI

I have a python script that I want to make accessible through a website with an userinterface.
I was experimenting with Flask, but I'm not sure this is the right tool for what I want to do.
My script takes userdata (.doc/.txt files), does something with it and returns it to the user. I don't want to save anything and I don't think that I need a database for this (is this right?). The file will be temporarily saved on the server and everything will be deleted once the user downloaded the modified file.
My webhosting provider supports Python and only accepts CGI. I read that WSGI is the preferred method to use with Python and that CGI has scaling issues and can only process one request at a time. I'm not sure if I understand this correctly. If several users would upload files at the same time, the server would only accept one request or overwrite previous requests? Or it can do one request per unique IP address/user?
Would CGI be ok for the simple get/process/return task of my python script or should I look into a hosting service that uses WSGI?
I had a look at Heroku and Render to deploy a flask app, but I think I could do that through my webhosting provider I guess.
For anyone interested in this topic,
I decided to deploy my app on render.com, which supports gunicorn (WSGI).

How to request for GOOGLE_APPLICATION_CREDENTIALS json file through Python

I have a Python script that accesses Google cloud platform, I also set up the service account, I can request & save the json file through the cloud console webpage after I login my Google account, and sets the GOOGLE_APPLICATION_CREDENTIALS to that json file, so the Python script can have access.
Now I want to share it with others, I have requirements.txt for the Python scrip to install the gcloud-api library, but I don't want to enforce others to install gcloud-sdk. And I don't want to share that json file with others. I would like to let others run the script, and if that json credential file is not found, the script will ask them to:
login gcloud
generate and save json credential, e.g., to a default directory
sets GOOGLE_APPLICATION_CREDENTIALS to that json file
All the step better be done without browser. Is there a way to use Python to do such thing? I did some research & googling but no luck.
I believe I can do this anyway by Python invoking curl or using requests, but just wonder if there is a simpler way to do this.
UPDATE
Thanks to the comments but I just want to release to others a Python script file.
I read through the service account and the work identity federation, I don‘t have infra to setup identity provider. I believe that based on my reading and the comments, if I want to use something like oauth, I need to register my script as a client on Google. I am not sure if this is feasible or considered as a good practice...

Setup for running simple python scripts on (Nginx) web server?

I would like to run a couple of very simple Python 3 scripts on the web. As an example, say, the script just reads certain parameters from the URL, sends an email and prints out a simple HTML page saying "success".
I have a virtual private server with Nginx, so I am free to setup the best framework.
My question is: what is the state-of-the-art setup to do this?
More particularly: what do I need to install on my server for Nginx and what do I use in my Python scripts for e.g. reading the URL content? My idea is, that once the server setup is done, I can just put any new script_xy.py file into some directory and it can be accessed using the URL of the script, without a full blown deployment for each script.
Flask If I were to use Flask (or Django), each new script would need its own, continuously running process and its own Nginx setup. That seems like a total overkill to me. Another alternative is web2py, is it the same here or would that be an idea?
CGI 20 years ago I used to program such simple things as Perl scripts using CGI. I read this can be done in principle with Python, but CGI is slow. Then there is Fast CGI. However, my impression was that this is still a bit outdated?
WSGI WSGI seems to be the state-of-the-art alternative to CGI for Python. What python modules would I need to import in my script and what would be the setup for Nginx?
Something else? As you see, I might just need a few hints on what to search for. I am not even sure if I need to search for "Python web framework" or "Python server" etc.
Any help is greatly appreciated.
Thanks a lot for your ideas!
juxeku

Writing into text file on web server using urllib/urllib3

I am wondering if there is a way to write into .txt file on web server using urllib or urllib3. I tried using urllib3 POST but that doesnt do anything. Does any of these libraries have ability to write into files or do I have to use some other library?
It's not clear from your question, but I'm assuming that the Python code in question is not running on the web server. (Otherwise, it would be a matter of using the regular open() call.)
The answer is no, HTTP servers do not usually provide the ability to update files, and urllib does not support writing files over FTP / SCP. You will need to be either running some sort of an upload service on the server that exposes an API over HTTP (via a POST entry point or otherwise) that allows you to make requests to the server in a way that causes the file to be updated on the server's filesystem. Alternatively, you would need to use a protocol other than HTTP, such as FTP or SCP, and use an appropriate Python library for that protocol such as ftplib.

Importing Python files into each other on a web server

Using CGI scripts, I can run single Python files on my server and then use their output on my website.
However, I have a more complicated program on my computer that I would like to run on the server. It involves several modules I have written myself, and the SQLITE3 module built in Python. The program involves reading from a .db file and then using that data.
Once I run my main Python executable from a browser, I get a "500: Internal server error" error.
I just wanted to know whether I need to change something in the permission settings or something for Python files to be allowed to import other Python files, or to read from a .db file.
I appreciate any guidance, and sorry if I'm unclear about anything I'm new to this site and coding in general.
FOLLOW UP: So, as I understand, there isn't anything inherently wrong with importing Python files on a server?
I suggest you look in the log of your server to find out what caused the 500 error.
You can get an extensive error message by adding the following lines at the beginning of your CGI script:
import cgitb
cgitb.enable()

Categories