Is there a way to connect to a website from a Python Tkinter GUI using HTTP requests? Essentially, I want the following functionality:
Press a button on the GUI
A signal is sent to a website (from the GUI) that this button was pressed
Send information along with that signal
I only need to focus on the GUI side of this. I have no code to go along with this - I was just wondering if it's even possible.
There are three components to what you ask. Your GUI can send information to your web server using some URL you would have to invent. It can be as simple as:
import requests
requests.get("http://example.com/information?name=Joe+Smith")
Then, your web server needs to respond to that request by saving the information somewhere. Your web server also needs a similar request to return the information. Then, your web page needs Javascript on a timer doing an AJAX request to fetch that info, and to change some field on the page in response. That depends on what Javascript tools you're using.
Related
I am trying to implement a chatbot that I created in Python on my website. I have looked at so many tutorials and most of them uses some sort of SocketIO connection which they use on localhost.
My current setup
I have a static website made with HTML, CSS and JS which is hosted on GitHub Pages (for now). As a separate project I have a Python project in which I have a function that can take an input message and then return the "correct" bot message from that.
What I want
I now want to create a html interface where a user can input his message and press send. Then (in JS I assume) I will take this message, somehow send it to the python function, get a response from the python function, and then print it out in JS. No other users should see these messages, just the current user.
My problem
I know the JS steps but I have no clue how to make a JS function "talk" to a python function/script/project and then use the response from that. I have seen so many tutorials on flask/socketio but all seems to be targeted to localhost and some sort of live chat between users.
I'm trying to automate a login on a popular website. This website uses Discord oauth.
I have gotten to the stage where I have monitored the requests being made to discord (which contains the sites call-back URL.
However, the issue I am facing is that Discord's authorize button doesn't return the oauth code via requests. Instead when the button is clicked there is some obfuscated JS file which redirects the user to the oauth call-back URL with a generated code.
Unfortunately I do not know of a way to get this code since it cannot be monitored in network tab.
Is there a way I can get around this? For example initializing the JS file (Simulating that I clicked the authorize button in some way or another?)
I know I could use selenium, but selenium isn't great for performance, as well as websites constantly changing UI. Api endpoints are a much better way of doing it.
I'm using python httpx module.
An example login URL for this is:
https://discord.com/oauth2/authorize?client_id=896549597550358548&redirect_uri=https://www.monkeebot.xyz/oauth/discord&response_type=code&scope=identify%20guilds
when you click authorize it sends you via a callbackURL to the site on question. The goal is to automate logging in via this link by using python requests only.
I am logging in into a website with python request by sending a post with required data.
I am trying to get other http requests after sending the previous http post.
Is there a way to do it?
If I log in manually in browser I can see all other requests that are being sent after logging in (which is the first POST in screenshot), I want to grab them all (the ones marked with green marker):
I assume that when you login a new html side is responded to your web browser.
During the rendering of this site some files like images or javascript are requested from the server side. With selenium you can automate user interactions with a web browser and log the traffic like described in this example.
I created a simple python script which takes a URL as input and once passed will do curl using multiple proxies and show the response code, now I want to create a webpage where others can use(my colleagues) as it will help them too, I want to create simple webpage which let them select set of proxy addresses, and input URL and upon submission, it will run the script on a machine(webserver) and populate the result to webpage using dynatable or datatable frameworks, but am not sure how or if it is possible as I didn't worked much in webserver thing, I want to know what tools I will need and how do I design it.
If python script can be called in terminal(as it needs to run curl) and show result on webpage based on output from script(which I will export to csv file), how can I do that? what to use xampp, wamp, lamp etc ?
You need a framework for this, something that will listen to your request coming from the front-end (webpage), there are tons out there as python framework, you can check bottle framework as a starting point.
So the flow would be something below.
1. from webpage, a request is sent to the backend
2. backend receive the request and run your logic (connecting to some server, computing logic, etc)
3. once backend process is done, backend then send the response to webpage
you can either use a REST approach or use templating functionality of the framework
You will need a request. You can do this in JavaScript with an ajax request there are frameworks to hook it up straight to your python which allow you not to code the JavaScript https://www.w3schools.com/xml/ajax_intro.asp there are many JavaScript frameworks that will make this easier/ shorter to code.
I've written an algorithm in python and a web interface around that. After you submit the form and start the algorithm, I'd like to push and update data on the page as it's running. How can I accomplish this?
To have real-time or semi-real time communications between the web page the options are
Automatically refresh the page after certain seconds using meta refresh tag in HTML <head>
Fetch updated data with JavaScript and AJAX HTTP GET: https://api.jquery.com/jquery.get/
Use server-sent sent events: http://www.html5rocks.com/en/tutorials/eventsource/basics/
Use WebSockets: http://www.html5rocks.com/en/tutorials/websockets/basics/
All approaches, excluding the first one, require rudimentary JavaScript skills besides knowing server-side Python. The latter two approaches recommend advanced understanding of real-time communications. Thus, if you are not familiar with the web development I recommend picking the meta refresh tag.
On the server side you need to start a process or a thread which to handle the long running process, then have this process to write its progress to a database. When the web UI updates itself, it reads the latest results from the database and pushes/pulles them back to the browser.