Python and Selenium - Reboot program and reuse same browser session - python

Scenario:
I am working an an auto whatsapp responder using whatsapp web.
I log in via chromedriver on selenium with python 3.
I run a function that does some stuff inside a while True.
Problem:
Sometimes, due to a lack of conectivity with the phone, or whatever other problems, the program just does not keep running the right way.
There are a lot of factors that might cause the whole thing to lose the right flow. I am analyzing them all and fixing them as best as I can.
Question:
I came up with the idea that maybe if I restart the whole thing every hour (or every whatever-thousands iterations) it would become more solid. As it will refind the flow no matter what happens, if I did not catch the bug yet.
Is it possible to restart the whole thing, without losing the browser session? Whatsapp web requires a QR scan, but it allows a "keep session alive in further connections" (which I do not really know how it works... if cookies or something else.)
Note: I know that a python script can be rebooted, but the bigger problem here is to reuse the browser session. Of course I am doing my research. None of what I read so far made me come with a solid solution, and that is why I ask to all the super cool brains out there.

Whatsapp stores session in localStorage of the browser.
You can extract the localStorage and save to a file on closing of a session.
Upon instantiating a session check if this file exists, then parse the file and update localStorage with saved values before opening a URL.

Related

Run a python program from browser

I want to run a python program (kinda like this) from a browser
Anyway, as you see it has a few inputs, and i would like to "translate" that into a kind of form
def addNew():
appendBase = open('dBase.cfg','a')
uname = input('Username: ')
pword = input('Password: ')
appendBase.write(uname+','+pword+'\n')
print('\nAdded new profile: \n'+uname+','+pword)
appendBase.close()
Also i dont know how to get the print to the page, so it can show it
I've just started learning, so go easy on me, please
It is not possible to actually run this in the browser, for various reasons.
you can't run python in browsers. only javascript
you can't open local files from a browser
there's no command line to input from some terminal
Most things you see on the web have two parts
a part that actually runs in the browser. Written in HTML and javascript
another part where the browser connects, to send and receive data. That can be done in any language, including python. However, that part is not visible in the browser
The two parts communicate using HTTP protocol. So, start by reading a bit on HTML/javascript (W3Schools is an easy way to get started). When you feel comfortable with that, practice with a python web framework (django is the most popular, but flask is the easiest to get started), and see how javascript uses HTTP to connect to that.

Transfer selected web sockets to browser

My understanding of web programming isn't the best, so I might have some misconceptions here about how it works in general, but hopefully what I'm trying to do is possible.
Recently my friend and I have been challenging each other to break web systems we've set up, and in order to break his next one I need to use the requests module, while doing part of it by myself. I'm perfectly happy with the requests module, but after a while, I want to manually take over that session with the server in my browser. I've tried webbrowser.open, but this effectively loads the page again as if I've never connected before, due to not having any of the cookies from the other session. Is this possible, or do I have a misunderstanding of the situation? Thanks in advance for any help.

Python - Cookies & BeautifulSoup

I wrote a simple python script that authenticates to a website, gets the cookie write it to a file and do some scrapping at the website. I'm writting the cookie to a file so, I can reuse it and don't need to authenticate my self over and over.
At my personal computer the script works fine. Although when I upload it to my server it refuse to work.
The most strange part is if I upload the cookie created at my personal computer to the my server it will work fine. Of course, I have some issues at the function that saves the cookie...
As far I as know if I have library issues Python would warm me about it, so I guess my problem is much more complex.
I also tried to run as a root, but no lucky.
What do you think may be causing this stuff?
BTW: All pythons are 2.7
Refer to tags to more infos

How to monitor the Internet connectivity on two PCs simultaneously?

I have two PCs and I want to monitor the Internet connectivity in both of them and make it available in a page as to whether they're currently online and running. How can I do that?
I'm thinking of a cron job that gets executed every minute that sends a POST to a file located in a server, which in turn would write the connectivity status "online" to a file. In the page where the statuses are displayed, read from both the status files and display whether they're online or not. But this feels like a sloppy idea. What alternative suggestion do you have?
(The answer doesn't necessarily have to be code; I'm not looking for copy-paste solutions. I just want an idea, a nudge in the right directio,)
I would suggest just a GET request (you just need a ping to indicate that the PC is on) sent periodically to maybe a Django server and if you query a page on the Django server, it shows a webpage indicating the status of each.
In the Django server have a loop where the time each GET is received is indicated, if the time between the last GET and current time is too large, set a flag to false.
That flag will later be visible when the URL is queried, via the views.
I don't think this would end up sloppy, just a trivial solution where you don't really have to dig too deep to make it work.
I have used Nagios in the past I like it a lot. It is free and open source. I have used it to monitor several Web, DNS, Mail servers and a proxy. You can check it here: https://www.nagios.com/products/nagioscore

pulling info from a log file and sending a notification email using python

I am working on a project for work that requires me to pull information from a logfile and send a notification anytime it finds a the specific information. For example the exact issue I am working on is I am needing to create a python script that will look into may /var/log/auth.log (FreeBSD system) and pull any invalid SSH login attempts, then proceed to email me and another co-worker anytime there is an offense.
I've been looking all over for a few days now and have had minimal success any help would be greatly appreciated.
I think what you're really after is a daemon like fail2ban, which is specifically designed to examine log files for intrusion attempts.
From the fail2ban wiki:
Fail2ban scans log files (e.g. /var/log/apache/error_log) and bans IPs
that show the malicious signs -- too many password failures, seeking
for exploits, etc. Generally Fail2Ban then used to update firewall
rules to reject the IP addresses for a specified amount of time,
although any arbitrary other action (e.g. sending an email, or
ejecting CD-ROM tray) could also be configured. Out of the box
Fail2Ban comes with filters for various services (apache, curier, ssh,
etc).
This would probably work better than any solution you baked yourself.
That said, if you did want to roll your own, the naive way to implement periodic checking of a file is simply to read it every five minutes and see if it's changed.
The smarter way is to use the operating system's file monitoring service, which hooks into the filesystem driver and notifies you as soon as the file changes. This has the dual benefits that your code will take less CPU time, and it will respond immediately whenever the file changes.
On Linux the service is called inotify. BSD and Windows have an equivalent feature.
You could run a cron job every few minutes that checks for changes in that file. If there are any changes, it will email you, by using, for example, smtplib. Here is an example of smtplib usage with sendgrid: http://docs.sendgrid.com/documentation/get-started/integrate/examples/python-email-example-using-smtp/
How do you find out if a file was modified?
You keep a copy of the file as it looked in the previous script run, and compare that to the current contents
You check the file's last modification time.
This is just a general idea that can be tweaked, and all the 'ingredients' can be found on google, so you should be able to implement it by googling yourself.
Hope this helps.
As a rough idea for a cron job:
with open('/var/log/auth.log') as auth:
for line in auth:
if 'blahblah' in line:
# send email
You'll want to check out the email module for emailing details. You'll also want a way to keep track of what's already been scanned, so you don't end up sending duplicate emails.

Categories