I am currently trying to write a python script that will open my companies inventory system which is a link in google chrome, sign in, and then click the save as excel button that is posted on top of a specific page. This will hopefully automate the process of opening the link, navigating over to the tab, clicking export, then exporting this data daily.
Any idea of where to start? I was thinking maybe can get this done using web scraping but not sure with the log in details needed. Also, how can I export this file once in? Just need some ideas to start me on this journey. Any and all help is appreciated!
Simply start with selenium python automation
Divide you whole process in smaller tasks and write python code
for that each task:)
Related
I have created an automation via selenium in order to improve the time spent for downloading raw data per user:
What I found out now is that when a user has no data to download the "download" button is not visible and then the script stops running.
Sorry for uploading screenshots but it is impossible to share URL since you have to log in first.
Is there a way to avoid this?
I need to automate the download of a pdf recipe from a webpage that contains a list of pdfs. Every pdf could be downloaded by clicking a button, and every month the system upgrades the first item-button of the list with the last recipe.
I'm trying to understand how to create a Python's script that automates (every month) the download of the pdf by clicking the first button of the item list.
It would be perfect if I transpose the pdf into a csv file and write it into a DB MYsql.
Does anyone suggest which are the steps? Thanks!
Look into:
Using cron jobs if you need the script to run monthly
Parsing a website (as in the HTML) and taking the relevant elements that you need from it
Downloading files via HTTP using python
I want to webscraping from a website, where i have to log in first. The problem is that, there is a "robotprotection" too (so I have to verify that i am not a robot + a recaptcha-security.), but it's chances of success (passing the captcha) is ~30% and this is horrible for me.
There is another possibility maybe which one i am log in with my browser (for example chrome or firefox), and after im going to use this session ID in my python script to webscraping dataes automatically?
So, more simplier: I want to webscraping tables from a website, so i have to log in first. This 30% succes rate is not enough good for me, so i hope there is another possibilty : log in manually, and after use this session in python?!
After that, there is a textbox in this page, where i want to write what i want to search, and after it is navigate to the page, where i'll found the table and dataes.
Any ideas, or it is possible?
(now i have only a script which one i have to download the html code to this datapage, and after change some name in the code manually..it is a very big waste time, i hope i can automate it more.) - Python 2.7
I have a .csv file with a list of URLs I need to extract data from. I need to automate the following process: (1) Go to a URL in the file. (2) Click the chrome extension that will redirect me to another page which displays some of the URL's stats. (3) Click the link in the stats page that enables me to download the data as a .csv file. (4) Save the .csv. (5) Repeat for the next n URLs.
Any idea how to do this? Any help greatly appreciated!
There is a python package called mechanize. It helps you automate the processes that can be done on a browser. So check it out.I think mechanize should give you all the tools required to solve the problem.
How can I download all the pdf (or specific extension files like .tif or .pdf) from a webpage that requires login. I dont want to log in everytime for every pdf so I cant use link generation and pushing to browser scheme
The solution was simple: just posting it for others may have the same question
mydriver.get("https://username:password#www.somewebsite.com/somelink")