I need beautifulsoup4 to scrape through a websites HTML and get information. I want to use that information in my Alexa-skill.
How do I import/use bs4 in my Alexa developer console?
I've already read how to make a deployment package (https://docs.aws.amazon.com/lambda/latest/dg/lambda-python-how-to-create-deployment-package.html), but I don't understand how to download/zip bs4.
I am new to Python, AWS and Alexa developer console, so I am sorry if that question is very easy to answer.
Kind regards,
Dany
I think you'll find all you need here at this documentation url: https://docs.aws.amazon.com/lambda/latest/dg/lambda-python-how-to-create-deployment-package.html
Related
This question was asked before but never answered well:
I need beautifulsoup4 to scrape through a websites HTML and get information. I want to use that information in my Alexa-skill.
How do I import/use bs4 in my Alexa developer console?
I've already read how to make a deployment package (https://docs.aws.amazon.com/lambda/latest/dg/lambda-python-how-to-create-deployment-package.html), but I don't understand how to download/zip bs4.
I am new to Python, AWS and Alexa developer console, so I am sorry if that question is very easy to answer.
I tried to create a ziped folder named lamda and upload it under upload code, but running my skill wuth import bs4 just errors
I'm trying to create an automation for a site I need to access daily.
I can't install any python libraries at the moment.
I need to set a value into an input. And click on a button on this page.
I already managed to get the page content with urllib. But I can't figure out how to control the page.
It feels like if I could execute JavaScript code on the page my problem will be solved.
Is there a way to execute JavaScript with urllib?
Or any other way to automate a site without external libraries?
I made an Amazon Web Scraper and I want it work as an extension. What will happen is that it will display the price if it is lower than the previously recorded price. I don't know Javascript. I went through things like Transcrypt but didn't understand much
You cannot. Chrome extensions are written in JS.
The only way to accomplish what you want is to use the extension as a bridge from users browser to your script. You'll need to convert the script into a server of some kind that can accept requests from the extension and respond.
While reading through WebTest documentation, I found a somehow cryptic note about webtest.http.StopableWSGIServer:
StopableWSGIServer is a WSGIServer which run in a separated thread. This allow to use tools like casperjs or selenium.
I know, what WebTest does. It is a package to simulate a web browser, with a very nice API, to test web pages.
I know, what Selenium does. It is a package that allows a programmer to actually use a real web browser to test web pages.
I use both of those tools in my codebase, only separately.
Somehow I can get my head around the thought of using WebTest with Selenium.
Could anybody please elaborate on that? Am I missing something?
My first idea, a bit vague, is to use WebTest API to access pages running inside a browser controller by Selenium. Was it the idea behind webtest-selenium package?
As a side note, webtest-selenium package seems to be in very early stage of development, since a long time. Does anybody know, is it alive?
I want to try to use the Yahoo Fantasy Sports API for my fantasy football league, but it seems like I can only access the data if I authenticate with Yahoo. Makes sense. However, I am new to Python and have no idea how to get started.
I found a python module on the web
link text
but I don't know how to "install" a file that has a .gz extension on a windows machine.
Simply, any help you can provide on how to use oauth with yahoo in python will be greatly appreciated.
Thanks in advance
Just use easysinstall
c:/python25/scripts/easy_install.exe yql
Replace path with your computer's python path if necessary.