I'm looking for an Google API for python (except the official API), in order to make "eloborate" query like "intitle", "intext","filetype", ...
I made some research on the internet but I don't have find my need. Maybe I misssearched or maybe that doesn't exist.
Can you give me some lead or advice ?
Thank you.
Google Search API has deprecated long ago. You could use something like pygoogle to screen-scrape the results page, but you're liable to be detected by Google's anti-bot captchas if you make a large number of queries.
Related
I have never used an API before, but I am trying to learn how to use Scopus for a project I'm doing with a few colleagues. I have gotten about this far:
response = requests.get("https://api.elsevier.com/content/search/scopus/",
headers={'Accept':'application/json',
'X-ELS-APIKey': '[My_API_Key]'})
I keep getting a 400 error in response to this, even though my API Key is valid, and I've entered it correctly. I'm guessing I'm getting the error because the query is too large since I am just searching Scopus instead of looking for any specific author ID or ISSN.
I want to run queries to get all of the author data for a handful of specific ISSNs. As someone who is very uncomfortable using Python (I'm a web developer, not a Python programmer) and also someone who has never used an API themselves, I have no idea how to proceed from here. I've read the guides provided by Elsevier, but as I don't understand this stuff, I haven't found them helpful at all. I've also watched and read some tutorials about APIs, but none of them have helped me figure out how to make an actual specific request.
If any of you have used Scopus before, can you please tell me how to make a request based on the parameters I need? Am I supposed to put the ISSN at the end of the URL? If so, how should I format it, and how do I specify what other data I want for that specific ISSN?
I apologize for the lack of specificity in this particular question, I am just completely lost here. I am currently using Jupyter notebooks to write and run my code.
As other responders have said you are on the right track, however, your query is essentially blank, in that you are not actually asking the API for anything.
This request
"https://api.elsevier.com/content/serial/title/issn/[ISSN]?apiKey=[My_API_Key]"
will bring back general metadata about an ISSN, but to go deep with author and institution metadata, you should start with a Scopus Search API query:
"https://api.elsevier.com/content/search/scopus?query=issn([ISSN])?apiKey=[My_API_Key]"
The API Docs page, located here:
"https://dev.elsevier.com/api_docs.html"
Has more tech details about the APIs
I'm new to Python and am very confused on how to begin this assignment:
Write Python code to connect to Twitter search API at:
https://api.twitter.com/1.1/search/tweets.json
with at least the following parameters:
q for a search topic of your interest;
count to 100 for 100 records
to retrieve twitter data and assign the data to a variable.
I already created the app to access Twitter's API. Thanks.
I had this same issue. According to Twitter, you must register an account to even use basic search features of public tweets and there is no way around it. I followed the guide for the complete setup here and can confirm it works
Edit: the article is for php, but changing it to python shouldn’t be very difficult. It has all the methods outlined in the article
Just as a hint, you might want to use tweepy for this. They have very good documentation and its much easier than trying to reinvent the wheel.
I've searched on google and have taken a look at the facebook site for the apis, but facebook does not have an official SDK for python. I looked at the third party api for python listed on their site that could be used to communicate with facebook. After having visited their official site and github repository there is a small readme file that shows basic usage, it seems to assume that you are already connected to facebook, and the example at the end of that page shows a cookie example.
The short examples seem easy enough but there is no explaination of anything and i dont find any more documentation about anything else.. there does not seem to be any information about all available methods you can use with the api..where do the people who are using this api get the documentation to find the methods available so they are able to do work with this ?
Since i guess people are pretty tired of signing up for yet another service i would like to offer to sign in with their facebook and twitter accounts (although thats a no no for the ad people who would like to have access to the user profile in order to have targeted words/links that generate revenue). Im using django and have taken a look at the django-facebook api as well but the documentation seems to just point to the github repository which doesnt have any documenation, almost just like the other api pointed out above. Basically i dont find any documenation about how to use the apis except from the small examples.
And like always, i appericiate your time answering this, always nice to add an explaination to any code so the answer is a little more usefull, thanks.
My info might be a little bit out of date as I was working at a startup implementing a Python backend on Google AppEngine that interfaced with Facebook, used FQL, AppEngine datastore etc, about a year and half ago.
There are several third party APIs you can use, for instance, https://github.com/jgorset/facepy or https://github.com/pythonforfacebook/facebook-sdk. The reason there is no 'documentation' on the github site is because it implements access to the API that IS documented on Facebook's developer pages https://developers.facebook.com/docs/graph-api/.
But that is in a perfect world. My experience with the Facebook APIs is that they don't always do what is said on the dev pages. You don't get consistent return data, FB Realtime API not/inconsistently notifying for certain connections (music, movies, books, tv). Unfortunately, I think they have many non-documented APIs that are only available to the big app players.
Where I got my real world working info and learned how to access Facebook using Python was right here on stack overflow.
I'm trying to find a solution to my current problem. Let me explain: I need to find a Search UI that can consume a REST service of my choice and be highly configurable. I've searched the web and found Blacklight Search UI (written in Ruby) for Solr. I've also looked at Haystack (for django) which seems to be more promissing because somewhere in the docs i found out that you can link Haystack to your custom search engine. Out of the box Haystack supports Solr, Xapian and 2 others which i can't remember now.
What i'm trying to find is a UI written in Java, PHP(last resort!) or Python that will allow me to specify the endpoints for my APIs and with a few configurations (i'm not expecting it to run out of the box) it should be able to query the APIs and return results.
If that is not possible then could somebody suggest me something that gets close to what I described and allows me to write my own backend code that will link to the APIs ? A Haystack example will also do...
Thanks
I'm interested in this topic as well. I know about the SESAT framework supporting FAST, Solr, Yahoo!, generic XML and more, but it is old and not well maintained, and also tries to do much more than a simple front-end.
You also have AJAX-Solr which obviously only supports Solr.
I have forwarded your question on Twitter, hope others will fill in as well.
I'm a Simple DB newb and am looking for some step-by-step instructions about how to generate the correct signature. Some of the tutorials/sample code I'm seeing say it's a combination of a timestamp, the service you're using, and the method you're requesting be done on that service. Is this correct? I speak python if someone can provide code to generate it...
You'll want to use boto. It's documentation is really great, and they have a page specifically about SimpleDB.
You may like to view this: http://www.youtube.com/watch?v=9YEGwmYejt4