Sending straight JSON to API using requests? - python

The jist of my problem is that there is an open API that contains road info that I'm trying to pull very specific information from. The JSON request I'm trying to make looks like:
data1 = """
<REQUEST>
<LOGIN authenticationkey=""/>
<QUERY objecttype="RoadData" schemaversion="1.0">
<INCLUDE>SpeedLimit</INCLUDE>
<INCLUDE>RoadMainNumber</INCLUDE>
<INCLUDE>RoadSubNumber</INCLUDE>
</QUERY>
<QUERY objecttype="RoadGeometry" schemaversion="1.0">
<INCLUDE>Geometry.WGS843D</INCLUDE>
<INCLUDE>RoadMainNumber</INCLUDE>
<INCLUDE>RoadSubNumber</INCLUDE>
</QUERY>
</REQUEST>
"""
and I'm simply sending a request to the correct URL with this as my json code, i.e:
respones = requests.post(url, data=data1) but I'm only getting response 400 which leads me to believe my JSON request is wrong, however they supply an in-house console for testing your requests directly on their website and there it's working. Am I simply stupid here and I'm sending it incorrectly?
Tried looking at similar problems on stackoverflow and online but every single issue uses a dictionary to map their requests but that's not possible here? At least I don't know how to get the keyword in a dictionary as a condition since the sice of the dataset requires this.

Related

is there any way to get HTTP response code without receiving html files in python?

I don't know maybe am asking some kind of weird question or not.
Normally if we use any module in python like requests or urllib3. We get a response for each request. we get status-code, cookies, headers, and the HTML content.
But the problem is! this HTML data is huge and I don't need this data. I just need a response code for a request. So, is there any method or module to do so?
import requests
r = requests.head(url)
print(r)

How to handle a post request from a Python program

So basically I would like to know how to handle a POST request from a python program, and store it on the website server so I can make a GET request to retrieve that information. I'm hoping you can help me. Currently this is my code:
import requests
url = 'mywebsitehere.com'
source_code = 'print('Hello World')
data = {'code': source_code, 'format': 'python'}
r = requests.post(url = url, data = data)
print(r.text)
I'm trying to send some code and the format for the code in the post request, but I'm not sure how to handle the post request once it reaches the website so other programs can access it with GET requests. I know how to actually send POST and GET requests in Python just not how to handle them once they reach the website/server. From my research, it seems like you have to make a PHP file or something and specify individual boxes or variables for the program to enter the information into.
I know it's a really noob question but I am just starting to get into more advanced stuff with Python and modules and stuff.
I'm going to learn more about general web development so instead of just barely understanding it I can get a good grasp on the post requests and actually develop my website into something custom rather than copying and pasting other peoples work without completely understanding it.
...also I'm not sure how to close a post to "answered" or something but yeah.

LinkedIn Webscrape

I need to fetch basic profile data (complete page - html) of Linkedin profile. I tried python packages such as beautifulsoup but I get access denied.
I have generated the api tokens for linkedIn, but I am not sure how to incorporate those into the code.
Basically, I want to automate the process of scraping by just providing the company name.
Please help. Thanks!
Beautiful Soup is a web scraper. Typically, people use this library to parse data from public websites or websites that don't have APIs. For example, you could use it to scrape the top 10 Google Search results.
Unlike web scrapers, a API lets you retrieve data behind non-public websites. Furthermore, it returns the data in a easily readable XML or JSON format, so you don't have to "scrape" a HTML file for the specific data you care about.
To make a API call to LinkedIn, use need to use a python HTTP request library. See this stackoverflow post for examples.
Take a look at Step 4 of the LinkedIn API documentation. It shows a sample HTTP GET call.
GET /v1/people/~ HTTP/1.1
Host: api.linkedin.com
Connection: Keep-Alive
Authorization: Bearer AQXdSP_W41_UPs5ioT_t8HESyODB4FqbkJ8LrV_5mff4gPODzOYR
Note that you also need to send a "Authorization" header along with HTTP GET call. This is where your token would go. You're probably getting an access denied right now because you didn't set this header in your request.
Here's an example of how you would add that header to a request with the requests library.
And that should be it. When you make that request, it should return a XML or JSON that has the data you want. You can use an XML or JSON parser to get the specific fields you want.

No JSON object could be decoded for the web APIs

I am new to python, and I am currently repeating all the solved examples from the book "python for data analysis"
in one example, I need to access the APIs from twitter search. Here is the code:
import requests
url='https://twitter.com/search?q=python%20pandas&src=typd'
resp=requests.get(url)
everything works okay up to here. problems comes when I use the json
import json
data=json.loads(resp.text)
then I received the error message ValueError: No JSON object could be decoded
I tried to switch to another url instead of the twitter, but I still receive the same error message.
does anyone have some ideas? thanks
You need a response with JSON content. To search Twitter this requires use of api.twitter.com, but you need to get an OAuth key. It's more complicated than the 5 lines of code you have. See http://nbviewer.ipython.org/github/chdoig/Mining-the-Social-Web-2nd-Edition/blob/master/ipynb/Chapter%201%20-%20Mining%20Twitter.ipynb

How do I send an xml request and receive an xml response in Python?

I'm currently using a WS where I send an XML Request to a url and then receive an XML Response. The request might look like the following:
<RequestColour>
...
</RequestColour>
The response looks like:
<ResponseColourOutput>
...
</ResponseColourOutput>
Which libraries should I use in Python to send those xml requests and receive the responses back?
You can use the urllib2 module that comes with Python to make requests to a URL that can consume this.
The Python website has a decent tutorial on how to use this module to fetch internet resources. The next step is learning how to generate/consume XML.
Related SO answers for those steps:
Generating XML
Consuming XML
I have posted a small example of plain XML request and response in Python here:
http://it.toolbox.com/blogs/lim/request-get-reply-and-display-xml-in-python-beauty-of-simplicity-49791
Please not that my example posts arbitrary XML and gets a valid XML but that informs you of an error because the content of the request was not recognized.
You could use your own URL and XML or adjust not to send XML request and just parse the response against the provided XML.
I would also suggest looking into eBays example for their x.Commerce XML API accessed from Python.
Hope this helps.
If the XML processing you do is simple, try looking at the built-in API, xml.etree.ElementTree.

Categories