Create a middleware which listens to localhost:3332 and prints the endpoints called - python

I am trying to create a system in java which listens to localhost:3332 and prints the endpoints. I have a application which runs on that port where I can apply various actions to a table.
I have tried to run this script :
url=url = 'http://127.0.0.1:3332/'
while True:
with requests.Session() as session:
response = requests.get(url)
if response.status_code == 200:
print("Succesful connection with API.")
// here I should print the endpoints
Unfortunately, it doesn't work. Any suggestion is more than welcome

The script doesn't work because the "with requests.Session() as session" command is missing parentheses, which are necessary for command execution. Correcting this will fix the issue.
Also, it's not clear what you mean by printing the endpoints. Depending on the application, you may need to modify the script in order to make an API call that will return the endpoints in this way:
url = "http://127.0.0.1:3333/endpoints"
with requests.Session() as session:
response = requests.get(url)
if response.status_code == 200:
print("Succesful connection with API.")
// here I should print the endpoints - assuming the API call gives json data
data = response.json()
if data:
for endpoint in data:
print("Endpoint:", endpoint)
Hope this helps.

Related

Python api request to gitlab unexpectedly returns empty result

import requests
response = requests.get("https://gitlab.com/api/v4/users/ahmed_sh/projects")
print(response.status_code) # 200
print(response.text) # []
print(response.json()) # []
I'm trying to get a list of my GitLab repo projects using python API, but the outputs are nothing! Although, when I use the browser, I got a non-empty response. How can I solve this problem?
This is because you don't have any public projects in your user namespace. If you want to see your private projects in your namespace, you'll need to authenticate with the API by passing a personal access token in the PRIVATE-TOKEN header.
Note, this also won't show projects you work on in other namespaces.
headers = {'PRIVATE-TOKEN': 'Your API key here!'}
resp = requests.get('https://gitlab.com/api/v4/users/ahmed_sh/projects', headers=headers)
print(resp.json())

Python script http request working locally, but not when tested as a Google Cloud Function

I have a Python script that I would like to run at a set interval using Google Cloud Functions and Google Cloud Scheduler. The script works fine when tested locally, but when I test it in the Google Cloud Functions panel I'm getting a network connection error message for some reason? Do I need to do something special to get the requests library to work when the Python script is a Google Cloud Function?
Python script:
import datetime
from config import config_vars
import requests
APIKEY = config_vars['APIKEY']
NOW = datetime.datetime.now()
LAST = NOW - datetime.timedelta(seconds=config_vars['UPDATE_INTERVAL'])
def getOrders(nextPage = None):
url = "https://api.squarespace.com/1.0/commerce/orders"
if nextPage is None:
params = {
'modifiedAfter': f"{LAST.isoformat()}Z",
'modifiedBefore': f"{NOW.isoformat()}Z"
}
else:
params = { 'cursor': nextPage }
headers = { "Authorization": f"Bearer {SAPIKEY}" }
r = requests.get(url, params=params, headers=headers)
if not r.ok:
logging.error(f"Unable to get orders. Respoonse: {r.text}")
return []
res = r.json()
pagination = res['pagination']
if pagination['hasNextPage']: return res['result'] + getOrders(pagination['nextPageCursor'])
else: return res['result']
def main(data = None, context = None):
"""Triggered from a message on a Cloud Pub/Sub topic.
Args:
data (dict): Event payload.
context (google.cloud.functions.Context): Metadata for the event.
"""
orders = getOrders()
for order in orders:
# do something with each order
pass
if __name__ == '__main__': main()
Error message:
HTTPSConnectionPool(host='api.squarespace.com', port=443): Max retries exceeded with url: /1.0/commerce/orders?modifiedAfter=2020-02-09T23%3A01%3A44.372818Z&modifiedBefore=2020-02-09T23%3A01%3A45.372818Z (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7eedecb76850>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))
You need to enable billing for your project. You won't be able to make outbound requests to any URL until it is enabled.
The "Billing enabled" answer worked for me initially - However, I was mystified by a later occurrence of this same message on a Function / project where billing was definitely enabled, and in fact I could make some outbound requests, but one in particular was failing. It turned out to be a \n at the end of the URL string I had been sending to the function as a parameter. In my particular case, since I was using PHP to generate the string, a simple trim() call removed the cruft and the function continued to work as expected. Posting just in case it helps someone else, as this had me scratching my head for a bit.

Test GET and POST calls

I need to test POST and GET calls against an NGINX server.
I need to capture the error codes and verify the response. I was able to test the GET requests by hitting localhost:8080 (NGINX is running on docker exposing 8080), but I'm not sure how to test the POST calls.
Can we construct a dummy request and test POST call? NGINX runs with default page.
Below is one way to make a post request to an endpoint in python
import requests
API_ENDPOINT = "http://pastebin.com/api/api_post.php"
data = {param1:value1,
param2:value2}
#sending post request and saving response as response object
r = requests.post(url = API_ENDPOINT, data = data)
#extracting response text
pastebin_url = r.text
print("The pastebin URL is:%s"%pastebin_url)

Splunk Python Connection Lost

I'm using python to execute a splunk search query and return the results. I connect with the following string:
service = client.connect(
host=HOST,
port=PORT,
username=USERNAME,
password=PASSWORD
)
The variables have been tested to work, and it connects to splunk, but sometimes, when I run these lines of code:
print "Installed App Names \n"
for app in service.apps:
print app.name
It returns this error:
Request Failed: Session is not logged in
About 50% of the time, the code works, and it executes. Is this inconsistency in code results do to the service = lines of code not actually connecting to the splunk server? Can these connections time out?
connect can take an autologin=True argument to allow the bindings to try to re-connect when authentication fails, instead of raising that error immediately.
Probably you should get the token and session id of splunk using your python code. Please find the below code if this could help you.
import json,os,sys,requests
BASE_URL = "https://SPLUNKLB / SPLUNK WEB URL"
def getToken():
# body for token request
payload = {'username': "",'password': ""}
TOKEN_URL = "/services/auth/login?output_mode=json"
# post token request
res = requests.post(BASE_URL+TOKEN_URL, data=payload, verify=False)
if (res.status_code == 200):
# Get token out of response
resJson = json.loads(res.content)
return resJson.get('sessionKey')
else:
print res.status_code, res.content

Unit testing Python Flask Stream

Does anybody have any experience/pointers on testing a Flask Content Streaming resource? My application uses Redis Pub/Sub, when receiving a message in a channel it streams the text 'data: {"value":42}'
The implementation is following Flask docs at: docs
and my unit tests are done following Flask docs too. The messages to Redis pub/sub are sent by a second resource (POST).
I'm creating a thread to listen to the stream while I POST on the main application to the resource that publishes to Redis.
Although the connection assertions pass (I receive OK 200 and a mimetype 'text/event-stream') the data object is empty.
My unit test is like this:
def test_04_receiving_events(self):
headers = [('Content-Type', 'application/json')]
data = json.dumps({"value": 42})
headers.append(('Content-Length', len(data)))
def get_stream():
rv_stream = self.app.get('stream/data')
rv_stream_object = json.loads(rv_stream.data) #error (empty)
self.assertEqual(rv_stream.status_code, 200)
self.assertEqual(rv_stream.mimetype, 'text/event-stream')
self.assertEqual(rv_stream_object, "data: {'value': 42}")
t.stop()
threads = []
t = Thread(target=get_stream)
threads.append(t)
t.start()
time.sleep(1)
rv_post = self.app.post('/sensor', headers=headers, data=data)
threads_done = False
while not threads_done:
threads_done = True
for t in threads:
if t.is_alive():
threads_done = False
time.sleep(1)
The app resource is:
#app.route('/stream/data')
def stream():
def generate():
pubsub = db.pubsub()
pubsub.subscribe('interesting')
for event in pubsub.listen():
if event['type'] == 'message':
yield 'data: {"value":%s}\n\n' % event['data']
return Response(stream_with_context(generate()),
direct_passthrough=True,
mimetype='text/event-stream')
Any pointers or examples of how to test a Content Stream in Flask? Google seems to not help much on this one, unless I'm searching the wrong keywords.
Thanks in advance.
You are starting the thread to get_stream() before POSTing any data to REDIS.
This means that it won't find any events and will return without streaming any data.
I believe you don't needs threads at all, you can simply use the POST to setup data for your integration test and then call the stream.
def test_04_receiving_events(self):
headers = [('Content-Type', 'application/json')]
data = json.dumps({"value": 42})
headers.append(('Content-Length', len(data)))
rv_post = self.app.post('/sensor', headers=headers, data=data)
rv_stream = self.app.get('stream/data')
rv_stream_object = json.loads(rv_stream.data)
self.assertEqual(rv_stream.status_code, 200)
self.assertEqual(rv_stream.mimetype, 'text/event-stream')
self.assertEqual(rv_stream_object, "data: {'value': 42}")
If you are interested rather in integration testing than unit testing, my suggestion would be to use the Postman App and its command line integration with newman.
You can have all variables and parameters in an environment and all requests to be tested in a collection. Postman offers docs and tutorials for this, see https://learning.postman.com/. You can do a lot with the free version.
Set Authorization, Params, Headers to match the incoming requests from the service that sends requests to your instance to be tested.
When you are done with preparing your requests to successfully run in postman, then you write tests in postman where you might assert specific parts of a requests response body. Postman also offers pre-build snippets to write these.
pm.test("Body includes 'some string'", function () {
pm.expect(pm.response.text()).to.include("some string");
});
You can also export your collection of requests (includes tests) and environment as json and run it in a shell with newman.
$ newman run mycollection.json -e dev_environment.json
This is useful if you want to integrate the tests into a CICD pipeline.

Categories