What is equivalent to TestCase.client in normal script - python

For example with TestCase
I can check login post and so on with self.client
class TestMyProj(TestCase):
response = self.client.login(username="user#example.com", password="qwpo1209")
response = self.client.post('/cms/content/up',
{'name': 'test', '_content_file': fp},
follow=True)
However now I want to use this in script not in test case.
because this is very useful to make initial database.
I want to do like this.
def run():
response = client.login(username="user#example.com", password="qwpo1209")
with open('_material/content.xlsx','rb') as fp:
response = client.post('/cms/content/up',
{'name': 'faq_test', 'scenario_content_file': fp})
What is equivalent to TestCase.client in normal script??
More details
If there is not file upload, I can make database directory from model.
However, I want to upload file and parse then put into database, same as user does. (via form_valid and so on)
so, I want to use post for url from script.
My Solution
Use from django.test.client import Client as Willem Van Onsem mensioned.
somehow client.login returns True not response.
So, I use post to login
def run():
client = Client()
#response = client.login(username="guest#guest.com", password="guest")# it doesn't work some how.
response = client.post('/login/', {'username': 'guest#guest.com', 'password': 'guest'},follow=True)
with open('_material/content.xlsx','rb') as fp:
response = client.post('/cms/content/up',
{'name': 'test','is_all':"True", '_content_file': fp})

It is a Client object [Django-doc], so:
from django.test.client import Client
def run():
client = Client()
response = client.login(username="user#example.com", password="qwpo1209")
response = client.post(
'/cms/content/up',
{'name': 'test', '_content_file': fp},
follow=True
)
The documentation discusses the parameters that can be passed when constructing a Client object.

Generally I recommend to create command to make initial database instead of using the Client class that we can use to test the application.
https://docs.djangoproject.com/en/4.0/howto/custom-management-commands/
You can also take advantage of bulk methods to create many instances in a single query.

Related

How to pass unencoded URL in FastAPI/Swagger UI via GET method?

I would like to write a FastAPI endpoint, with a Swagger page (or something similar) that will accept a non-encoded URL as input. It should preferably use GET, not POST method.
Here's an example of a GET endpoint that does require double-URL encoding.
#app.get("/get_from_dub_encoded/{double_encoded_url}")
async def get_from_dub_encoded(double_encoded_url: str):
"""
Try https%253A%252F%252Fworld.openfoodfacts.org%252Fapi%252Fv0%252Fproduct%252F7622300489434.json
"""
original_url = urllib.parse.unquote(urllib.parse.unquote(double_encoded_url))
response = requests.get(original_url)
return response.json()
Which generates a Swagger interface as below.
The following PUT request does solve my problem, but the simplicity of a GET request with a form is better for my co-workers.
class InputModel(BaseModel):
unencoded_url: AnyUrl = Field(description="An unencoded URL for an external resource", format="url")
#app.post("/unencoded-url")
def unencoded_url(inputs: InputModel):
response = requests.get(inputs.unencoded_url)
return response.json()
How can I deploy a convenient interface like that without requiring users to write the payload for a PUT request or to perform double URL encoding?
This post has some helpful related discussion, but doesn't explicitly address the FORM solution: How to pass URL as a path parameter to a FastAPI route?
You can use Form instead of query parameters as payload.
from fastapi import FastAPI, Form
import requests
app = FastAPI()
#app.post("/")
def get_url(url: str = Form()):
"""
Try https://world.openfoodfacts.org/api/v0/product/7622300489434.json
"""
response = requests.get(url)
return response.json()
Swagger interface would look like:
You'll need to install python-multipart.
Tip: Don't use async endpoint if you are using requests or any non-async library.

Python3 : Records not getting pushed to Splunk

I have created a custom class, which push my logs to splunk, but somehow it is not working. Here is the class.
class Splunk(logging.StreamHandler):
def __init__(self, url, token):
super().__init__()
self.url = url
self.headers = {f'Authorization': f'Splunk {token}'}
self.propagate = False
def emit(self, record):
mydata = dict()
mydata['sourcetype'] = 'mysourcetype'
mydata['event'] = record.__dict__
response = requests.post(self.url, data=json.dumps(mydata), headers=self.headers)
return response
I call the class from my logger class, somehow like this (adding additional handler), so that it can log on console along with send to splunk
if splunk_config is not None:
splunk_handler = Splunk(splunk_config["url"], splunk_config["token"])
self.default_logger.addHandler(splunk_handler)
But somehow, I am not able to see any logs in splunk. Though I can see the logs in console.
When I try to run the strip down version of above logic from python3 terminal, it is successful.
import requests
import json
url = 'myurl'
token = 'mytoken'
headers = {'Authorization': 'Splunk mytoken'}
propagate = False
mydata = dict()
mydata['sourcetype'] = 'mysourcetype'
mydata['event'] = {'name': 'root', 'msg': 'this is a sample message'}
response = requests.post(url, data=json.dumps(mydata), headers=headers)
print(response.text)
Things I have already tried, making my dictionary data as JSON serializable using below link but it didn't helped.
https://pynative.com/make-python-class-json-serializable/
Any other things to try ?
I've successfully used this Python Class for Sending Events to Splunk HTTP Event Collector instead of writing a dedicated class
https://github.com/georgestarcher/Splunk-Class-httpevent
Advantage is that it implements batchEvent() and flushBatch() methods to submit multiple events at once across multiple threads.
The example here should get you started:
https://github.com/georgestarcher/Splunk-Class-httpevent/blob/master/example.py
If this answers your question, take a moment to accept the answer. This can be done by clicking on the check mark beside the answer to toggle it from greyed out to filled in!

hosting an image with the flask and then processing the same using another view function in the same code

so I am hosting an image using flask and then I want to do a post request to an API using the url all in the same code:
#app.route('/host')
def host():
return send_from_directory("C:/images", "image1.png")
#app.route('/post')
def post():
response = requests.post(url, data={'input':'<url for host>', headers=headers)
return jsonify(response.json())
I believe as both these view functions are in the same python file, post() gets blocked.
Is there a workaround this problem ?
PS: if I host images on a different machine, it works, but that's not what I desire.
Thanks!
I think there are some problems with your code.
First, I don't believe there is an #app.post() decorator in Flask. My guess is that you were trying to specify that that route should be POSTed to by your users. The way to do that would be #app.route('/post', methods=['POST']).
Next, it seems like you want the /post endpoint to send a POST request to a user-specified(?) URL when the user sends an HTTP request to this endpoint. The way you would do that for a user-specified / user-POSTed URL is something like this (I haven't run this code to test it):
#app.route('/send_post_request', methods=['POST'])
def send_post_request():
user_posted_data = json.loads(request.data)
user_specified_url = user_posted_data['url']
dict_to_post= { 'input': url_for('hosts') }
headers = {} # Fill these in
response = requests.post(user_specified_url , json=dict_to_post, headers=headers)
return jsonify(response.json())
If the URL to send the POST request to is known by the server, you could have your user simply send a GET request:
#app.route('/send_post_request', methods=['GET'])
def send_post_request():
dict_to_post = { 'input': url_for('hosts') }
headers = {} # Fill these in
server_specified_url = '' # Fill this in
response = requests.post(server_specified_url, json=dict_to_post, headers=headers)
return jsonify(response.json())

Redirect to external url while sending a JSON object or String from flask app

I have a flask application where I need to redirect to an URL outside flask root path (www.externalurl.com). This URL, on the client side, requires consuming some information (either a JSON object or a String). How to send this information along while redirecting?
Consider below example,
from flask import Flask,redirect
app = Flask(__name__)
#app.route('/<some_path>')
def some_method():
# some_info = {'key1' : 'value1'}
response = redirect("http://www.externalurl.com", code=302)
return response
I have come across setting headers & cookies. But, I came to know that the cookie size should be less than 4k and I don't want this limitation. Are there any standard ways of doing this?
You mentioned that you have control over externalurl.com
You can associate some key or hash with the JSON object and store them to be retrieved by a script on externalurl.com later.
So, store it in a database at example.com and setup an endpoint for it to be requested later.
#app.route('/<some_path>')
def some_method():
some_info = {'key1' : 'value1'}
db_entry = JsonInfo(some_info)
db.add(db_entry)
db.commit()
hash = db_entry.hash
response = redirect("http://www.externalurl.com?hash={}".format(hash), code=302)
return response
#app.route('/get-json')
def get_json():
hash = request.args.get("hash")
json_info = JsonInfo.query.filter_by(hash=hash).first()
return jsonify(json_info)
Then on the side of externalurl.com parse the hash from the get request, then use that to make a request to https://example.com/get-json?hash=abc123 to retreive the info.

Trying to write a unit test for file upload to a django Restless API

I'm writing a fairly small lightweight REST api so I chose restless as the quickest/easiest support for that. I didn't seem to need all the complexity and support of the django-REST module. My service will only received and send json but users need to upload files to one single endpoint. Currently my view/api code for the file upload looks like:
class SubmissionCreate(Endpoint):
def post(self, request):
# get the data from the post request
data = {}
data['input_data'] = request.FILES['input_data'].read().decode('UTF-8')
data['submission_name'] = request.FILES['submission_name'].read().decode('UTF-8')
submission_form = SubmissionForm(data)
if submission_form.is_valid():
s = submission_form.save()
return {'message': 'file uploaded'}
else:
return {'error': 'Input information is not correctly formatted'}
I also wrote a small client with Requests to upload files
import os
import requests
import json
url = 'http://127.0.0.1:8000/submission/create/'
payload = {'input_data': ('input.txt', open('./static/files/file1.txt', 'rb')), 'submission_name': 'test'}
r = requests.post(url, files=payload)
This works great and I can push files to the database with my client. But obviously I want some proper testing before I give this any more complex behaviour, so I looked at the docs and wrote the following test
class SubmissionCreateTests(TestCase):
def test_submissioncreate_will_accept_data(self):
f = SimpleUploadedFile("file.txt", bytes("file_content", 'utf-8'))
response = self.client.post(reverse('submission_data'),
{'input_data': f, 'submission_name': 'test'})
self.assertEqual(response.status_code, 200)
However this produces the following error:
django.utils.datastructures.MultiValueDictKeyError: "'submission_name'"
If I set the content_type to 'application/x-www-form-urlencoded' I get the same error.
If I set the content_type to 'multipart/form-data' I get a 400 error but the tests run with no exception thrown.
I tried to fix this but in the end it was quicker and easier to switch to the Django-REST framework. The docs are so much better for Djano-REST that it was trivial to set it up and build tests. There seem to be no time savings to be had with regards either the restless or django-restless modules for django.

Categories