I need to send a JSON file from a raspberry (programmed in python) to Laravel. Looking on the internet I found this function for python:
https://docs.python-requests.org/en/latest/
but when I go to view the JSON file on Laravel nothing appears.
The code in python is this:
def readFile():
with open("/home/pi/Desktop/Progetti SIoTD/Bluetooth/device.txt", "r") as file:
for i in file:
line, *lines = i.split()
if line in mac_dict:
mac_dict[line] += lines
else:
mac_dict[line] = lines
print(mac_dict)
print("\n")
json_obj = json.dumps(mac_dict, indent=4)
with open("/home/pi/Desktop/Progetti SIoTD/Bluetooth/mac_addr.json", "w") as json_file:
json_file.write(json_obj)
r = requests.get(ip, data=json_obj, headers=headers)
print(r.text)
I use this function to read a txt file (containing the various Bluetooth MAC addresses and their respective RSSI values) and then transform it into JSON. Now I wanted to understand how to send it to Laravel and display something
The function in Laravel is this:
<?php
namespace App\Http\Controllers;
use Illuminate\Http\Request;
class DictionaryController extends Controller
{
public function index()
{
return view('backend.auth.user.dictionary');
}
public function store(Request $request)
{
return $request;
}
public function show($id)
{
//
}
public function update(Request $request, $id)
{
//
}
public function destroy($id)
{
//
}
}
The route in laravel is in the api.php:
Route::get('dictionary', [DictionaryController::class, 'store'])->name('dictionary');
I hope to have explained well and that someone can help me, I am stuck for several days on this.
I also need to know how to use php/blade on Laravel to display the JSON file (if it is possible)
Thanks in advance
I use this function to read a txt file (containing the various Bluetooth MAC addresses and their respective RSSI values) and then transform it into JSON. Now I wanted to understand how to send it to Laravel and display something
I assume that your writing JSON via Python has worked. So, what you're asking is how Laravel can see JSON file.
class DictionaryController extends Controller
{
public function index()
{
$json = file_get_contents('/home/pi/Desktop/Progetti SIoTD/Bluetooth/mac_addr.json');
return response()->json($json);
}
...
Route :
Route::get('dictionary', [DictionaryController::class, 'index'])->name('dictionary.index');
Or if you want to work with template, you can use:
class DictionaryController extends Controller
{
public function index()
{
$json = '/home/pi/Desktop/Progetti SIoTD/Bluetooth/mac_addr.json';
$data['macs'] = json_encode($json, true);
return view('backend.auth.user.dictionary', $data);
}
...
And then you can process it however you want on the Blade.
Related
I am preety new to Jackson and how it really works under the hood when its get complicated to an issue of what i am facing. I have a record coming from the database which is stored as JSON TYPE (using postgres). Below is a sample of how it is in the database:
{"flutterwave": {"secret": "SECRET KEYS"}, "dlocal": {"xkey": X KEY VALUE", "xlogin": "X LOGIN VALUE"}}
Coming from python world, i would have just done json.loads(DATA_FROM_DB_IN_JSON) and it automatically converts the resulting output to a Dictionary in which i can easily retrieve and utilize the keys as i want them, However with Jackson library of Java, i haven't been able to get it to work.
Below is what i have done in java and haven't gotten it to work the way i would have expected it if were python.
public class PaymentConfigDTO {
#JsonAlias({"secrets"})
#JsonDeserialize(using = KeepAsJsonDeserializer.class)
private String processorCredentials;
}
DESERIALIZER CLASS
public class KeepAsJsonDeserializer extends JsonDeserializer<String> {
#Override
public String deserialize(JsonParser jsonParser, DeserializationContext deserializationContext) throws IOException, JacksonException {
TreeNode tree = jsonParser.getCodec().readTree(jsonParser);
return tree.toString();
}
}
In summary, what i want to acheive is been able to convert the result json coming from the db to deserializable to a Map<String, Map<>> or a better approach where i will be able to get the nested values without much stress.
If you want to get a Map<String, Map<String, String>> you can do:
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
...
var mapper = new ObjectMapper();
var result = mapper.readValue(json, new TypeReference<Map<String, Map<String, String>>>() {});
But would be better to use types:
record Config(Flutterwave flutterwave, DLocal dlocal) {}
record Flutterwave(String secret) {}
record DLocal(String xkey, String xlogin) {}
...
var mapper = new ObjectMapper();
var result = mapper.readValue(json, Config.class);
var secret = result.flutterwave().secret()
Assuming that you already have Json String it is very simple to deserialize it to Map<String, Object>. you don't need to write your own deserializer class. All you need to do is:
ObjectMapper om = new ObjectMapper();
try {
Map<String, Object> map = om.readValue(jsonStr, Map.class);
} catch(IOException ioe) {
...
}
See JavaDoc for ObjectMapper
Also, if you want it even simpler I wrote my own JsonUtil where you don't even have to instantiate ObjectMapper. Your code would look like this:
try {
Map<String, Object> map = JsonUtils.readObjectFromJsonString(jsonStr, Map.class);
} catch (IOException ioe) {
...
}
In this example class JsonUtils comes with Open Source MgntUtils library written and maintained by me. See the Javadoc for JsonUtils class. The MgntUtils library can be obtained from Maven Central as Maven artifact or from Github along with Source code and Javadoc
So what i did basically was to update the custom deserializer class to use a JsonNode instead of TreeNode
It was re-written as:
#Override
public String deserialize(JsonParser jsonParser, DeserializationContext deserializationContext) throws IOException, JacksonException {
JsonNode node = jsonParser.getCodec().readTree(jsonParser);
return node.get("value").textValue();
}
Then to convert to a map, i did
Map<String, Object> toMap = objectMapper.readValue(JSON, Map.class)
We have a c#/.net Windows service that parses big log files for us and updates a meta table when it does so. The problem is whenever you need to stop the service or (services, we have multiple of them running), one must manually delete the files that are in the process of being parsed in the local folder and also update the queue DB table where it tracks files to process.
I want to automate this. I am much more familiar with python so ideally, it would be a python script as opposed to .net. Is it possible to have a script that will trigger when the service is stopped? How would one do this?
I have tried doing this internally in the .net service but since it's multithreaded, files don't get cleaned up neatly. There's always a "can't stop service because another process is using it". It is like the service gets stuck trying to delete files when the Onstop() method is called. This was how I had tried to do it internally within the service:
protected override void OnStop()
{
ProducerConsumerQueue.Dispose();
Logger.Info($"{ProducerConsumerQueue.Count()} logs will be canceled");
CancellationTokenSource.Cancel();
FileUtil.DeleteFilesInProgress(Constants.ODFS_STAGING);
MetadataDbContext.UpdateServiceEntriesOnServiceReset();
//look into some staging directory, delete all files.
Logger.Info($"{ProducerConsumerQueue.Count()} logs canceled");
}
public static void DeleteFilesInProgress(string directory)
{
var constantsutil = new ConstantsUtil();
constantsutil.InitializeConfiguration();
try
{
System.IO.DirectoryInfo di = new DirectoryInfo(directory);
foreach (FileInfo file in di.GetFiles())
{
file.Delete();
}
}
catch(Exception ex)
{
Logger.Error(ex.Message);
string subject = Constants.GENERAL_EMAIL_SUBJECT;
string body = "The following error occured in Client.Util.ConstantsUtil:";
string error = ex.ToString(); //ex.ToString makes it more verbose so you can trace it.
var result = EmailUtil.Emailalert(subject, body, error);
}
}
public static int UpdateServiceEntriesOnServiceReset()
{
int rowsAffected = 0;
try
{
string connectionString = GetConnectionString();
using (SqlConnection connection = new SqlConnection())
{
connection.ConnectionString = connectionString;
SqlCommand cmd = new SqlCommand();
cmd.CommandType = CommandType.Text;
cmd.CommandText = $"UPDATE {Constants.SERVICE_LOG_TBL} SET STATUS = '0'";
cmd.Connection = connection;
connection.Open();
rowsAffected = cmd.ExecuteNonQuery();
}
}
catch (Exception ex)
{
Logger.Error($"{ex.Message.ToString()}");
string subject = Constants.GENERAL_EMAIL_SUBJECT;
string body = "The following error occured in Client.MetadatDbContext while Parser was processing:";
string error = ex.ToString(); //ex.ToString makes it more verbose so you can trace it.
var result = EmailUtil.Emailalert(subject, body, error);
}
return rowsAffected;
}
You can run your script from OnStop:
using System.Diagnostics;
Process.Start("python yourscript.py");
// or whatever the command for executing your python script is on your system.
And then use something like pywin32's win32service to find out the status of the service that launched the script, and then wait for it to die and release its hold on the files.
Then wipe them.
I am trying to send a file from a Python script to my .net core webserver.
In Python I am doing this using the requests library, and my code looks like so.
filePath = "run-1.csv"
with open(filePath, "rb") as postFile:
file_dict = {filePath: postFile}
response = requests.post(server_path + "/batchUpload/create", files=file_dict, verify=validate_sql)
print (response.text)
This code executes fine, and I can see the request fine in my webserver code which looks like so:
[HttpPost]
[Microsoft.AspNetCore.Authorization.AllowAnonymous]
public string Create(IFormFile file) //Dictionary<string, IFormFile>
{
var ms = new MemoryStream();
file.CopyTo(ms);
var text = Encoding.ASCII.GetString(ms.ToArray());
Debug.Print(text);
return "s";
}
However, the file parameter always returns as null.
Also, I can see the file parameter fine when getting data posted from postMan
I suspect that this problem has to do with how .net core model binding works, but not sure...
Any suggestions here on how to get my file displaying on the server?
Solved my issue - the problem was that in Python I was assigning my file to my upload dictionary with the actual file name "./run1.csv" rather than a literal string "file"
Updating this fixed my issue.
file_dict = {"file": postFile}
This is what I believe #nalnpir mentioned above.
I figured this out by posting from postman and also from my python code to http://httpbin.org/post and comparing the respoinse
The example from the requests docs is mostly correct, except that the key has to match the parameter of the controller method signature.
url = 'https://www.url.com/api/post'
files = {'parameterName': open('filename.extension', 'rb')}
r = requests.post(url, files=files)
So in this case the controller action should be
[HttpPost]
public string Post(IFormFile parameterName)
Summary
I am trying to set my FormData properly using javascript.
I need to be able to upload jpg/png, but I might need to upload some other file types pdf/csv in the future using fetch.
Expected
I expect it to append the data to the form
Error
Working
This snippet is working fine:
const formData = new FormData(document.querySelector('form'));
formData.append("extraField", "This is some extra data, testing");
return fetch('http://localhost:8080/api/upload/multi', {
method: 'POST',
body: formData,
});
Not working
const formData = new FormData();
const input = document.querySelector('input[type="file"]');
formData.append('files', input.files);
Question
Does fetch support multiple file upload natively?
If you want multiples file, you can use this
var input = document.querySelector('input[type="file"]')
var data = new FormData()
for (const file of input.files) {
data.append('files',file,file.name)
}
fetch('http://localhost:8080/api/upload/multi', {
method: 'POST',
body: data
})
The issue with your code is in the lineformData.append('files', input.files);
Instead of that, you should upload each file running a loop with unique keys, like this
const fileList = document.querySelector('input[type="file"]').files;
for(var i=0;i<fileList.length;i++) {
formData.append('file'+i, fileList.item(i));
}
I have created a simple error fiddle here with your code. You can check its' submitted post data here, where you can see that no file has been uploaded.
At the bottom of the page you can find
.
I have corrected the fiddle here with the fix. You can check its'post data from the server, where it shows the details of the two files that I uploaded.
I mentioned this on a similar question: I had the same problem, but with a PHP backend. The unique formData keys work, but I found that the classic HTML notation worked just fine and simply results in an array on the server.
formData.append('file[]', data[i]);
I like that a lot better, since I can use the same methods to process this as with a classic <input type="file" multiple />.
I am using a node.js restify server code that accepts text file upload and a python client code that uploads the text file.
Here is the relevant node.js server code;
server.post('/api/uploadfile/:devicename/:filename', uploadFile);
//http://127.0.0.1:7777/api/uploadfile/devname/filename
function uploadFile(req, res, next) {
var path = req.params.devicename;
var filename = req.params.filename;
console.log("Upload file");
var writeStream = fs.createWriteStream(path + "/" + filename);
var r = req.pipe(writeStream);
res.writeHead(200, {"Content-type": "text/plain"});
r.on("drain", function () {
res.write(".", "ascii");
});
r.on("finish", function () {
console.log("Upload complete");
res.write("Upload complete");
res.end();
});
next();
}
This is the python2.7 client code
import requests
file_content = 'This is the text of the file to upload'
r = requests.post('http://127.0.0.1:7777/api/uploadfile/devname/filename.txt',
files = {'filename.txt': file_content},
)
The file filename.txt did appear on the server filesystem. However, the problem is that the contents is empty. If things went right, the content This is the text of the file to upload should appear but it did not. What is wrong with the code? I am not sure if it is server or client or both code that are wrong.
It looks like you're creating a file but never actually getting the uploaded file contents. Check out the bodyParser example at http://restify.com/#bundled-plugins. You need to give the bodyParser a function for handling multi-part data.
Alternatively, you could just use bodyParser without your own handlers and look for the uploaded file information in req.files, including where the temporary uploaded file is for copying to wherever you like.
var restify = require('restify');
var server = restify.createServer();
server.use(restify.bodyParser());
server.post('/upload', function(req, res, next){
console.log(req.files);
res.end('upload');
next();
});
server.listen(9000);