I have an iOS mobile application that sends an encoded image to a Python3 server.
static func prepareImageAndUpload(imageView: UIImageView) -> String?
{
if let image: UIImage? = imageView.image {
// You create a NSData from your image
let imageData = UIImageJPEGRepresentation(imageView.image!, 0.5)
// You create a base64 string
let base64String = imageData!.base64EncodedStringWithOptions([])
// And you encode it in order to delete any problem of specials char
let encodeImg = base64String.stringByAddingPercentEncodingWithAllowedCharacters(.URLHostAllowedCharacterSet()) as String!
return encodeImg
}
return nil
}
And I am trying to receive that image using the following code:
imageName = "imageToSave.jpg"
fh = open(imageName, "wb")
imgDataBytes = bytes(imgData, encoding="ascii")
imgDataBytesDecoded = base64.b64decode(imgDataBytes)
fh.write(imgDataBytesDecoded)
fh.close()
I create the image file successfully and nothing breaks. And I can see that the filesize is correct, but the image is not correct, since it can't be opened and shows that it is broken.
I am not sure where the error can be, since the logic is as follows:
Encode image with base64 on iOS device
Send it
Decode image with base64 on Python3 server
Save image from decoded bytes
I have tried two new variants:
Remove stringByAddingPercentEncodingWithAllowedCharacters and
the result was the same
Add urldecode in Python3 server and the result was the same
Related
I need to compress a file into a specific format that is required by our country's tax regulation entity and it has to be sent encoded in base64.
I work on Python3 and attempted to do the compression with the following code:
import gzip
# Work file generated before and stored in BytesBuffer
my_file = bytes_buffer.getvalue()
def compress(work_file):
encoded_work_file = base64.b64encode(work_file)
compressed_work_file = gzip.compress(encoded_work_file )
return base64.b64encode(compressed_work_file )
compress(my_file)
Now the tax entity returns an error message about an unknown compression format.
Luckily, they provided us the following Java example code:
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.util.zip.GZIPInputStream;
import java.util.zip.GZIPOutputStream;
public class DemoGZIP {
private final byte[] BUFFER = new byte[1024];
/**
* #param work_file File to compress
* The file is compressed over the original file name with the extension .zip
* #return boolean
* TRUE success
* FALSE failure
*/
public boolean compress(File work_file ) {
try (GZIPOutputStream out = new GZIPOutputStream (new FileOutputStream(work_file .getAbsolutePath() + ".zip"));
FileInputStream in = new FileInputStream(work_file )) {
int len;
while ((len = in.read(BUFFER)) != -1) {
out.write(BUFFER, 0, len);
}
out.close();
} catch (IOException ex) {
System.err.println(ex.getMessage());
return false;
}
return true;
}
The problem is that I do not have any experience working on Java and do not understand much of the provided code.
Can someone please help me adapt my code to do what the provided code does in python?
As noted in the comment, the Java code does not do Base64 coding, and names the resulting file incorrectly. It is most definitely not a zip file, it is a gzip file. The suffix should be ".gz". Though I doubt that the name matters to your tax agency.
More importantly, you are encoding with Base64 twice. From your description, you should only do that once, after gzip compression. From the Java code, you shouldn't do Base64 encoding at all! You need to get clarification on that.
I have a post request in Flask that accepts an image file, and I want to return another image to retrieve it in Flutter and put it on screen.
In Flutter, I can send the image through the post request, but I don't know how to retrieve an image and put it on screen.
I know I can save the image in the static folder at Flask, and retrieve the URL from Flutter, and it works, but I think this is too inefficient for what I'm doing.
So I want to send the image directly without saving it.
This was my last attempt but didn't work.
#app.route("/send-image", methods=['POST'])
def send_image():
if request.method == 'POST':
user_image = request.files["image"]
image = cv2.imdecode(np.frombuffer(
user_image.read(), np.uint8), cv2.IMREAD_COLOR)
#data is a NumPy array returned by the predict function. This numpy array it's an image
data = predict(image)
data_object = {}
data = data.reshape(data.shape[0], data.shape[1], 1)
data2 = array_to_img(data)
b = BytesIO()
data2.save(b, format="jpeg")
b.seek(0)
data_object["img"] = str(b.read())
return json.dumps(data_object)
Here I returned a Uint8List because I read from the internet that I can put that into an Image.memory() to put the image on the screen.
Future<Uint8List> makePrediction(File photo) async {
const url = "http://192.168.0.11:5000/send-image";
try {
FormData data = new FormData.fromMap({
"image": await MultipartFile.fromFile(photo.path),
});
final response = await dio.post(url, data: data);
String jsonResponse = json.decode(response.data)["img"].toString();
List<int> bytes =
utf8.encode(jsonResponse.substring(2, jsonResponse.length - 1));
Uint8List dataResponse = Uint8List.fromList(bytes);
return dataResponse;
} catch (error) {
print("ERRORRR: " + error.toString());
}
}
Sorry if what I did here doesn't make sense, but after trying a lot of things I wasn't thinking properly.
I really need your help
You can convert the image to base64 and display it with Flutter.
On server:
import base64
...
data_object["img"] = base64.b64encode(b.read()).decode('ascii')
...
On client:
...
String imageStr = json.decode(response.data)["img"].toString();
Image.memory(base64Decode(imageStr));
...
The problem with your server-side code is it tries to coerce a bytes to str object by using function str().
However, in Python 3, bytes.__repr__ is called by str() since bytes.__str__ is not defined. This results in something like this:
str(b'\xf9\xf3') == "b'\\xf9\\xf3'"
It makes the JSON response looks like:
{"img": "b'\\xf9\\xf3'"}
Without writing a dedicated parser, you can not read this format of image data in Flutter. However, base64 is a well known format of encoding binary data and we do have a parser base64Decode in Flutter.
I'm trying to retrieve a picture over an inputstream and then set my ImageView to that picture. Below is my code for how to retrieve the inputstream and read the data.
let bufferSize = 1024
var buffer = Array<UInt8>(repeating: 0, count: bufferSize)
while (inputStream.hasBytesAvailable){
let len = inputStream.read(&buffer, maxLength: buffer.count)
if(len > 0){
let imageData = NSData(bytes: &buffer, length: buffer.count)
print("\(imageData)")
let newImage = UIImage(data: imageData as Data)
ImageView.image = newImage
}
}
I'm using the CFStream as sockets for receiving data. All of this works fine as I'm able to send and receive all my other data but for some reason the picture doesn't.
Many thanks.
UPDATE.
The code from below doesn't throw any errors but the picture is still not showing. Is there anything wrong with my server code?
f = open("/home/pi/Documents/server/foo.jpg", "rb")
byte = f.read()
print "Done reading"
self.transport.write(byte)
f.close()
print "Done sending"
The server code is written in Twisted Python and the connection is working accordingly since I'm able to transfer other messages than the picture.
So, to accumulate the data fragment into one Data, you need to write something like this:
let bufferSize = 1024
var buffer = Array<UInt8>(repeating: 0, count: bufferSize)
//You need to assign an actual Data to the variable.
//With initial value given, you have no need to work with Optionals.
var imageData = Data()
while (inputStream.hasBytesAvailable){
let len = inputStream.read(&buffer, maxLength: buffer.count)
if(len > 0){
imageData.append(&buffer, count: buffer.count)
}
}
//You need to try converting the data to image after reading all data fragments.
let newImage = UIImage(data: imageData)
pictureView.image = newImage
If you find other issues with this code, please report as comments to this answer.
My mistake was not sending each byte of the file. When i changed my server code to:
with open("/home/pi/Documents/server/foo.jpg", "rb") as f:
byte = f.read()
print "Done reading"
while byte:
self.transport.write(byte)
byte = f.read(1)
print "Done sending"
f.close()
It worked fine. However, I will accept the answer below because it might have been a problem with how I was reading my data into the new image file as well.
I'm attempting to grab an image attached to an email using Jython 2.5.3. I get the email (using they Jython version of the Python imap library). I can get the attachment by looping through the parts, finding the correct part type using get_content_type():
image, img_ext = None, None
for part in self.mail.get_payload():
part_type, part_ext = part.get_content_type().split('/')
part_type = part_type.lower().strip()
part_ext = part_ext.lower().strip()
if part_type == 'image':
image = part.get_payload(decode=True)
img_ext = part_ext
return image, img_ext
'image' is returned as a big block of bytes, which in regular Python I'd write out directly to a file. However when I try the same thing in Jython I get the following error:
TypeError: write(): 1st arg can't be coerced to java.nio.ByteBuffer[], java.nio.ByteBuffer
What's the right way to make Jython recognize my big blob of data as a byte array?
PS: the writing code uses tempfile.mkstmp(), which defaults to writing binary...
For future readers, here's how I got around it. In the code tha does the writing:
from org.python.core.util import StringUtil
from java.nio import ByteBuffer
tmp, filename = tempfile.mkstemp(suffix = "." + extension, text=True)
bytes = StringUtil().toBytes(attachment)
bb = ByteBuffer.wrap(bytes)
tmp.write(bb)
tmp.close()
Im trying to read in image file from a server , with the code below . It keeps going into the exception. I know the correct number of bytes are being sent as I print them out when received. Im sending the image file from python like so
#open the image file and read it into an object
imgfile = open (marked_image, 'rb')
obj = imgfile.read()
#get the no of bytes in the image and convert it to a string
bytes = str(len(obj))
#send the number of bytes
self.conn.send( bytes + '\n')
if self.conn.sendall(obj) == None:
imgfile.flush()
imgfile.close()
print 'Image Sent'
else:
print 'Error'
Here is the android part , this is where I'm having the problem. Any suggestions on the best way to go about receiving the image and writing it to a file ?
//read the number of bytes in the image
String noOfBytes = in.readLine();
Toast.makeText(this, noOfBytes, 5).show();
byte bytes [] = new byte [Integer.parseInt(noOfBytes)];
//create a file to store the retrieved image
File photo = new File(Environment.getExternalStorageDirectory(), "PostKey.jpg");
DataInputStream dis = new DataInputStream(link.getInputStream());
try{
os =new FileOutputStream(photo);
byte buf[]=new byte[1024];
int len;
while((len=dis.read(buf))>0)
os.write(buf,0,len);
Toast.makeText(this, "File recieved", 5).show();
os.close();
dis.close();
}catch(IOException e){
Toast.makeText(this, "An IO Error Occured", 5).show();
}
EDIT: I still cant seem to get it working. I have been at it since and the result of all my efforts have either resulted in a file that is not the full size or else the app crashing. I know the file is not corrupt before sending server side. As far as I can tell its definitely sending too as the send all method in python sends all or throws an exception in the event of an error and so far it has never thrown an exception. So the client side is messed up . I have to send the file from the server so I cant use the suggestion suggested by Brian .
The best way to get a bitmap from a server is to execute the following.
HttpClient client = new DefaultHttpClient();
HttpGet get = new HttpGet("http://yoururl");
HttpResponse response = client.execute(get);
InputStream is = response.getEntity().getContent();
Bitmap image = BitmapFactory.decodeStream(is);
That will give you your bitmap, to save it to a file do something like the following.
FileOutputStream fos = new FileOutputStream("yourfilename");
image.compress(CompressFormat.PNG, 1, fos);
fos.close();
You can also combine the two and just write straight to disk
HttpClient client = new DefaultHttpClient();
HttpGet get = new HttpGet("http://yoururl");
HttpResponse response = client.execute(get);
InputStream is = response.getEntity().getContent();
FileOutputStream fos = new FileOutputStream("yourfilename");
byte[] buffer = new byte[256];
int read = is.read(buffer);
while(read != -1){
fos.write(buffer, 0, read);
read = is.read(buffer);
}
fos.close();
is.close();
Hope this helps;
I'm not sure I understand your code. You are calling dis.readFully(bytes); to write the content of dis into your byte array. But then you don't do anything with the array, and then try to write the content of dis through a buffer into your FileOutputStream.
Try commenting out the line dis.readFully(bytes);.
As a side note, I would write to the log rather than popping up a toast for things like the number of bytes or when an exception occurs:
...
} catch (IOException e) {
Log.e("MyTagName","Exception caught " + e.toString());
e.printStackTrace();
}
You could look at these links for examples of writing a file to the SD card:
Android download binary file problems
Android write to sd card folder
I solved it with the help of a Ubuntu Forums member. It was the reading of the bytes that was the problem . It was cutting some of the bytes from the image. The solution was to just send the image whole and remove the sending of the bytes from the equation all together