pysqlite - how to save images - python

I need to save an image file into sqlite database in python. I could not find a solution. How can I do it?
Thanks in advance.

write - cursor.execute('insert into File
(id, name, bin) values (?,?,?)', (id, name, sqlite3.Binary(file.read())))
read - file = cursor.execute('select bin from File where id=?', (id,)).fetchone()
if you need to return bin data in web app - return cStringIO.StringIO(file['bin'])

Do you have to store the image in the database? I would write the image to the filesystem and store its path in the DB. (You may not be able to do this, depending on your particular case.)
If you absolutely must, look here.

I am not sure if pysqlite is the same as sqlite3, which is currently default in the standard python library. But if you use sqlite3 you can store the image in a buffer object and store that in a blob field in sqlite.
Be aware of the following though:
storing images in a database is frowned upon by some, storing files and path in the database is the other possibility.
make sure you return the proper mime type

It's never a good idea to record raw types in databases. Couldn't you just save the file on the filesystem, and record the path to it in database?

Related

How Do I store downloaded pdf files to Mongo DB

I download the some of pdf and stored in directory. Need to insert them into mongo database with python code so how could i do these. Need to store them by making three columns (pdf_name, pdf_ganerateDate, FlagOfWork)like that.
You can use GridFS. Please check this URL.
It will help you to store any file to mongoDb and get them. In other collection you can save file metadata.

How to save an SVG image in mySQL (from Python 3.6)

How can one save a large SVG image in a mySQL table?
My problem is that my SVGs are up to 200K symbols, which appears to be too much to save them in my table.
When trying to save as TEXT, Python (using Python 3.6 with Anaconda), python/sqlalchemy tells me the following:
sqlalchemy.exc.DataError: (pymysql.err.DataError) (1406, "Data too long for column 'cantons_svg' at row 27") [SQL: 'INSERT INTO ...]
I encountered this problem today when I try to store videos into tidb. I am using flask as backend framework and sqlalchemy as ORM, connecting to by database with mysql python connector.
the log is as follow:
sqlalchemy.exc.DataError: (pymysql.err.DataError) (1406, "Data too long for column 'video' at row 1")
[SQL: INSERT INTO videos (user_id, token_id, video) VALUES (%(user_id)s, %(token_id)s, %(video)s)]
I found that there is few advise about this situation, amoung those, one suggest me to see if there are any self-defined storage-type in sqlalchemy. It seems quite complicated.(if anyone find a doc or something that giving a clearly guidance about this, please tell me).
As for me, I just use BLOB type of sqlalchemy to init the database. And use
alter table videos modify column video LongBlob DEFAULT NULL ;
munualy change the column type. This work fine with me.

How to store and download a zip file in database postgres

I'm trying to store and then download a zip file from Postgres database. I know that this is not the best approach (i should only save the path to file) but i need to do this way, just for learning and practice.
I did a python script to store the content of the file into a bytea field but this was not my final goal. I really want to know how to save the zip file.
Any ideas? I just know python so i'm trying to this in python
Thank you guys!
If you can store the file as a bytea field then storing a zipped file is just the same.
Postgres doesn't have a concept of "file" field - you simply store the content (as you did for the original content) in bytea field.
If you're asking about zipping a file on the fly,
Take a look at zlib it's one of the common modules for such tasks.
Regards
Jony

Insert users into Active Directory

I am trying to determine how to best insert users into active directory from a SQL server table.
I figured I could use the LDAP sever to do a insert, but the research iv done would suggest otherwise and that I could only pull data from active directory to SQL server.
Then I thought I could use a python program to query the table and spit out a CSV file to then do a bulk insert but I am not sure if this would modify existing users if data changes.
Any insight would be appreciated
Here's a general idea of the algorithm:
Load user data from SQL Server
Convert it into an LDIF (LDAP Data Interchange Format) file
Import the LDIF file into Active Directory using the LDIFDE command-line tool
Python, or any other programming language, can help you with step 2. Notice that the details of the conversion are very specific to how your data is represented. You'll have to carefully map each data base field into an LDAP attribute, and determine the classes to be used in the LDAP objects.
Will the above modify existing users? yes, of course. You could write the LDIF in such a way that it updates the existing data, or if that's a problem you could verify first if an user exists in the Active Directory and don't add those changes to the LDIF file.
Alternatively
You could use CSVDE for importing data in CSV format, but anyway you'll have to design a mapping strategy for each one of the fields that you want to import into Active Directory.

how will Python DB-API read json format data into an existing database?

If we have a json format data file which stores all of our database data content, such as table name, row, and column, etc content, how can we use DB-API object to insert/update/delete data from json file into database, such as sqlite, mysql, etc. Or please share if you have better idea to handle it. People said it is good to save database data information into json format, which will be much convenient to work with database in python.
Thanks so much! Please give advise!
There's no magic way, you'll have to write a Python program to load your JSON data in a database. SQLAlchemy is a good tool to make it easier.

Categories