How to read UniDac .log files (database dumps) with python - python

I have an old delphi program exporting electrical measurement data from a device as .log files as a database dump. I need to convert them to .csv or any other easily accessible dataformat.
As I have no access to the source code, my options to find the actual database format which was used is relatively low. I know, the author was using UniDac and the password for database access is "admin".
When I try to open the .log files in pythons binary-read mode, they don't make much sense as I don't know the correct encoding. Here is an example line:
b'\xa0=\xb1\xfc\xaa<\xb0\xd6\x8e=\xbbr\xd7<\xde\x92\x1a=\x82\xab"=|5%>o\xa4\xbe=\xce:\xe3>\xdf\xb9\xe2=~\x14\x8b=\xbb\xac(=-\x8bE=g\r\x03=\xeb\xc9-=\xcb\x81\xbf</(\x86=\xb8\x07\xad<\xaf\xd0)=\x94\x18\xa6<j#\x05=N\xd0\xa7<\xfe\x92\xbd=\xf0I\xfa<\x9dl\xec<\x08t\xba<9(\xf5<R}\xc8<E\xecI=?\x17\xaa=\xd9\xf4\x01=\xeet\x92=}\xb8\x82<>D\xe1<\x83w\x99<\xab\xd7\xe9<\x92\n'
As you can see, it is a mixture of hex- and ASCII-Codes, sometimes even containing comprehensible text headers.
I have also tried opening the database with pyodbc:
import pyodbc as db
conn = db.connect('DRIVER={SQL Server};DBQ={link_to_.log};UID=admin;PWD=admin')
But as I am not sure of the UniDac database format, this doesn't work.
I am not used to working with databases so even basic information is welcome. Which are the formats I need to try with pyodbc, when I want to open UniDac-databases? Do I need to find the correct encoding first? Are there other possibilities to make the.log files accessible?
I can upload an example .log if someone is interested in this problem.

Related

Need code for importing .csv file via python or ruby code to Cassandra 3.11.3 DB (Production use)

We have 7 node Cassandra 3.11.3 production cluster, we get ticket details dump to a mid server, I need to read from this .csv file and import .csv data to cassandra table. I tried ruby code which was easy for me to write but it does not take care of all the column values (As this .csv will have special characters, enters/different lines, UTF issues, too much of text description as it is in ticketing tool) as data keep changing in each and every row in .csv.
I Want to know if ruby or python is good to perform this activity in production or does anyone have good sample code for mitigating issues mentioned above and performing this kind of activity in production environment?
Both Ruby and Python are perfect for this kind of task, but if your source file is in bad format then any potential tool could fail - there is no magic button tool that could deduce the context from the (broken) data file and fix all the problems for you automatically.
I'd suggest splitting the task into two parts: 1) fix the encoding and data quality problem(s) (and perform any data transformations if necessary) and then 2) import clean data.
Task 2 could be easily done with almost any programming language (that has appropriate cassandra driver available) but if you have a well-formatted csv source you probably don't need any hacking at all (depending on the use case, of course) - Cassandra supports copy ... from command that allows importing data from csv directly (https://docs.datastax.com/en/cql/3.3/cql/cql_reference/cqlshCopy.html).

How can I reduce the access time on large Excel files?

I would like to process a large data set of a mechanical testing device with Python. The software of this device only allows to export the data as an Excel file. Therefore, I use the xlrd package which works fine for small *.xlsx files.
The problem I have is, that when I want to open a common data set (3-5 MB) by
xlrd.open_workbook(path_wb)
the access time is about 30s to 60s. Is there any more effecitve and faster way to access Excel files?
You could access the file as a database via PyPyODBC instead, which may (or may not) be faster - you'd have to try it out and compare the results.
This method should work for both .xls and .xlsx files. Unfortunately, it comes with a couple of caveats:
As far as I am aware, this will only work on Windows machines, since you're relying on the Microsoft Jet database driver.
The Microsoft Jet database driver can be rather buggy, especially with dates.
It's not possible to create or modify Excel files (a note in the PyPyODBC exceltests.py file says: I have not been able to successfully create or modify Excel files.). Your question seems to indicate that you're only interested in reading files, though, so hopefully this will not be a problem.
I just figured out that it wasn't actually the problem with the access time but I created an object in the same step. Now, by creating the object separately everything works fast and nice.

How to save program settings to computer?

I'm looking to store some individual settings to each user's computer. Things like preferences and a license key. From what I know, saving to the registry could be one possibility. However, that won't work on Mac.
One of the easy but not so proper techniques are just saving it to a settings.txt file and reading that on load.
Is there a proper way to save this kind of data? I'm hoping to use my wx app on Windows and Mac.
There is no proper way. Use whatever works best for your particular scenario. Some common ways for storing user data include:
Text files (e.g. Windows INI, cfg files)
binary files (sometimes compressed)
Windows registry
system environment variables
online profiles
There's nothing wrong with using text files. A lot of proper applications uses them exactly for the reason that they are easy to implement, and additionally human readable. The only thing you need to worry about is to make sure you have some form of error handling in place, in case the user decides to replace you config file content with some rubbish.
Take a look at Data Persistence on python docs. One option a you said could be persist them to a simple text file. Or you can save your data using some serialization format as pickle (see previous link) or json but it will be pretty ineficient if you have several keys and values or it will be too complex.
Also, you could save user preferences in an .ini file using python's ConfigParser module as show in this SO answer.
Finally, you can use a database like sqlite3 which is simpler to handle from your code in order to save and retrieve preferences.

streamlining spreadsheet to DB copying Python

A friend of mine has asked me to write a quick Python script.
He has a small SQLite database (3 tables) and has to copy a bunch of data to it from an excel spreadsheet. The spreadsheet only has 2 data fields but alot of rows of data.
He asked if I would write a quick Python script to transfer the ss data to the db to so he doesn't have to spend a crapload of time manually copying it. I told him that I would do my best.
My question is: Where do I start? what do I need to research for this? Does anyone know if there is a pre-existing module to do this? Im trying to research this myself, but haven't come up with anything concrete yet and am not sure of any other search terms to use to narrow down my search.
Im just hoping someone wont mind giving my some guidance in the right direction.
Blessings and thanks
F
Do you really need Python?
SQLite Database Browser is a freeware, public domain, open source visual tool used to create, design and edit database files compatible with SQLite. Controls and wizards are available for users to:
Import and export tables from/to CSV files
If the file is simple enough (no commas in the content), you can import directly from SQLite:
For simple CSV files, you can use the SQLite shell to import the file into your SQLite database.
If you need to import a complex CSV file and the SQLite shell doesn't handle it, you may want to try a different front end, such as SQLite Database Browser.

Sending sqlite db over network

I have an sqlite database whose data I need to transfer over the network, the server needs to modify the data, and then I need to get the db back and either update my local version or overwrite it with the new db. How should I do this? My coworker at first wanted to scrap the db and just use an .ini file, but this is going to be data that we have to parse pretty frequently (it's a user defined schedule that can change at the user's will, as well as the server's). I said we should just transfer the entire .db as a binary file and let them do with it what they will and then take it back. Or is there a way in sqlite to dump the db to a .sql file like you can do in MySQL so we can transfer it as text?
Any other solutions? This is in python if it makes a difference
update: This is on an embedded platform running linux (I'm not sure what version/kernel or what OS commands we have except the basics that are obvious)
Use the copy command in your OS. No reason to overthink this.

Categories