Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I have apache+mysql+django. in mysql I store some data, and use django to interact with it.
Now I need to store data from mysql into some external text file(like csv) in a propper format(I want to parse it later with jenkins as a source to parameters). How can I create some kind of event which will upload data from DB table into text file using Django(as far as I already have Django installed, and I know python itself can do this easy)?
And I want reapetable event. Like every 1 minute data will be synced. Smthng like this.
Tnaks!
I would suggest writing a custom management command that takes the information you want and writes it to your CSV. As you said, the process of writing data to a text file is easy enough in Python - using a management command would let you use the ORM directly, greatly simplifying your interaction with the database.
The second benefit of using a management command is that it's easy to run as a cronjob. Once you've written your custom command (let's call it csvsync), you can write something like this in your crontab:
* * * * * python /your/path/to/manage.py csvsync
Here's the relevant documentation for custom management commands: https://docs.djangoproject.com/en/1.7/howto/custom-management-commands/
The developers note the following in the linked docs:
Standalone scripts
Custom management commands are especially useful for running
standalone scripts or for scripts that are periodically executed from
the UNIX crontab or from Windows scheduled tasks control panel.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I am trying to create a batch script to format my pc and create a backup file for me while the formating process starts . How i am able to make such type of batch script please help me guys
No you can't & It's not advisable to run these kind of scripts in you cmd/Powershell....
It's completely waste of time...Even though if we build such scripts, We have to make that script such that it has to bypass all the permissions...After lot effort even though once we managed to build it...It wont work on different systems...Because a whole new permissions has to be bypassed manually..So it's not advisable to make such scripts run on PC
You can't format a PC, but you can format a disk using format command. And for backup you can copy files to a folder or a disk using xcopy.
You should first create and run one script, before the process start, 'cause u could potencially overlap the time that the OS need to prepare the process, that saves you work, files, statuses... etc
Write another one that it's able to read and execute this one.
Then you can upload them (or write the process to autoupload it in the script) to save both in any cloud based solution.
Finally, when the PC is restored, you just need to download it one from the cloud, and it should be able to download the another one and run it, restablishing your previous status.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I'm using Python (3.7.4) to make a text/password manager, where the user can store text under different tabs (using tkinter for the interface) , and use a "master login" to access all data.
The only experience I've got with saving/storing data is using CSV files, and looping through them to get values.
Can anyone recommend anyway I can store text, without it being able to be opened from Windows Explorer, and need some sort of key to be opened?
The natural alternative to using csv files is the use of a database. A solution like sqlite might be enough for your solution. Directly from the documentation:
SQLite is a C library that provides a lightweight disk-based database
that doesn’t require a separate server process and allows accessing
the database using a nonstandard variant of the SQL query language.
Some applications can use SQLite for internal data storage. It’s also
possible to prototype an application using SQLite and then port the
code to a larger database such as PostgreSQL or Oracle.
Once you have learned sqlite, you can think of encrypting your database: in this post you will find many ideas for encrypting your sqlite database.
Otherwise you can switch to other more complete and complex DBMSs. The important thing is that you consider moving from the csv to using a db.
Take a look at SQLLite as a lightweight local database and use something like SQLCipher for strong AES 256-bit encryption.
I havn't read it fully but take a look at this blog for a python implementation.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I'm currently using a mix of smart view and power query(sql) to load data into Excel models however my excel always crashes when smart view is used. I'm required to work in Excel but I'm know looking at finding a way to periodically load data from Essbase into my SQL server database and only use power query(sql) for all my models. What would be my best options in doing this? Being a Python enthusiast I found essbasepy.py however there isn't much documentation on it. Please help
There are a couple of ways to go. The most straightforward is to export all of the data from your Essbase database using column export, then designing a process to load the data into SQL Server (such as using the import functionality or BULK IMPORT, or SSIS...).
Another approach is to use the DataExport calc script command to export either to a file (that you then load into SQL) or directly to the relational database (DataExport can be configured to export data directly to relational).
In either case, you will need privileges that are greater than normal user privileges, and either approach involves Essbase automation that may require you to coordinate with the Essbase admin.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I want to create Ruby on Rails web app (or on other dynamic languages such as Python), that should interact with Matlab.
Web app sends some info to the matlab server (or simply run matlab file with necessary data if it's possible) that processing this stuff and returns results back to the Rails server. Is it possible to do something like this?
If I understand correctly, you want to want to send info to matlab from a webapp. Then run some process on it using matlab, then it sends the info back to ruby to display it.
No matlab expert but if you did it in python (which you said you could) you could write your results to a database and then read the database in ruby, I expect you can export results from matlab into a database as well.
To use SQLite database in python you can import the module:
import sqlite3
The relevant documentation is here https://docs.python.org/2/library/sqlite3.html
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I would like to add some interactive capability to a python CLI application I've writen that stores data in a SQLite3 database. Currently, my app reads-in a certain type of file, parses and analyzes, puts the analysis data into the db, and spits the formatted records to stdout (which I generally pipe to a file). There are on the order of a million records in this file. Ideally, I would like to eliminate that text file situation altogether and just loop after that "parse and analyze" part, displaying a screen's worth of records, and allowing the user to page through them and enter some commands that will edit the records. The backend part I know how to do.
Can anyone suggest a good starting point for creating that pager frontend either directly in the console (like the pager "less"), through ncurses, or some other system?
You might want to take a look at urwid. It is a console user interface library for Python. The examples should be more than enough to convince you that this is what you want, if you really want to go text-console UI.
I'd use something like pygtk instead though.
After looking around a bit, I found that less and other pagers actually use curses. When I thought of curses I always imagined the blue-boxed interface with menus and mouse interaction. These are library add-ons for curses, which offer exactly the basic terminal selection and editing control functionality I'm looking for.
Tutorial on Python Curses Programming
Curses Programming with Python
On the backend, when the user attempts to move the cursor above or below the currently displayed records, I'll have sqlite fetch me the next appropriate set of records for display.