Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I'm currently using a mix of smart view and power query(sql) to load data into Excel models however my excel always crashes when smart view is used. I'm required to work in Excel but I'm know looking at finding a way to periodically load data from Essbase into my SQL server database and only use power query(sql) for all my models. What would be my best options in doing this? Being a Python enthusiast I found essbasepy.py however there isn't much documentation on it. Please help
There are a couple of ways to go. The most straightforward is to export all of the data from your Essbase database using column export, then designing a process to load the data into SQL Server (such as using the import functionality or BULK IMPORT, or SSIS...).
Another approach is to use the DataExport calc script command to export either to a file (that you then load into SQL) or directly to the relational database (DataExport can be configured to export data directly to relational).
In either case, you will need privileges that are greater than normal user privileges, and either approach involves Essbase automation that may require you to coordinate with the Essbase admin.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
Usually, for me, loading data from the SQL database on a Server first, then manipulatting later with pandas on my computer.
However, many other's aere preprocessing some data in SQL first (like case etc.) then the rest with pandas.
So i wonder which is better and why? thx!
This question is quit general. For a more specific answer, we would need to know more about your setup.
I make some assumptions to answer your question: I assume your database is running on a server and your python code is executed on your local machine.
In this case, you have to consider at least two things:
transmitted data over the network
data processing
If you make a general SQL request, large amounts of data are transmitted over the network. Next, your machine has to process the data. Your local machine might be less powerful than the server.
On the other hand, if you submit a specific SQL request, the powerful server can process the data and only return the data you are actually interested.
SQL queries can get long and hard to understand since you have to pass it as one statement. In python, you have the possibility to process the data over multiple lines of code.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I'm using Python (3.7.4) to make a text/password manager, where the user can store text under different tabs (using tkinter for the interface) , and use a "master login" to access all data.
The only experience I've got with saving/storing data is using CSV files, and looping through them to get values.
Can anyone recommend anyway I can store text, without it being able to be opened from Windows Explorer, and need some sort of key to be opened?
The natural alternative to using csv files is the use of a database. A solution like sqlite might be enough for your solution. Directly from the documentation:
SQLite is a C library that provides a lightweight disk-based database
that doesn’t require a separate server process and allows accessing
the database using a nonstandard variant of the SQL query language.
Some applications can use SQLite for internal data storage. It’s also
possible to prototype an application using SQLite and then port the
code to a larger database such as PostgreSQL or Oracle.
Once you have learned sqlite, you can think of encrypting your database: in this post you will find many ideas for encrypting your sqlite database.
Otherwise you can switch to other more complete and complex DBMSs. The important thing is that you consider moving from the csv to using a db.
Take a look at SQLLite as a lightweight local database and use something like SQLCipher for strong AES 256-bit encryption.
I havn't read it fully but take a look at this blog for a python implementation.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
i am trying to connect to a database and insert data with python.I found too many connector drivers like : mysql-connector,pymysql, MySQLdb. I want to know which way is better to communicate with database in python3.4 .
In our project we have remote machine with MySQL default client, we create SSH object of it and run sql query using traditional client. It's the most safe way and supports mostly everything with best optimized support.
However, if you want to prepare sql handle in python I will suggest go for pymysql, as pymysql is updated regularly & its very much stable & easy to use compared to others.
I would check out sqlalchemy http://www.sqlalchemy.org/. Or if you want more of an object-relational-manager then set up a small Django project https://www.djangoproject.com/.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I have apache+mysql+django. in mysql I store some data, and use django to interact with it.
Now I need to store data from mysql into some external text file(like csv) in a propper format(I want to parse it later with jenkins as a source to parameters). How can I create some kind of event which will upload data from DB table into text file using Django(as far as I already have Django installed, and I know python itself can do this easy)?
And I want reapetable event. Like every 1 minute data will be synced. Smthng like this.
Tnaks!
I would suggest writing a custom management command that takes the information you want and writes it to your CSV. As you said, the process of writing data to a text file is easy enough in Python - using a management command would let you use the ORM directly, greatly simplifying your interaction with the database.
The second benefit of using a management command is that it's easy to run as a cronjob. Once you've written your custom command (let's call it csvsync), you can write something like this in your crontab:
* * * * * python /your/path/to/manage.py csvsync
Here's the relevant documentation for custom management commands: https://docs.djangoproject.com/en/1.7/howto/custom-management-commands/
The developers note the following in the linked docs:
Standalone scripts
Custom management commands are especially useful for running
standalone scripts or for scripts that are periodically executed from
the UNIX crontab or from Windows scheduled tasks control panel.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I want to create Ruby on Rails web app (or on other dynamic languages such as Python), that should interact with Matlab.
Web app sends some info to the matlab server (or simply run matlab file with necessary data if it's possible) that processing this stuff and returns results back to the Rails server. Is it possible to do something like this?
If I understand correctly, you want to want to send info to matlab from a webapp. Then run some process on it using matlab, then it sends the info back to ruby to display it.
No matlab expert but if you did it in python (which you said you could) you could write your results to a database and then read the database in ruby, I expect you can export results from matlab into a database as well.
To use SQLite database in python you can import the module:
import sqlite3
The relevant documentation is here https://docs.python.org/2/library/sqlite3.html