Can I query Grok's ZODB instance outside the web application scope? - python

I have a grok-based webapp that persists data using ZODB. Can I query the object db offline i.e. from a python script that would be run on the webserver hosting the grok/paste webapp instance?
And would there be any issues in doing so while the web server is interacting with the database simultaneously?

You can open the ZODB with python and inspect the data, yes. To do so while you also run the web site, you'll need to use a concurrency layer like ZEO or RelStorage; plain FileStorage does not support concurrent access otherwise.

Related

Connect to local App Engine Datastore with Apache Beam

I am new with Google App Engine and I am a little bit confused with answers which are related to the connections to a local Datastore.
My ultimate goal is to stream data from a Google Datastore towards a Big Query Dataset, similar to https://blog.papercut.com/google-cloud-dataflow-data-migration/. I have a copy of this DataStore locally, accessible when I run a local App Engine, i.e. I can access it through an admin console when I use $[GOOGLE_SDK_PATH]/dev_appserver.py --datastore_path=./datastore.
I would like to know if it is possible to connect to this datastore using services outside of the App Engine Instance, with python google-cloud-datastore or even Apache Beam ReadFromDatastore method. If not, should I use the Datastore Emulator with the App Engine Datastore generated file ?
If anyone has an idea on how to proceed, I would be more than grateful to know how to do.
If it is possible it would have to be through the Datastore Emulator, which is capable to also serve apps other than App Engine. But it ultimately depends on the implementation of the libraries you intend to use - if the underlying access methods are capable of understanding the DATASTORE_EMULATOR_HOST environment variable pointing to a running datastore emulator and use that instead of the real Datastore. I guess you'll just have to give it a try.
But be aware that the local storage dir internal format used by the Datastore Emulator may be different than that used by the development server, so make a backup of your .datastore dir before trying stuff, just in case. From Local data format conversion:
Currently, the local Datastore emulator stores data in sqlite3 while
the Cloud Datastore Emulator stores data as Java objects.
When dev_appserver is launched with legacy sqlite3 data, the data will
be converted to Java objects. The original data is backed up with the
filename {original-data-filename}.sqlitestub.

Amazon EC2 file structure / web app with separate Python backend?

I'm currently running a t2.micro instance on EC2 right now. I have the html/web interface side of it working, along with a MySQL database.
The site allows users to register and stores them in the DB via a PHP script.
I want there to be an actual Python application that queries the MySQL database and returns user data, to then be executed in a Python script.
What I cannot find is whether I host this Python application as a totally separate instance or if it can exist on the same instance, in a different directory. I ultimately just need to query the database, which makes me thing it must exist on the same instance.
Could someone please provide some guidance?
Let me just be clear: this is not a Python web app. This Python backend is entirely separate except making queries against the database.
Either approach is possible, but there are pros & cons to each.
Running separate Python app on the same server:
Pros:
Setting up local access to the database is fairly simple
Only need to handle backups or making snapshots, etc. for a single instance
Cons:
Harder to scale up individual pieces if you need more memory, processing power, etc. in the future
Running the Python app on a separate server:
Pros:
Separate pieces means you can scale up & down the hardware each piece is running on, according to their individual needs
If you're using all micro instances, you get more resources to work with, without any extra costs (assuming you're still meeting all the other 'free tier eligible' criteria)
Cons:
In general, more pieces == more time spent on configuration, administration tasks, etc.
You have to open up the database to non-local access
Simplest: open up the database to access from anywhere (e.g. all remote IP addresses), and have the Python app log in via the internet
Somewhat safer, more complex: set the Python app server up with an elastic IP, open up the database to access only from that address
Much safer, more complex: set up your own virtual private cloud (VPC), and allow connections to the database only from within the VPC. You'd have to configure public access for each of the servers for whatever public traffic you'll have, presumably ports 80 and/or 443.

Django Temp. Table - Which is Correct Tool- Django-Redis or Redis-Py?

I have a webapp using DJango 1.6. This is a simple webapp that makes api calls from a forum and keeps track of threads that are unanswered. I want to store these unanswered thread collections in a redis temporary hash table.
What I am confused about is if I should be using django-redis(which also uses redis-py) or just redis-py. I have read django-redis documentation, and from what I can tell it is for the purpose of using redis to store Django sessions and other backend Django caches. For what I want to do which is just keep a temp. table of forum threads populated by a api call, would the proper tool be Django-Redis or Redis-py?
django-redis just provides you with a redis cache backend:
django-redis is a BSD Licensed, full featured redis cache/session
backend for Django.
With a redis-py you can "talk" to a redis server, it's a python redis interface.
As far as I understand, question is how you want to interact with redis - directly via the interface or using django's cache system. If you want this data to "expire" or you want to use redis for caching other entities, or you want to store sessions in redis - use django-redis. Also, there is nothing wrong to use redis-py directly or to use both.
Also see:
How can I use redis with Django?
Accelerating and Enhancing Django with Redis
Also, django-redis allows access to raw redis client: http://niwibe.github.io/django-redis/#_raw_client_access
This allows reuse same connection parameters and same connection pool for both purposes ;)

initialize GAE datastore with start data?

this is my first question on stackoverflow and I'm new to programming:
What is the right way to load data into the GAE datastore when deploying my app? This should only happen once at deployment.
In other words: How can I call methods in my code, such that these methods are only called when I deploy my app?
The GAE documentation for python2.7 says, that one shouldn't call a main function, so I can't do this:
if __name__ == '__main__':
initialize_datastore()
main()
Create a handler that is restricted to admins only. When that handler is invoked with a simple GET request you could have it check to see if the seed data exists and if it doesn't, insert it.
Configuring a handler to require login or administrator status.
Another option is to write a Python script that utilizes the Remote API. This would allow you to access local data sources such as a CSV file or a locally hosted database and wouldn't require you to create a potentially unwieldy handler.
Read about the Remote API in the docs.
Using the Remote API Shell - Google App Engine

Web API in Flask

I want to use Flask to create a web API for my application, but have some problems to make my flask app aware of my other objects.
I want to use Flask in order to be able to interact with my application through http requests. So the whole flask application in my case is just an external API, and relies on a core application.
Let's imagine that my flask application will have to perform database
calls.
To manage database calls in my application, I use a single object that connects to the db ans implements some kind of Queue.
That means my core application running in the background has a reference to my db object in order to make db calls.
This is done by giving a reference to my queue object to this core application.
Now I want to be able to perform actions on the db using a flask application too.
What is the correct way to pass a reference to this Queue object to my Flask application?
If I define all my objects at module level, I have no way to interact with them afterwards, do I?
All the example of Flask applications use Flask as the core of their system and define everything in their app on module level. How do I make Flask just a part of my app?
I'm not sure what you mean by
If I define all my objects at module level, I have no way to interact with them afterwards, do I?
But no, you don't have to define your objects at the module level - that's true of your Flask instance, blueprints and any object which you provide. For example you can create an AppBuilder class that makes and configures Flask instances.
For some interactions context locals are a very handy tool as well.
If you can clarify the issue I'll try to expand my answer.

Categories