I have made a python flask app that I can use to manipulate a scoreboard. The app is hosted on heroku.com. The scoreboard is stored in a JSON file. First I just had the JSON file in the GitHub that Heroku makes for you. But then I found out that every couple of hours Heroku does a reset to your last commit. So any changes I would have made to the scoreboard.json would have been lost.
So I came to the conclusion that I needed to use an actual database hosting site to host my scoreboard.json. I have chosen mLab for this.
What command sends over a complete copy of a file in mLab back to the server so I can make changes to the file and then what command replaces the old file with the new file?
You're looking for a python mongodb driver. According to https://docs.mongodb.com/ecosystem/drivers/python/:
PyMongo is the recommended way to work with MongoDB from Python.
Check out the tutorial on using PyMongo, specifically inserting and getting documents.
That being said, you may want to consider splitting up the scoreboard data into smaller units. For example, having one document per player/team might be easier to manage.
Good luck!
Related
I'm working on a project that has all its data written as python files(codes). Users with authentication could upload data through a web page, and this will directly add and change codes on the production server. This is causing trouble every time I want to git pull changes from the git repo to the production server since the codes added by users directly on production are untracked.
I wonder if anyone has some other ideas. I know this is ill-designed but this is what I got from the beginning, and implementing a database will require a lot of effort since all the current codes are designed for python files.I guess it's because the people who wrote this didn't know much about databases and it works only because there is relatively little data.
The only two solutions I could think of is
use a database instead of having all datas being codes
git add/commit/push all changes on the production server every time the user upload data
Details added:
There is a 'data' folder of python files, with each file storing the information of a book. For instance, latin_stories.py has a dictionary variable text, an integer variable section, a string variable language, etc. When users upload a csv file according to a certain format, a program will automatically add and change the python files in the data folder, directly on the production server.
I'm giving myself a project to better learn these languages which I already know a lot of it's just syncing them together I need to get better with. This project is a pretty basic "SIM" game, generate some animals into your profile with login/logout. So far I've got the website aspect with HTML/CSS done and functioning with all the pages I currently need all of which is local host on my desktop. Now I'm moving on to working with Python and possibly some PHP aspects into this to get the login/logout and generate a new animal into your account.
Everything I've done with python so far has been done in IDEL, I'm wondering how to link my python document to my HTML document. Like you would CSS? Or is that not possible if not then how do I connect the two to have python interact with the HTML/CSS that has been created? I'm guessing to need MySQL for a database setup but seeing how much I can get as a simple local host without hosting online?
If you want to setup a localhost with PHP and MYSQL I can recommend XAMP (https://www.apachefriends.org/). In order for your webapp to talk to your Python scripts you will either need to use FLASK or Django to create a python webserver, or use PHP to run python scripts. Either way, you will need to make AJAX requests to an API to get this done.
Edit: Forgot to mention this, but you will need JavaScript in order to do this
I'm mostly working on backend staff, except now in a project I need to use python to do computing and visualize the results on google maps. Think about it as, for example, compute the geographical clusters of people tweeting in new york city.
In the python program, it runs about 10 seconds, and then output one iteration of data, which is a json object for coordinates. I'm wondering how should I connect this data to google maps?
What I thought was let python write data into a file and JS would listen to that file every few milliseconds. However that sounds too hacky. Just wondering is there a better way to do it?
I'm really a newbie to js. please forgive my ignorance.
Thanks
The normal way a HTML page gets data from a backend service (like your coordinate generator every 10 seconds) is to poll a web service (usually, a JSON feed) for updates.
All of the dynamic Google Maps stuff happens within a browser, and that page polls a JSON endpoint, or uses something fancier like websockets to stream data into the browser window.
For the frontend, consider using jQuery, which makes polling JSON dead simple. Here's some examples.
Your "python program" should dump results into a simple database. While relational and traditional databases like MySQL or PostgreSQL should suffice, i'd encourage you to use a NoSQL database, which handles capped collections. This prevents you from having to clean old data out from a cron schedule. It additionally allows storing data in ranged buckets for some cool playback style histories.
You should then have a simple web server which can handle the JSON requests from the HTML frontend page, and simply pulls data from the MongoDB. This can be done quickly in any one of the python web frameworks like Flask, Bottle or Pyramid. You could also play with something a little sexier like node.js. The only requirement here is that a database driver exists for it.
Hope that gives a 10,000 foot view of what you need to do now.
I'm using Azure and the python SDK.
I'm using Azure's table service API for DB interaction.
I've created a table which contains data in unicode (hebrew for example). Creating tables and setting the data in unicode seems to work fine. I'm able to view the data in the database using Azure Storage Explorer and the data is correct.
The problem is when retrieving the data.. Whenever I retrieve specific row, data retrieval works fine for unicoded data:
table_service.get_entity("some_table", "partition_key", "row_key")
However, when trying to get a number of records using a filter, an encode exception is thrown for any row that has non-ascii chars in it:
tasks = table_service.query_entities('some_table', "PartitionKey eq 'partition_key'")
Is this a bug on the azure python SDK? Is there a way to set the encoding beforehand so that it won't crash? (azure doesn't give access to sys.setdefaultencoding and using DEFAULT_CHARSET on settings.py doesn't work as well)
I'm using https://www.windowsazure.com/en-us/develop/python/how-to-guides/table-service/ as reference to the table service API
Any idea would be greatly appreciated.
This looks like a bug in the Python library to me. I whipped up a quick fix and submitted a pull request on GitHub: https://github.com/WindowsAzure/azure-sdk-for-python/pull/59.
As a workaround for now, feel free to clone my repo (remembering to checkout the dev branch) and install it via pip install <path-to-repo>/src.
Caveat: I haven't tested my fix very thoroughly, so you may want to wait for the Microsoft folks to take a look at it.
I have a script which scans an email inbox for specific emails. That part's working well and I'm able to acquire the data I'm interested in. I'd now like to take that data and add it to a Django app which will be used to display the information.
I can run the script on a CRON job to periodically grab new information, but how do I then get that data into the Django app?
The Django server is running on a Linux box under Apache / FastCGI if that makes a difference.
[Edit] - in response to Srikar's question When you are saying " get that data into the Django app" what exactly do you mean?...
The Django app will be responsible for storing the data in a convenient form so that it can then be displayed via a series of views. So the app will include a model with suitable members to store the incoming data. I'm just unsure how you hook into Django to make new instances of those model objects and tell Django to store them.
I think Celery is what you are looking for.
You can write custom admin command to load data according to your need and run that command through cron job. You can refer Writing custom commands
You can also try existing loaddata command, but it tries to load data from fixture added in your app directory.
I have done the same thing.
Firstly, my script was already parsing the emails and storing them in a db, so I set the db up in settings.py and used python manage.py inspectdb to create a model based on that db.
Then it's just a matter of building a view to display the information from your db.
If your script doesn't already use a db it would be simple to create a model with what information you want stored, and then force your script to write to the tables described by the model.
Forget about this being a Django app for a second. It is just a load of Python code.
What this means is, your Python script is absolutely free to import the database models you have in your Django app and use them as you would in a standard module in your project.
The only difference here, is that you may need to take care to import everything Django needs to work with those modules, whereas when a request enters through the normal web interface it would take care of that for you.
Just import Django and the required models.py/any other modules you need for it work from your app. It is your code, not a black box. You can import it from where ever the hell you want.
EDIT: The link from Rohan's answer to the Django docs for custom management commands is definitely the least painful way to do what I said above.
When you are saying " get that data into the DJango app" what exactly do you mean?
I am guessing that you are using some sort of database (like mysql). Insert whatever data you have collected from your cronjob into the respective tables that your Django app is accessing. Also insert this cron data into the same tables that your users are accessing. So that way your changes are immediately reflected to the users using the app as they will be accessing the data from the same table.
Best way?
Make a view on the django side to handle receiving the data, and have your script do a HTTP POST on a URL registered to that view.
You could also import the model and such from inside your script, but I don't think that's a very good idea.
Have your script send an HTTP Post request like so. This is the library Requests
>>> files = {'report.xls': open('report.xls', 'rb')}
>>> r = requests.post(url, files=files)
>>> r.text
then on the receiving side you can use web.py to process the info like this
x = web.input()
then do whatever you want with x
On the receiving side of the POST request import web and write a function that handles the post
for example
def POST(self):
x = web.input()
If you dont want to use HTTP to send messages back and forth you could just have the script write the email info to a .txt file and then have your django app open the file and read it.
EDIT:
You could set your CRON job to read the e-mails at say 8AM then write it to a text file info.txt. The in your code write something like
import time
if '9' == time.strftime("%H"):
file = open(info.txt)
info = file.read()
that will check the file at 9AM untill 10AM. if you want it to only check one time just add the minutes too the if statement as well.