How to implement a global class in django - python

I am currently working on a networked multiplayer game using django and django channels for websockets. I currently have my project set up where players send data to the server which then processes that data in a "GameManager" class that processes all game logic and interactions between all players. This works perfectly fine in my dev environment, but when I tried setting up my project for production, my global "GameManager" class does not seem to retain it's data across multiple requests. I'm guessing that since I'm using gunicorn in my production environment, my django project is running amongst multiple processes that each have their own instance of my classes.
My question is how can I implement some sort of global class in django to handle all the game logic that will be shared across all requests? I can't use sessions because I need this data to be shared by ALL connected clients, and I'm skeptical of using a solution such as redis because I would need to be reading/writing to it multiple times a second, so keeping it within python would help me keep things running smoothly. Any help would be greatly appreciated.

Related

Accessing the same instance of a class django, python, heroku

I've been working on a website which allows users to play a game against a "Machine" player and I decided to do this using django 1.12 and python 3.6 in an attempt to develop skills in this area. The game & ML algorithms run on the backend in python and during testing/dev this all worked fine. When pushing this to heroku it became apparent that the instance from the game and other classes were being instantiated correctly but then as the page refreshes, in order to get the machine player's choice from the server, the request would go to another server which didn't have the instantiated objects. I tried using the default cache to allow the player to access the same instance but I believe it might be too large. After some reading it sounds like memcached is the way forward, but I wondered whether anyone might have any suggestions or know if there's a simpler solution?
For each request, Django create a new HttpRequest object and load new instances of corresponding views. You can't share data between requests without putting them in a persistent storage.

Python objects lose state after every request in nginx

This is really troublesome for me. I have a telegram bot that runs in django and python 2.7. During development I used django sslserver and everything worked fine. Today I deployed it using gunicorn in nginx and the code works very different than it did on my localhost. I tried everything I could since I already started getting users, but all to no avail. It seems to me that most python objects lose their state after each request and this is what might be causing the problems. The library I use has a class that handles conversation with a telegram user and the state of the conversation is stored in a class instance. Sometimes when new requests come, those values would already be lost. Please has anyone faced this? and is there a way to solve the problem quick? I am in a critical situation and need a quick solution
Gunicorn has a preforking worker model -- meaning that it launches several independent subprocesses, each of which is responsible for handling a subset of the load.
If you're relying on internal application state being consistent across all threads involved in offering your service, you'll want to turn the number of workers down to 1, to ensure that all those threads are within the same process.
Of course, this is a stopgap -- if you want to be able to scale your solution to run on production loads, or have multiple servers backing your application, then you'll want to be modify your system to persist the relevant state to a shared store, rather than relying on content being available in-process.

How to set up a continuously running processes along with Django?

I am setting up backend for an application, with Django and MySQL.
As a part of the set up, I need to keep on fetching latest content from Facebook and Twitter Graph APIs and keep updating my database with that.
The user of my application would pull this latest available data from the database.
Now,how and where I implant this code? Shall I put it somewhere in the Django project, if yes, then where?
Or shall I use it as an independent script i.e. not attached to Django in anyway, and update the DB directly with that.
Also since this would be a continuous process, I need it to run as background task. It should not eat consume any resources that might be needed by the foreground tasks.
The recommended way is using Celery. If you want don't want to use async task handling you can also just create a custom management command and run it via cron. Both of them should work with the whole projects context (e.g. what your defined in your settings), so you can use the Django ORM to connect to your DB etc..

managing server application

I have a server which runs flask with python.
Now I want to make an application which can do various tasks like uploading files, updating redis database and various other things.
Now ofcourse this could be done using html pages but since the operation could involve lots of files realtime input of data and other things it might be better to make an application and manage the server from that point rather than webpages.
do you suggest using webpages anyway or would you make an application for it?
and if I make an application should I use http or not?
sorry if this is a uninformed question but I would like to learn the best methods
You might want to look into Flask-Script. It allows you to run various commands related to your flask application easily. It also allows you to easily add your own commands to it. This way you will be able to keep your administrative code still within the Flask app, but not necessarily have it accessible via a web page.

Using both Django and Websocket

My issue comes as follows: I want to mount a game using only one port (80), and django.
For my game (it's not a MMORTS game, so I must have real-time interaction) I need to use Websockets. This means, I need to use Django (backend, profiles) + Websockets (The game itself). For the sake of this issue, think about this game as if it was like Habbo Hotel or Sims (I need that kind of interaction).
Question: What non-dead packages could I use for this?
If the recommended websockets-package is not integrable to Django but mountable as a separate server, I'd need to solve 2 additional issues:
Is there a way to bridge non-ws connections to the regular Django application handler?
Is there a way to integrate whatever I implement as the websocket handling (i.e. game logic) to the Django engine? (i.e. models).
Assume I'd use this in a shared hosting supporting Python 2.7/Django 1.6.

Categories