How to create a leader board in repl.it python - python

I'm trying to make a game with a leaderboard, and I have the code that writes the high scores in a file, but when multiple people play the game, they have separate leaderboards. so basically, i'm asking if there's a way to have everyone share a file in repl.it.

If your code is writing to local files e.g. via open('some_file.txt'), then those are limited to the individual user's Repl.it workspaces.
If you must use files, then you could use any of the Dropbox, Git, AWS S3, Google Drive Python libraries to read/write external files.
However, the better solution would be to use a hosted database solution and connect Repl.it to that. Some popular (free) examples include Firebase, Mongo Atlas, or Datastax Astra. For a simple leaderboard counter, then even Grafana Cloud.

Related

Azure Confusion - Guide to Python in Azure - Where do I start?

Help!
I have a small python script that logins to a website, downloads a bundle of files and then saves these files to a Sharepoint site for use by others. There are multiple files to this and several required python imports.
I'd like to move this to Azure so I can put this thing on a schedule to run periodically so I can forget about it (and have the script send notification or otherwise). Actually there are other scripts I would also like to put on a schedule.
I'm somewhat baffled in where to start doing this on Azure. I have an Azure account with some free credit but beyond that confused as what Azure service this should be built on.
Google searching is not helping as all I get is bundle of buzzwords that are not really helping.
Looking for some pointers in the right direction.
Thanks
As mentioned by #Gaurav Mantri, you can use Azure functions' Timer trigger where the function is run on a schedule. Alternatively, for codeless automated workflows, you can even opt for Azure logic apps which provide many connectors that are performed just by constructing a workflow.
REFERENCES: Connect to SharePoint from Azure Logic Apps - MSFT Docs

Store and retrieve text online/cloud using python APIs

I am writing an python application which reads/parses a file of this this kind.
myalerts.ini,
value1=1
value2=3
value3=10
value4=15
Currently I store this file in local filesystem. If I need to change this file I need to have physical access to this computer.
I want to move this file to cloud so that I can change this file anywhere (another computer or from phone).
If this application is running on some machine I should be able to change this file on the cloud and the application which is running on another machine which I don't have physical access to will be able to read updated file.
Notes,
I am new to both python and aws.
I am currently running it on my local mac/linux and planning on deploying on aws.
There are many options!
Amazon S3: This is the simplest option. Each computer could download the file at regular intervals or just before they run a process. If the file is big, the app could instead check whether the file has changed before downloading.
Amazon Elastic File System (EFS): If your applications are running on multiple Amazon EC2 instances, EFS provides a shared file system that can be mounted on each instance.
Amazon DynamoDB: A NoSQL database instead of a file. Much faster than parsing a file, but less convenient for updating values — you'd need to write a program to update values, eg from the command-line.
AWS Systems Manager Parameter Store: A managed service for storing parameters. Applications (anywhere on the Internet) can request and update parameters. A great way to configure cloud-based application!
If you are looking for minimal change and you want it accessible from anywhere on the Internet, Amazon S3 is the easiest choice.
Whichever way you go, you'll use the boto3 AWS SDK for Python.

Access a Cloud-stored File using Python 3?

I currently have a Python program which reads a local file (containing a pickled database object) and saves to that file when it's done. I'd like to branch out and use this program on multiple computers accessing the same database, but I don't want to worry about synchronizing the local database files with each other, so I've been considering cloud storage options. Does anyone know how I might store a single data file in the cloud and interact with it using Python?
I've considered something like Google Cloud Platform and similar services, but those seem to be more server-oriented whereas I just need to access a single file on my own machines.
You could install gsutil and the boto library and use that.

How to save excel file to amazon s3 from python or ruby

Is it possible to create a new excel spreadsheet file and save it to an Amazon S3 bucket without first saving to a local filesystem?
For example, I have a Ruby on Rails web application which now generates Excel spreadsheets using the write_xlsx gem and saving it to the server's local file system. Internally, it looks like the gem is using Ruby's IO.copy_stream when it saves the spreadsheet. I'm not sure this will work if moving to Heroku and S3.
Has anyone done this before using Ruby or even Python?
I found this earlier question, Heroku + ephemeral filesystem + AWS S3. So, it would seem this is not possible using Heroku. Theoretically, it would be possible using a service which allows adding an Amazon EBS.
You have dedicated Ruby Gem to help you moving file to Amazon S3:
https://rubygems.org/gems/aws-s3
If you want more details about the implementation, here is the git repository. The documentation on the page is very complete, and explain how to move file to S3. Hope it helps.
Once your xls file is created, the library helps you create a S3Objects and store it into a Bucket (which you can also create with the library).
S3Object.store('keyOfYourData', open('nameOfExcelFile.xls'), 'bucketName')
If you want more choice, Amazon also delivered an official Gem for this purpose: https://rubygems.org/gems/aws-sdk

How to create a simple virtual filesystem on google app engine blobstore using python?

There is a Java version GAEVFS which seems quite effective but also complicated.
And a Python vfs collection at http://code.google.com/p/pyfilesystem/ which is not designed for GAE.
Google locked its file system and make tempfile empty, which needs a workaround.
I have to build a simple vfs with GAE blobstore/file API to emulate a Linux-style dir-file-owner-permission behavior.
Is it possible? What are the most fundamental classes, attributes and methods I should implement?
Thanks in advance!
If you are trying to implement FS in GAE prepare for a world of pain. You can't write files on GAE, even if you move to Managed VMs your disk will be wiped on restart.
You'll basically will have to emulate a file system interface and translate that to datastore entities and relations.
It can be done, but you are gonna have to do it from the ground up.
If you want to have a cloud file system i suggest you consider cloud storage which actually allows you handle files, it actually implements most of the features of a basic file system.

Categories