I´d like to know how to upload to a Google Spreadsheet, values stored in the database of my application.
Objective:
Connecting to Google Spreadsheet and automatically fill in a chart in the admin area with values that were passed by the upload.
I've been giving a look in the docs and it seems to me that I have to use Bulk Loader.
Is this the only way? If yes how to configure the Handler if I have a spreadsheet as a link to link text
Someone could make a script to access the Google Spreadsheet and pass the values of a Model?
Model:
class User (db.Model):
photo= db.BlobProperty()
name = db.StringProperty (required = True)
surname = db.StringProperty (required = True)
adress = db.PostalAddressProperty (required = True)
phone = db.PhoneNumberProperty (required = True)
The Bulk Loader has nothing to do with interacting with a Google Docs Spreadsheet. It is used for adding records to your application's datastore.
To manipulate a Google Spreadsheet, you'll need to use the Google Spreadsheet API, which you could easily find on your own using Google.
No one here is going to write this code for you. This is not a free code-writing service. If you write some code that doesn't work and need some help figuring out why, edit your question and include the code along with a thorough description of what isn't working and why if you have any idea.
Related
I'm trying to access to a blob uploaded on a bucket of Google Cloud Storage via Python official client (google-cloud-storage).
I'm managing into retrieving the owner of the file (the one who uploaded it), and I'm not finding something useful on internet.
I've tried using the client with something like:
client(project='project').get_bucket('bucket').get_blob('blob')
But the blob properties like "owner" are empty!
So I tried using a Cloud Function and accessing to event and context.
In the Google documentation (https://cloud.google.com/storage/docs/json_api/v1/objects#resource) it is reported the structure of an event and it seems to have the owner propriety. But when I print or try to access it I obtain an error because it is not set.
Can someone help me? I just need to have the user email... thanks in advance!
EDIT:
It doesn't seem to be a permission error, because I obtain the correct results testing the API from the Google site: https://cloud.google.com/storage/docs/json_api/v1/objects/get?apix_params=%7B%22bucket%22%3A%22w3-dp-prod-technical-accounts%22%2C%22object%22%3A%22datagovernance.pw%22%2C%22projection%22%3A%22full%22%7D
By default, owner and ACL are not fetched by get_blob. You will have to explicitly fetch this info:
blob = client(project='project').get_bucket('bucket').get_blob('blob')
blob.reload(projection='full')
Note that if you use uniform bucket-level ACLs owner doesn't have any meaning and will be unset even with the above change.
EDIT: this is actually not the most efficient option because it makes an extra unnecessary call to GCS. The most efficient option is:
blob = Blob('bucket', 'blob')
blob.reload(projection='full', client=client)
I'm developing an application that saves an acquired value to a google spreadsheet. I would like to know if I can do following things via google-spreadsheet API while using python
Check if a spreadsheet exists with given name
Get a list of spreadsheets while requesting via name or id
Thanks in advance!
There's an existing google sheet that I need to read from with a Python script. Is there any way I can enable the Google Sheets or Drive API for an existing spreadsheet (so I can download the necessary credentials for a Python app)?
All the online guides tell you how to do this for a New Project (like this one or this one) but not an existing sheet. Thanks!
If you look at the Google Developers quickstart, you can see the constants
# The ID and range of a sample spreadsheet.
SAMPLE_SPREADSHEET_ID = '1BxiMVs0XRA5nFMdKvBdBZjgmUUqptlbs74OgvE2upms'
SAMPLE_RANGE_NAME = 'Class Data!A2:E'
If you look at your sheet URL, you'll see it has a separate ID. You can replace the spreadsheet ID in the code with your own URL, and change the range you're querying depending on your specific application.
I have a database in Google Firebase that has streaming sensor data. I have a Shiny app that needs to read this data and map the sensors and their values.
I am trying to pull the data from Firebase into R, but couldn't find any package that does this. The app is currently running on local downloaded data.
I found the FireData package, but have no idea how it works.
I do know that you can pull data from Firebase with Python, but I don't know enough Python to do so, but I would be willing to code it in R with rPython if necessary.
I have:
- The Firebase project link
- The username
- The password
Has anyone tried Firebase and R / Shiny in the past?
I hope my question is clear enough.
The basics to get started with the R package fireData are as follows. First you need to make sure that you have set up a firebase account on GCP (Google Cloud Platform). Once there set up a new project and give it a name
Now that you have a project select the option on the overview page that says "Add Firebase to your web app". It will give you all the credential information you need.
[
One way of dealing with this kind of information in R is to add it to an .Renviron file so that you do not need to share it with your code (for example if it goes to github). There is a good description about how to manage .Renviron files in the Efficient R Programming Book.
API_KEY=AIzaSyBxxxxxxxxxxxxxxxxxxxLwX1sCBsFA
AUTH_DOMAIN=stackoverflow-1c4d6.firebaseapp.com
DATABASE_URL=https://stackoverflow-1c4d6.firebaseio.com
PROJECT_ID=stackoverflow-1c4d6
This will be available to your R session after you restart R (if you have made any changes).
So now you can try it out. But first, change the rules of your firebase Database to allow anyone to make changes and to read (for these examples to work)
Now you can run the following examples
library(fireData)
api_key <- Sys.getenv("API_KEY")
db_url <- Sys.getenv("DATABASE_URL")
project_id <- Sys.getenv("PROJECT_ID")
project_domain <- Sys.getenv("AUTH_DOMAIN")
upload(x = mtcars, projectURL = db_url, directory = "new")
The upload function will return the name of the document it saved, that you can then use to download it.
> upload(x = mtcars, projectURL = db_url, directory = "main")
[1] "main/-L3ObwzQltt8IKjBVgpm"
The dataframe (or vector of value) you uploaded will be available in your Firebase Database Console immediately under that name, so you can verify that everything went as expected.
Now, for instance, if the name that was returned read main/-L3ObwzQltt8IKjBVgpm then you can download it as follows.
download(projectURL = db_url, fileName = "main/-L3ObwzQltt8IKjBVgpm")
You can require authentication, once you have created users. For example, you can create users like so (the users appear in your firebase console).
createUser(projectAPI = api_key, email = "test#email.com", password = "test123")
You can then get their user information and token.
registered_user <- auth(api_key, email = "test#email.com", password = "test123")
And then use the tokenID that is returned to access the files.
download(projectURL = db_url, fileName = "main/-L3ObwzQltt8IKjBVgpm",
secretKey = api_key,
token = registered_user$idToken)
Listmates:
I am designing a google app engine (python) app to automate law office documents.
I plan on using GAE, google docs, and google drive to create and store the finished documents. My plan is to have case information (client name, case number, etc.) entered and retrieved using GAE web forms and the google datastore. Then I will allow the user to create a motion or other document by inserting the form data into template.
The completed document can be further customized by the user, email, printed, and/or stored in a google drive folder.
I found information on how to create a web page that can be printed. However, I am looking for information for how to create an actual google doc and insert the form data into that document or template.
Can someone point me to a GAE tutorial of any type that steps me through how to do this?
There is currently no API to create google docs directly except for:
1) make a google apps script service, which does have access to the docs api.
2) create a ".doc" then upload and convert to gdoc.
1 is best but a gas service has some limitations like quotas. If you are only creating dozens/hundreds per day you will be ok with quotas. Ive done it this way for something similar as your case.