I am trying to get the calendar event so I can then update it as the document shows.
My assumption was that an event with such JSON:
{
'kind': 'calendar#event',
'etag': '"32...000"',
'id': '6oo....jsa',
'status': 'confirmed',
'htmlLink': 'https://www.google.com/calendar/event?eid=Nm...YjhAZw',
'created': '2021-02-25T20:13:18.000Z',
'updated': '2021-02-28T01:21:43.762Z',
'summary': 'code',
'creator': {
'email': 'example#gmail.com'
},
'organizer': {
'email': 'abc#group.calendar.google.com',
'displayName': 'Website',
'self': True
},
'start': {
'dateTime': '2021-02-27T23:30:00-05:00'
},
'end': {
'dateTime': '2021-02-28T00:00:00-05:00'
},
'iCalUID': 'xyz#google.com',
'sequence': 27,
'reminders': {
'useDefault': True
},
'eventType': 'default'
}
The calendarId would be iCalUID & the eventId would be id or even the id after the calendar url. However, trying both of those interchangeably from the google API platform keeps on giving:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "notFound",
"message": "Not Found"
}
],
"code": 404,
"message": "Not Found"
}
}
What should I do different?
My ultimate goal is to modify the event's summary, startTime & endTime for any event with a calendarID and eventID, hence my code:
def modifying():
credentials = google.oauth2.credentials.Credentials(
**flask.session['credentials'])
flask.session['credentials'] = credentials_to_dict(credentials)
service = build('calendar', 'v3', credentials=credentials)
event = {
'summary': 'modified',
'location': '',
'description': '',
'start': {
'dateTime': '2021-02-27T17:09:25.993663-05:00',
'timeZone': 'America/Los_Angeles',
},
'end': {
'dateTime': '2021-02-27T21:09:25.993663-05:00',
'timeZone': 'America/Los_Angeles',
}
}
try:
events_result =service.calendarList().list().execute()
events = events_result.get('items', [])
Ids = [item['id'] for item in events] #do this so it can search across all calendars & not just 'primary'
service.events().update(calendarId=Ids, eventId='6oo....jsa', body=event).execute()
return 200
except Exception as err:
print(err)
return {"message":"Server under maintenance"}
P.S: I have already tried all the solutions in the SO post as shown.
First off you are doing a calendarlist.list
events_result =service.calendarList().list().execute()
Which Returns the calendars on the user's calendar list.
The response from that being a list of calendar resources
You then appear to be trying to update a static event id '6oo....jsa' on the first calendar that is returned by the calendar list response.
service.events().update(calendarId=Ids, eventId='6oo....jsa', body=event).execute()
How exactly do you know that this static event id is even part of that calendar? Which is why you are getting a 404 it does not exist.
Why not do a events.list in order to get a list of the events on the calendar. Or even better if you don't mind getting an error just do a event.get on the calendar for your static event id.
BTW event.list wont return a calendar id in the response as its assumed that you already know the calendar id since you used it go get the events in the first place.
Solution
To update an event with Python you need to use the method update of the calendar service:
service.events().update(calendarId=calendarId, eventId=eventId, body=body).execute()
The request body must contain the start and end dates of the event.
How to update an event having the summary
List all the events on a specific calendar (primary for the default calendar)
Check the summary of all the events until you find one that matches your search
Keep the id, start and end dates
summarySearch = 'old summary'
events = service.events().list(calendarId=calendarId).execute()
for ev in events['items']:
try:
print(ev['summary'])
if ev['summary'] == summarySearch:
eventId = ev['id']
startDate = ev['start']['date']
endDate = ev['end']['date']
except:
print('empty summary')
Define the new summary (and all the fields to update)
Define the body request. It can be easily constructed with Try this API
Update the event
#startDate = '2021-02-16' # uncomment to overwrite
#endDate = '2021-02-18' # uncomment to overwrite
summary = 'new summary'
body = {
'start': {'date': startDate},
'end': {'date': endDate},
'summary': summary
}
result = service.events().update(calendarId=calendarId, eventId=eventId, body=body).execute()
Print the result to check that the event has been updated successfully
Some notes
I keep the start and end dates because they are mandatory to update an event. If you want to overwrite them is okay and you don't have to store them previously, but I wanted to give a solution that can fit in more situations.
Reference
Events: update
Events: list
Related
EDIT: This is has been identified as a bug here.
I am trying to make a Google Meet along with a new calendar event. However, for some reason the returning event does not include any conferenceData, not even one with status fail.
Here is my code. I have omitted the authentication step as I do not get an authentication error.
def generateMeet(emails=None, fake=False):
if emails is None:
emails = []
if fake:
return "https://www.google.com", get_random_string(10)
now = datetime.utcnow().isoformat() + 'Z' # 'Z' indicates UTC time
inonehour = (datetime.utcnow() + timedelta(hours=1)).isoformat() + 'Z'
event = {
'summary': 'Orb Meeting',
'start': {
'dateTime': now,
'timeZone': 'America/New_York',
},
'end': {
'dateTime': inonehour,
'timeZone': 'America/New_York',
},
'sendUpdates': "none",
'reminders': {
'useDefault': False,
},
'attendees': [{'email': x} for x in emails],
'conferenceDataVersion': 1,
'conferenceData': {
'createRequest': {
'requestID': get_random_string(10),
'conferenceSolutionKey': {
'type': 'hangoutsMeet'
},
}
}
}
ret = service.events().insert(calendarId='primary', body=event).execute()
return ret['conferenceData']['entryPoints'], ret['id']
This returns a key error, as conference data does not exist. Here is the full 'ret' object before I run the return:
{'kind': 'calendar#event', 'etag': '"3197938620273000"', 'id': '5fb6epfe93sceba9scjt1nevsk', 'status': 'confirmed',
'htmlLink': 'https://www.google.com/calendar/event?eid=NWZiNmVwZmU5M3NjZWJhOXNjanQxbmV2c2sgZm9ycmVzdG1pbG5lckBt',
'created': '2020-09-01T14:08:30.000Z', 'updated': '2020-09-01T14:08:30.162Z', 'summary': 'Orb Meeting',
'creator': {'email': '[my email]', 'self': True},
'organizer': {'email': '[my email]', 'self': True},
'start': {'dateTime': '2020-09-01T10:08:28-04:00', 'timeZone': 'America/New_York'},
'end': {'dateTime': '2020-09-01T11:08:28-04:00', 'timeZone': 'America/New_York'},
'iCalUID': '5fb6epfe93sceba9scjt1nevsk#google.com', 'sequence': 0,
'attendees': [{'email': '[my other email]', 'displayName': 'Forrest Milner', 'responseStatus': 'needsAction'}],
'reminders': {'useDefault': False}}
Can anyone tell me why the conferenceData part of my request might be dropped? I am setting the conferenceDataVersion to 1, and using a random string.
I have tried adding dummy "invitees". In this trial, I invited my second gmail account, and in other trials I have invited several dummy accounts with domain "example.com". This updates the attendees, but does not make the conference data appear.
I have also tried waiting a few minutes and then listing all my events. Even after waiting, the conference data was not filled in. When I check my calendar on the GUI (https://calendar.google.com/calendar/r) it also does not have a Google Meet attached.
Thank you for any help.
Yes, it is a bug, if it's high priority you can 'patch' your created event with something like this:
const eventPatch = {
conferenceData: {
createRequest: { requestId: "yourRandomString" },
},
};
let response = await calendar.events.patch({
calendarId: 'primary',
eventId: "createdEventID",
resource: eventPatch,
sendNotifications: true,
conferenceDataVersion: 1,
});
Reference: https://developers.google.com/calendar/create-events#conferencing
The conferenceDataVersion It's not a part of requestBody anymore. Must be passed inside your insert method.
On NodeJS it's something like this:
calendar.events.insert({
calendarId: 'calendarIdHere',
requestBody: yourOldRequestBodyObject,
conferenceDataVersion: 1
});
#André Eccel is correct. Here's a working Python example of how to add a google meet link to your event:
import uuid
request_id = str(uuid.uuid1())
event = {'summary': "My Birthday Party",
'description': "There will be flamingos.",
'start': start,
'end': end,
'recurrence':recurrence,
'attendees': attendees,
'conferenceData': {'createRequest': {
'requestId': request_id,
"conferenceSolutionKey": {"type": "hangoutsMeet"}}}
}
service.events().insert(calendarId=cal_id,body=event,conferenceDataVersion=1).execute()
Goal: create a Google Calendar event.
Blocker: the date formatting.
background: I'm trying to create a meal planner that takes recipes from a given database and creates events with their name in a google calendar. The database looks like this:
d = {'recipe_id': ['carrot salad', 'leek fritters'], 'meal_date': ['2020-05-28 22:28:01.204464+00:00', '2020-05-29 22:28:01.204464+00:00']}
df = pd.DataFrame(data=d)
the meal date is a product of two
today_date = datetime.datetime.utcnow().isoformat() + 'Z'
df['menu_date'] = today_date
df['menu_date'] = pd.to_datetime(df['menu_date'])
df['meal_date'] = df['menu_date'] + df['meal'].apply(pd.offsets.Day)
Where 'meal' is just a number (1, 2, etc) and the last command just shifts today's date by that amount.
When I use the following code to upload to google calendar I get an error:
def myconverter(o):
'''
call the __str__ method of the datetime object that will return a string representation of the value
___
shoutout: https://code-maven.com/serialize-datetime-object-as-json-in-python
'''
if isinstance(o, datetime.datetime):
return o.__str__()
def add_to_calendar(df, calendar_id):
"""Shows basic usage of the Google Calendar API.
Prints the start and name of the next 10 events on the user's calendar.
"""
creds = None
# The file token.pickle stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists('token.pickle'):
with open('token.pickle', 'rb') as token:
creds = pickle.load(token)
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'credentials.json', SCOPES)
creds = flow.run_local_server(port=0)
# Save the credentials for the next run
with open('token.pickle', 'wb') as token:
pickle.dump(creds, token)
service = build('calendar', 'v3', credentials=creds)
# Call the Calendar API
now = datetime.datetime.utcnow().isoformat() + 'Z' # 'Z' indicates UTC time
print('Adding meals to calendar')
for i, r in df.iterrows():
event = {
'summary': r.name,
'description': r.meal_period,
'start': {
'date': json.dumps(r.meal_date, default=myconverter),
'timeZone': 'America/Los_Angeles'
},
'end': {
'date': json.dumps(r.meal_date, default=myconverter),
'timeZone': 'America/Los_Angeles'
}
}
event = service.events().insert(calendarId=calendar_id, body=event).execute()
running this code, I get the following error:
HttpError: <HttpError 400 when requesting https://www.googleapis.com/calendar/v3/calendars/CALENDAR_ID/events?alt=json returned "Invalid value for: Invalid format: ""2020-05-28 22:28:01.204464+00:00""">
where CALENDAR_ID is my google calendar id.
If anyone knows how to fix this date issue using python code, this would be incredibly helpful.
How about this answer?
Modification points:
If you want to use start.date and end.date as the all-day event, the format is required to be yyyy-mm-dd. In this case, timeZone is not required.
If you want to use start.dateTime and end.dateTime as the all-day event, the format is required to be RFC3339. In this case, timeZone is required.
From above situations, when your script is modified, how about the following patterns?
Pattern 1:
In this pattern, start.date and end.date are used. For this, please modify as follows.
From:
'start': {
'date': json.dumps(r.meal_date, default=myconverter),
'timeZone': 'America/Los_Angeles'
},
'end': {
'date': json.dumps(r.meal_date, default=myconverter),
'timeZone': 'America/Los_Angeles'
}
To:
'start': {
'date': 'date': parse(r.meal_date).strftime("%Y-%m-%d"),
},
'end': {
'date': 'date': parse(r.meal_date).strftime("%Y-%m-%d"),
}
Pattern 2:
In this pattern, start.dateTime and end.dateTime are used. For this, please modify as follows.
From:
'start': {
'date': json.dumps(r.meal_date, default=myconverter),
'timeZone': 'America/Los_Angeles'
},
'end': {
'date': json.dumps(r.meal_date, default=myconverter),
'timeZone': 'America/Los_Angeles'
}
To:
'start': {
'dateTime': parse(r.meal_date).isoformat(),
'timeZone': 'America/Los_Angeles'
},
'end': {
'dateTime': parse(r.meal_date).isoformat(),
'timeZone': 'America/Los_Angeles'
}
Note:
In this modification, from dateutil.parser import parse is used.
From your script, I couldn't see the scope you are using. So if the error related to the scope occurs, please use https://www.googleapis.com/auth/calendar as the scope. At that time, please delete the file of token.pickle and reauthorize the scope. Please be careful this.
Reference:
Events: insert
Following the docs, there's an example to export all for a specific OU
def create_drive_ou_all_data_export(service, matter_id):
ou_to_search = 'ou id retrieved from admin sdk'
drive_query_options = {'includeSharedDrives': True}
drive_query = {
'corpus': 'DRIVE',
'dataScope': 'ALL_DATA',
'searchMethod': 'ORG_UNIT',
'orgUnitInfo': {
'org_unit_id': ou_to_search
},
'driveOptions': drive_query_options,
'startTime': '2017-03-16T00:00:00Z',
'endTime': '2017-09-23T00:00:00Z',
'timeZone': 'Etc/GMT+2'
}
drive_export_options = {'includeAccessInfo': False}
wanted_export = {
'name': 'My first drive ou export',
'query': drive_query,
'exportOptions': {
'driveOptions': drive_export_options
}
}
return service.matters().exports().create(
matterId=matter_id, body=wanted_export).execute()
However, it does not show how to just export for a given user, is this possible? Also, where are all of the different body options for creating an export? The examples do not seem to show all of the parameters available.
You'd want to use searchMethod:account
Reference Query: https://developers.google.com/vault/reference/rest/v1/Query
Reference searchmethod: https://developers.google.com/vault/reference/rest/v1/Query#SearchMethod
Reference AccountInfo: https://developers.google.com/vault/reference/rest/v1/Query#AccountInfo
drive_query = {
'corpus': 'DRIVE',
'dataScope': 'ALL_DATA',
'searchMethod': 'ACCOUNT', # This is different
'accountInfo': { # This is different
'emails': ['email1#company.com', 'email2#company.com', 'email3#company.com']
},
'driveOptions': drive_query_options,
'startTime': '2017-03-16T00:00:00Z',
'endTime': '2017-09-23T00:00:00Z',
'timeZone': 'Etc/GMT+2'
}
I try to use the Google Adwords API, with the official library here : https://github.com/googleads/googleads-python-lib
I use an Manager Account on Google Adwords and want to work with my client's accounts.
I can get all the the Adwords account ID (like 123-456-7891) but I don't know how to pass the account ID to my Google Adwords functions as a parameter.
Here's my main function :
def main(argv):
adwords_client = adwords.AdWordsClient.LoadFromStorage(path="googleads.yaml")
add_campaign(adwords_client)
I see any Account ID parameter in the official samples, as :
import datetime
import uuid
from googleads import adwords
def add_campaign(client):
# Initialize appropriate services.
campaign_service = client.GetService('CampaignService', version='v201809')
budget_service = client.GetService('BudgetService', version='v201809')
# Create a budget, which can be shared by multiple campaigns.
budget = {
'name': 'Interplanetary budget #%s' % uuid.uuid4(),
'amount': {
'microAmount': '50000000'
},
'deliveryMethod': 'STANDARD'
}
budget_operations = [{
'operator': 'ADD',
'operand': budget
}]
# Add the budget.
budget_id = budget_service.mutate(budget_operations)['value'][0][
'budgetId']
# Construct operations and add campaigns.
operations = [{
'operator': 'ADD',
'operand': {
'name': 'Interplanetary Cruise #%s' % uuid.uuid4(),
# Recommendation: Set the campaign to PAUSED when creating it to
# stop the ads from immediately serving. Set to ENABLED once you've
# added targeting and the ads are ready to serve.
'status': 'PAUSED',
'advertisingChannelType': 'SEARCH',
'biddingStrategyConfiguration': {
'biddingStrategyType': 'MANUAL_CPC',
},
'endDate': (datetime.datetime.now() +
datetime.timedelta(365)).strftime('%Y%m%d'),
# Note that only the budgetId is required
'budget': {
'budgetId': budget_id
},
'networkSetting': {
'targetGoogleSearch': 'true',
'targetSearchNetwork': 'true',
'targetContentNetwork': 'false',
'targetPartnerSearchNetwork': 'false'
},
# Optional fields
'startDate': (datetime.datetime.now() +
datetime.timedelta(1)).strftime('%Y%m%d'),
'frequencyCap': {
'impressions': '5',
'timeUnit': 'DAY',
'level': 'ADGROUP'
},
'settings': [
{
'xsi_type': 'GeoTargetTypeSetting',
'positiveGeoTargetType': 'DONT_CARE',
'negativeGeoTargetType': 'DONT_CARE'
}
]
}
}, {
'operator': 'ADD',
'operand': {
'name': 'Interplanetary Cruise banner #%s' % uuid.uuid4(),
'status': 'PAUSED',
'biddingStrategyConfiguration': {
'biddingStrategyType': 'MANUAL_CPC'
},
'endDate': (datetime.datetime.now() +
datetime.timedelta(365)).strftime('%Y%m%d'),
# Note that only the budgetId is required
'budget': {
'budgetId': budget_id
},
'advertisingChannelType': 'DISPLAY'
}
}]
campaigns = campaign_service.mutate(operations)
How can I tell Adwords API in which account I want to add this campaign ?
Thanks for your help !
OK my bad, I missed a documentation method (http://googleads.github.io/googleads-python-lib/googleads.adwords.AdWordsClient-class.html#SetClientCustomerId).
# ID of your customer here
CUSTOMER_SERVICE_ID = '4852XXXXX'
# Load customer account access
client = adwords.AdWordsClient.LoadFromStorage(path="googleads.yaml")
client.SetClientCustomerId(CUSTOMER_SERVICE_ID)
And the customer ID is now associate with the AdwordsClient variable as "client" set as parameters for other functions.
I am trying to use the cost explorer API using boto3. I am trying to get cost for EC2 snapshots. These snapshots have custom tags associated with them. What I am trying to retrieve is the cost of snapshots which have a particular tag.
I have written the following script:
import boto3
client = boto3.client('ce')
response = client.get_cost_and_usage(
TimePeriod={
'Start': '2019-01-20',
'End': '2019-01-24'
},
Metrics=['BLENDED_COST','USAGE_QUANTITY','UNBLENDED_COST'],
Granularity='MONTHLY',
Filter={
'Dimensions': {
'Key':'USAGE_TYPE_GROUP',
'Values': ['EC2: EBS - Snapshots']
}
}
)
This gives me the cost. But this is the total cost for the snapshot usage, i.e. for all the volumes. Is there any way to filter based on tags on the snapshot?
I tries adding the fallowing Filter:
Filter={
'And': [
{
'Dimensions': {
'Key':'USAGE_TYPE_GROUP',
'Values': ['EC2: EBS - Snapshots']
}
},
{
'Tags':{
'Key': 'test',
'Values': ['aj']
}
}
]
}
There is 1 snapshot where I have added that tag. I checked the date range and the snapshot was created within that time range and is still available. I tried changing granularity to DAILY too.
But this always shows 0 cost.
To query the snapshots or even other services using tags, you need to activate them in the billing menu.
Refer the link to activate the tags you need to query:
https://console.aws.amazon.com/billing/home?region=us-east-1#/preferences/tags
NOTE: Only master accounts in an organization and single accounts that are not members of an organization have access to the Cost Allocation Tags.
I hope that helps!
'Tags' can be added in your filter as follows:
response = client.get_cost_and_usage(
TimePeriod={
'Start': '2019-01-10',
'End': '2019-01-15'
},
Metrics=['BLENDED_COST','USAGE_QUANTITY','UNBLENDED_COST'],
Granularity='MONTHLY',
Filter={
'Dimensions': {
'Key':'USAGE_TYPE',
'Values': ['APN1-EBS:SnapshotUsage']
},
'Tags': {
'Key': 'keyName',
'Values': [
'keyValue',
]
}
}
)
You can find the exact usage in the boto3 cost explorer API reference.
You could also group by tag keys like this:
Filter={
'Dimensions': {
'Key':'USAGE_TYPE',
'Values': ['APN1-EBS:SnapshotUsage']
}
},
GroupBy=[
{
'Type': 'DIMENSION'|'TAG',
'Key': 'string'
},
],
It won't filter out tags, but it will group the returned data by tag key. This will return ALL tag values matching the tag key, so it may be too broad, but you can use it to troubleshoot any additional problems.
I'd confirm that your tag values and keys all match up.