I want to retrieve data from Google analytics. I have created a service account in the console and I am using Google's Python (hello_analytics_api_v3.py) code to access the data.
I have copied the client_secrets.json into my folder but get this error:
*SystemExit:
WARNING: Please configure OAuth 2.0
To make this sample run you will need to populate the client_secrets.json file
found at:*
What should I do? I am using Python 2.7.
Ensure the terminal is pointing to the same path directory as your client_secrets.json file.
i.e. type pwd in the console you're using to call the script and the output should match the directory of where client_secrets.json is stored.
I was having this exact issue and deleted the credentials to my project and creating new ones using the 'OAuth client ID' option. Follow step one of this page closley https://developers.google.com/analytics/devguides/config/mgmt/v3/quickstart/installed-py
I also found a syntax error in the sample code provided by google
The Lines:
print 'View (Profile): %s' % results.get('profileInfo').get('profileName')
print 'Total Sessions: %s' % results.get('rows')[0][0]
Should read:
print ('View (Profile): %s' % (results.get('profileInfo').get('profileName')))
print ('Total Sessions: %s' % (results.get('rows')[0][0]))
At least this solved it for me. Also, make sure the client_secrets.json is in the same directory as your python script.
In the sample code at https://developers.google.com/youtube/v3/guides/uploading_a_video the call to flow_from_clientsecrets() passes CLIENT_SECRETS_FILE as a relative path.
To fix it, force the CLIENT_SECRETS_FILE argument to be an absolute path:
def get_authenticated_service(args):
flow = flow_from_clientsecrets(
os.path.abspath(os.path.join(
os.path.dirname(__file__),CLIENT_SECRETS_FILE)),
scope=YOUTUBE_UPLOAD_SCOPE,
message=MISSING_CLIENT_SECRETS_MESSAGE)
I received this error because I still had the square brackets inside the client_id and client_secret. It should just be the string with no brackets.
If you are using Windows system, follow this steps:
Put your file (client_secrets.json) in the directory (C:) or (D:).
In your Python file define your variable like this:
CLIENT_SECRETS_FILE = "\client_secrets.json". Python will search the json file in the root C: or D: and will find it.
I had same problem with Google API for youtube and I solved it like that.
Related
I'm hitting a wall with the python script at https://stackoverflow.com/a/60946539/1641112. I've added the script to a file called test.py3.
I'm running this command directly on my Synology drive with an admin account: python test.py3 /volume2/dir/key-basic-preferred.pdf --debug
The relevant part of the debug output:
Get(): "request = entry.cgi?api=SYNO.FileStation.Sharing&version=3&method=create&path="/volume2/dir/key-basic-preferred.pdf""
GET: "http://127.0.0.1:5000/webapi/entry.cgi?api=SYNO.FileStation.Sharing&version=3&method=create&path="/volume2/dir/key-basic-preferred.pdf"&_sid=011o6xaJaxvLsKCJ4N91278"
GET: "<Response [200]>"
ERROR: Get (entry.cgi?api=SYNO.FileStation.Sharing&version=3&method=create&path="/volume2/dir/key-basic-preferred.pdf"&_sid=011o6xaJaxvLsKCJ4N91278):
Error: 408: Unknown error
The 408 error is an unknown file according to the API docs.
However, I know that file exists. I've tried URL encoding the path. I've tried with and without quotes. I've tried noodling with the python code (don't know python) to change the quotations and api version to 1 instead of 3 but I'm not having any luck.
Finally figured it out. The file paths are relative to the File Station. So if you have a folder in file station called "backup" and a file in "backup" call "blah.txt", the path to the file is /backup/blah.txt
I'm building my python package using Azure DevOps pipeline and the generated artifact is then uploaded to a feed. Everything is working fine, however, I don't like the fact that I have a .pypirc file containing the credentials for the upload sitting in my repository.
Basically I'm uploading the artifact using:
- script: 'twine upload -r imglib --config-file .pypirc dist/imglib-*.tar.gz'
Is there another way to store the credentials, preferably not in a file that anyone could edit? I read something about storing the credentials in the key vault, but I don't see how change the pipeline (the yml file) to do this.
Thanks in advance.
EDIT:
Shaykis answere seems to be the right way, however, I'm not able to replace the placeholder in the .pypirc file using a bash command. All I get is three asterics when I print the content of .pypirc after replacement. For the replacement I use:
- script: 'sed -i "s/__password__/$PYPI_CRED_MAPPED/g" .pypirc'
displayName: 'Setting PyPI credentials'
env:
PYPI_CRED_MAPPED: $(pypi_cred)
The content of .pypirs is (displayed during the build task using a bash cat .pypirc. Is there an easier way to debug the build prozess?):
[distutils]
Index-servers =
pypi
imglib
[imglib]
Repository = https://pkgs.dev.azure.com/XXX/_packaging/imglib/pypi/upload
username = imglib
password = ***
Does anyone know what is happening there?
EDIT 2:
I also tried to use $env:PYPI_CRED_MAPPED but in that case only the $env is replaced by nothing and all I'm left with is :PYPI_CRED_MAPPED. Also, I look at the docs and they use the variable directly (e.g. $PYPI_CRED_MAPPED, see bottom of page).
EDIT 3:
The three asterics are just a placeholder. It worked with $PYPI_CRED_MAPPED as mentioned in EDIT 2. The build process was failing because of another reason. I also tried it with the powershell command provided in the answer and it worked as well. So thank you for your help.
You can store the variable as a secret variable, in the .pypirc file put a placeholder and in the pipeline add a script that replace the placeholder with the variable.
1) In the .yaml editor click on the 3 dots near the Save/Run button on the top right and then click "Variables".
2) Add a new variable (pythonCred fore example) with the password and click on the lock icon to make it secret.
3) Go to your .pypirc file and replace the password with __password__.
4) In your pipeline add a PowerShell task to put the password:
- powershell: |
(Get-Content path/to/pypirc) -replace "__password__" , "$env:CredPython" | Set-Content -Path path/to/pypirc
env: CredPython: $(pythonCred) # you must to map the variable because is a secret variable
You can also use Azure Key Vault with this way, download the password from there in with Azure Key Vault task and replace update the .pypirc file.
I am using Ride (RobotFramework IDE) and I have imported Library AllureReportLibrary in my project.
Using the Set Output Dir, I am creating a Directory C:/AutomationLogs/Allure and all the allure properties and xml files are getting generated in that path.
Set Output Dir C:/AutomationLogs/
Then I am using the "allure serve C:\AutomationLogs\Allure" command to try and generate the html report file in command prompt, but it shows the below error -
"Could not read result
C:\AutomationLogs\Allure\f56f4796-d30a-47f3-a988-d17f6c4e13ca-testsuite.xml:
{} com.fasterxml.jackson.databind.exc.InvalidFormatException: Cannot
deserialize va lue of type
ru.yandex.qatools.allure.model.SeverityLevel from String "None":
value not one of declared Enum instance names: [trivial, blocker,
minor, normal, critical]"
The xml file "f56f4796-d30a-47f3-a988-d17f6c4e13ca-testsuite.xml" was generated using the AllureReportLibrary
Also the index.html file which is generated after the command opens after this command and shows Allure Report unknown
unknown - unknown (Unknown) 0 test cases NaN%
I am using the below -
Allure version - 2.4.1
Ride version - RIDE 1.5.2.1 running on Python 2.7.12.
I am new to Robot Framework and Allure. Please let me know whether I have implemented it correctly and why I am facing the above error.
-Ryan M
I'm using the 1.1.1 version of Allure Adaptor for Robot Framework and the severity is picked from the test case tags and added as a label under the test-case element of the report.
However, it seems that Allure 2.6.0 is also expecting a valid value for the severity attribute of the test-case element.
In order to use Allure2 with the current reports I have altered AllureListener.py to also add the severity to the test case:
elif tag in SEVERITIES:
test.severity = tag
test.labels.append(TestLabel(
name='severity',
value=tag
))
If your output.xml has severity = None for any testcase then the allure-robotframework-adaptor will give the error that you have mentioned. Creating TestCase() object with severity='' in start_suitesetup method of AllureListener.py will do the trick.
def start_suitesetup(self, name, attributes):
....
....
test = TestCase(name=name,
description=description,
start=now(),
attachments=[],
labels=[],
parameters=[],
steps=[],
severity='')
How to create the Allure reports in Robot Framework ?
Initially, Download the Command line and UNzip the file and save the path of the bin folder in environment.
Link : http://repo.maven.apache.org/maven2/io/qameta/allure/allure-commandline/2.8.0/allure-commandline-2.8.0.zip
Unzip the above file then put it in the Environment folder.
Then Pip install the below modules
pip install allure-robotframework
pip install robotframework-allurereport
In robot file, Add the Library in Settings like,
Example :
Library AllureReportLibrary D:\eclipse\RobotFramework\results
Then Use the Below commands to run the robot code.
robot --listener allure_robotframework;D:\eclipse\RobotFramework\results
Example.txt
Finally,
Generate the HTML file by,
allure generate D:\eclipse\RobotFramework\results
Note : Use the same path what you used in the previous command to generate the HTml.file.
and
Open in Mozhila FireFox. It wont be work in Chrome. I dont know exactly why.
Regards,
Vijay
I'm fairly new to Flask, GAE and the use of API. I'm trying to build a basic Web App that can connect to one of Google's API.
My folder structure looks like this (I've kept it to the main files):
app-webemotions:
-app.yaml
-main.py
-lib
--sentimentanalysis.py
-static
--credential.json
Everything is working but providing the json file for the credentials. My understanding is that there's a couple of ways to do it:
1) Setting up the GOOGLE_APPLICATION_CREDENTIALS environment variable to the destination of my file in app.yaml
2) Requesting the file through my script (sentimentanalysis.py)
Unfortunately, I haven't been able to make any of those work.
Option 1):
In app.yaml I have the line:
env_variables:
GOOGLE_APPLICATION_CREDENTIALS: static/key/credentials.json
I then run my code through dev_appserver.py . and get the following error:
ApplicationDefaultCredentialsError: File static/key/credentials.json (pointed by GOOGLE_APPLICATION_CREDENTIALS environment variable) does not exist!
Option 2):
I have a line of code in my script sentimentanalysis.py:
scope = ['https://www.googleapis.com/auth/cloud-platform']
credentials = ServiceAccountCredentials.from_json_keyfile_name('/static/credentials.json', scope)
And when running the code I get the following error:
raise IOError(errno.EACCES, 'file not accessible', filename)
IOError: [Errno 13] file not accessible: '/static/credentials.json'
INFO 2016-08-06 04:10:51,678 module.py:788] default: "POST /Sentiment-analysis HTTP/1.1" 500 -
Question:
So it looks like regardless of the method I'm using, I'm not able to provide the right path to the JSON file
My question is to know first if any of the above options is the right option and if yes, what am I doing wrong? If they are not the right options, what would you recommend?
Apologies if this has already been asked, I've tried to find an answer for a few hours now and haven't been able to crack it...
Thank you!
If you are running on Google App Engine, then your code automatically has the credentials it needs. Do not set GOOGLE_APPLICATION_CREDENTIALS and do not call .from_json_keyfile_name. Instead, call:
credentials = GoogleCredentials.get_application_default()
As shown here:
https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/bigquery/api/getting_started.py
set GOOGLE_APPLICATION_CREDENTIALS=credentials.json
use this command if you are using cmd
Im using the plone cms and am having trouble with a python script. I get a name error "the global name 'open' is not defined". When i put the code in a seperate python script it works fine and the information is being passed to the python script becuase i can print the query. Code is below:
#Import a standard function, and get the HTML request and response objects.
from Products.PythonScripts.standard import html_quote
request = container.REQUEST
RESPONSE = request.RESPONSE
# Insert data that was passed from the form
query=request.query
#print query
f = open("blast_query.txt","w")
for i in query:
f.write(i)
return printed
I also have a second question, can i tell python to open a file in in a certain directory for example, If the script is in a certain loaction i.e. home folder, but i want the script to open a file at home/some_directory/some_directory can it be done?
Python Scripts in Plone are restricted and have no access to the filesystem. The open call is thus not available. You'll have to use an External Method or full python module to have full access to the filesystem.