Error running Google's Cloud Vision API Example (Face Detection) - python

I am trying to run the Face Detection Example in Google's Cloud Vision API. I am trying to run [faces.py here][1].
When I run the following:
faces.py demo-picture.jpg
below is the error I get:
ubuntu#ubuntu-VirtualBox:~/Documents/code/python- stuff/googleapis/cloudvisionapi/cloud-vision/python/face_detection$ python faces.py demo-image.jpg
Traceback (most recent call last):
File "faces.py", line 121, in <module>
main(args.input_image, args.output, args.max_results)
File "faces.py", line 98, in main
faces = detect_face(image, max_results)
File "faces.py", line 62, in detect_face
service = get_vision_service()
File "faces.py", line 35, in get_vision_service
credentials = GoogleCredentials.get_application_default()
File "/home/ubuntu/.local/lib/python2.7/site- packages/oauth2client/client.py", line 1398, in get_application_default
return GoogleCredentials._get_implicit_credentials()
File "/home/ubuntu/.local/lib/python2.7/site- packages/oauth2client/client.py", line 1388, in _get_implicit_credentials
raise ApplicationDefaultCredentialsError(ADC_HELP_MSG)
oauth2client.client.ApplicationDefaultCredentialsError: The Application Default Credentials are not available. They are available if running in Google Compute Engine. Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing to a file defining the credentials. See https://developers.google.com/accounts/docs/application- default-credentials for more information.
ubuntu#ubuntu-VirtualBox:~/Documents/code/python- stuff/googleapis/cloudvisionapi/cloud-vision/python/face_detection$
[1]: https://github.com/GoogleCloudPlatform/cloud- vision/tree/master/python/face_detection
I guess my question is -- how do I do this:
Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing to a file defining the credentials.

You need to download the service-account key; typically a JSON file.
If you have not created the credentials/obtained the key, follow the steps:
Go to your API manager;
Create credentials;
Choose "Service Account Key";
Select "Key Type" as JSON.
After this point, you should obtain a JSON file.
Once you obtain the key, go to your BASHRC (~/.bashrc) and add the following:
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/your/JSON
Then restart your bash by
exec bash
Now, re-run your faces.py.

Related

FileNotFoundError: [Errno 2] in 'service_account.py' for json-credential file when run from CMD-prompt

I have placed the credential json-file in the same directory as the python source code file (py3.10) and when I run my code from within Visual Code (run and debug) it works fine.
However, when I run the same code from the CMD-prompt (as admin in Win10) I get the following error:
C:\WINDOWS\system32>"C:\Users\USER1\AppData\Local\Programs\Python\Python310\python.exe"
"C:\Users\USER1\Documents\PythonScripts\Data from BT connected Soehnle scale.py"
Traceback (most recent call last):
File "C:\Users\USER1\Documents\PythonScripts\Data from BT connected Soehnle
scale.py", line 7, in <module>
creds = ServiceAccountCredentials.from_json_keyfile_name('mydata-332216-96035fc855db.json', scope)
File "C:\Users\USER1\AppData\Local\Programs\Python\Python310\lib\site-packages
\oauth2client\service_account.py", line 219, in from_json_keyfile_name
with open(filename, 'r') as file_obj:
FileNotFoundError: [Errno 2] No such file or directory: 'mydata-332216-96035fc855db.json'
I tried placing the json in:
C:\Users\USER1\AppData\Local\Programs\Python\Python310\lib\site-packages
\oauth2client\
C:\Users\USER1\AppData\Local\Programs\Python\Python310\
but it (obviously) did not help.
What is going on?
When using gspread you don't need to handle the service account auth yourself.
You have helper functions for that: gspread.service_account().
From there you simply need to place your service account file next to your code and give the relative path as follow:
client = gspread.service_account("./my-sc.json")
If you look at gspread documentation you'll notice that you can also place the file in the défaut config directory of your computer and gspread will find it.
By default:
%APPDATA%\gspread on Windows
~/.config/gspread everywhere
So once you placed your service account credentials file in the folder you can simply authenticate using:
gspread.service_account()
And it will find your creds in the above folder.

failed to use TLS with Couchbase Python SDK 3.0.x

I'm trying to use Couchbase Python SDK 3.0.x to connect to a cluster. I followed the document (https://docs.couchbase.com/python-sdk/3.0/howtos/managing-connections.html#ssl) to create the certificate file and used it with the following code:
cluster = Cluster("couchbase://10.82.xxx.xxx:18091", ClusterOptions(PasswordAuthenticator(DEST_USR, DEST_PW, cert_path="./cert.crt")))
However when I run it it gives the following:
$ python3 mycode.py
creating couchbase instance seriesMgmt#10.82.xxx.xxx:18091
Traceback (most recent call last):
File "scheduleCache.py", line 29, in <module>
cluster = Cluster("couchbase://"+DEST_CB_IP, ClusterOptions(PasswordAuthenticator(DEST_USR, DEST_PW, cert_path="./cert.crt")))
File "/usr/local/lib/python3.7/site-packages/couchbase/cluster.py", line 492, in __init__
super(Cluster, self).__init__(connection_string=str(self.connstr), _conntype=_LCB.LCB_TYPE_CLUSTER, **self._clusteropts)
File "/usr/local/lib/python3.7/site-packages/couchbase_core/client.py", line 141, in __init__
super(Client, self).__init__(*args, **kwargs)
couchbase.exceptions.InvalidArgumentException: <Bad/insufficient arguments provided, inner_cause='certpath' is an invalid keyword argument for this function, C Source=(src/bucket.c,1047)>
I believe it's not about the correctness of my cert file yet - somehow the SDK just doesn't want to take the parameter cert_path. Tried to remove the parameter name and it doesn't help:
cluster = Cluster("couchbase://10.82.xxx.xxx:18091", ClusterOptions(PasswordAuthenticator(DEST_USR, DEST_PW, "./cert.crt")))
BTW I can login https://10.82.xxx.xxx:18091 via browser without any problem.

Logging into GCP SQL: How to Ensure PromptSession Is Imported or Otherwise Resolve

I am trying to use the Cloud Shell to update some user permissions. I am logging in using gcloud sql connect my-instance --user=root
gcloud sql connect my-instance
Whitelisting your IP for incoming connection for 5 minutes...done.
Connecting to database with SQL user [sqlserver].********************************************************************************
Python command will soon point to Python v3.7.3.
Python 2 will be sunsetting on January 1, 2020.
See http://https://www.python.org/doc/sunset-python-2/
Until then, you can continue using Python 2 at /usr/bin/python2, but soon
/usr/bin/python symlink will point to /usr/local/bin/python3.
To suppress this warning, create an empty ~/.cloudshell/no-python-warning file.
The command will automatically proceed in seconds or on any key.
********************************************************************************
> Password:
Traceback (most recent call last):
File "/usr/lib/python2.7/runpy.py", line 174, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/usr/local/lib/python2.7/dist-packages/mssqlcli/main.py", line 117, in <module>
main()
File "/usr/local/lib/python2.7/dist-packages/mssqlcli/main.py", line 110, in main
run_cli_with(mssqlcli_options)
File "/usr/local/lib/python2.7/dist-packages/mssqlcli/main.py", line 43, in run_cli_with
from mssqlcli.mssql_cli import MssqlCli
File "/usr/local/lib/python2.7/dist-packages/mssqlcli/mssql_cli.py", line 18, in <module>
from prompt_toolkit.shortcuts import PromptSession, CompleteStyle
ImportError: cannot import name PromptSession
A) I have made the root user's password so insecure and easy there is no way I am mistyping it.
B) It is the third of January, so I really don't know what this Python version error is on about. I made the file but FYI ~/.cloudshell did not exist so I had to make it first. Even so, it just suppresses the version warning, the main error persists when I try to log in.
The documentation acknowledges there are a couple other login methods using glcoud beta sql connect, but that gets me another error
2020/01/04 18:38:41 Rlimits for file descriptors set to {&{8500 1048576}}
2020/01/04 18:38:41 invalid json file "/tmp/tmp.s38C662KKr/legacy_credentials/me#gmail.com/adc.json": open /tmp/tmp.s38C662KKr/legacy_credentials/me#gmail.com/adc.json: no such file or directory
ERROR: (gcloud.beta.sql.connect) Failed to start the Cloud SQL Proxy.
Same for alpha.
This is the first thing I have typed into Cloud Shell, so I can't imagine what could have broken PromptSession.
How can I resolve this error and log into SQL Server using Cloud Shell?
There most likely is an issue while attempting to connect from the Cloud Shell (I managed to connect from a Compute Engine instance with this command); possibly related to the Python run-time / environment variable. It has been reported here. Engineering is aware and are looking into it.

How to get p12 file working on python app engine

Im having trouble getting identity-toolkit fully working with Python App Engine Sandbox. The sample provided is for a non GAE Sandbox project.
In the sample project it reads gitkit-server-config.json from file using os.path. But this is not supported in GAE Sandbox. To get around this I am creating a GitkitClient directly using the constructor:
gitkit_instance = gitkitclient.GitkitClient(
client_id="123456opg.apps.googleusercontent.com",
service_account_email="my-project#appspot.gserviceaccount.com",
service_account_key="/path/to/my-p12file.p12",
widget_url="http://localhost:8080/callback",
http=None,
project_id="my-project")
Is this the correct way to create the GitkitClient?
The issue now is when I try to do a password reset when running locally using dev_appserver.py I get the following stack trace:
File "dashboard.py", line 89, in post
oobResult = gitkit_instance.GetOobResult(self.request.POST,self.request.remote_addr)
File "identitytoolkit/gitkitclient.py", line 366, in GetOobResult
param['action'])
File "identitytoolkit/gitkitclient.py", line 435, in _BuildOobLink
code = self.rpc_helper.GetOobCode(param)
File "identitytoolkit/rpchelper.py", line 104, in GetOobCode
response = self._InvokeGitkitApi('getOobConfirmationCode', request)
File "identitytoolkit/rpchelper.py", line 210, in _InvokeGitkitApi
access_token = self._GetAccessToken()
File "identitytoolkit/rpchelper.py", line 231, in _GetAccessToken
'assertion': self._GenerateAssertion(),
File "identitytoolkit/rpchelper.py", line 259, in _GenerateAssertion
crypt.Signer.from_string(self.service_account_key),
File "oauth2client/_pure_python_crypt.py", line 183, in from_string
raise ValueError('No key could be detected.')
ValueError: No key could be detected.
Im assuming this is a problem with the .p12 file? I double checked service_account_key="/path/to/my-p12file.p12" and the file exists. What am I missing here?
FYI to others working on this in the future -
I could not get this working in python. The documentation doesn't make it clear how to get this working in app engine. In addition, dependency issues with PyCrypto made this a gcc and dependency nightmare.
I was however able to get this working in Go and there is a semi-working example online that will work with some modifications highlighted in the issues and pull request pages. Good luck.

Unable to run access EC2 server via cron job

I am trying to get the directory structure for a FTP user remotely. I am doing this from a python script using [spur][1] module. I am calling this script using a cron job. But I am getting the following
Traceback (most recent call last):
File "/mnt/voylla-staging/releases/20140717193920/voylla_scripts/snapdeal/GetOrders/getOrders.py", line 54, in <module>
feeds = getFeeds()
File "/mnt/voylla-staging/releases/20140717193920/voylla_scripts/snapdeal/GetOrders/getOrders.py", line 26, in getFeeds
result = shell.run(["ls", FEED_LOCATION])
File "/usr/local/lib/python2.7/dist-packages/spur/ssh.py", line 73, in run
return self.spawn(*args, **kwargs).wait_for_result()
File "/usr/local/lib/python2.7/dist-packages/spur/ssh.py", line 83, in spawn
channel = self._get_ssh_transport().open_session()
File "/usr/local/lib/python2.7/dist-packages/spur/ssh.py", line 190, in _get_ssh_transport
raise self._connection_error(error)
spur.ssh.ConnectionError: Error creating SSH connection
Original error: Authentication failed.
If I run the scipt manually without using the cron, its runs perfectly fin!
Please can someone help.
Thanks
Relevant Code:
FTP_SERVER = "abc.example.com"
FTP_USER = "root"
FEED_LOCATION = "/home/xyz/abc"
PROCESSED_FEED_LOCATION = "/home/xyz/def"
PREFIX = "alpha"
def getFeeds(): ####returns the list of feeds in FEED_LOCATION
shell = spur.SshShell(hostname=FTP_SERVER, username=FTP_USER)
with shell:
result = shell.run(["ls", FEED_LOCATION])
feeds = result.output.decode().split("\n")
return feeds
feeds = getFeeds()
You have to specify the path of your private key file with the private_key_file option. Here is the example from the documentation.
# Use a private key
spur.SshShell(
hostname="localhost",
username="bob",
private_key_file="path/to/private.key"
)

Categories