I use Python 2.5 and informixdb. I want to connect with the database, but what are the parameters for the informixdb.connect() method?
I have
Hostname
Port
Databasename
User
Password
But what is the right order? Or how is the dsn String build?
The official documentation does not really help me.
The documentation says i can use
informixdb.connect(dsn)
but they do not explain how a DataSourceString should looks like. What arguments and in which order are needed.
Here is an link to the documentation.
And i know Python 2.5 is very old, but the database does not support Python 3.x i have tried it.
From the documentation at https://sourceforge.net/projects/informixdb/:
To do anything useful with InformixDB one must connect to a database. This is accomplished by calling informixdb.connect:
>>> import informixdb
>>> conn = informixdb.connect('db#daniel', user='me', password='something')
>>> conn
<_informixdb.Connection object at 0xb7d08e90>
informixdb.connect takes three arguments: A dsn which identifies the database and server to connect to, as recognized by ESQL's CONNECT statement (e.g. 'database#server', 'database', '#server') plus an optional user and a corresponding password.
If the dsn doesn't include a servername the value of the environment variable INFORMIXSERVER is used. When connecting without specifying the name of the database no database will be selected. This is useful for setting up a new database from within InformixDB.
Why not use the new "IfxPy" OpenInformix module?
https://github.com/OpenInformix/IfxPy
It has support for both 2.x and 3.x versions of Python.
I’m using Python 3.6 and Fabric 2.4. I’m using Fabric to SSH into a server and run some commands. I need to set an environment variable for the commands being run on the remote server. The documentation indicates that something like this should work:
from fabric import task
#task(hosts=["servername"])
def do_things(c):
c.run("command_to_execute", env={"KEY": "VALUE"})
But that doesn’t work. Something like this should also be possible:
from fabric import task
#task(hosts=["servername"])
def do_things(c):
c.config.run.env = {"KEY": "VALUE"}
c.run("command_to_execute")
But that doesn’t work either. I feel like I’m missing something. Can anyone help?
I was able to do it by setting inline_ssh_env=True, and then explicitly setting the env variable, ex:
with Connection(host=hostname, user=username, inline_ssh_env=True) as c:
c.config.run.env = {"MY_VAR": "this worked"}
c.run('echo $MY_VAR')
As stated on the site of Fabric:
The root cause of this is typically because the SSH server runs non-interactive commands via a very limited shell call: /path/to/shell -c "command" (for example, OpenSSH). Most shells, when run this way, are not considered to be either interactive or login shells; and this then impacts which startup files get loaded.
You read more on this page link
So what you try to do won't work, and the solution is to pass the environment variable you want to set explicitly:
from fabric import task
#task(hosts=["servername"])
def do_things(c):
c.config.run.env = {"KEY": "VALUE"}
c.run('echo export %s >> ~/.bashrc ' % 'ENV_VAR=VALUE' )
c.run('source ~/.bashrc' )
c.run('echo $ENV_VAR') # to verify if it's set or not!
c.run("command_to_execute")
You can try that:
#task
def qa(ctx):
ctx.config.run.env['counter'] = 22
ctx.config.run.env['conn'] = Connection('qa_host')
#task
def sign(ctx):
print(ctx.config.run.env['counter'])
conn = ctx.config.run.env['conn']
conn.run('touch mike_was_here.txt')
And run:
fab2 qa sign
When creating the Connection object, try adding inline_ssh_env=True.
Quoting the documentation:
Whether to send environment variables “inline” as prefixes in front of command strings (export VARNAME=value && mycommand here), instead of trying to submit them through the SSH protocol itself (which is the default behavior). This is necessary if the remote server has a restricted AcceptEnv setting (which is the common default).
According to that part of the official doc, the connect_kwargs attribute of the Connection object is intended to replace the env dict. I use it, and it works as expected.
I am using MySQLdb package in Python to update my database. I have a simple update command as follows :
update_query = "update user_details set `address`='%s' where `id`='%s'"
cursor.execute(update_query, (address, id1))
print(cursor._last_executed)
Here is the command executed :
update user_details set `address`='35, Chikmagalur' where `id`='242069'
The program runs fine without error. However, the database is not getting updated. The same command works when I run as an SQL query on PHPMyAdmin.
Any idea what could be the issue ?
this is a duplicate of ...
sql transactions needs to be committed, either explicitly or implicitly.
either issue a commit command explicitly
cursor._get_db().commit()
setting the connection to autocommit when opening the connection is also an option.
I'm trying to connect to Google BigQuery through the BigQuery API, using Python.
I'm following this page here:
https://cloud.google.com/bigquery/bigquery-api-quickstart
My code is as follows:
import os
import argparse
from apiclient.discovery import build
from apiclient.errors import HttpError
from oauth2client.client import GoogleCredentials
GOOGLE_APPLICATION_CREDENTIALS = './Peepl-cb1dac99bdc0.json'
def main(project_id):
# Grab the application's default credentials from the environment.
credentials = GoogleCredentials.get_application_default()
print(credentials)
# Construct the service object for interacting with the BigQuery API.
bigquery_service = build('bigquery', 'v2', credentials=credentials)
try:
query_request = bigquery_service.jobs()
query_data = {
'query': (
'SELECT TOP(corpus, 10) as title, '
'COUNT(*) as unique_words '
'FROM [publicdata:samples.shakespeare];')
}
query_response = query_request.query(
projectId=project_id,
body=query_data).execute()
print('Query Results:')
for row in query_response['rows']:
print('\t'.join(field['v'] for field in row['f']))
except HttpError as err:
print('Error: {}'.format(err.content))
raise err
if __name__ == '__main__':
parser = argparse.ArgumentParser(
description=__doc__,
formatter_class=argparse.RawDescriptionHelpFormatter)
parser.add_argument('project_id', help='Your Google Cloud Project ID.')
args = parser.parse_args()
main(args.project_id)
However, when I run this code through the terminal, I get the following error:
oauth2client.client.ApplicationDefaultCredentialsError: The Application Default Credentials are not available. They are available if running in Google Compute Engine. Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing to a file defining the credentials. See https://developers.google.com/accounts/docs/application-default-credentials for more information.
As you can see in the code, I've tried to set the GOOGLE_APPLICATION_CREDENTIALS as per the link in the error. However, the error persists. Does anyone know what the issue is?
Thank you in advance.
First - Thanks for the code - this provided to be very useful.
I would also suggest adding setting the environmental variable directly in your code - as not to set it for every environment you work on.
you can use the following code:
import os
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "path_to_your_.json_credential_file"
I found this useful when switching between different projects that require different credentials.
I'm not sure about BigQuery, but i'm using Google Data Store for saving. If you've installed gcloud sdk in your mac, you can try running this command
gcloud auth application-default login
It's looking for the environment variable in your local UNIX (or other) environment, not a variable in your python script.
You'd set that by opening up your terminal or cygwin and doing one of the following:
export GOOGLE_APPLICATION_CREDENTIALS='/path/to/your/client_secret.json'
Type that into your terminal to set the variable for just this session
Open up your .bashrc file, in UNIX by typing in nano ~/.bashrc and add this line to it underneath user specific aliases if you see that header:
GOOGLE_APPLICATION_CREDENTIALS="/full/path/to/your/client_secret.json"
Then reload it by typing source ~/.bashrc and confirm that it's set by trying echo $GOOGLE_APPLICATION_CREDENTIALS. If it returns the path, you're good.
Note: oauth2client is deprecated, instead of GoogleCredentials.get_application_default() you can use google.auth.default(). Install the package first with:
pip install google-auth
In your specific example, I see you know where the JSON file is located from your code. Instead of default credentials (from environment variables), you can use a service account directly with the google.oauth2.service_account module.
credentials = google.oauth2.service_account.Credentials.from_service_account_file(
'./Peepl-cb1dac99bdc0.json',
scopes=['https://www.googleapis.com/auth/cloud-platform'])
You can use this credentials file just as you are currently doing so by passing them to googleapiclient.discovery.build or if you are using the google-cloud-bigquery library, pass the credentials to the google.cloud.bigquery.Client constructor.
Here is a c# solution
System.Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS",#"C:\apikey.json");
string Pathsave = System.Environment.GetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS");
It's looking for the environment variable. But I was able to solve this problem on Windows platform by using Application Default Credentials.
Steps that I followed:
Installed Google SDK
Then execute gcloud init steps to specify my defaults credentials and default project which you can change as and when needed. gcloud executable can be loacted in the bin directory where you chose to install Google SDK.
After successfully providing the credentials, you can check in at the location C:\Users\"yourusername"\AppData\Roaming\gcloud\legacy_credentials\"youremail"
. You can find the credentials stored in the JSON format there.
It helped me to resolve the error.
Apart from using GOOGLE_APPLICATION_CREDENTIALS (which is already described in a bunch of answers) there is one more way to set generated json credentials as a default service account:
gcloud auth activate-service-account --key-file=<path to your generated json file>
That will activate a default account (and set credentials according to the provided json file) without explicitly setting GOOGLE_APPLICATION_CREDENTIALS, and it will be still activated after re-login or reboot without modifying .bashrc.
The link provided in the error message, https://developers.google.com/identity/protocols/application-default-credentials, says to set the environment variable to point at the fail that contains the JSON service credentials. It looks like you set a Python variable. Try setting your terminal's environment variable to point at the correct file.
An alternate would be to explicitly use some other credentials when you aren't running in a GCE container, like oauth2client.client.SignedJwtAssertionCredentials and point it directly at your client secret so you don't have to indirect through an environment variable.
There is another workaround that I don't think was mentioned here yet. The google.oauth2.service_account.Credentials object offers the from_service_account_info method (see here: https://github.com/googleapis/google-auth-library-python/blob/main/google/oauth2/service_account.py).
So you can set any variable you want in your environment and read it in and pass it into the function something like this:
your_data = {
"type": os.environ.get('YOUR_ENV_VAR'),
"project_id": os.environ.get('YOUR_ENV_VAR'),
"private_key_id": os.environ.get('YOUR_ENV_VAR'),
#... and so on with all the required Google variables....
}
your_credentials = service_account.Credentials.from_service_account_info(your_data, scopes=your_scopes)
service = discovery.build(api_name, api_version, credentials=your_credentials)
I basically took all the data from my google keyfile.json and stored them in the env and did the above. That way you never need to keep your keyfile.json anywhere near your code or worse, upload it somewhere public. And that's basically it. Good luck!
PS: I forgot to mention this also, which might help someone who is running into the same issues as I did. While the above should work fine in development, in some production environments the \n will not be interpreted as a new line. Instead it will remain inside the private key. Put all of the above into a try statement and if you have the error: 'no key could be detected' then this is most likely the problem. In that case you need to replace all \\n with \n similar as to to what has been suggested by Sumit Agrawal but kind of the other way round. That's because in some environments an automatic adding of \ will occur for a new line indication such as \n in order to keep them as they are if that makes any sense. So you have to basically undo this.
You can simply do the following for one of the lines above:
"private_key": os.environ.get('YOUR_ENV_VAR').replace('\\n', '\n'),
But again try to print them to the log file / console to see how they actually look like. If you have any \n in the string you know you need to clean or convert them as explained. Good luck!
If you would like to use different credential files without setting the environmental variable, you can use the following code:
from oauth2client import service_account
from apiclient.discovery import build
import json
client_credentials = json.load(open("<path to .json credentials>"))
credentials_token = service_account._JWTAccessCredentials.from_json_keyfile_dict(client_credentials)
bigquery_service = build('bigquery', 'v2', credentials=credentials_token)
query_request = bigquery_service.jobs()
query_data = {
'query': (
'SELECT TOP(corpus, 10) as title, '
'COUNT(*) as unique_words '
'FROM [publicdata:samples.shakespeare];')
}
query_response = query_request.query(
projectId=project_id,
body=query_data).execute()
print('Query Results:')
for row in query_response['rows']:
print('\t'.join(field['v'] for field in row['f']))
Export the Google credential JSON from command line: export GOOGLE_APPLICATION_CREDENTIALS='\path\key.json'
I hope it will work fine.
You can create a client with service account credentials using from_service_account_json():
from google.cloud import bigquery
bigqueryClient = bigquery.Client.from_service_account_json('/path/to/keyfile.json')
If there is case where you can not provide credential in a file set
GOOGLE_APPLICATION_CREDENTIALS='\path\key.json'
As the service account is JSON and it contains double quote character, replace every double quote with \"
Wrap the complete JSON in double quote
Replace every \n with \\n ( on linux ) \\\n (on mac)
with above changes in service account if you export it as variable, then it should be recorded correctly.
try echo %variable_name to confirm if it looks good.
In your project folder just type:
set GOOGLE_APPLICATION_CREDENTIALS='\path\key.json'
On Windows, update the 'environmental variables for your account'.
You will most likely already have a variable called: GOOGLE_APPLICATION_CREDENTIALS
Simply update the path to /path/to/liquid-optics-xxxxxxxx.json
(you'll most likely have that file somewhere on your machine). Then refresh your environment (cmd or whatever) to pick up the change.
I want to execute SQL queries from the Python script environment in MySQL Workbench. I looked at the MySQL Workbench documentation for the grt module and found the executeScript method but I can't seem to use it to make queries.
Executing this Python code:
import grt
querystring = "select * from Purchases WHERE PurchaseAmount > 600 and PurchaseAmount < 2500"
executeScript(querystring)
produces the following error message:
Uncaught exception while executing [filepath]runquery.py:
File "[filepath]runquery.py", line 10, in <module>
executeScript(querystring)
NameError: name 'executeScript' is not defined
I don't understand what virtual grt::ListRef executeScript ( const std::string & sql ) means so I can't format my query properly, however, the error message seems to indicate that the executeScript method doesn't exist anyway. Most documentation I look at has examples of correctly-formatted function calls but I can't seem to find any for executeScript.
All I want to do is literally run my string as an SQL query within the MySQL Workbench Python scripting environment.
Thank you!
I am new to Python and SQL so please be patient. :)
To run executeScript function you need to interact with an sqleditor object.
For testing, do the next on MS Windows with the example databases:
Start MySQLWorkbench
connect to local database
select sakila from SCHEMAS
start scripting shell with Tools->scripting shell or (Ctrl+F3)
Add new python script (test.py)
Save script with the content below
run script in scripting shell
Script content:
import grt
result = grt.root.wb.sqlEditors[0].executeScript("select * from actor limit 10;")
for col in result[0].columns:
print col.name
To find out how to reference objects in the script, it is very easy to use the Globals Tree panel's class browser and using right mouse click on the object and choose "Copy Path for Python"
You can run something like the following command if you need to run your script from command line in Windows:
"C:\Program Files\MySQL\MySQL Workbench 6.1 CE\MySQLWorkbench.exe" -query "Local instance MySQL56" -run-python "execfile('c:\Users\Root\AppData\Roaming\MySQL\Workbench\scripts\script.py')" -log-to-stderr -v
The (first) problem seems to be that you use a function called executeScript(), which you haven't defined or taken from anywhere. If it is in the grt module (which I am not familiar with) you have to do it as follows:
from grt import executeScript
querystring = "select * from Purchases WHERE PurchaseAmount > 600 and PurchaseAmount < 2500"
executeScript(querystring)