I'm working on a web application with Python and Google App Engine.
I tried to set the default URLFetch deadline globally as suggested in a previous thread:
https://stackoverflow.com/a/14698687/2653179
urlfetch.set_default_fetch_deadline(45)
However it doesn't work - When I print its value in one of the functions: urlfetch.get_default_fetch_deadline() is None.
Here is main.py:
from google.appengine.api import users
import webapp2
import jinja2
import random
import string
import hashlib
import CQutils
import time
import os
import httpRequests
import logging
from google.appengine.api import urlfetch
urlfetch.set_default_fetch_deadline(45)
...
class Del(webapp2.RequestHandler):
def get(self):
id = self.request.get('id')
ext = self.request.get('ext')
user_id = httpRequests.advance(id,ext)
d2 = urlfetch.get_default_fetch_deadline()
logging.debug("value of deadline = %s", d2)
Prints in the Log console:
DEBUG 2013-09-05 07:38:21,654 main.py:427] value of deadline = None
The function which is being called in httpRequests.py:
def advance(id, ext=None):
url = "http://localhost:8080/api/" + id + "/advance"
if ext is None:
ext = ""
params = urllib.urlencode({'ext': ext})
result = urlfetch.fetch(url=url,
payload=params,
method=urlfetch.POST,
headers={'Content-Type': 'application/x-www-form-urlencoded'})
if (result.status_code == 200):
return result.content
I know this is an old question, but recently ran into the issue.
The setting is placed into a thread-local, meaning that if your application is set to thread-safe and you handle a request in a different thread than the one you set the default deadline for, it can be lost. For me, the solution was to set the deadline before every request as part of the middleware chain.
This is not documented, and required looking through the source to figure it out.
Related
I want to delete the snapshot which is 10 days older in GCP using python. I tried using the below program using filter expression, but unfortunately i faced below errors
from datetime import datetime
from googleapiclient import discovery
import google.oauth2.credentials
from oauth2client.service_account import ServiceAccountCredentials
import sys
def get_disks(project,zone):
credentials = ServiceAccountCredentials.from_json_keyfile_name(r"D:\Users\ganeshb\Desktop\Json\auth.json",
scopes='https://www.googleapis.com/auth/compute')
service = discovery.build('compute', 'v1',credentials=credentials)
request = service.snapshots().list(project='xxxx',FILTER="creationTimestamp<'2021-05-31'")
response = request.execute()
print (response)
output = get_disks("xxxxxxxx", "europe-west1-b")
Your problem is a known Google Cloud bug.
Please read these issue trackers: 132365111 and 132676194
Solution:
Remove the filter statement and process the returned results:
from datetime import datetime
from dateutil import parser
request = service.snapshots().list(project=project)
response = request.execute()
# Watch for timezone issues here!
filter_date = '2021-05-31'
d1 = parser.parse(filter_date)
for item in response['items']:
d2 = datetime.fromisoformat(item['creationTimestamp'])
if d2.timestamp() < d1.timestamp():
# Process the result here. This is a print statement stub.
print("{} {}".format(item['name'], item['creationTimestamp']))
APIError(code=-2015): Invalid API-key, IP, or permissions for action
I keep getting the above issue.
I am not sure what the issue is.
I am able to access the client.get_all_tickers() command no problem but when I try to place an order or access user_data (both which require a signature) I get the error
APIError(code=-2015): Invalid API-key, IP, or permissions for action
I think the issue has something to do with the signature. I checked to see if I have the relevant permissions enabled and I do. Furthermore, I tried to create a new API key and I still go the same issue.
NOTE: I am using binance.us not binance.com because I am located in the US so I cannot make an account on binance.com
Therefore, another idea I had was to create a VPN that places me in England so I can make an account through binance.com and maybe that will work.
import time
import datetime
import json
from time import sleep
from binance.client import Client
from binance.enums import *
import sys
import requests, json, time, hashlib
import urllib3
import logging
from urllib3 import PoolManager
from binance.exceptions import BinanceAPIException, BinanceWithdrawException
r = requests.get('https://www.binance.us/en/home')
client = Client(API_key,Secret_key,tld="us")
prices = client.get_all_tickers()
#Def to get location
def crypto_location(sym):
count = 0
for i in prices:
count += 1
ticker = i.get('symbol')
if ticker == sym:
val = i.get('price')
count = count-1
return count
bitcoin_location = crypto_location('BTCUSDT')
ethereum_location = crypto_location('ETHUSDT')
stable_coin_location = crypto_location('BUSDUSDT')
bitcoin_as_BUSD_location = crypto_location('BTCBUSD')
#%% Where to quickly get bitcoin price
t_min = time.localtime().tm_min
prices = client.get_all_tickers()
bitcoin_price = prices[bitcoin_location].get('price')
print(bitcoin_price)
ethereum_price = prices[ethereum_location].get('price')
print(ethereum_price)
stable_coin_price = prices[stable_coin_location].get('price')
print(stable_coin_price)
bitcoin_as_BUSD = prices[bitcoin_as_BUSD_location].get('price')
print(bitcoin_as_BUSD)
client.session.headers.update({ 'X-MBX-APIKEY': API_key})
client.get_account()
error occurs at client.get_account()
I had the same problem, the binance APIs without any IP restrictions, expire every 90 days. I have restricted the API to my IP and it works!
Still, you're sure to find it all here:
https://python-binance.readthedocs.io/en/latest/index.html
I'm using Gcloud Composer as my Airflow. When I try to use Jinja in my HQL code, it does not translate it correctly.
I know that the HiveOperator has a Jinja translator as I'm used to it, but the DataProcHiveOperator doesn't.
I've tried to use the HiveConf directly into my HQL files, but when setting those values to my Partition (i.e. INSERT INTO TABLE abc PARTITION (ds = ${hiveconf:ds}))`, it doesn't work.
I have also added the following to my HQL file:
SET ds=to_date(current_timestamp());
SET hive.exec.dynamic.partition=true;
SET hive.exec.dynamic.partition.mode=nonstrict;
But it didn't work as HIVE is transforming the formula above into a STRING.
So my idea was to combine both operators to have the Jinja translator working fine, but when I do that, I get the following error: ERROR - submit() takes from 3 to 4 positional arguments but 5 were given.
I'm not very familiar with Python coding and any help would be great, see below code for the operator I'm trying to build;
Header of the Python File (please note that the file contains other Operators not mentioned in this question):
import ntpath
import os
import re
import time
import uuid
from datetime import timedelta
from airflow.contrib.hooks.gcp_dataproc_hook import DataProcHook
from airflow.contrib.hooks.gcs_hook import GoogleCloudStorageHook
from airflow.exceptions import AirflowException
from airflow.models import BaseOperator
from airflow.utils.decorators import apply_defaults
from airflow.version import version
from googleapiclient.errors import HttpError
from airflow.utils import timezone
from airflow.utils.operator_helpers import context_to_airflow_vars
modified DataprocHiveOperator:
class DataProcHiveOperator(BaseOperator):
template_fields = ['query', 'variables', 'job_name', 'cluster_name', 'dataproc_jars']
template_ext = ('.q',)
ui_color = '#0273d4'
#apply_defaults
def __init__(
self,
query=None,
query_uri=None,
hiveconfs=None,
hiveconf_jinja_translate=False,
variables=None,
job_name='{{task.task_id}}_{{ds_nodash}}',
cluster_name='cluster-1',
dataproc_hive_properties=None,
dataproc_hive_jars=None,
gcp_conn_id='google_cloud_default',
delegate_to=None,
region='global',
job_error_states=['ERROR'],
*args,
**kwargs):
super(DataProcHiveOperator, self).__init__(*args, **kwargs)
self.gcp_conn_id = gcp_conn_id
self.delegate_to = delegate_to
self.query = query
self.query_uri = query_uri
self.hiveconfs = hiveconfs or {}
self.hiveconf_jinja_translate = hiveconf_jinja_translate
self.variables = variables
self.job_name = job_name
self.cluster_name = cluster_name
self.dataproc_properties = dataproc_hive_properties
self.dataproc_jars = dataproc_hive_jars
self.region = region
self.job_error_states = job_error_states
def prepare_template(self):
if self.hiveconf_jinja_translate:
self.query_uri= re.sub(
"(\$\{(hiveconf:)?([ a-zA-Z0-9_]*)\})", "{{ \g<3> }}", self.query_uri)
def execute(self, context):
hook = DataProcHook(gcp_conn_id=self.gcp_conn_id,
delegate_to=self.delegate_to)
job = hook.create_job_template(self.task_id, self.cluster_name, "hiveJob",
self.dataproc_properties)
if self.query is None:
job.add_query_uri(self.query_uri)
else:
job.add_query(self.query)
if self.hiveconf_jinja_translate:
self.hiveconfs = context_to_airflow_vars(context)
else:
self.hiveconfs.update(context_to_airflow_vars(context))
job.add_variables(self.variables)
job.add_jar_file_uris(self.dataproc_jars)
job.set_job_name(self.job_name)
job_to_submit = job.build()
self.dataproc_job_id = job_to_submit["job"]["reference"]["jobId"]
hook.submit(hook.project_id, job_to_submit, self.region, self.job_error_states)
I would like to be able to use Jinja templating inside my HQL code to allow partition automation on my data pipeline.
P.S: I'll use the Jinja templating mostly for Partition DateStamp
Does anyone know what is the error message I'm getting + help me solve it?
ERROR - submit() takes from 3 to 4 positional arguments but 5 were given
Thank you!
It is because of the 5th argument job_error_states which is only in master and not in the current stable release (1.10.1).
Source Code for 1.10.1 -> https://github.com/apache/incubator-airflow/blob/76a5fc4d2eb3c214ca25406f03b4a0c5d7250f71/airflow/contrib/hooks/gcp_dataproc_hook.py#L219
So remove that parameter and it should work.
What I'm trying to do is upload a picture to wordpress using wp.uploadFile xmlrpc method.
To do this, in PHP there is an example here: https://stackoverflow.com/a/8910496/1212382
I'm trying to do the same thing in python but I don't know how.
Anyone any ideas?
ok, the answer lies in the xmlrpclib class.
To send base64 bits to wordpress from python you need to use the xmlrpclib class like so:
base64bits = xmlrpclib.Binary(file_content)
then you just add the base64bits variable to the 'bits' parameter in your wp.uploadFile xmlrpc request.
to be a little more exact, here's the complete code in python of how this should be done:
import xmlrpclib
import urllib2
from datetime import date
import time
def get_url_content(url):
try:
content = urllib2.urlopen(url)
return content.read()
except:
print 'error! NOOOOOO!!!'
file_url = 'http://the path to your picture'
extension = file_url.split(".")
leng = extension.__len__()
extension = extension[leng-1]
if (extension=='jpg'):
xfileType = 'image/jpeg'
elif(extension=='png'):
xfileType='image/png'
elif(extension=='bmp'):
xfileType = 'image/bmp'
file = get_url_content(file_url)
file = xmlrpclib.Binary(file)
server = xmlrpclib.Server('http://website.com/xmlrpc.php')
filename = str(date.today())+str(time.strftime('%H:%M:%S'))
mediarray = {'name':filename+'.'+extension,
'type':xfileType,
'bits':file,
'overwrite':'false'}
xarr = ['1', 'USERHERE', 'PASSWORDHERE', mediarray]
result = server.wp.uploadFile(xarr)
print result
I like to use IPython's zope profile to inspect my Plone instance, but a few annoying permissions differences come up compared to inserting a breakpoint and hitting it with the admin user.
For example, I would like to iterate over the content objects in an unpublished testing folder. This query will return no results in the shell, but works from a breakpoint.
$ bin/instance shell
$ ipython --profile=zope
from Products.CMFPlone.utils import getToolByName
catalog = getToolByName(context, 'portal_catalog')
catalog({'path':'Plone/testing'})
Can I authenticate as admin or otherwise rejigger the permissions to fully manipulate my site from ipython?
here's the (very dirty) code I use to manage my plone app from the debug shell. It may requires some updates depending on your versions of Zope and Plone.
from sys import stdin, stdout, exit
import base64
from thread import get_ident
from ZPublisher.HTTPRequest import HTTPRequest
from ZPublisher.HTTPResponse import HTTPResponse
from ZPublisher.BaseRequest import RequestContainer
from ZPublisher import Publish
from AccessControl import ClassSecurityInfo, getSecurityManager
from AccessControl.SecurityManagement import newSecurityManager
from AccessControl.User import UnrestrictedUser
def loginAsUnrestrictedUser():
"""Exemple of use :
old_user = loginAsUnrestrictedUser()
# Manager stuff
loginAsUser(old_user)
"""
current_user = getSecurityManager().getUser()
newSecurityManager(None, UnrestrictedUser('manager', '', ['Manager'], []))
return current_user
def loginAsUser(user):
newSecurityManager(None, user)
def makerequest(app, stdout=stdout, query_string=None, user_pass=None):
"""Make a request suitable for CMF sites & Plone
- user_pass = "user:pass"
"""
# copy from Testing.makerequest
resp = HTTPResponse(stdout=stdout)
env = {}
env['SERVER_NAME'] = 'lxtools.makerequest.fr'
env['SERVER_PORT'] = '80'
env['REQUEST_METHOD'] = 'GET'
env['REMOTE_HOST'] = 'a.distant.host'
env['REMOTE_ADDR'] = '77.77.77.77'
env['HTTP_HOST'] = '127.0.0.1'
env['HTTP_USER_AGENT'] = 'LxToolsUserAgent/1.0'
env['HTTP_ACCEPT']='image/gif, image/x-xbitmap, image/jpeg, */* '
if user_pass:
env['HTTP_AUTHORIZATION']="Basic %s" % base64.encodestring(user_pass)
if query_string:
p_q = query_string.split('?')
if len(p_q) == 1:
env['PATH_INFO'] = p_q[0]
elif len(p_q) == 2:
(env['PATH_INFO'], env['QUERY_STRING'])=p_q
else:
raise TypeError, ''
req = HTTPRequest(stdin, env, resp)
req['URL1']=req['URL'] # fix for CMFQuickInstaller
#
# copy/hacked from Localizer __init__ patches
# first put the needed values in the request
req['HTTP_ACCEPT_CHARSET'] = 'latin-9'
#req.other['AcceptCharset'] = AcceptCharset(req['HTTP_ACCEPT_CHARSET'])
#
req['HTTP_ACCEPT_LANGUAGE'] = 'fr'
#accept_language = AcceptLanguage(req['HTTP_ACCEPT_LANGUAGE'])
#req.other['AcceptLanguage'] = accept_language
# XXX For backwards compatibility
#req.other['USER_PREF_LANGUAGES'] = accept_language
#req.other['AcceptLanguage'] = accept_language
#
# Plone stuff
#req['plone_skin'] = 'Plone Default'
#
# then store the request in Publish._requests
# with the thread id
id = get_ident()
if hasattr(Publish, '_requests'):
# we do not have _requests inside ZopeTestCase
Publish._requests[id] = req
# add a brainless session container
req['SESSION'] = {}
#
# ok, let's wrap
return app.__of__(RequestContainer(REQUEST = req))
def debug_init(app):
loginAsUnrestrictedUser()
app = makerequest(app)
return app
This lives in a wshelpers Zope product. Once the debug shell launched, it's just a matter of;
>> from Products.wshelpers import wsdebug
>> app = wsdebug.debug_init(app)
>> # now you're logged in as admin
Just use catalog.search({'path':'Plone/testing'}). It performs the same query as catalog() but does not filter the results based on the current user's permissions.
IPython's zope profile does provide a method utils.su('username') to change the current user, but it does not recognize the admin user (defined in /acl_users instead of /Plone/acl_users) and after calling it subsequent calls to catalog() fail with AttributeError: 'module' object has no attribute 'checkPermission'.