I'm having an issue recently with some values returned from item.sender - attached is a screen dump
As you can see in the lower portion of the graphic, it's not the usual form returned by item.sender which usually is of form:
Mailbox(name='ReCircle Recycling', email_address='aldoushicks#recirclerecycling.com', routing_type='SMTP', mailbox_type='OneOff')
Has anyone else seen this?
How do you deal with it?
i.e. even though I am using try/except clause, this result still causes my IDE to freeze.
I re-ran the script today using exactly the same date filter and it didn't happen. So I’m forced to ask, what did happen? Why is it not occurring again?
Its weird behaviour. It could jam up a script in future so wondering how to prevent it.
Code:
from collections import defaultdict
from datetime import datetime
import logging
from exchangelib import DELEGATE, Account, Credentials, \
EWSDateTime, EWSTimeZone, Configuration
from exchangelib.util import PrettyXmlHandler
logging.basicConfig(level=logging.DEBUG, handlers=[PrettyXmlHandler()])
gusername="" #deleted :/
gpassword="" #deleted :/
gprimary_smtp_address="bspks#lunet.lboro.ac.uk"
em_dict = defaultdict(list)
def contactServer(pusername, ppassword,pprimary_smtp_address):
creds = Credentials(pusername, ppassword)
config = Configuration(server='outlook.office365.com/EWS/Exchange.asmx', \
credentials=creds)
return Account(
primary_smtp_address=pprimary_smtp_address,
autodiscover=False,
config = config,
access_type=DELEGATE
)
print ("connecting to server\n")
account = contactServer(gusername, gpassword, gprimary_smtp_address)
dt_string = "2018-11-01 12:41:19+00:00" #usually comes out from db, stringified for debugging
dt = datetime.strptime(dt_string, '%Y-%m-%d %H:%M:%S+00:00')
tz = EWSTimeZone.timezone('Europe/London')
last_datetime = tz.localize(dt)
for item in account.inbox.filter(datetime_received__gt=last_datetime):
if isinstance(item, Message):
em_dict["username"].append(gusername)
em_dict["gprimary_smtp_address"].append(gprimary_smtp_address)
em_dict["datetime_received"].append(item.datetime_received)
em_dict["subject"].append(item.subject)
em_dict["body"].append(item.body)
em_dict["item_id"].append(item.item_id)
em_dict["sender"].append(item.sender)
# extract the email address from item.sender
emailaddr = item.sender.email_address
print ("debug email addr: ", emailaddr)
print ("downloaded emails\n")
Related
APIError(code=-2015): Invalid API-key, IP, or permissions for action
I keep getting the above issue.
I am not sure what the issue is.
I am able to access the client.get_all_tickers() command no problem but when I try to place an order or access user_data (both which require a signature) I get the error
APIError(code=-2015): Invalid API-key, IP, or permissions for action
I think the issue has something to do with the signature. I checked to see if I have the relevant permissions enabled and I do. Furthermore, I tried to create a new API key and I still go the same issue.
NOTE: I am using binance.us not binance.com because I am located in the US so I cannot make an account on binance.com
Therefore, another idea I had was to create a VPN that places me in England so I can make an account through binance.com and maybe that will work.
import time
import datetime
import json
from time import sleep
from binance.client import Client
from binance.enums import *
import sys
import requests, json, time, hashlib
import urllib3
import logging
from urllib3 import PoolManager
from binance.exceptions import BinanceAPIException, BinanceWithdrawException
r = requests.get('https://www.binance.us/en/home')
client = Client(API_key,Secret_key,tld="us")
prices = client.get_all_tickers()
#Def to get location
def crypto_location(sym):
count = 0
for i in prices:
count += 1
ticker = i.get('symbol')
if ticker == sym:
val = i.get('price')
count = count-1
return count
bitcoin_location = crypto_location('BTCUSDT')
ethereum_location = crypto_location('ETHUSDT')
stable_coin_location = crypto_location('BUSDUSDT')
bitcoin_as_BUSD_location = crypto_location('BTCBUSD')
#%% Where to quickly get bitcoin price
t_min = time.localtime().tm_min
prices = client.get_all_tickers()
bitcoin_price = prices[bitcoin_location].get('price')
print(bitcoin_price)
ethereum_price = prices[ethereum_location].get('price')
print(ethereum_price)
stable_coin_price = prices[stable_coin_location].get('price')
print(stable_coin_price)
bitcoin_as_BUSD = prices[bitcoin_as_BUSD_location].get('price')
print(bitcoin_as_BUSD)
client.session.headers.update({ 'X-MBX-APIKEY': API_key})
client.get_account()
error occurs at client.get_account()
I had the same problem, the binance APIs without any IP restrictions, expire every 90 days. I have restricted the API to my IP and it works!
Still, you're sure to find it all here:
https://python-binance.readthedocs.io/en/latest/index.html
(Note: this question has nothing to do with encoding, as should be clear by reading it. Ignore the suggestion above.)
I'm learning Python and figured a nice tool to start out with would be something that would grab some emails over MIME and display a given header. The following is basically my script:
#!/usr/bin/env python3
from imaplib import IMAP4_SSL
from netrc import netrc
from email import message_from_bytes
conn = IMAP4_SSL('imap.gmail.com')
auth = netrc().hosts['imap.gmail.com']
conn.login(auth[0], auth[2])
conn.select()
typ, data = conn.search(None, 'ALL')
i = 0
for num in reversed(data[0].split()):
i += 1
typ, data = conn.fetch(num, '(RFC822)')
email = message_from_bytes(data[0][1])
print("%i: %s" % (int(num), email.get('subject')))
if i == 5:
break
conn.close()
conn.logout()
The frustrating thing is that the header comes back folded; thus showing through
the underlying email string instead of the actual value inside of the header.
How can I get the correctly unfolded header value? I'd like
to stick with core python3 stuff but I'm open to external deps if I must.
Use Policy Objects to enable unfolding in the Python email package. In your script, you would have to add:
from email.policy import SMTPUTF8
to import the policy SMTPUTF8, and later use that when calling message_from_bytes:
email = message_from_bytes(data[0][1], policy=SMTPUTF8)
I tried your script with Python 3.9.5, actually all policies except compat32 (which is used when the parameter policy is absent) enabled unfolding.
TL;DR: strip newlines
I'd love it if there were a simple answer to this, so if you have a better one feel free to add it. In the meantime, this sorta ghetto solution works perfectly:
#!/usr/bin/env python3
from imaplib import IMAP4_SSL
from netrc import netrc
from email import message_from_bytes
import re
conn = IMAP4_SSL('imap.gmail.com')
auth = netrc().hosts['imap.gmail.com']
conn.login(auth[0], auth[2])
conn.select()
typ, data = conn.search(None, 'ALL')
i = 0
for num in reversed(data[0].split()):
i += 1
typ, data = conn.fetch(num, '(RFC822)')
email = message_from_bytes(data[0][1])
raw_header = email.get('subject')
header = re.sub('[\r\n]', '', header)
print("%i: %s" % (int(num), header))
if i == 5:
break
conn.close()
conn.logout()
I'm working on a web application with Python and Google App Engine.
I tried to set the default URLFetch deadline globally as suggested in a previous thread:
https://stackoverflow.com/a/14698687/2653179
urlfetch.set_default_fetch_deadline(45)
However it doesn't work - When I print its value in one of the functions: urlfetch.get_default_fetch_deadline() is None.
Here is main.py:
from google.appengine.api import users
import webapp2
import jinja2
import random
import string
import hashlib
import CQutils
import time
import os
import httpRequests
import logging
from google.appengine.api import urlfetch
urlfetch.set_default_fetch_deadline(45)
...
class Del(webapp2.RequestHandler):
def get(self):
id = self.request.get('id')
ext = self.request.get('ext')
user_id = httpRequests.advance(id,ext)
d2 = urlfetch.get_default_fetch_deadline()
logging.debug("value of deadline = %s", d2)
Prints in the Log console:
DEBUG 2013-09-05 07:38:21,654 main.py:427] value of deadline = None
The function which is being called in httpRequests.py:
def advance(id, ext=None):
url = "http://localhost:8080/api/" + id + "/advance"
if ext is None:
ext = ""
params = urllib.urlencode({'ext': ext})
result = urlfetch.fetch(url=url,
payload=params,
method=urlfetch.POST,
headers={'Content-Type': 'application/x-www-form-urlencoded'})
if (result.status_code == 200):
return result.content
I know this is an old question, but recently ran into the issue.
The setting is placed into a thread-local, meaning that if your application is set to thread-safe and you handle a request in a different thread than the one you set the default deadline for, it can be lost. For me, the solution was to set the deadline before every request as part of the middleware chain.
This is not documented, and required looking through the source to figure it out.
I'm working on creating scripts using python, mongodb and the pymongo module to fetch certain aspects of the Twitter API and store them in a mongo database. I've written some scripts to do different things: access the search API, access the user_timeline, and more. However, I have been just getting to know all of the tools that I'm working with and it's time for me to go back and make it more efficient. Thus, right now I'm working on adding functions and classes to my scripts. Here is one of my scripts without functions or classes:
#!/usr/local/bin/python
import twitter
import datetime
from datetime import date, timedelta, datetime
import pymongo
from pymongo import Connection
# Twitter handle that we are scraping mentions for
SCREEN_NAME = '#twitterapi'
# Connect to the database
connection = Connection()
db = connection.test
collection = db.twitterapi_mentions # Change the name of this database
t = twitter.Twitter(domain='search.twitter.com')
# Fetch the information from the API
results = []
for i in range(2):
i+=1
response = t.search(q=SCREEN_NAME, result_type='recent', rpp=100, page=i)['results']
results.extend(response)
# Create a document in the database for each item taken from the API
for tweet in results:
id_str = tweet['id_str']
twitter_id = tweet['from_user']
tweetlink = "http://twitter.com/#!/%s/status/%s" % (twitter_id, id_str)
created_at = datetime.strptime(tweet['created_at'], "%a, %d %b %Y %H:%M:%S +0000")
date = created_at.date().strftime("%m/%d/%y")
time = created_at.time().strftime("%H:%M:%S")
text = tweet['text']
identifier = {'id' : id_str}
entries = {'id' : id_str, 'tweetlink' : tweetlink, 'date' : date, 'time' : time, 'text' : text, 'twitter_id':twitter_id }
collection.update(identifier, entries, upsert = True)
These scripts have been working well for me, but I have to run the same script for multiple twitter handles. For instance I'll copy the same script and change the following two lines:
SCREEN_NAME = '#cocacola'
collection = db.cocacola_mentions
Thus I'm getting mentions for both #twitterapi and #cocacola. I've thought a lot about how I can make this into a function. The biggest problem that I've run into is finding a way to change the name of the collection. For instance, consider this script:
#!/usr/local/bin/python
import twitter
import datetime
from datetime import date, timedelta, datetime
import pymongo
from pymongo import Connection
def getMentions(screen_name):
# Connect to the database
connection = Connection()
db = connection.test
collection = db.screen_name # Change the name of this database
t = twitter.Twitter(domain='search.twitter.com')
# Fetch the information from the API
results = []
for i in range(2):
i+=1
response = t.search(q=screen_name, result_type='recent', rpp=100, page=i) ['results']
results.extend(response)
# Create a document in the database for each item taken from the API
for tweet in results:
id_str = tweet['id_str']
twitter_id = tweet['from_user']
tweetlink = "http://twitter.com/#!/%s/status/%s" % (twitter_id, id_str)
created_at = datetime.strptime(tweet['created_at'], "%a, %d %b %Y %H:%M:%S +0000")
date = created_at.date().strftime("%m/%d/%y")
time = created_at.time().strftime("%H:%M:%S")
text = tweet['text']
identifier = {'id' : id_str}
entries = {'id' : id_str, 'tweetlink' : tweetlink, 'date' : date, 'time' : time, 'text' : text, 'twitter_id':twitter_id }
collection.update(identifier, entries, upsert = True)
getMentions("#twitterapi")
getMentions("#cocacola")
If I use the above script then all of the data is stored in the collection "screen_name" but I want it to be stored in the screen name that is passed through. Ideally, I want #twitterapi mentions to be in a "twitterapi_mentions" collection and I want #cocacola mentions to be in a "cocacola_mentions" collection. I believe that using the Collection class of pymongo might be the answer and I've read the documentation but can't seem to get it to work. If you have other suggestions of how I should make this script more efficient they would be incredibly appreciated. Otherwise, please excuse any mistakes I've made, as I said, I'm new to this.
Use getattr to retrieve the attribute by string name:
collection = getattr(db, screen_name)
I'd go with:
collection = db[screen_name]
I think it's more straightforward.
I am running into a problem when running the follow python script on the server looking for commit information for the push making sure it follows a particular syntax, I am unable to get input from the user which is why the username and password are hard coded. I am now also unable to get the list of commit message that occurred before this particular push.
#!/usr/bin/python
import SOAPpy
import getpass
import datetime
import sys
import re
import logging
import os
def login(x,y):
try:
auth = soap.login(x, y)
return auth
except:
sys.exit( "Invalid username or password")
def getIssue(auth,issue):
try:
issue = soap.getIssue(auth, issue)
except:
sys.exit("No issue of that type found : Make sure all PRs are vaild jira PRs")
def git_get_commit_msg(commit_id):
return get_shell_cmd_output("git rev-list --pretty --max-count=1 " + commit_id)
def git_get_last_commit_id():
return get_shell_cmd_output("git log --pretty=format:%H -1")
def getCommitText():
commit_msg_filename = sys.argv[1]
try:
commit_msg_text = open(commit_msg_filename).read()
return commit_msg_text
except:
sys.exit("Could not read commit message")
def git_get_array_of_commit_ids(start_id, end_id):
output = get_shell_cmd_output("git rev-list " + start_id + ".." + end_id)
if output == "":
return None
commit_id_array = string.split(output, '\n')
return commit_id_array
def get_shell_cmd_output(cmd):
try:
proc = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE)
return proc.stdout.read().rstrip('\n')
except KeyboardInterrupt:
logging.info("... interrupted")
except Exception, e:
logging.error("Failed trying to execute '%s'", cmd)
def findpattern(commit_msg):
pattern = re.compile("\w\w*-\d\d*")
group = pattern.findall(commit_msg)
print group
found = len(group)
found =0
issues = 0
for match in group:
auth = soap.login(jirauser,passwd)
getIssue(auth,match)
issues = issues + 1
found+=1
if found ==0:
sys.exit("No issue patterns found.")
print "Retrieved issues: " + str(issues)
def update():
print sys.argv[2]
print sys.argv[3]
old_commit_id = sys.argv[2]
new_commit_id = sys.argv[3]
commit_id_array = git_get_array_of_commit_ids(old_commit_id, new_commit_id)
for commit_id in commit_id_array:
commit_text = git_get_commit_msg(commit_id)
findpattern(commit_text)
soap = SOAPpy.WSDL.Proxy('some url')
# this line if for repointing the input from dev/null
#sys.stdin = open('/dev/tty', 'r') # this fails horribly.
#ask user for input
#jirauser = raw_inp
#("Username for jira: ")
jirauser = "username"
passwd = "987654321"
#passwd = getpass.getpass("Password for %s: " % jirauser)
login(jirauser,passwd)
#commit_msg = getCommitText()
#findpattern(commit_msg)
update()
The intended goal of this code is to check the commits made locally, and to parse through them for the intended pattern, as well as checking the in jira if that PR exists. it is a server side hook that get activated on a push to the repository.
Any tips on writing python hooks would be appreciated. Please and thank you.
I suggest that you have a look at gitorious (http://gitorious.org/gitorious).
They use ssh to handle authentication and rights management (getting the username given by ssh).
They also have some hooks on git repositories. I guess it could help to see how they are processing git hooks using ruby.
By the time your update hook fires, the server has the new commits: the question is whether your hook will allow the ref in question to move. What information from the local (sending) repository do you want?
For the credentials issue, funnel everyone through a single user. For example, GitHub does it with the git user, which is why their SSH URLs begin with git#github.com:.... Then in ~git/.ssh/authorized_keys, associate a username with each key. Note that the following should be on a single line but is wrapped for presentation purposes.
no-agent-forwarding,no-port-forwarding,no-pty,no-X11-forwarding,
command="env myuser=gbgcoll /usr/bin/git-shell -c \"${SSH_ORIGINAL_COMMAND:-}\""
ssh-rsa AAAAB...
Now to see who's trying to do the update, your hook examines the $myuser environment variable.
This doesn't give you each user's Jira credentials. To solve that issue, create a dummy Jira account that has read-only access to everything, and hardcode that Jira account's credentials in your hook. This allows you to verify that a given PR exists.