python lambda can't detect packaged modules - python

I'm trying to create a lambda function by uploading a zip file with a single .py file at the root and 2 folders which contain the requests lib downloaded via pip.
Running the code local works file. When I zip and upload the code I very often get this error:
Unable to import module 'main': No module named requests
Sometimes I do manage to fix this, but its inconsistent and I'm not sure how I'm doing it. I'm using the following command:
in root dir zip -r upload.zip *
This is how I'm importing requests:
import requests
FYI:
1. I have attempted a number of different import methods using the exact path which have failed so I wonder if thats the problem?
2. Every time this has failed and I've been able to make it work in lambda, its involved a lot of fiddling with the zip command as I thought the problem was I was zipping the contents incorrect and hiding them behind an extra parent folder.
Looking forward to seeing the silly mistake i've been making!
Adding code snippet:
import json ##Built In
import requests ##Packaged with
import sys ##Built In
def lambda_function(event, context):
alias = event['alias']
message = event['message']
input_type = event['input_type']
if input_type == "username":
username = alias
elif input_type == "email":
username = alias.split('#',1)[0]
elif input_type is None:
print "input_type 'username' or 'email' required. Closing..."
sys.exit()
payload = {
"text": message,
"channel": "#" + username,
"icon_emoji": "<an emoji>",
"username": "<an alias>"
}
r = requests.post("<slackurl>",json=payload)
print(r.status_code, r.reason)

I got some help outside the stackoverflow loop and this seems to consistently work.
zip -r upload.zip main.py requests requests-2.9.1.dist-info

Related

How to import manager from main without using circular import?

I am trying to import the manager of the login route. It is in python. This is in fastapi. I am using a router so that I can seperate the main file into different pieces. I am trying to make sure that the user is logged in before he/she can see the item page. For this, I need to get the manager from the main file by importing. But, I have already imported the router from the main file, so I cannot import it back, for the system would throw a circular import error.
Here is my code:
In the router's function:
#router.get("/page/items/item/{id}", response_class=HTMLResponse)
def item_page(request: Request, id: int, db: Session = Depends(get_db), user: database_files.models.User = Depends(manager)):
cursor.execute("""SELECT * FROM ratings;""")
reviews = cursor.fetchall()
item = db.query(database_files.models.GamingItem).filter(database_files.models.GamingItem.id == id).first()
return templates.TemplateResponse("item_page_gaming.html", {"request": request, "item": item, "user": user, "reviews": reviews, "title": f"XTreme Gaming - {item.name}"})
I have to import manager, right? There is a main file already importing this so that it can include this router into its own code. So, there would be a circular import. Also, I know what you would put immediately. You would say, "don't import directly like 'from main import manager,' import like 'import main.' Then, you can put 'main.manager' in your dependency. But no, that would not work. This is because the main file is throwing an error when I do that. It says: "Error in loading ASGI app. Could not import module 'main'" I would be very happy if you can provide a solution for this- it would be a huge help to me, for this problem has been stuck with me for a while(some weeks, almost months!)

Google Cloud Function Python installing dependency on each request

requirements.txt:
pymystem3==0.2.0
main.py:
from pymystem3 import Mystem
import json
def hello_world(request):
request_json = request.get_json()
if request_json and 'text' in request_json:
text = request_json['text']
m = Mystem()
lemmas = m.lemmatize(text)
return json.dumps(m.analyze(text), ensure_ascii=False, encoding='utf8', sort_keys=True, indent=2)
else:
return 'No text provided'
Logs in google cloud:
hello_world
032ljsdfawo
Function execution started
hello_world
032ljsdfawo
Installing mystem to /tmp/.local/bin/mystem from http://download.cdn.yandex.net/mystem/mystem-3.1-linux-64bit.tar.gz
Expand all | Collapse all {
insertId: "000000-a9eaa07b-d936-45d8-a186-927cc94246d6"
labels: {…}
logName: "projects/project_name/logs/cloudfunctions.googleapis.com%2Fcloud-functions"
receiveTimestamp: "2018-09-26T09:41:31.070927654Z"
resource: {…}
severity: "ERROR"
textPayload: "Installing mystem to /tmp/.local/bin/mystem from http://download.cdn.yandex.net/mystem/mystem-3.1-linux-64bit.tar.gz"
timestamp: "2018-09-26T09:41:19.595Z"
}
I'm new to python, so it's probably a very simple mistake. Maybe dependencies suppose to be installed before deploying the function somehow and compiled with main.py?
This is due to unfortunate behavior by the pymsystem library: instead of including everything it needs in it's Python package, it's attempting to download additional things at runtime, specifically every time you import it.
You can see that the message in your logs is coming from the library itself.
It seems like the library is attempting to determine if it has already been installed or not, but this might not be working correctly. I'd recommend filing an issue on the project's issue tracker.

Lambda function - How to split python code across multiple files

Problem
I'm trying to split my python code for a lambda function across multiple files however any attempt to import the other relative modules throws an error for the top level module.
{
"errorMessage": "Unable to import module 'download_ga_data'"
}
What am I doing wrong? This feels like it should be super basic.
File structure layout (shown from root)
- download_ga_data.py
- [analytics]
- google.py (contains a single class)
- __init__.py
- [helpers]
- main.py (contains a single class)
- __init__.py
- {other libraries from site-packages}
Contents of download_ga_data.py
# import unicodecsv as csv
import os
# import path
from . import definitions
from analytics.google import GoogleAnalytics
from helpers.main import GoogleCloudStorageBucket
def lambda_handler(event, context):
print("test")
This as it stands will throw the error. If I comment out the three imports after os, then it will function correctly.
How should I correctly import these two modules, I feel like I'm missing something super basic.
Environment notes
This is all built on a the following lambda mimicking docker and uploaded straight into S3. All the files are 777 to bypass any permissions errors.
Ok finally solved this. The main tip I received to help with this was to wrap my load functions in a try except block:
try:
import definitions
from analytics.google import GoogleAnalytics
from cloud_helpers.main import GoogleCloudStorageBucket
except Exception as e:
error = str(e)
stacktrace = json.dumps(traceback.format_exc())
message = "Exception: " + error + " Stacktrace: " + stacktrace
err = {"message": message}
return respond(err)
def respond(err, res=None):
return {
"statusCode": "400" if err else "200",
"body": err["message"] if err else json.dumps(res),
"headers": {"Content-Type": "application/json"},
}
This then revealed the following error (which I irritatingly only have a screenshot of):
Which was then solved by switching to import definitions

Flask not writing to file

I've been meaning to log all the users that visit the site to a file.
Using Flask for the backend.
I have not been able to get python to write to the file. Tried keeping exception handling to catch any errors that might be generated while writing. No exceptions are being raised.
Here is the part of the blueprint that should write to file.
from .UserDataCache import UserDataCache
udc = UserDataCache()
#main.route('/')
def index():
s = Suggestion.query.all()
udc.writeUsertoFile()
return render_template('suggestions.html', suggestions = s)
Here is the UserDataCache class:
from flask import request
from datetime import datetime
class UserDataCache():
def __init__(self):
pass
def writeUsertoFile(self):
try:
with open("userData.txt","a") as f:
f.write(str(datetime.now()) + " " + request.remote_addr + " " + request.url + " " + request.headers.get('User-Agent') + "\n")
except IOError,e:
print e
return
I recommend using an absolute path and verifying the permissions on that file. Something like /tmp/UserData.txt or another absolute path should work. The web server's user is what needs the permission to write to the file (www-data if you're using apache2 with Ubuntu, or check your web server's conf file to verify).
As far as why you're not seeing the exception you're catching, I see you're using print. If you're calling the app using a web browser, you'll need to send the error to something else, like a log file or flash it to the browser, or raise an error so it gets logged in the web server error log.
Is your python file name begins with uppercase? If so, try to modify it into lowercase.
I just came into the same problem and copied the exactly same code into two .py file. The only difference is their file name, one being 'Flask_test.py' and another being 'flask_for_test.py'. It's weird that 'Flask_test.py' works just fine except it cannot write into any file and 'flask_for_test.py' works perfectly.
I don't know whether the format of the file name has an effect on the function of python but using lowercase file name works for me.
By the way, all other solutions I found didn't work.

Python & gdata within Django app: "POST method does not support concurrency"

I am trying to use gdata within a Django app to create a directory in my google drive account. This is the code written within my Django view:
def root(request):
from req_info import email, password
from gdata.docs.service import DocsService
print "Creating folder........"
folder_name = '2015-Q1'
service_client = DocsService(source='spreadsheet create')
service_client.ClientLogin(email, password)
folder = service_client.CreateFolder(folder_name)
Authentication occurs without issue, but that last line of code triggers the following error:
Request Method: GET
Request URL: http://127.0.0.1:8000/
Django Version: 1.7.7
Exception Type: RequestError
Exception Value: {'status': 501, 'body': 'POST method does not support concurrency', 'reason': 'Not Implemented'}
I am using the following software:
Python 2.7.8
Django 1.7.7
PyCharm 4.0.5
gdata 2.0.18
google-api-python-client 1.4.0 (not sure if relevant)
[many other packages that I'm not sure are relevant]
What's frustrating is that the exact same code (see below) functions perfectly when I run it in its own, standalone file (not within a Django view).
from req_info import email, password
from gdata.docs.service import DocsService
print "Creating folder........"
folder_name = '2015-Q1'
service_client = DocsService(source='spreadsheet create')
service_client.ClientLogin(email, password)
folder = service_client.CreateFolder(folder_name)
I run this working code in the same virtual environment and the same PyCharm project as the code that produced the error. I have tried putting the code within a function in a separate file, and then having the Django view call that function, but the error persists.
I would like to get this code working within my Django app.
I don't recall if I got this to work within a Django view, but because Google has since required the use of Oauth 2.0, I had to rework this code anyways. I think the error had something to do with my simultaneous use of two different packages/clients to access Google Drive.
Here is how I ended up creating the folder using the google-api-python-client package:
from google_api import get_drive_service_obj, get_file_key_if_exists, insert_folder
def create_ss():
drive_client, credentials = get_drive_service_obj()
# creating folder if it does not exist
folder = get_file_key_if_exists(drive_client, 'foldername')
if folder: # if folder exists
print 'Folder "' + folder_name + '" already exists.'
else: # if folder doesn't exist
print 'Creating folder........"' + folder_name + '".'
folder = insert_folder(drive_client, folder_name)
After this code, I used a forked version (currently beta) of sheetsync to copy my template spreadsheet and populate the new file with my data. I then had to import sheetsync after the code above to avoid the "concurrency" error. (I could post the code involving sheetsync here too if folks want, but for now, I don't want to get too far off topic.)

Categories