Pass command line variables between Python files. Totally stuck - python

I have what seems like a very easy problem with an easy solution just beyond my reach.
My setup:
A) Driver file (runs the test script)
B) Connection file (using Requests)
C) Parameters file
The paramenters file has 6 variables with things like server IP, login, pass etc.
The Driver file has a praser which reads the properties file and fills in the blanks.
driver.py paramtersfile.csv
This works fine. However, I added a PORT variable to the parameters file which needs to be seen by B) Connection file. This connections file is never called explicitly, rather just imported into the driver file for its connection and cookie methods.
How do I carry over the parsed variables (from sys.argv) from paramtersfile.csv to the Connections file (or any other file which is used to run my script?
Thank you stackoverflow community
Edit:
I got it to work using the obvious way of passing on the arguments into the class (self.foo) of whatever module/file I needed.
My question from before was along the lines of this idea:
You do something like
loadproperties(propertiesfile)
then from any other python script you could just do
import propertyloader
which would load a list of immutable properties into the current space
Seems very convenient to just do
url = propertyloader.url
instead of
class Connect (host, port, pass, url):
self.url = url
loader = requests(secure, url)
blah blah blah...
Seems like a headache free way of sharing common parameters between different parts of the script.
Maybe there's still a way of doing this (extra credit question)

From the driver.py file, import the connections file as a module and pass the arguments you've parsed to the methods inside the module. Something like this:
#===inside driver.py===
import connections
params = parseFile(sys.argv) #get parameters from the csv file that is passed to the command line
connections.connect(params) #pass them to whatever method you need to call from connections
EDIT: It sounds like you're not writing your code in a modular way. You should stop thinking about your code in terms of files, but instead think of them in terms of modules: bits and pieces of interchangeable code that you can use in many different places. The main flaw with your design that I see (forgive me if I didn't understand correctly) is that you're hard-coding a value inside the connections file that you use to create connection objects. I'm guessing this is what your code looks like (or at least captures the spirit of your code adequately):
#connections.py
MY_SERVER = ??? #WHAT DO I SET THIS TO?
class Connection:
def __init__(self):
self.server = MY_SERVER
def connect():
connection = Connection() #create a new connection object
The above code is not designed well since you're defining a variable MY_SERVER that shouldn't be defined there in the first place! The connections class doesn't know or care what server it should use, it should work with any server. So where do you get the server variable? You pass it in via a constructor or a method. You could do something like this:
#connections.py
class Connection:
def __init__(self, server):
self.server = server
def connect(server):
connection = Connection(server) #create a new connection object with the server passed to the method
With this design, the Connection object becomes much more flexible. It is basically saying "I am a connection object that can handle any server. If you want to use me, just tell me what server you want to use!".
This way, in your drivers file you can first parse the server from your csv, and then simply call the method connections.connect by passing it the server you want!

Related

How to store an object between script runs?

I'm using a python script based on this Mint API to pull personal finance info every hour.
To connect to Mint I do mint = mintapi.Mint(email, password) which opens a Chrom instance via selenium and logs into Mint, and creates an object of <class 'mintapi.api.Mint'>
To refresh info I just need to do mint.initiate_account_refresh().
But every time I run the script, it does the whole logging in thing again.
Can I store the mint object on disk somehow so I can skip that step and just do the account refresh?
Ah the wonders of open source.
Curious, I went and looked at the mintapi you linked, to see if there was anything obvious and simple I could do to recreate the object instance without the arduous setup.
It turns out there isn't, really. :(
Here is what is called when you instantiate the Mint object:
def __init__(self, email=None, password=None):
if email and password:
self.login_and_get_token(email, password)
As you can see, if you don't give it a truthy email and password, it doesn't do anything. (As a sidenote, it should really be checking is None, but whatever).
So, we can avoid having do go through the setup process nice and easily, but now we need to find out how to fake the setup process based on previous data.
Looking at .login_and_get_token(), we see the following:
def login_and_get_token(self, email, password):
if self.token and self.driver:
return
self.driver = get_web_driver(email, password)
self.token = self.get_token()
Nice and simple, again. If it already has a token, it's done, so it goes away. If not, it sets a driver, and sets .token by calling .get_token().
This makes the whole process really easy to override. Simply instantiate a Mint object with no arguments like so:
mint = mintapi.Mint()
Then set the .token on it:
mint.token = 'something magical'
Now you have an object that is in an almost ready state. The problem is that it relies on self.driver for basically every method call, including your .initiate_account_refresh():
def initiate_account_refresh(self):
self.post(
'{}/refreshFILogins.xevent'.format(MINT_ROOT_URL),
data={'token': self.token},
headers=JSON_HEADER)
...
def post(self, url, **kwargs):
return self.driver.request('POST', url, **kwargs)
This looks like it's a simple POST that we could replace with a requests.post() call, but I suspect that seeing as it's doing it through the web browser that it's relying on some manner of cookies or session storage.
If you wanted to experiment, you could subclass Mint like so:
class HeadlessMint(Mint):
def post(self, url, **kwargs):
return requests.post(url, **kwargs)
But my guess is that there will be more issues with this that will surface over time.
The good news is that this mintapi project looks reasonably simple, and rewriting it to not rely on a web browser doesn't look like an unreasonable project for someone with a little experience, so keep that in your back pocket.
As for pickling, I don't believe that will work, for the same reason that I don't believe subclassing will - I think the existence of the browser is important. Even if you pickle your mint instance, it will be missing its browser when you try to load it.
The simplest solution might very well be to make the script a long-running one, and instead of running it every hour, you run it once, and it does what it needs to, then sleeps for an hour, before doing it again. This way, you'd log in once at the very beginning, and then it could keep that session for as long as it's running.
To store objects in Python you can use the pickle module.
Let's say you have an object mint
import pickle
mint = Something.Somefunc()
with open('data.pickle','wb') as storage:
pickle.dump(mint,storage)
The object will be saved as a sequence of binary bytes in a file named data.pickle.
To access it just use the pickle.load() function.
import pickle
with open('data.pickle','rb') as storage:
mint = pickle.load(storage)
>>>mint
>>><class 'something' object>
NOTE:
Although it doesn't matter here but the pickle module has a flaw that it can execute some function objects while loading them from a file, so don't use it when reading pickle stored object from a third party source.
Use a library pickle to save and load object
SAVE
import pickle
mint = mintapi.Mint(email, password)
with open('mint .pkl', 'wb') as output:
pickle.dump(mint , output, pickle.HIGHEST_PROTOCOL)
LOAD
import pickle
with open('mint.pkl', 'rb') as input:
mint= pickle.load(input)
mint.initiate_account_refresh()

Strange way to pass data between modules in Python: How does it work?

I'm supposed to work with some messy code that I haven't written myself, and amidst the mess I found out two scripts that communicate by this strange fashion (via a 3rd middleman script):
message.py, the 'middleman' script:
class m():
pass
sender.py, who wants to send some info to the receiver:
from message import m
someCalculationResult = 1 + 2
m.result = someCalculationResult
receiver.py, who wants to print some results produced by sender.py:
from message import m
mInstance = m()
print mInstance.result
And, by magic, in the interpreter, importing sender.py then receiver.py does indeed print 3...
Now, what the hell is happening behind the scenes here? Are we storing our results into the class definition itself and recovering them via a particular instance? If so, why can't we recover the results from the definition itself also? Is there a more elegant way to pass stuff inbetween scripts ran sucessively in the interpreter?
Using Python 2.6.6
That is just a convoluted way to set a global.
m is a class, m.result a class attribute. Both the sender and receiver can access it directly, just as they can access m.
They could have done this too:
# sender
import message
message.result = someCalculationResult
# receiver
import message
print message.result
Here result is just a name in the message module top-level module.
It should be noted that what you are doing is not running separate scripts; you are importing modules into the same interpreter. If you ran python sender.py first, without ever importing reciever.py, then separately running python receiver.py without ever importing sender.py this whole scheme doesn't work.
There are myriad ways to pass data from one section of code to another section, too many to name here, all fitting for a different scenario and need. Threading, separate processes, separate computers all introduce different constraints on how message passing can and should take place, for example.

How to connect to a mongoDB

I'm using mongoLabs to host my database and I want to connect to it from my app.
I'm also using the Motor module in pyMongo. I'm unsure where to instantiate the connection.
For instance I know that if the database was on same local machine as the app I would do this:
database = motor.MotorClient().open_sync().myDatabase
The mongoLab site says to include the following uri in the driver:
mongodb://<dbuser>:<dbpassword>#ds047057.mongolab.com:47057/myDatabase
But I cannot see how to create the connection to this database.
Thanks
It looks like MotorClient takes the same arguments as MongoClient:
https://github.com/ajdavis/mongo-python-driver/blob/motor/motor/init.py#L782
http://api.mongodb.org/python/current/api/pymongo/mongo_client.html
Given that, you should be able to use the URI like so:
database = motor.MotorClient("mongodb://<dbuser>:<dbpassword>#ds047057.mongolab.com:47057/myDatabase").open_sync().myDatabase
You should to specify connection settings for MotorClient following these manuals:
MotorClient takes the same constructor arguments as MongoClient, as well as, http://emptysquare.net/motor/pymongo/api/motor/motor_client.html#motor.MotorClient,
http://emptysquare.net/motor/pymongo/api/pymongo/mongo_client.html#pymongo.mongo_client.MongoClient
"The host parameter can be a full mongodb URI, in addition to a simple
hostname. It can also be a list of hostnames or URIs. Any port
specified in the host string(s) will override the port parameter. If
multiple mongodb URIs containing database or auth information are
passed, the last database, username, and password present will be
used. For username and passwords reserved characters like ‘:’, ‘/’,
‘+’ and ‘#’ must be escaped following RFC 2396."
db = database = motor.MotorClient('mongodb://<dbuser>:<dbpassword>#ds047057.mongolab.com:47057/myDatabase
').open_sync().myDatabase
Previous answers have got a bit outdated, so the correct way according to the docs and as worked for me:
import motor.motor_asyncio
import asyncio
from asyncio import coroutine
db = motor.motor_asyncio.AsyncIOMotorClient().database_name
https://motor.readthedocs.io/en/stable/tutorial-asyncio.html
https://github.com/mongodb/motor/blob/master/doc/tutorial-asyncio.rst

Basic example of AMI connection to Asterisk from Python script w/Starpy

I have some years of solid experience working with asterisk but am new to python.
I want to connect from a python script and receive some events. I have created a manager user with AMIUSERNAME and AMIPASSWORD as credentials and tested working OK. I have also installed StarPy.
Then I run with the command python ami.py USERNAME PASSWORD the following script:
import sys
from starpy import manager
f = manager.AMIFactory(sys.argv[1], sys.argv[2])
df = f.login('127.0.0.1',5038)
While monitoring the asterisk console and nothing happens.
Does anyone know what I am missing?
I would like to send a Ping action and wait for a Pong response.
I suppose that f.login() returns you an AMIProtocol instance that has a ping() method.
I don't know anything about starpy, so some vague advice:
Start Python as an interactive shell. Execute code and examine results on the spot. help function is your friend; try help(df) after the last line of your script.
Look at the examples directory in starpy distribution. Maybe 90% of the code you need is already there.
The following is pulled from the ami module (and a few other places) in the Asterisk Test Suite. We use starpy extensively throughout the Test Suite, so you may want to check it out for some examples. Assume that the following code resides in some class with member method login.
def login(self):
def on_login_success(self, ami):
self.ami_factory.ping().addCallback(ping_response)
return ami
def on_login_error(self, reason):
print "Failed to log into AMI"
return reason
def ping_response(self, ami)
print "Got a ping response!"
return ami
self.ami_factory = manager.AMIFactory("user", "mysecret")
self.ami_factory.login("127.0.0.1", 5038).addCallbacks(on_login_success, on_login_error)
Make sure as well that your manager.conf is configured properly. For the Asterisk Test Suite, we use the following:
[general]
enabled = yes
webenabled = yes
port = 5038
bindaddr = 127.0.0.1
[user]
secret = mysecret
read = system,call,log,verbose,agent,user,config,dtmf,reporting,cdr,dialplan,test
write = system,call,agent,user,config,command,reporting,originate

Why does creating a neo4j.GraphDatabase from within a Paste app cause a segfault?

The following code causes Java to segfault:
import os.path
import neo4j
from paste import httpserver, fileapp
import tempfile
from webob.dec import wsgify
from webob import Response, Request
HOST = '127.0.0.1'
PORT = 8080
class DebugApp(object):
#wsgify
def __call__(self, req):
# db = neo4j.GraphDatabase(tempfile.mkdtemp())
db = neo4j.GraphDatabase(os.path.join(os.path.dirname(os.path.abspath(__file__)), 'data'))
return Response(body='it worked')
def main():
app = DebugApp()
httpserver.serve(app, host=HOST, port=PORT)
if __name__ == '__main__':
main()
To reproduce, first save that code into a file (say, app.py), and then run python app.py. Then try http://localhost:8080 in your browser; you should see the Java crash handler.
The top of the Java stack trace looks like this:
Stack: [0xb42e7000,0xb4ae8000], sp=0xb4ae44f0, free space=8181k
Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code)
C [_jpype.so+0x26497] JPJavaEnv::NewObjectA(_jclass*, _jmethodID*, jvalue*)+0x37
C [_jpype.so+0x3c0e8] JPMethodOverload::invokeConstructor(_jclass*, std::vector<HostRef*, std::allocator<HostRef*> >&)+0x178
C [_jpype.so+0x3a417] JPMethod::invokeConstructor(std::vector<HostRef*, std::allocator<HostRef*> >&)+0x47
C [_jpype.so+0x1beba] JPClass::newInstance(std::vector<HostRef*, std::allocator<HostRef*> >&)+0x2a
C [_jpype.so+0x67b9c] PyJPClass::newClassInstance(_object*, _object*)+0xfc
C [python+0x96822] PyEval_EvalFrameEx+0x4332
C [python+0x991e7] PyEval_EvalCodeEx+0x127
I believe that's neo4j.GraphDatabase in Python triggering JPype to go looking for EmbeddedGraphDatabase in neo4j, under Java.
Running this code in an interactive Python session doesn't segfault:
>>> import webob
>>> import app
>>> debug_app = app.DebugApp()
>>> response = debug_app(webob.Request.blank('/'))
>>> response.body
'it worked'
Presumably that's because I'm avoiding Paste altogether in that example. Perhaps this has something to do with Paste's use of threads getting in the way of neo4j? I noted a somewhat similar problem in the neo4j forums: http://neo4j-community-discussions.438527.n3.nabble.com/Neo4j-CPython-Pylons-and-threading-td942435.html
...but that only occurs on shutdown.
The issue is not with Paste per se, but with the neo4j Python bindings, which use JPype. Paste creates threads to handle incoming requests; neo4j is supposed to be thread-safe, but JPype comes with this caveat from the documentation (1):
"For the most part, python threads based on OS level threads (i.e posix threads), will work without problem. The only thing to remember is to call jpype.attachThreadToJVM() in the thread body to make the JVM usable from that thread. For threads that you do not start yourself, you can call isThreadAttachedToJVM() to check."
I couldn't find the code that does this, but I think that some of the Java code in the neo4j bindings may call attachThreadToJVM at import time. If so, when a request is handed to a worker thread by paste, and that thread then goes to fetch data from neo4j, it is crossing thread boundaries, and the JVM attachment rule may not be satisfied.
You can avoid the crash by only running import neo4j from within a single thread. In the case above, this is the callable targeted by threading.Thread.
Unfortunately, this means that even though neo4j is thread-safe, it must be constrained to a single thread when used from Python. But that's not too disappointing, considering.
Update: the maintainers responded(2) and investigated the problem, and checked in a fix. I don't know which release of neo4j this was available in, and I can no longer find the commit to their github repo(3), so this stands for re-testing.

Categories