I want to call methods and get/set instance variables on an instance of a given python class from another process. All of the class methods and variables accept/return simple python dictionaries or lists (specifically it is the P4Python API - I can't use the perforce c++ interop and need the option to call this from another host)
I'd like do this via SOAP or passing json back and forth. My first target is to have mono consume the python class. I am toying with the idea of writing my own bindings generator using python's inspect module that would spit out c# files for my python class.
Have I missed anything out there that already lets me do this? pywebsvcs looks quite close! Could I generate a wsdl file from this?
Does it have to be SOAP or JSON? I think something similar is quite simple with xmlrpc (which comes with python). I'm using it a lot.
Related
I would like, given a python module, to monkey patch all functions, classes and attributes it defines. Simply put, I would like to log every interaction a script I do not directly control has with a module I do not directly control. I'm looking for an elegant solution that will not require prior knowledge of either the module or the code using it.
I found several high-level tools that help wrapping, decorating, patching etc... and i've went over the code of some of them, but I cannot find an elegant solution to create a proxy of any given module and automatically proxy it, as seamlessly as possible, except for appending logic to every interaction (record input arguments and return value, for example).
in case someone else is looking for a more complete proxy implementation
Although there are several python proxy solutions similar to those OP is looking for, I could not find a solution that will also proxy classes and arbitrary class objects, as well as automatically proxy functions return values and arguments. Which is what I needed.
I've got some code written for that purpose as part of a full proxying/logging python execution and I may make it into a separate library in the future. If anyone's interested you can find the core of the code in a pull request. Drop me a line if you'd like this as a standalone library.
My code will automatically return wrapper/proxy objects for any proxied object's attributes, functions and classes. The purpose is to log and replay some of the code, so i've got the equivalent "replay" code and some logic to store all proxied objects to a json file.
I am using Selenium in Python to create an automated test. In this test, I am trying to select a file from a local directory. I was able to find a reference using Java but I am struggling to convert this to Python. https://sqa.stackexchange.com/questions/12851/how-can-i-work-with-file-uploads-during-a-webdriver-test
element=driver.find_element_by_id("file_browse").click()
driver.file_detector("<>")
upload=driver.find_element_by_id("<>")
keys=upload.send_keys("<>")
For the file detector function I keep getting that the object is not callable. What should the input be for it?
Thanks!
Just remove this line:
driver.file_detector("<>")
Python's remote webdriver uses LocalFileDetector() by default. That seems to be what you want, looking at the linked Java example.
If you need to override the default, you can use or subclass one of the available file detectors from selenium.webdriver.remote.file_detector
There doesn't seem to be any documentation of how to use FileDetector, but the source code is quite short and straightforward.
from selenium.webdriver.remote.file_detector import UselessFileDetector
driver.file_detector = UselessFileDetector()
Python's ideom for setting object members is simply to use the assignment operator (=) instead of calling a set method, as you would in Java.
I have a Python program, in which there's a function that should be able to accept a Python lambda, and then pass to a running C++ program.
I was thinking of using pickle.dumps to serialize the lambda into a string. Then I just pass the string to the C++ program, after which I do deserialization using boost.python and get the lambda in C++. However, I just realized that functions cannot be pickled.
I believe there should be existing solutions that I'm not aware of. Thanks in advance for any advice!
Have a look at the C API tutorial page. The link goes to the section about calling Python functions from C. It sounds like you're new to the C API, so start at the top.
Python does not distinguish between the various types of callable objects. Functions, lambdas, objects that have a __call__ method are interchangeable as far as the C API is concerned.
I won't copy/paste everything here as it's very well written in the link, but basically you make a Python module in C (or C++, the API works in both). The first example shows you how. Then you can call a C function from Python, and pass it whatever you need to pass it. All Python objects are exposed to C via PyObject* pointers, and the C API provides many functions to manipulate the objects, convert to/from C datatypes, and do things like call callable objects.
Wrappers like boost.python and SWIG use this API internally.
I am curious of is it a way to deal with Avro Python in the same way as in Java or C++ implementations.
According to the official Avro Python documentation, I have to provide an Avro schema in runtime to encode/decode data. But is it a way to use code generator as it did in Java/C++?
Update: My coworker put together a pretty good library for doing this, avro-to-python. We have been using it in production for over a year now on some pretty complex schemas.
I had to implement something like this for php: avro-to-php
pyschema is a pretty good start, but the documentation is poor. You'll need to look a the source code to see how it all works. You can use it to read avro schemas and generate python source code. It adds another layer of abstraction and as such slows things down a bit more.
I've asked this question a couple of times recently in the Pulsar slack channel and my belief is that no tool currently exists that can convert an Avro schema to a Python class that is compatible with the Pulsar Python client library.
The Pulsar Python client library expects the Python class to inherit from the Record class (https://github.com/apache/pulsar/blob/master/pulsar-client-cpp/python/pulsar/schema/definition.py#L57), and for every field in the Python class to inherit from the Field class (https://github.com/apache/pulsar/blob/master/pulsar-client-cpp/python/pulsar/schema/definition.py#L141), both of which are defined in the Pulsar Python client library.
So, an Avro to Python converter would have to import the Record class and Field class from the Python client library, and so if such a converter exists, someone in the Pulsar Slack community really should know about it.
Further, the Pulsar Python client library is missing support for Avro keywords like "doc", "namespace", and for null default values. So even if an Avro to Python converter exists for Pulsar, likely, the converted Python class cannot be properly consumed by the Pulsar Python client library.
I don't see any indication of an existing Avro schema -> Python class code generator in the docs (which explicitly mention code generation for the Java case) for arbitrary Python interpreters. If you're using Jython, you could use the Java code generator to make a class that you access in your Jython code.
Unlike Java and C++, failing to have code generation doesn't affect Python performance much (in the CPython case anyway), since class instances are implemented in terms of dicts anyway (there are exceptions to this rule in a sense, but they mostly change memory usage, not the fact that dict lookup is always involved). That makes code generation largely "nice to have" syntactic sugar, not a necessary feature for development; with some effort, you could always implement a converter than writes out a class definition and evals it in Python to get a similar effect (this is how collections.namedtuple classes are defined).
I am attempting to re-organize our test libraries for automation and nose seems really promising. My question is, what is the best strategy for passing Python objects into nose tests?
Our tests are organized in a testlib with a bunch of modules that exercise different types of request operations. Something like this:
testlib
\-testmoda
\-testmodb
\-testmodc
In some cases the test modules (i.e. testmoda) is nothing but test_something(), test_something2() functions while in some cases we have a TestModB class in testmob with the test_anotherthing1(), test_anotherthing2() functions. The cool thing is that nose easily finds both.
Most of those test functions are request factory stuff that can easily share a single connection to our server farm. Thus we do a lot of test_something1(cnn), TestModB.test_anotherthing2(cnn), etc.
Currently we don't use nose, instead we have a hodge-podge of homegrown driver scripts with hard-coded lists of tests to execute. Each of those driver scripts creates its own connection object. Maintaining those scripts and the connection minutia is painful.
I'd like to take free advantage of nose's beautiful discovery functionality, passing in a connection object of my choosing.
Thanks in advance!
Rob
P.S. The connection objects are not pickle-able. :(
Could you use a factory create the connections, then have the functions test_something1() (taking no arguments) use the factory to get a connection?
As far as I can tell, there is no easy way to simply pass custom objects to Nose.
However, as Matt pointed out there are some viable workarounds to achieve similar results.
Basically, do this:
Setup a data dictionary as a package level global
Add custom objects to that dictionary
Create some factory functions to return those custom objects or create new ones if they're present/suitable
Refactor the existing testlib\testmod* modules to use the factory