To preface I'm very new to python (about 7 days) but I'm an experienced software eng undergrad.
I would like to send data between machines running python scripts. The idea I had (in order to simplify things) was to concatenate the data (strings & ints) into a string and do the parsing client-side.
The UDP packets send beautifully with simple strings but when I try to send useful data python always complains about the data I send; specifically python won't let me concatenate tuples.
In order to parse the data on the client I need to seperate the data with a dash character: '-'.
nodeList is of type dictionary where the key is a string and value is a double.
randKey = random.choice( nodeList.keys() )
data = str(randKey) +'-'+ str(nodeList[randKey])
mySocket.sendto ( data , address )
The code above produces the following error:
TypeError: coercing to Unicode: need string or buffer, tuple found
I don't understand why it thinks it is a tuple I am trying to concatenate...
So my question is how can I correct this to keep Python happy, or can someone suggest I better way of sending the data?
Thank you in advance.
I highly suggest using Google Protocol Buffers as implemented in Python as protobuf for this as it will handle the serialization on both ends of the line. It has Python bindings that will allow you to easily use it with your existing Python program.
Using your example code you would create a .proto file like so:
message SomeCoolMessage {
required string key = 1;
required double value = 2;
}
Then after generating, you can use it like so:
randKey = random.choice( nodeList.keys() )
data = SomeCoolMessage()
data.key = randKey
data.value = nodeList[randKey]
mySocket.sendto ( data.SerializeToString() , address )
I'd probably use the json module serialize the data.
You need to serialize the data. Pickle does this built in for you, and you can ask pickle for an ascii representation of the data vs binary data (see the docs), or you could use json (it also serializes the data for you) both are in the standard library. But really there are a hundred thousand different libraries that handle ALL the work for you, in getting data from 1 machine to another. I'd suggest using a library.
Depending on speed, etc. there are different trade offs for the various libraries. In the standard library you get HTTP, that's about it (well and raw sockets). But there are others.
If super fast speed is more important than other things..., zeroMQ, or google's protocol buffers might be valid options.
For me, I use rpyc usually, it lets me be totally lazy, and just call over to the other process across the network. It's fast enough usually.
You know that UDP has no guarantee that the data will ever show up on the other side, or that it will show up IN ORDER. for your application you may not care, I don't know, but just thought I'd bring it up.
Related
I have a text file that contains something like this :
host host_name {
# comment (optional)
hardware ethernet 01:22:85:EA:A8:5D;
fixed-address 192.168.107.210;
}
host another_host_name {
# comment (optional)
hardware ethernet 01:22:85:EA:A8:5D;
fixed-address 192.168.107.210;
}
I want my program to detect the line with the 'host' and then modify the content of the block according to what I type.
When I do the following (for example with request.form.get('name') in flask):
#random inputs
host = name2
comment = nothing
hardware = 00:00:00:00:00:00
address = 192.168.101.123
I would like to have :
host host_name {
# comment (optional)
hardware ethernet 01:22:85:EA:A8:5D;
fixed-address 192.168.107.210;
}
#after the change
host name2 {
# nothing
hardware ethernet 00:00:00:00:00:00;
fixed-address 192.168.101.123;
}
I don't have a problem with regex but rather the program that I have to do in order to achieve this, so how can I do it?
If you start to cod e the way you are thinking about your problem, you will likely have a complete and utter mess no ne can change or maintain, even if it works at first.
You have several different small tasks, and you are thinking of them "as one thing".
No. You are using Falsk to provide you an easy and light-weight web interface. That is ok. You already know how to get a text block from it. You don't need to ask anything about Flask now. Nor tio put any further code in the same place the code that gets data from the web is.
Instead just write some plain Python functions that will get your textual data as parameters, and then update the configuration files for you.
And while at that, if you can pick an special template and create a new config file when doing this, instead of trying to parse an existing file, and update your desired values in place, then, this is something you can achieve.
Parsing a "real world" config file in place and live update it is not an easy task. Actually it can be so complicated that most Linux distributions skipped trying that for more than 10 years.
And then you have a further complication you don't mention: you probably want to keep any configurations you are not changing on the file. I was to advise you to keep a template of the file, and fill in your data, creating a new file on each run. But that would require you to have all the other config data in some other format, which would basically duplicate your problem.
So, ok, your idea of "getting data from the original file" with regular expressions might be a go. But still, keep it separate from writing back the file. And don't think in "lines" if said file is structured in blocks.
One feasible thing would be to read the file, get the data you are interested in into a Python data-structure (for example, a list of dictionaries, each one having your host_name, comment, ethernet and ip fields). And, in a second apply of the same regex, change all those for placeholders , so that the file contents could be filled back in by a call to the .format method, or using Flask's jinja2 templating.
Separating the above in 2 functions will even allow you to present all the configured hosts on your web interface, so the user can edit then individually without having to type ethernet addresses by hand.
Sorry, but i won be writing all this code for you. I hope the above can help you think about a proper approach there. So, if later you come up with other questions, with some code from your attempts, we can help you further.
I am working on a school project in which I must measure and log Wi-Fi (I know how to log the data, I just don't know the most efficient way to do it). I have tried using by using
subproject.check_output('iwconfig', stderr=subprocess.STDOUT)
but that outputs bytes, which I really don't want to deal with (and I don't know how to, either, so if that is the only option, then can someone explain how to handle bytes). Is there any other way, maybe to get it in plain text? And please do not just give me the code I need, tell me how to do it.
Thank you in advance!
You are almost there. I assume that you are using python 3.x. iwconfig is sending you text encoded in whatever character set your terminal uses. That encoding is available as sys.stdin.encoding. So just put it together to get a string. BTW, you want a command list instead of a string.
raw = subprocess.check_output(['iwconfig'],stderr=subprocess.STDOUT)
data = raw.decode(sys.stdin.encoding)
I am fairly new to R, but the more use it, the more I see how powerful it really is over SAS or SPSS. Just one of the major benefits, as I see them, is the ability to get and analyze data from the web. I imagine this is possible (and maybe even straightforward), but I am looking to parse JSON data that is publicly available on the web. I am not a programmer by any stretch, so any help and instruction you can provide will be greatly appreciated. Even if you point me to a basic working example, I probably can work through it.
RJSONIO from Omegahat is another package which provides facilities for reading and writing data in JSON format.
rjson does not use S4/S3 methods and so is not readily extensible, but still useful. Unfortunately, it does not used vectorized operations and so is too slow for non-trivial data. Similarly, for reading JSON data into R, it is somewhat slow and so does not scale to large data, should this be an issue.
Update (new Package 2013-12-03):
jsonlite: This package is a fork of the RJSONIO package. It builds on the parser from RJSONIO but implements a different mapping between R objects and JSON strings. The C code in this package is mostly from the RJSONIO Package, the R code has been rewritten from scratch. In addition to drop-in replacements for fromJSON and toJSON, the package has functions to serialize objects. Furthermore, the package contains a lot of unit tests to make sure that all edge cases are encoded and decoded consistently for use with dynamic data in systems and applications.
The jsonlite package is easy to use and tries to convert json into data frames.
Example:
library(jsonlite)
# url with some information about project in Andalussia
url <- 'https://api.stackexchange.com/2.2/badges?order=desc&sort=rank&site=stackoverflow'
# read url and convert to data.frame
document <- fromJSON(txt=url)
Here is the missing example
library(rjson)
url <- 'http://someurl/data.json'
document <- fromJSON(file=url, method='C')
The function fromJSON() in RJSONIO, rjson and jsonlite don't return a simple 2D data.frame for complex nested json objects.
To overcome this you can use tidyjson. It takes in a json and always returns a data.frame. It is currently not availble in CRAN, you can get it here: https://github.com/sailthru/tidyjson
Update: tidyjson is now available in cran, you can install it directly using install.packages("tidyjson")
For the record, rjson and RJSONIO do change the file type, but they don't really parse per se. For instance, I receive ugly MongoDB data in JSON format, convert it with rjson or RJSONIO, then use unlist and tons of manual correction to actually parse it into a usable matrix.
Try below code using RJSONIO in console
library(RJSONIO)
library(RCurl)
json_file = getURL("https://raw.githubusercontent.com/isrini/SI_IS607/master/books.json")
json_file2 = RJSONIO::fromJSON(json_file)
head(json_file2)
I have to migrate data to OpenERP through XMLRPC by using TerminatOOOR.
I send a name with value "Rotule right Aurélia".
In Python the name with be encoded with value : 'Rotule right Aur\xc3\xa9lia '
But in TerminatOOOR (xmlrpc client) the data is encoded with value 'Rotule middle Aur\357\277\275lia'
So in the server side, the data value is not decoded correctly and I get bad data.
The terminateOOOR is a ruby plugin for Kettle ( Java product) and I guess it should encode data by utf-8.
I just don't know why it happens like this.
Any help?
This issue comes from Kettle.
My program is using Kettle to get an Excel file, get the active sheet and transfer the data in that sheet to TerminateOOOR for further handling.
At the phase of reading data from Excel file, Kettle can not recognize the encoding then it gives bad data to TerminateOOOR.
My work around solution is manually exporting excel to csv before giving data to TerminateOOOR. By doing this, I don't use the feature to mapping excel column name a variable name (used by kettle).
first off, whenever you deal with text (and all text is bound to contain some non-US-ASCII character sooner or later), you'll be much happier doing that in Python 3.x instead of in the 2.x series. if Py3 is not an option, try to always use from __future__ import unicode_literals (available in Python 2.6 and 2.7).
basically, when you send text or any other data over the wire, that will only happen in the form of bytes (octets of bits), so it will have to be encoded at some point. try to find out exactly where that encoding takes place in your tool chain; if necessary, use a debugging tool (or deploy print( repr( x ) ) statements) to look into relevant variables. the other software you mention is presumably written in PHP, a language which is known to have issues with unicode. you say that 'it should encode the data by utf-8', but on the other hand, when the receiving end sees the data of an incoming RPC request, that data should already be in utf-8. it would have to be decoded to obtain unicode again.
I need to transfer an array of varying length in which each element is a tuple of two integers. As an example:
path = [(1,1),(1,2)]
path = [(1,1),(1,2),(2,2)]
I am trying to use pack and unpack, however, since the array is of varying length I don't know how to create a format such that both know the format. I was trying to turn it into a single string with delimiters, such as:
msg = 1&1~1&2~
sendMsg = pack("s",msg) or sendMsg = pack("s",str(msg))
on the receiving side:
path = unpack("s",msg)
but that just prints 1 in this case. I was also trying to send 4 integers as well, which send and receive fine, so long as I don't include the extra string representing the path.
sendMsg = pack("hhhh",p.direction[0],p.direction[1],p.id,p.health)
on the receive side:
x,y,id,health = unpack("hhhh",msg)
The first was for illustration as I was trying to send the format "hhhhs", but either way the path doesn't come through properly.
Thank-you for your help. I will also be looking at sending a 2D array of ints, but I can't seem to figure out how to send these more 'complex' structures across the network.
Thank-you for your help.
While you can use pack and unpack, I'd recommend using something like YAML or JSON to transfer your data.
Pack and unpack can lead to difficult to debug errors and incompatibilities if you change your interface and have different versions trying to communicate with each other.
Pickle can give security problems and the pickle format might change between Python versions.
JSON is included in the standard Python distribution since 2.6. For YAML there is PyYAML.
You want some sort of serialization protocol. twisted.spread provides one such (see the Banana specification or Perspective Broker documentation). JSON or protocol buffers would be more verbose examples.
See also Comparison of data serialization formats.
If you include message length as part of the message, then you will know how much data to read. So the entire string should be read across the network.
In any case, perhaps it would help if you posted some of the code you are using to send data across the network, or at least provided more of a description.
Take a look at xdrlib, it might help. It's part of the standard library, and:
The xdrlib module supports the External Data Representation Standard as described in RFC 1014, written by Sun Microsystems, Inc. June 1987. It supports most of the data types described in the RFC.
Pack and unpack are mandatory?
If not, you could use JSON and YAML.
Don't use pickle because is not secure.