How to call a python function with parameters from a batch file - python

I want to call a python function with one or more arguments using a batch file. Is this possible?
Not sure where to start with this one!
Python function code
def my_function(port):
#Import Serial
import serial
# Set COM Port.....
ser = serial.Serial('COM' + port, 115200, timeout=0,
parity=serial.PARITY_NONE, stopbits=serial.STOPBITS_ONE, rtscts=0)
ser.close()

You have to do something like this:
import sys
if __name__ == '__main__':
myfunc(sys.argv)
EDIT
Note that the first element of sys.argv is your script's name, so if your function is declared this way myfunc(a, b) you have to call it like myfunc(*sys.argv[1:]). You would also to do some validations.

Maybe you want some more and easy to use functionality. Use click! https://click.palletsprojects.com/en/7.x/

Related

Unable to use scapy as a bridge among interfaces

I'm trying to perform a transparent MITM attack with scapy. I've got an Ubuntu machine with two network interfaces, connected each one to a machine. Those machines have same subnet addresses and operate correctly if directly connected. The objective is to be able to be totally transparent, using both interfaces with no IP address and in promisc mode.
The implementation I'm using is the following:
def pkt_callback(pkt):
if pkt.sniffed_on == "enp0s3":
sendp(pkt, iface="enp0s8", verbose=0)
else:
sendp(pkt, iface="enp0s3", verbose=0)
def enable_bridge():
sniff(iface=["enp0s3", "enp0s8"], prn=pkt_callback, store=0)
if __name__ == "__main__":
conf.sniff_promisc=True
enable_bridge()
This is not all the code, but is the main routing part... I can see that packets arrive to both interfaces, but no pinging from one machine to another... Any idea of how to make this work?
Thanks in advance.
EDIT 1:
The full implementation here:
from scapy.all import *
from utils import interfaces, addresses
#from routing import *
from packet_filters import is_mms_packet
from attacks import performAttack
import sys
import os
import time
import datetime
def writePacketInDisk(pkt):
wrpcap("network_logs/network-log-
"+datetime.date.today().strftime("%Y")+"-"
+datetime.date.today().strftime("%B")+"-
"+datetime.date.today().strftime("%d")+".pcap", pkt, append=True)
def pkt_callback_PLC_OPC(pkt):
ret = True
# if is_mms_packet(pkt):
# writePacketInDisk(pkt)
#ret = performAttack(pkt)
return ret
def pkt_callback_OPC_PLC(pkt):
ret = True
# if is_mms_packet(pkt):
# writePacketInDisk(pkt)
#ret = performAttack(pkt)
return ret
def enable_bridge():
print "hello!!"
bridge_and_sniff(interfaces["plc-ccb"], interfaces["opc"],
xfrm12=pkt_callback_PLC_OPC, xfrm21=pkt_callback_OPC_PLC,
count=0, store=0)
#prn = lamba x: x.summary()
print "bye!!"
if __name__ == "__main__":
conf.sniff_promisc=True
enable_bridge()
This is definitely not working... Is the code correct? May be my VM too slow for this task?
This code is correct and should work. You should update to the current development version of Scapy (https://github.com/secdev/scapy/) and see if that was related to an old bug.
As a side note, you can directly use bridge_and_sniff("enp0s3", "enp0s8") instead of writing your own function.

Passing data between separately running Python scripts

If I have a python script running (with full Tkinter GUI and everything) and I want to pass the live data it is gathering (stored internally in arrays and such) to another python script, what would be the best way of doing that?
I cannot simply import script A into script B as it will create a new instance of script A, rather than accessing any variables in the already running script A.
The only way I can think of doing it is by having script A write to a file, and then script B get the data from the file. This is less than ideal however as something bad might happen if script B tries to read a file that script A is already writing in. Also I am looking for a much faster speed to communication between the two programs.
EDIT:
Here are the examples as requested. I am aware why this doesn't work, but it is the basic premise of what needs to be achieved. My source code is very long and unfortunately confidential, so it is not going to help here. In summary, script A is running Tkinter and gathering data, while script B is views.py as a part of Django, but I'm hoping this can be achieved as a part of Python.
Script A
import time
i = 0
def return_data():
return i
if __name__ == "__main__":
while True:
i = i + 1
print i
time.sleep(.01)
Script B
import time
from scriptA import return_data
if __name__ == '__main__':
while True:
print return_data() # from script A
time.sleep(1)
you can use multiprocessing module to implement a Pipe between the two modules. Then you can start one of the modules as a Process and use the Pipe to communicate with it. The best part about using pipes is you can also pass python objects like dict,list through it.
Ex:
mp2.py:
from multiprocessing import Process,Queue,Pipe
from mp1 import f
if __name__ == '__main__':
parent_conn,child_conn = Pipe()
p = Process(target=f, args=(child_conn,))
p.start()
print(parent_conn.recv()) # prints "Hello"
mp1.py:
from multiprocessing import Process,Pipe
def f(child_conn):
msg = "Hello"
child_conn.send(msg)
child_conn.close()
If you wanna read and modify shared data, between 2 scripts, which run separately, a good solution is, take advantage of the python multiprocessing module, and use a Pipe() or a Queue() (see differences here). This way, you get to sync scripts, and avoid problems regarding concurrency and global variables (like what happens if both scripts wanna modify a variable at the same time).
As Akshay Apte said in his answer, the best part about using pipes/queues, is that you can pass python objects through them.
Also, there are methods to avoid waiting for data, if there hasn't been any passed yet (queue.empty() and pipeConn.poll()).
See an example using Queue() below:
# main.py
from multiprocessing import Process, Queue
from stage1 import Stage1
from stage2 import Stage2
s1= Stage1()
s2= Stage2()
# S1 to S2 communication
queueS1 = Queue() # s1.stage1() writes to queueS1
# S2 to S1 communication
queueS2 = Queue() # s2.stage2() writes to queueS2
# start s2 as another process
s2 = Process(target=s2.stage2, args=(queueS1, queueS2))
s2.daemon = True
s2.start() # Launch the stage2 process
s1.stage1(queueS1, queueS2) # start sending stuff from s1 to s2
s2.join() # wait till s2 daemon finishes
# stage1.py
import time
import random
class Stage1:
def stage1(self, queueS1, queueS2):
print("stage1")
lala = []
lis = [1, 2, 3, 4, 5]
for i in range(len(lis)):
# to avoid unnecessary waiting
if not queueS2.empty():
msg = queueS2.get() # get msg from s2
print("! ! ! stage1 RECEIVED from s2:", msg)
lala = [6, 7, 8] # now that a msg was received, further msgs will be different
time.sleep(1) # work
random.shuffle(lis)
queueS1.put(lis + lala)
queueS1.put('s1 is DONE')
# stage2.py
import time
class Stage2:
def stage2(self, queueS1, queueS2):
print("stage2")
while True:
msg = queueS1.get() # wait till there is a msg from s1
print("- - - stage2 RECEIVED from s1:", msg)
if msg == 's1 is DONE ':
break # ends loop
time.sleep(1) # work
queueS2.put("update lists")
EDIT: just found that you can use queue.get(False) to avoid blockage when receiving data. This way there's no need to check first if the queue is empty. This is no possible if you use pipes.
You could use the pickling module to pass data between two python programs.
import pickle
def storeData():
# initializing data to be stored in db
employee1 = {'key' : 'Engineer', 'name' : 'Harrison',
'age' : 21, 'pay' : 40000}
employee2 = {'key' : 'LeadDeveloper', 'name' : 'Jack',
'age' : 50, 'pay' : 50000}
# database
db = {}
db['employee1'] = employee1
db['employee2'] = employee2
# Its important to use binary mode
dbfile = open('examplePickle', 'ab')
# source, destination
pickle.dump(db, dbfile)
dbfile.close()
def loadData():
# for reading also binary mode is important
dbfile = open('examplePickle', 'rb')
db = pickle.load(dbfile)
for keys in db:
print(keys, '=>', db[keys])
dbfile.close()
This will pass data to and from two running scripts using TCP host socket. https://zeromq.org/languages/python/. required module zmq: use( pip install zmq ).
This this is called a client server communication. The server will wait for the client to send a request. The client will also not run if the server is not running. In addition, this client server communication allows for you to send a request from one device(client) to another device(server), as long as the client and server are on the same network and you change localhost (localhost for the server is marked with: * )to the actual IP of your device(server)( IP help( go into your device network settings, click on your network icon, find advanced or properties, look for IP address. note this may be different from going to google and asking for your ip. I am using IPV6 so. DDOS protection.)) Change the localhost IP of the client to the server IP. QUESTION to OP. Do you have to have script b always running or can script b be imported as a module to script a? If so look up how to make python modules.
I solved the same problem using the lib Shared Memory Dict, it's a very simple dict implementation of multiprocessing.shared_memory.
Source1.py
from shared_memory_dict import SharedMemoryDict
from time import sleep
smd_config = SharedMemoryDict(name='config', size=1024)
if __name__ == "__main__":
smd_config["status"] = True
while True:
smd_config["status"] = not smd_config["status"]
sleep(1)
Source2.py
from shared_memory_dict import SharedMemoryDict
from time import sleep
smd_config = SharedMemoryDict(name='config', size=1024)
if __name__ == "__main__":
while True:
print(smd_config["status"])
sleep(1)

Sending data over serial in python from different functions

I program in C++ and C# normally and am trying to get accustomed to how python works so forgive me if this is a fairly basic question.
I'd like to open a serial connection in python over RS-232 and send data from various functions. Here's the functions that I have to open a serial port and send data:
def sendData(data):
ser = serial.Serial('/dev/ttyUSB0', 115200)
data += "\r\n"
ser.write(data.encode())
Pretty simple, but I'd like to move the ser = serial.Serial('/dev/ttyUSB0', 115200) line outside the function so that I can just call this function on it's own.
Now in other languages I would just make the ser variable public so that other functions could access it, but I'm not sure I'm understanding how variables work in python properly yet. Ideally I'd like something like this:
def main():
ser = serial.Serial('/dev/ttyUSB0', 115200)
while 1:
#misc code here
sendData(data)
I know I could pass the ser variable through the function call and make it sendData(ser, data), but that seems unnecessary. What's the best way to do this?
Thanks!
You can use the global keyword in your main function to assign the public variable:
ser = None
def sendData(data):
data += "\r\n"
ser.write(data.encode())
def main():
global ser
ser = serial.Serial('/dev/ttyUSB0', 115200)
while 1:
#misc code here
sendData(data)
Or even better, using a class:
class SerialWrapper:
def __init__(self, device):
self.ser = serial.Serial(device, 115200)
def sendData(self, data):
data += "\r\n"
self.ser.write(data.encode())
def main():
ser = SerialWrapper('/dev/ttyUSB0')
while 1:
#misc code here
ser.sendData(data)

How to use pyserial expect on a string from serial console, then send a charater

I'm a new to python, hoping someone knows something about the issue I have. I want to control a device using serial console. A command will be sent to reboot the device, while the device is rebooting, a string is printed. I want to catch the string and then send a character "h" which will abort the reboot. Code looks like this
#! /bin/env python
import serial
import sys
import pexpect
from time import sleep
from serial import SerialException
ser = serial.Serial()
ser.baudrate = 9600
ser.port="/dev/ttyUSB0"
ser.stopbits=serial.STOPBITS_ONE
ser.xonxoff=0
try:
ser.open()
except:
sys.exit ("Error opening port")
print "Serial port opened"
ser.write('rebootnow\r')
temp = ser.expect('press h now to abort reboot..')
if i == 0:
print ('Gotcha, pressing h')
ser.sendline('h')
print ('Reboot aborted')
else:
print ('Ouch, got nothing')
time.sleep(1)
ser.close()
exit()
When I run the code, I get the error
AttributeError: 'Serial' object has no attribute 'expect'
at line
temp = ser.expect('press h now to abort reboot..')
Any ideas?
This thread is quite old, but I hope my answer can still be useful to you.
The expect methods are from pexpect, not from pyserial. You should do something like:
temp = pexpect.fdpexpect.fdspawn(ser)
temp.expect("message")
Basically, from temp onwards you should call methods on the temp object, not on the serial object. This includes the sendline() call, later on.

calling a script from daemon

I am trying to call a script from python-daemon but its not working. this is what i am tying to do, is it correct?
I also want to pass a random argument to that script, currently i have hard coded it
import daemon
import time
import subprocess
import os
def interval_monitoring():
print "Inside interval monitoring"
while True:
print "its working"
# os.system("XYZ.py 5416ce0eac3d94693cf7dbd8") Tried this too but not working
subprocess.Popen("XYZ.py 5416ce0eac3d94693cf7dbd8", shell=False)
time.sleep(60)
print "condition true"
def run():
print daemon.__file__
with daemon.DaemonContext():
interval_monitoring()
if __name__ == "__main__":
run()
If you didn't make XYZ.py executable and added #!/usr/bin/env python in the top line, you need to call it via python, rather than directly. So your line would be something like this:
subprocess.check_output(["python", "XYZ.py", "5416ce0eac3d94693cf7dbd8"])

Categories