Move multiple mice with Python - python

Note: I am open to different solutions which achieve the desired capability
I am working on a project with many instances of the same game.
Therefore, I am sending keyboard and mouse instructions to each of theses processes, in parallel.
I am currently using win32ui as follows:
After finding the processes hwnd (windows handle) values from Get HWND of each Window?, so a hwnds_list with all the processes with a given name e.g. [788133, 723724, ...]
I am sending instructions to each of the processes, by creating a PyCWnd object:
PyCWnd = win32ui.CreateWindowFromHandle(hwnd)
Then, say I want to press the return key, I used:
def press_return(pycwnd):
pycwnd.SendMessage(win32con.WM_KEYDOWN, win32con.VK_RETURN, 0)
pycwnd.SendMessage(win32con.WM_KEYUP, win32con.VK_RETURN, 0)
Then I run this in parallel with:
def press_return_par(hwnds):
# Get the Window from handle
pycwnd = make_pycwnd(hwnds)
time.sleep(0.1)
press_return(pycwnd)
num_workers = len(hwnds_list)
with Pool(num_workers) as p:
p.map(press_return_par, hwnds_list)
So, I have a good way of sending keyboard commands, and even scrolling with a mouse, but can't work out how to do this with mouse movements.
Ideally, I'd like to say, "Move to (x, y) coordinates over n time". This 'ideal' method needs to not effect the current cursor (or allow a locked cursor for each process/game), as I will want to do this across ~8 instances of the game.
I've looked through the official pywin32 docs http://timgolden.me.uk/pywin32-docs/contents.html, other answers that look bang on https://stackoverflow.com/a/3721198/11181287 but use win32api.mouse_event, so I don't know how to convert this to work with the multiple pycwnd objects.
https://stackoverflow.com/a/3721053/11181287 looks close, but doesn't seem to move the mouse, it just does the right click, although I have made some guesses for the MAKELPARAM function which is not listed.
In addition, https://github.com/oblitum/Interception could be helpful but haven't found good docs for how to apply this here.
As the game is an FPS game, running multiple instances through nucleus-coop, using a VM etc... won't be fast enough (from my current research).
PyAutoGUI is exactly the functionality I want, with the speed, but (as expected) I haven't been able to set it up to work for multiple mice/processes
There could be something in sending DirectX inputs into the game (black ops 2)?
(I'm running windows 10, Python 3.7.11, and only know Python)

I have two possible solutions to your mice issue.
What if you used only one mouse to control all of the windows? With pyautogui you could tab into each window when necessary and control the mouse for that window. I'm not sure how efficient this would be and how fast the mouse control for each window would be, but it's still sort of a solution.
OR
You could control the mouse with the keyboard.
See this article https://www.windowscentral.com/how-control-mouse-using-keyboard-windows-10
I apologize for not just commenting, unfortunately I don't have enough reputation.

Related

Is there a way to simulate multiple mouse pointers in Python?

Firstly, I am aware of this post and this post here StackOverflow. However, most of the information in these posts is either severely outdated or not applicable to my use case.
I would like to know if it is possible to simulate n mouse pointers with n different instances of Python. That is, I would like to be able to run as many mouse pointers on my screen as possible. All of these mouse pointers would be controlled by the same script, but would still be doing all their work independently of each other.
Is it possible to create such an application using Python?

using mouse with pyautogui on extended screens [python]

I am using python to move the cursor along multiple screens (using extended display).
The method pyautogui.moveTo(_row_, _col_) moves the cursor to the correct position on the main screen, but does not move it outside of it. In windows "Personalize" I set the second screen to the right of the first one, but when I call pyautogui.moveTo(2000,400) the mouse simply moves to location (1366, 400) [the edge of the main screen].
So, how can I use python to move the cursor from one screen to another? I'll be using 4 different monitors in my project
Sorry that this is late, but since it comes up in the top Google results, I figured I'd answer for others who are ending up here.
There is currently no solution to this problem (As of 8/13/2018). It is being looked into, but not very hard.
In the meantime, I've made things work on my Windows PC with the GhostMouse freeware, which has no problem with multiple monitors:

Take all input in Python (like UAC)

Is there any way I can create a UAC-like environment in Python? I want to basically lock the workstation without actually using the Windows lock screen. The user should not be able to do anything except, say, type a password to unlock the workstation.
You cannot do this without cooperation with operating system. Whatever you do, Ctrl-Alt-Del will allow the user to circumvent your lock.
The API call you're looking for Win32-wise is a combination of CreateDesktop and SetThreadDesktop.
In terms of the internals of Vista+ desktops, MSDN covers this, as does this blog post. This'll give you the requisite background to know what you're doing.
In terms of making it look like the UAC dialog - well, consent.exe actually takes a screenshot of the desktop and copies it to the background of the new desktop; otherwise, the desktop will be empty.
As the other answerer has pointed out - Ctrl+Alt+Delete will still work. There's no way around that - at least, not without replacing the keyboard driver, anyway.
As to how to do this in Python - it looks like pywin32 implements SetThreadDesktop etc. I'm not sure how compatible it is with Win32; if you find it doesn't work as you need, then you might need a python extension to do it. They're not nearly as hard to write as they sound.
You might be able to get the effect you desire using a GUI toolkit that draws a window that covers the entire screen, then do a global grab of the keyboard events. I'm not sure if it will catch something like ctrl-alt-del on windows, however.
For example, with Tkinter you can create a main window, then call the overrideredirect method to turn off all window decorations (the standard window titlebar and window borders, assuming your window manager has such things). You can query the size of the monitor, then set this window to that size. I'm not sure if this will let you overlay the OSX menubar, though. Finally, you can do a grab which will force all input to a specific window.
How effective this is depends on just how "locked out" you want the user to be. On a *nix/X11 system you can pretty much completely lock them out (so make sure you can remotely log in while testing, or you may have to forcibly reboot if your code has a bug). On windows or OSX the effectiveness might be a little less.
I would try with pygame, because it can lock mouse to itself and thus keep all input to itself, but i wouldn't call this secure without much testing, ctr-alt-del probably escape it, can't try on windows right now.
(not very different of Bryan Oakley's answer, except with pygame)

Tell windows which monitor to display dialogs on

I've got a program which is using multiple monitors. The program is showing special visualizations on the second monitor. At one point, the program uses windows shell functions to send files to the recycle bin. However, when it does this, the delete confirmation dialog comes on top of my visualization. This is particularly problematic, as when the mouse is on the second monitor, my program uses mouse hooks to capture all mouse input, so the user cannot even click the confirmation dialog.
Is it possible to somehow tell Windows to only place dialog boxes on a particular display?
I'm using python, though if I have to call C WinAPI functions that shouldn't be a problem
which function are you using to send the files to the recycle bin? if you use SHFileOperation you can pass a parent HWND. perhaps make that an invisible WS_EX_TOOLWINDOW window on the other monitor.
i would expect the API, treating that window as a parent, would center relative to that window, but i haven't tried it.
depending on which version of Windows you are targeting, there used to be a capability to create desk bands that 'dock' to the sides of the screen. this automatically gets factored into the area returned as rcWork by GetMonitorInfo and should prevent dialogs from overlapping this space. There might be another way to declare that a region is "in use" in a way that declares space off-limits, but I don't know of it so it probably doesn't exist...
the ugly and crude thing you could do is poll and move the dialog yourself, but if this is any kind of widely deployed or commercial app that would likely cause more harm than good.

Can you auto hide frames/dialogs using wxPython?

I would like to create an application that has 3-4 frames (or windows) where each frame is attached/positioned to a side of the screen (like a task bar). When a frame is inactive I would like it to auto hide (just like the Windows task bar does; or the dock in OSX). When I move my mouse pointer to the position on the edge of the screen where the frame is hidden, I would like it to come back into focus.
The application is written in Python (using wxPython for the basic GUI aspects). Does anyone know how to do this in Python? I'm guessing it's probably OS dependent? If so, I'd like to focus on Windows first.
I don't do GUI programming very often so my apologies if this makes no sense at all.
As far as I know, there's nothing built in for this.
When the window is hidden, do you want it completely invisible or can a border of a few pixels be showing? That would be an easy way to get a mouse hover event. Otherwise you might have to use something like pyHook to get system-wide mouse events to know when to expand your window.
The events EVT_ENTER_WINDOW and EVT_LEAVE_WINDOW might also be useful here to know when the user has entered/left the window so you can expand/collapse it.
Expanding/collapsing can just be done by showing/hiding windows or resizing them. Standard window functions, nothing fancy.
By the way, you might want to use wx.ClientDisplayRect to figure out where to position your window. That will give you a rectangle of the desktop that does NOT include the task bar or any other toolbars the user has, assuming you want to avoid overlapping with those things.
Personally, I would combine the EVT_ENTER_WINDOW and EVT_LEAVE_WINDOW that FogleBird mentioned with a wx.Timer. Then whenever it the frame or dialog is inactive for x seconds, you would just call its Hide() method.
I think you could easily just make a window that is the same size as the desktop then do some while looping for an inactivity variable based on mouse position, then thread off a timer for loop for the 4 inactivity variables. I'd personally design it so that when they reach 0 from 15, they change size and position to become tabular and create a button on them to reactivate. lots of technical work on this one, but easily done if you figure it out

Categories