Python PyQt4 how to detect control press + mouse Scroll - python

I am developing an app.I would like there to be an option to zoom in and out. to do this I would like to have the program get when the control key is clicked and the mouse is being scrolled at the same time, and zoom accordingly(scroll up for zoom in, and scroll down for zoom out) I've been doing a lot of searching and reading,but only found ways to get a control press and a mouse click-event, not a mouse scroll-event. I also found a way to get a mouse scroll but couldn't get it to work with just control press. - and to ignore all other presses.
Could anyone suggest something to help me???

It depends a bit on the structure of your application.
One way to get scroll-events is to add a wheelEvent handler to your Widget
def wheelEvent(self, QWheelEvent):
modifiers = QtGui.QApplication.keyboardModifiers()
if modifiers == QtCore.Qt.ControlModifier:
# do your processing
Another approach could be to install an eventFilter in the component where you want to intercept the scroll-events by
component.viewport().installEventFilter(self)
(maybe you have to install the event filter on the component itself iso on the viewport).
and self has an eventFilter function like
def eventFilter(self, qobject, event):
if (event.type() == QtCore.QEvent.Wheel) :
modifiers = QtGui.QApplication.keyboardModifiers()
if modifiers == QtCore.Qt.ControlModifier:
#do some scaling stuff
return True
return False
else:
# standard event processing
return False
Hope this helps.

Related

PyQt5 - How to make a push button always has true value when click down and false value when it has up

Im just try to use PyQt5 for my robot GUI. im little bit confuse how to use momentary push button which should giving true value when it pressed down and false value when it unpressed. fyi im using python for this.
im already try to use "setCheckable" to detecting its state, but it make the button toggled (not momentary). is there any other methode that i can implement?
The clicked signal is always emitted after the mouse button is released in the button rectangle. That's the common convention for a "click event" of buttons in almost any reasonable UI toolkit.
If you want to know when the button is pressed or released, then you can connect to the pressed and released signals.
Be aware, though, that the common convention above also considers that if the user leaves the button while the mouse button is still pressed, the button becomes "unpressed" as well, and becomes pressed again if the mouse goes back inside its geometry while the same mouse button is still pressed. This means that if you connect to those signals and the user "leaves" the button with the mouse button still pressed and gets back to it, you might get inconsistent results depending on your needs.
If you are interested in the basic "just when mouse button was pressed" and "mouse button released at last" state, no matter if the user released the mouse button outside of the widget, then the only option is to override the mouse button handlers and use a custom signal:
from PyQt5 import QtCore, QtWidgets
class IgnoreClickPosButton(QtWidgets.QPushButton):
pressedState = QtCore.pyqtSignal(bool)
def mousePressEvent(self, event):
super().mousePressEvent(event)
if event.button() == QtCore.Qt.LeftButton:
self.pressedState.emit(True)
def mouseReleaseEvent(self, event):
super().mouseReleaseEvent(event)
if event.button() == QtCore.Qt.LeftButton:
self.pressedState.emit(False)
if __name__ == "__main__":
import sys
app = QtWidgets.QApplication(sys.argv)
btn = IgnoreClickPosButton()
btn.pressedState.connect(lambda p: print('pressed', p))
btn.show()
sys.exit(app.exec_())

Taking Single Click Event and Propagating it to Multiple Areas - python, pynput, pyautogui

Having trouble turning 1 mouse click into multiple mouse clicks. Basically what I want to do is to control multiple windows at once. I want to click on one master window and have the clicks propagate to the subsequent windows. In this snippet there are 4 windows and I track them via determining the offset between it and the master window.
I'm using python3 with pynput for the mouse listener and pyautogui for mouse control.
What I'm having a problem with is setting up the mouse listener such that it listens to my actual clicks but ignores the programmatic clicks. Right now, I think it's getting stuck in an infinite loop where my initial click triggers the on_click event, propagates the clicks, each triggering an additional on_click event, propagates the clicks, etc. When I run the below code it starts fine, and then when I first click it just heavily lags my mouse for a minute before return back to normal with no mouse listener active anymore. My guess is that a failsafe kicks in to return it to normal.
Things I have tried:
using pynput for listener and control - this does not change the outcome
stopping the listener and creating a new one after propagated clicks have finished - bad hacky solution that still did not change the outcome
semaphore locking with _value peeking to ignore events if semaphore has already been acquired - also hacky and did not work
calling propagateActions via threading and waiting for completion before returning from on_click event - did not work
commenting out pyautogui.click() - this allows for expected behavior to move the mouse to the subsequent locations and return it back to its initial position after. Without the click, it works perfect. With the click, it lags and the listener dies.
searching stackoverflow - this question bears a resemblance in terms of outcome, but is unanswered and is trying to achieve something different.
My snippet is below:
from pynput import mouse, keyboard
import pyautogui
pyautogui.PAUSE = 0.01
mouseListener = None
killSwitch = False
# this is just a keyboard listener for a kill switch
def on_release(key):
if key == keyboard.Key.f1:
global killSwitch
print('### Kill switch activated ###')
killSwitch = True
# on mouse release I want to propogate a click to 4 other areas
def on_click(x, y, button, pressed):
print('{0} at {1}'.format('Pressed' if pressed else 'Released', (x, y)))
if not pressed:
propogateActions(x, y, button)
# propogates clicks
def propogateActions(x, y, button):
print('propogating actions to {0} windows'.format(len(offsets)+1))
for offset in offsets:
pyautogui.moveTo(x+offset.x, y+offset.y)
print('mouse moved')
if button == mouse.Button.left:
print('left clicking at ({0}, {1})'.format(x+offset.x, y+offset.y))
pyautogui.click()
pyautogui.moveTo(x, y)
# point class for ease of use
class Point():
def __init__(self, x, y):
self.x = x
self.y = y
def __repr__(self):
return 'Point(x={0}, y={1})'.format(self.x, self.y)
# main method
def doTheThing():
print('started')
while not killSwitch:
pass
# initializations and starting listeners
# offsets tracks how far the subsequent clicks are from the initial click point
offsets = [Point(50, 0), Point(50, 50), Point(0, 50)]
keyboardListener = keyboard.Listener(on_release=on_release)
mouseListener = mouse.Listener(on_click=on_click)
keyboardListener.start()
mouseListener.start()
doTheThing()
My Question:
Is there some way to listen only for "real" clicks and not programmatic clicks?
If not, can I pause the mouse listener and then restart it some way after the propagated clicks have occurred?
This is the small section of code that's relevant to the issue at hand. offsets has an initialization that sets it more appropriately and there's other bells and whistles, but this is the section relevant to the problem. I appreciate your help.
Found the answer! Had to go a layer deeper.
Pynput has a method of suppressing events that exposes the win32 data behind the click event. Ran a test of one of my clicks vs a pyautogui.click() and lo-and-behold there is a difference. The data.flags was set to value 0 on a user click event and set to value 1 on a programmatic click.
That's good enough for me to filter on. This is the pertinent filter:
def win32_event_filter(msg, data):
if data.flags:
print('suppressing event')
return False
added that to my above code and changed the
mouseListener = mouse.Listener(on_click=on_click)
to
mouseListener = mouse.Listener(on_click=on_click, win32_event_filter=win32_event_filter)
and it works!
My real clicks prevail, programmatic clicks are propagated, and I am not stuck in an infinite loop. Hope this helps if others run into this issue.

PyQt5: In a QGraphicsView, rebind ScrollHandDrag to work with middle mouse button, not just left mouse button

I'm working on a map editor, where I use a QGraphicsView to display a QGraphicsObject (the map). I have already implemented the image zoom functionality from this post, and I have tried using their implementation of pan, and have found that it works. However, I want to be able to do the pan without toggling, simply by holding down the middle mouse button and dragging from there.
I've written this code to switch between the two modes (normal and drag), and then call it on mouse(Press/Release)Events.
(in QGraphicsView)
def toggleDragMode(self):
if self.dragMode() == QGraphicsView.ScrollHandDrag:
self.setDragMode(QGraphicsView.NoDrag)
self.pointerMode = PointerMode.Normal
else:
self.setDragMode(QGraphicsView.ScrollHandDrag)
self.pointerMode = PointerMode.Drag
def mousePressEvent(self, event):
if event.buttons() & Qt.MiddleButton:
self.toggleDragMode()
...
super().mousePressEvent(event)
def mouseReleaseEvent(self, event):
if self.dragMode() == QGraphicsView.ScrollHandDrag:
self.setDragMode(QGraphicsView.NoDrag)
self.pointerMode = PointerMode.Normal
super().mouseReleaseEvent(event)
(Sidenote: With the mouseReleaseEvent, I found that I couldn't use event.buttons() & Qt.MiddleButton to check if the middle button was released, so I always check and release in this case.)
However, whenever I try and middle-click and drag, no pan happens. However, I have confirmed this works correctly whenever I use the left mouse button as the binding (i.e. if event.buttons() & Qt.LeftButton: toggleDrag()). There has to be some code in QGraphicsView.mouseMoveEvent() that checks to see if the left button is held AND the cursor is in DragMode. Is there a way to change the keybinding on that deeper level?
EDIT: I just found this link and tried it by doing this (in QGraphicsView)
def __init__(*args, **kwargs):
...
self.STOP_GLOBAL_SCROLLING = 1
But to no avail. Still, it's an interesting lead.

PyQt Python - Creating right mouse click for QPushButton

I'm dynamically creating creating a list of QPushButtons in a vertical layout. I'm currently using the "click" signal for a function. But I would like to use the right mouse button for an additional function. Ex. print the tooltip of that QPushButton that the mouse is over by right clicking.
but = QtGui.QPushButton()
but.setText(cur);but.setCheckable(True)
but.setStyleSheet(_fromUtf8("text-align:right;background-color: rgb(50, 50, 50);"))
but.setToolTip(longName + "." + cur)
I'm looking over at the "QMouseEvent", "setMouseTracking", "mousePressEvents". But I'm not sure how to properly use them to get my desired result.
I would also be open to a custom signal for a QPushButton on "right-click".
Usually, the right mouse click is connected to the context menu. With
but.setContextMenuPolicy(QtCore.Qt.CustomContextMenu)
but.customContextMenuRequested.connect(handle_right_click)
you can connect a handler, which could be used for the right mouse click.
If you'd like to have a pure right mouse click action, you should extend QPushButton and override mousePressEvent(self, event) and mouseReleaseEvent(self, event). Then you need to set the button's ContextMenuPolicy to QtGui.Qt.PreventContextMenu to ensure their delivery:
The widget does not feature a context menu, and in contrast to NoContextMenu, the handling is not deferred to the widget's parent. This means that all right mouse button events are guaranteed to be delivered to the widget itself through QWidget::mousePressEvent(), and QWidget::mouseReleaseEvent().
You can check the mouse buttons in your handler for the clicked signal:
but.clicked.connect(self.buttonClicked)
...
def buttonClicked(self):
if QtGui.qApp.mouseButtons() & QtCore.Qt.RightButton:
print(self.sender().toolTip())
The mouse button constants can be OR'd together, allowing you to test for different combinations of them. So the above code will detect that the right button is held down while the left button is clicked.
You can also do a similar thing with the keyboard modifiers:
if QtGui.qApp.keyboardModifiers() & QtCore.Qt.ControlModifier:
print('ctrl+click')

Gtk IconView select multiple without Ctrl?

Is it possible to make Gtk IconView (in pygtk) allow selection of multiple icons without the Ctrl key being pressed?
I basically want the behaviour of Ctrl being held down even when it is not held down.
Overriding this kind of behaviour might confuse users. But if you really want to, there are two possibilities that I can see:
Either make the IconView believe Ctrl is always pressed:
def force_ctrl(iv, ev): ev.state |= gtk.gdk.CONTROL_MASK
iconview.connect('key-press-event', force_ctrl)
iconview.connect('button-press-event', force_ctrl)
Or you could try implementing the selection behaviour yourself, something like:
def clicked(iv, ev):
p = iv.get_path_at_pos(int(ev.x), int(ev.y))
if not p is None:
if iv.path_is_selected(p):
iv.unselect_path(p)
else:
iv.select_path(p)
return True # make the IconView ignore this click
iconview.connect('button-press-event', clicked)

Categories