Use absolute pointer coordinates to move back to the host screen#387
Use absolute pointer coordinates to move back to the host screen#387whot wants to merge 1 commit intofeschber:mainfrom
Conversation
When the EI implementation gives us (the RemoteDesktop client) a device with regions, those regions represent the accessible screen area for that device. We can use those to set up the screen extents and then, from lan-mouse, calculate the target absolute coordinate within that region. Then pass the abs coordinates to the portal and let the compositor sort it out. This means we know when we're close to an edge of the screen without having to use the InputCapture barriers (which cannot be set up simultaneously with RemoteDesktop anyway). This fixes the broken return-to-host behavior after moving to a RemoteDesktop + libei portal client. While it's possible that an EI implementation gives us multiple devices with distinct regions, that's a niche case for now and left as furture exercise. Co-authored-by: Claude Code
|
@nbolton, how is deskflow doing this? @whot I'm a bit confused: I wasn't aware that input emulated through the remote desktop portal shouldn't be captured through input-capture portal. I'm 99% sure this worked before (I've tested this on two GNOME desktops before!) I think, what you are missing is pointer acceleration: To make pointer motion feel as natural as possible, the general Idea is to capture relative, unaccelerated pointer motion on the sending side and emulated it as physical pointer motion on the remote end, such that the receiving end can decide, whether or not the cursor motion should be accelerated (on a desktop) or not (very important for games!). More fundamentally, I want to enable the use of emulation backends that do not or even can not know about the absolute cursor position: (virtual-pointer, #297, and probably also MacOS with a custom HID driver in the future). Frankly, in it's current state, most of the emulation backends do not actually emulate unaccelerated motion in this way right now, but it's a goal I'd like to achieve in the future. An application grabbing the pointer, as well as pointer acceleration make it fundamentally impossible for these backends to keep track of when the device is left / the client should be released. Maybe skipping the release barrier creation for backends that can keep track of the absolute cursor position might be an option but I'm not sure this is even possible to do correctly for the reasons stated above. |
I can answer that question :) exactly like I proposed here - the server controls the position on the remote client. That's where I got the idea from. In deskflow the clients have no information at all, the server knows that client A is next to client B but neither A nor B know that, they just get remote-controlled.
With this we still get the relative events on the inputcapture host but we convert them ourselves to absolute coordinates on the target. Pointer acceleration isn't applied to RemoteDesktop input so the movement should be exactly the same to what it is right now. A quick check of mutter tells me that the EI events you get via the InputCapture portal are accelerated events. Unaccelerated is available, libinput provides it, but it's not used atm [see below...] If you want to apply acceleration on the target machine you're also facing the issue that the mouse may behave differently depending on which machine it's currently on. Mind you, that's just as valid a use-case as having only one host decide pointer acceleration so it comes down to preference.
fwiw, the intention of that device type was to have something akin to the USB device forwarding in qemu. It's primary use-case was tablets1 where you want the tablet to be used as-is on a libei client. Of course it would work just as well for relative and that would be a reasonable option for the unaccelerated data as above. Would need to extend mutter et al. for this feature though, but that's probably not even hard to do.
I'll re-test again but I always got stuck on the remote and had to use the keyboard shortcut to release. Maybe it's a regression. From my personal side: I had just never considered the use-case for RemoteDesktop input to feed into InputCapture input so I definitely didn't pay attention to that when implementing bits. Footnotes
|
I've updated to F44-alpha in the meantime for other reasons (on both boxes) and this is not an issue now. I don't know if it was a glitch or some temporary bug or something broken in F43 but it definitely seems to work now. 🤷 time to close this one then, it's not needed, sorry about wasting your time |
I think is good that we talked about this! Especially since you have some saying in the rdp internals :) I hope that
Keeps working! (maybe as an intended feature?) Would there be a good reason not to count rdp input for the input capture portal? |
I personally believe that having the target machine emulate physical input from unaccelerated captured events is the only sane thing to do. Otherwise, it would be impossible for the remote end to give raw input over to a game.
I'd actually love to see this feature get added! |
|
I filing the mutter issue for this when I realised that pointer acceleration in this case won't necessarily work as you'd expect. For games that rely on raw input it's easy enough but if you want acceleration applied you're on the wrong level with RemoteDesktop - the compositor itself does not do acceleration, libinput does this for real physical input. So while capturing/replaying raw input is possible, getting pointer accel as configured on the target machine is not (unless the client itself does accel) |
Late to the conversation, but it's worth noting that Deskflow/Synergy supports both relative and absolute pointer modes on the client which users can choose. void mouseMove(int32_t xAbs, int32_t yAbs) override;
void mouseRelativeMove(int32_t xRel, int32_t yRel) override;
Though games remain tricky since games use a mix of ways to get mouse coordinates (e.g. Edit: From customer support:
|
I'm filing this as Draft because I need some feedback whether this is the right thing to do or whether I'm missing something here.
Testing lan-mouse on two Fedora 43 hosts here, main machine in front of me and a laptop to the right. Moving the mouse to the laptop works fine but it got stuck moving back. As far as I can tell (and Claude seemed to confirm this) lan-mouse is trying to use inputcapture on the remote to detect a barrier entry. This cannot work, those two sessions are fundamentally incompatible (remote desktop input does not count as input for inputcapture purposes1).
Which means we can't rely on barriers on the remote - but since we know the remote's dimensions and we control the events we convert the relative coords from the mouse to absolute coordinates on the remote screen. And whenever we detect a movement off that area, we
release()the inputcapture session and are back to the local screen.Now, this works with my local setup but maybe I'm missing something?
AI disclaimer: written by claude code and stared long and hard at by me :)
Footnotes
this may not be spelled out anywhere and ultimately depends on the compositor anyway but I never even considered this case when spec-ing the portal :) ↩