Skip to content

Use absolute pointer coordinates to move back to the host screen#387

Closed
whot wants to merge 1 commit intofeschber:mainfrom
whot:wip/absolute-pointer
Closed

Use absolute pointer coordinates to move back to the host screen#387
whot wants to merge 1 commit intofeschber:mainfrom
whot:wip/absolute-pointer

Conversation

@whot
Copy link
Contributor

@whot whot commented Feb 12, 2026

I'm filing this as Draft because I need some feedback whether this is the right thing to do or whether I'm missing something here.

Testing lan-mouse on two Fedora 43 hosts here, main machine in front of me and a laptop to the right. Moving the mouse to the laptop works fine but it got stuck moving back. As far as I can tell (and Claude seemed to confirm this) lan-mouse is trying to use inputcapture on the remote to detect a barrier entry. This cannot work, those two sessions are fundamentally incompatible (remote desktop input does not count as input for inputcapture purposes1).

Which means we can't rely on barriers on the remote - but since we know the remote's dimensions and we control the events we convert the relative coords from the mouse to absolute coordinates on the remote screen. And whenever we detect a movement off that area, we release() the inputcapture session and are back to the local screen.

Now, this works with my local setup but maybe I'm missing something?

AI disclaimer: written by claude code and stared long and hard at by me :)

Footnotes

  1. this may not be spelled out anywhere and ultimately depends on the compositor anyway but I never even considered this case when spec-ing the portal :)

When the EI implementation gives us (the RemoteDesktop client) a device
with regions, those regions represent the accessible screen area for
that device. We can use those to set up the screen extents and then,
from lan-mouse, calculate the target absolute coordinate within that
region. Then pass the abs coordinates to the portal and let the
compositor sort it out.

This means we know when we're close to an edge of the screen without
having to use the InputCapture barriers (which cannot be set up
simultaneously with RemoteDesktop anyway).

This fixes the broken return-to-host behavior after moving to a
RemoteDesktop + libei portal client.

While it's possible that an EI implementation gives us multiple devices
with distinct regions, that's a niche case for now and left as furture
exercise.

Co-authored-by: Claude Code
@feschber
Copy link
Owner

@nbolton, how is deskflow doing this?

@whot I'm a bit confused: I wasn't aware that input emulated through the remote desktop portal shouldn't be captured through input-capture portal. I'm 99% sure this worked before (I've tested this on two GNOME desktops before!)

I think, what you are missing is pointer acceleration:

To make pointer motion feel as natural as possible, the general Idea is to capture relative, unaccelerated pointer motion on the sending side and emulated it as physical pointer motion on the remote end, such that the receiving end can decide, whether or not the cursor motion should be accelerated (on a desktop) or not (very important for games!).
What this means is that emulating absolute pointer motion based on the unaccelerated relative motion received from a client is fundamentally the wrong thing to do...

More fundamentally, I want to enable the use of emulation backends that do not or even can not know about the absolute cursor position: (virtual-pointer, #297, and probably also MacOS with a custom HID driver in the future).
Allowing the receiving end to simply emulate relative mouse motion without having to care about absolute positions at all makes the implementation simpler or even possible at all in theses cases...

Frankly, in it's current state, most of the emulation backends do not actually emulate unaccelerated motion in this way right now, but it's a goal I'd like to achieve in the future.
I would also prefer to use a relative physical ei device (EI_DEVICE_TYPE_PHYSICAL) for libei input emulation for the same purpose, but it doesn't seem to be advertised by any EI server implementation as of right now.

An application grabbing the pointer, as well as pointer acceleration make it fundamentally impossible for these backends to keep track of when the device is left / the client should be released.

Maybe skipping the release barrier creation for backends that can keep track of the absolute cursor position might be an option but I'm not sure this is even possible to do correctly for the reasons stated above.

@whot
Copy link
Contributor Author

whot commented Feb 13, 2026

@nbolton, how is deskflow doing this?

I can answer that question :) exactly like I proposed here - the server controls the position on the remote client. That's where I got the idea from. In deskflow the clients have no information at all, the server knows that client A is next to client B but neither A nor B know that, they just get remote-controlled.

I think, what you are missing is pointer acceleration:

With this we still get the relative events on the inputcapture host but we convert them ourselves to absolute coordinates on the target. Pointer acceleration isn't applied to RemoteDesktop input so the movement should be exactly the same to what it is right now.

A quick check of mutter tells me that the EI events you get via the InputCapture portal are accelerated events. Unaccelerated is available, libinput provides it, but it's not used atm [see below...]

If you want to apply acceleration on the target machine you're also facing the issue that the mouse may behave differently depending on which machine it's currently on. Mind you, that's just as valid a use-case as having only one host decide pointer acceleration so it comes down to preference.

I would also prefer to use a relative physical ei device (EI_DEVICE_TYPE_PHYSICAL) for libei input emulation

fwiw, the intention of that device type was to have something akin to the USB device forwarding in qemu. It's primary use-case was tablets1 where you want the tablet to be used as-is on a libei client. Of course it would work just as well for relative and that would be a reasonable option for the unaccelerated data as above. Would need to extend mutter et al. for this feature though, but that's probably not even hard to do.

I'm 99% sure this worked before (I've tested this on two GNOME desktops before!)

I'll re-test again but I always got stuck on the remote and had to use the keyboard shortcut to release. Maybe it's a regression. From my personal side: I had just never considered the use-case for RemoteDesktop input to feed into InputCapture input so I definitely didn't pay attention to that when implementing bits.

Footnotes

  1. we don't actually have tablets in libei yet but that's another story

@whot
Copy link
Contributor Author

whot commented Feb 13, 2026

I'll re-test again but I always got stuck on the remote and had to use the keyboard shortcut to release.

I've updated to F44-alpha in the meantime for other reasons (on both boxes) and this is not an issue now. I don't know if it was a glitch or some temporary bug or something broken in F43 but it definitely seems to work now.

🤷 time to close this one then, it's not needed, sorry about wasting your time

@whot whot closed this Feb 13, 2026
@feschber
Copy link
Owner

I'll re-test again but I always got stuck on the remote and had to use the keyboard shortcut to release.

I've updated to F44-alpha in the meantime for other reasons (on both boxes) and this is not an issue now. I don't know if it was a glitch or some temporary bug or something broken in F43 but it definitely seems to work now.

🤷 time to close this one then, it's not needed, sorry about wasting your time

I think is good that we talked about this! Especially since you have some saying in the rdp internals :)

I hope that

(remote desktop input does not count as input for inputcapture purposes1).

Keeps working! (maybe as an intended feature?) Would there be a good reason not to count rdp input for the input capture portal?

@feschber
Copy link
Owner

If you want to apply acceleration on the target machine you're also facing the issue that the mouse may behave differently depending on which machine it's currently on. Mind you, that's just as valid a use-case as having only one host decide pointer acceleration so it comes down to preference.

I personally believe that having the target machine emulate physical input from unaccelerated captured events is the only sane thing to do.

Otherwise, it would be impossible for the remote end to give raw input over to a game.
I thought about this very extensively when designing the protocol and I believe, what barrier / input-leap / deskflow do breaks use cases like gaming which rely on raw pointer input. (unless pointer acceleration is disabled everywhere).

Would need to extend mutter et al. for this feature though, but that's probably not even hard to do.

I'd actually love to see this feature get added!

@whot
Copy link
Contributor Author

whot commented Feb 16, 2026

I filing the mutter issue for this when I realised that pointer acceleration in this case won't necessarily work as you'd expect. For games that rely on raw input it's easy enough but if you want acceleration applied you're on the wrong level with RemoteDesktop - the compositor itself does not do acceleration, libinput does this for real physical input.

So while capturing/replaying raw input is possible, getting pointer accel as configured on the target machine is not (unless the client itself does accel)

@nbolton
Copy link
Contributor

nbolton commented Feb 16, 2026

@nbolton, how is deskflow doing this?

Late to the conversation, but it's worth noting that Deskflow/Synergy supports both relative and absolute pointer modes on the client which users can choose.

  void mouseMove(int32_t xAbs, int32_t yAbs) override;
  void mouseRelativeMove(int32_t xRel, int32_t yRel) override;

To make pointer motion feel as natural as possible, the general Idea is to capture relative, unaccelerated pointer motion on the sending side and emulated it as physical pointer motion on the remote end, such that the receiving end can decide, whether or not the cursor motion should be accelerated (on a desktop) or not (very important for games!).

I personally believe that having the target machine emulate physical input from unaccelerated captured events is the only sane thing to do. Otherwise, it would be impossible for the remote end to give raw input over to a game.

Though games remain tricky since games use a mix of ways to get mouse coordinates (e.g. GetCursorPos vs GetRawInputData). We found (customer anecdote) that switching between relative and absolute doesn't help with game compatibility on clients. (edit: added to my comment below)

Edit: From customer support:

Depends on the game bug, but these are the settings that can affect mouse movements in game:

  • Relative mouse moves
  • Foreground window setting
  • Lock to screen

The current workaround is to:

  1. Disable relative mouse moves
  2. Disable the foreground window setting
  3. Use lock to screen to lock cursor to client (if playing game on client)

However, this does not work for all games; Minecraft does not work with Synergy client.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants