Goal
Why this matters
Prerequisites
Avalonia turns OS-specific events into a three-stage pipeline (InputManager.ProcessInput).
RawInputEventArgs (mouse, touch, pen, keyboard, gamepad). Each IRenderRoot has devices that call Device.ProcessRawEvent.InputManager.Instance?.PreProcess) can inspect or cancel before routing. Use this sparingly for diagnostics, not business logic.PointerPressedEvent, KeyDownEvent, TextInputMethodClientRequestedEvent).Because the input manager lives in AvaloniaLocator, you can temporarily subscribe:
using IDisposable? sub = InputManager.Instance?
.PreProcess.Subscribe(raw => _log.Debug("Raw input {Device} {Type}", raw.Device, raw.RoutedEvent));
Remember to dispose subscriptions; the pipeline never terminates while the app runs.
InputElement exposes pointer events (bubble strategy by default).
| Event | Trigger | Key data |
|---|---|---|
PointerEntered / PointerExited |
Pointer crosses hit-test boundary | Pointer.Type, KeyModifiers, Pointer.IsPrimary |
PointerPressed |
Button/contact press | PointerUpdateKind, PointerPointProperties, ClickCount in PointerPressedEventArgs |
PointerMoved |
Pointer moves while inside or captured | GetPosition, GetIntermediatePoints |
PointerWheelChanged |
Mouse wheel / precision scroll | Vector delta, PointerPoint.Properties |
PointerReleased |
Button/contact release | Pointer.IsPrimary, Pointer.Captured |
PointerCaptureLost |
Capture re-routed, element removed, or pointer disposed | PointerCaptureLostEventArgs.Pointer |
Event routing is tunable:
protected override void OnInitialized()
{
base.OnInitialized();
AddHandler(PointerPressedEvent, OnPreviewPressed, handledEventsToo: true);
AddHandler(PointerPressedEvent, OnPressed, routingStrategies: RoutingStrategies.Tunnel | RoutingStrategies.Bubble);
}
Use tunnel handlers (RoutingStrategies.Tunnel) for global shortcuts (e.g., closing flyouts). Keep bubbling logic per control.
e.GetPosition(this) projects coordinates into any visual's space; pass null for top-level coordinates.e.GetIntermediatePoints(this) yields historical samples—crucial for smoothing freehand ink.PointerPoint.Properties exposes pressure, tilt, contact rectangles, and button states. Always verify availability (Pointer.Type == PointerType.Pen before reading pressure).Capturing sends subsequent input to an element regardless of pointer location—vital for drags.
protected override void OnPointerPressed(PointerPressedEventArgs e)
{
if (e.Pointer.Type == PointerType.Touch)
{
e.Pointer.Capture(this);
_dragStart = e.GetPosition(this);
e.Handled = true;
}
}
protected override void OnPointerReleased(PointerReleasedEventArgs e)
{
if (ReferenceEquals(e.Pointer.Captured, this))
{
e.Pointer.Capture(null);
CompleteDrag(e.GetPosition(this));
e.Handled = true;
}
}
Key rules:
Capture(null)) on completion or cancellation.PointerCaptureLost—it fires if the element leaves the tree or another control steals capture.PointerMoved events until capture returns.Control → Window), consider e.Pointer.Capture(this) in the top-level to avoid anomalies when children are removed mid-gesture.Avalonia assigns unique IDs per contact (Pointer.Id) and marks a primary contact (Pointer.IsPrimary). Keep per-pointer state in a dictionary:
private readonly Dictionary<int, PointerTracker> _active = new();
protected override void OnPointerPressed(PointerPressedEventArgs e)
{
_active[e.Pointer.Id] = new PointerTracker(e.Pointer.Type, e.GetPosition(this));
UpdateManipulation();
}
protected override void OnPointerReleased(PointerReleasedEventArgs e)
{
_active.Remove(e.Pointer.Id);
UpdateManipulation();
}
Pen-specific data lives in PointerPoint.Properties:
var sample = e.GetCurrentPoint(this);
float pressure = sample.Properties.Pressure; // 0-1
bool isEraser = sample.Properties.IsEraser;
Touch sends a contact rectangle (ContactRect) you can use for palm rejection or handle-size aware UI.
Two gesture models coexist:
Avalonia.Input.Gestures (Tapped, DoubleTapped, RightTapped). Attach with Gestures.AddDoubleTappedHandler or AddHandler.InputElement.GestureRecognizers) for continuous gestures (pinch, pull-to-refresh, scroll).To attach built-in recognizers:
GestureRecognizers.Add(new PinchGestureRecognizer
{
// Your subclasses can expose properties via styled setters
});
Creating your own recognizer lets you coordinate multiple pointers and maintain internal state:
public class PressAndHoldRecognizer : GestureRecognizer
{
public static readonly RoutedEvent<RoutedEventArgs> PressAndHoldEvent =
RoutedEvent.Register<InputElement, RoutedEventArgs>(
nameof(PressAndHoldEvent), RoutingStrategies.Bubble);
public TimeSpan Threshold { get; set; } = TimeSpan.FromMilliseconds(600);
private CancellationTokenSource? _hold;
private Point _pressOrigin;
protected override async void PointerPressed(PointerPressedEventArgs e)
{
if (Target is not Visual visual)
return;
_pressOrigin = e.GetPosition(visual);
Capture(e.Pointer);
_hold = new CancellationTokenSource();
try
{
await Task.Delay(Threshold, _hold.Token);
Target?.RaiseEvent(new RoutedEventArgs(PressAndHoldEvent));
}
catch (TaskCanceledException)
{
// Swallow cancellation when pointer moves or releases early.
}
}
protected override void PointerMoved(PointerEventArgs e)
{
if (Target is not Visual visual || _hold is null || _hold.IsCancellationRequested)
return;
var current = e.GetPosition(visual);
if ((current - _pressOrigin).Length > 8)
_hold.Cancel();
}
protected override void PointerReleased(PointerReleasedEventArgs e) => _hold?.Cancel();
protected override void PointerCaptureLost(IPointer pointer) => _hold?.Cancel();
}
Register the routed event (PressAndHoldEvent) on your control and listen just like other events. Note the call to Capture(e.Pointer) which also calls PreventGestureRecognition() to stop competing recognizers.
Avalonia exposes higher-level manipulation data through gesture recognizers so you do not have to rebuild velocity tracking yourself.
ScrollGestureRecognizer raises ScrollGestureEventArgs with linear deltas and velocities—ideal for kinetic scrolling or canvas panning.PinchGestureRecognizer produces PinchEventArgs that report scale, rotation, and centroid changes for zoom surfaces.PullGestureRecognizer keeps track of displacement against a threshold (PullGestureRecognizer.TriggerDistance) so you can drive pull-to-refresh visuals without reimplementing spring physics.VelocityTracker to compute momentum; you can hook GestureRecognizer.Completed to project inertia with your own easing.Attach event handlers directly on the recognizer when you need raw data:
var scroll = new ScrollGestureRecognizer();
scroll.Scroll += (_, e) => _viewport += e.DeltaTranslation;
scroll.Inertia += (_, e) => StartInertiaAnimation(e.Velocity);
GestureRecognizers.Add(scroll);
Manipulation events coexist with pointer events. Mark the gesture event as handled when you consume it so the default scroll viewer does not fight your logic. For custom behaviors (elastic edges, snap points), tune ScrollGestureRecognizer.IsContinuous, ScrollGestureRecognizer.CanHorizontallyScroll, and ScrollGestureRecognizer.CanVerticallyScroll to match your layout.
Strategies for common scenarios:
Thumb, raise a routed DragDelta event, and update layout in response. Release capture in PointerReleased and PointerCaptureLost.GetIntermediatePoints for smooth curves, and throttle invalidation with DispatcherTimer to keep the UI responsive.PinchGestureRecognizer for zoom. Combine with MatrixTransform on the content.PullGestureRecognizer with PullDirection to recognise deflection and expose progress to the view model.PointerEntered kicks off a timer, PointerExited cancels it; inspect e.GetCurrentPoint(this).Properties.PointerUpdateKind to ignore quick flicks.Platform differences worth noting:
PointerType.Touch. Guard pen-specific paths behind Pointer.Type == PointerType.Pen because Linux/X11 backends can omit advanced pen properties.PointerPoint.Properties.Pressure may always be 1.0.http://tizen.org/privilege/haptic privilege before you can trigger haptics from pull or press gestures.Avalonia's focus engine is pluggable.
TopLevel exposes a FocusManager (via (this.GetVisualRoot() as IInputRoot)?.FocusManager) that drives tab order (TabIndex, IsTabStop).IKeyboardNavigationHandler orchestrates directional nav; register your own implementation before building the app, e.g. AvaloniaLocator.CurrentMutable.Bind<IKeyboardNavigationHandler>().ToSingleton<CustomHandler>();.XYFocus attached properties override directional targets for gamepad/remote scenarios:<StackPanel
input:XYFocus.Up="{Binding ElementName=SearchBox}"
input:XYFocus.NavigationModes="Keyboard,Gamepad" />
Key bindings complement commands without requiring specific controls:
KeyBindings.Add(new KeyBinding
{
Gesture = new KeyGesture(Key.N, KeyModifiers.Control | KeyModifiers.Shift),
Command = ViewModel.NewNoteCommand
});
HotKeyManager subscribes globally:
HotKeyManager.SetHotKey(this, KeyGesture.Parse("F2"));
Ensure the target control implements ICommandSource or IClickableControl; Avalonia wires the gesture into the containing TopLevel and executes the command or raises Click.
Ensure focus cues remain visible: call NavigationMethod.Tab when moving focus programmatically so keyboard users see an adorner.
When Avalonia detects non-keyboard key devices, it sets KeyDeviceType on key events. Use FocusManager.GetFocusManager(this)?.Focus(elem, NavigationMethod.Directional, modifiers) to respect D-Pad navigation.
Configure XY focus per visual:
| Property | Purpose |
|---|---|
XYFocus.Up/Down/Left/Right |
Explicit neighbours when layout is irregular |
XYFocus.NavigationModes |
Enable keyboard, gamepad, remote individually |
XYFocus.LeftNavigationStrategy |
Choose default algorithm (closest edge, projection, navigation axis) |
For dense grids (e.g., TV apps), set XYFocus.NavigationModes="Gamepad,Remote" and assign explicit neighbours to avoid diagonal jumps. Pair with KeyBindings for shortcuts like Back or Menu buttons on controllers (map gamepad keys via key modifiers on the key event).
Where hardware exposes haptic feedback (mobile, TV remotes), query the platform implementation with TopLevel.PlatformImpl?.TryGetFeature<TFeature>(). Some backends surface rumble/vibration helpers; when none are available, fall back gracefully so keyboard-only users are not blocked.
Text input flows through InputMethod, TextInputMethodClient, and TextInputOptions.
TextInputOptions attached properties describe desired keyboard UI.TextInputMethodClient adapts a text view to IMEs (caret rectangle, surrounding text, reconversion).InputMethod.GetIsInputMethodEnabled lets you disable the IME for password fields.Set options in XAML:
<TextBox
Text=""
input:TextInputOptions.ContentType="Email"
input:TextInputOptions.ReturnKeyType="Send"
input:TextInputOptions.ShowSuggestions="True"
input:TextInputOptions.IsSensitive="False" />
When you implement custom text surfaces (code editors, chat bubbles):
TextInputMethodClient to expose text range, caret rect, and surrounding text.TextInputMethodClientRequested in your control to supply the client.InputMethod.SetIsInputMethodEnabled(this, true) and update the client's TextViewVisual so IME windows track the caret.TextInputMethodClient.CursorRectangleChanged so the backend updates composition windows.Remember to honor TextInputOptions.IsSensitive—set it when editing secrets so onboard keyboards hide predictions.
Advanced interactions must fall back to keyboard and automation:
KeyBindings, buttons) for pointer-only gestures.CopyRequested) so automation peers can invoke them.AutomationProperties.ControlType, AutomationProperties.IsControlElement) when capture changes visual state.FocusManager decisions—never suppress focus adorners merely because a pointer started the interaction.InputMethod.SetIsInputMethodEnabled and TextInputOptions to support assistive text input (switch control, dictation).Create a playground that exercises every surface:
dotnet new avalonia.mvvm -n InputLab. Add a CanvasView control hosting drawing, a side panel for logs, and a bottom toolbar.DrawingContext.DrawGeometry. Display pressure as stroke thickness.PressAndHoldRecognizer (above) to show context commands after 600 ms. Hook the resulting routed event to toggle a radial menu.PinchGestureRecognizer and ScrollGestureRecognizer to pan/zoom the canvas. Update a MatrixTransform as gesture delta arrives.KeyBindings for Ctrl+Z, Ctrl+Shift+Z, and arrow-key panning. Update XYFocus properties so D-Pad moves between toolbar buttons.KeyDeviceType in KeyDown to confirm Avalonia recognises it as Gamepad.TextBox with TextInputOptions.ReturnKeyType="Send", plus a custom MentionTextBox implementing TextInputMethodClient to surface inline completions.InputManager.Instance?.Process and log pointer ID, update kind, and capture target into a side list for debugging.Document findings in README (which gestures compete, how capture behaves on focus loss) so the team can adjust default UX.
IsHitTestVisible is true and that no transparent sibling intercepts input. For overlays, set IsHitTestVisible="False".PointerCaptureLost and when the control unloads. Wrap capture in try/finally on operations that may throw.e.PreventGestureRecognition() when manual pointer logic should trump recognizers—or avoid attaching recognizers to nested elements.Visual.PointToScreen when working across popups; pointer positions are per-visual, not global.(this.GetVisualRoot() as IInputRoot)?.FocusManager?.GetFocusedElement() before capture and restore it when the operation completes to preserve keyboard flow.TextInputMethodClient.TextViewVisual whenever layout changes; failing to do so leaves composition windows floating in the old position.Pointer.csPointerEventArgs.cs, PointerPoint.csGestureRecognizer.cs, Gestures.csScrollGestureRecognizer.cs, PinchGestureRecognizer.cs, PullGestureRecognizer.csIKeyboardNavigationHandler.cs, XYFocus.Properties.csKeyEventArgs.cs, KeyDeviceType.cs, TouchDevice.cs, PenDevice.csTextInputOptions.cs, TextInputMethodManager.csInputManager.csPointerPointProperties matter for pen input and how do you guard against unsupported platforms?TextInputMethodClient in your control?IKeyboardNavigationHandler?What's next