Goal
InputElement and how gesture recognizers, commands, and keyboard navigation fit together.HotKeyManager and pointer capture APIs.Why this matters
Prerequisites
INotifyPropertyChanged view model.Avalonia input pieces live under:
Avalonia.Interactivity defines RoutedEvent, event descriptors, and routing strategies.InputElement (inherits Interactive → Visual → Animatable) exposes focus, input, and command helpers that every control inherits.Avalonia.Base/Input provides Pointer, KeyboardDevice, KeyGesture, PointerPoint.GestureRecognizers translate raw pointer data into tap, scroll, drag behaviors.HotkeyManager walks the visual tree to resolve KeyGestures against ICommand targets.Event flow:
PointerPressed, KeyDown). Each is registered as a RoutedEvent with a routing strategy (tunnel, bubble, direct).InputElement hosts the event metadata, raising class handlers and instance handlers.Tapped, DoubleTapped, PointerPressedEventArgs).Button.Command, KeyBinding, InputGesture) execute ICommand implementations and update CanExecute.Creating custom events uses the static registration helpers:
public static readonly RoutedEvent<RoutedEventArgs> DragStartedEvent =
RoutedEvent.Register<Control, RoutedEventArgs>(
nameof(DragStarted),
RoutingStrategies.Bubble);
public event EventHandler<RoutedEventArgs> DragStarted
{
add => AddHandler(DragStartedEvent, value);
remove => RemoveHandler(DragStartedEvent, value);
}
RoutingStrategies live in RoutedEvent.cs; each handler chooses whether the event should travel from root to leaf (tunnel) or leaf to root (bubble).
dotnet new avalonia.mvvm -o InputPlayground
cd InputPlayground
MainWindowViewModel exposes commands and state. Add CommunityToolkit.Mvvm or implement your own AsyncRelayCommand to simplify asynchronous logic. Hotkeys are attached in XAML using HotKeyManager.HotKey, keeping the view model free of UI dependencies.
using System;
using System.Threading.Tasks;
using System.Windows.Input;
namespace InputPlayground.ViewModels;
public sealed class MainWindowViewModel : ViewModelBase
{
private string _status = "Ready";
public string Status
{
get => _status;
private set => SetProperty(ref _status, value);
}
private bool _hasChanges;
public bool HasChanges
{
get => _hasChanges;
set
{
if (SetProperty(ref _hasChanges, value))
{
SaveCommand.RaiseCanExecuteChanged();
}
}
}
public RelayCommand SaveCommand { get; }
public RelayCommand DeleteCommand { get; }
public AsyncRelayCommand RefreshCommand { get; }
public MainWindowViewModel()
{
SaveCommand = new RelayCommand(_ => Save(), _ => HasChanges);
DeleteCommand = new RelayCommand(item => Delete(item));
RefreshCommand = new AsyncRelayCommand(RefreshAsync, () => !IsBusy);
}
private bool _isBusy;
public bool IsBusy
{
get => _isBusy;
private set
{
if (SetProperty(ref _isBusy, value))
{
RefreshCommand.RaiseCanExecuteChanged();
}
}
}
private void Save()
{
Status = "Saved";
HasChanges = false;
}
private void Delete(object? parameter)
{
Status = parameter is string name ? $"Deleted {name}" : "Deleted item";
HasChanges = true;
}
private async Task RefreshAsync()
{
try
{
IsBusy = true;
Status = "Refreshing...";
await Task.Delay(1500);
Status = "Data refreshed";
}
finally
{
IsBusy = false;
}
}
}
Supporting command classes (RelayCommand, AsyncRelayCommand) go in Commands folder. You may reuse the ones from CommunityToolkit.Mvvm or ReactiveUI.
| Use command when... | Use event when... |
|---|---|
| You expose an action (Save/Delete) from view model | You need pointer coordinates, delta, or low-level control |
| You want CanExecute/disable logic | You're implementing custom gestures/drag interactions |
| The action runs from buttons, menus, shortcuts | Work is purely visual or specific to a view |
| You plan to unit test the action | Data is transient or you need immediate UI feedback |
Most real views mix both: commands for operations, events for gestures.
<StackPanel Spacing="12">
<TextBox Watermark="Name" Text="{Binding SelectedName, Mode=TwoWay}"/>
<StackPanel Orientation="Horizontal" Spacing="12">
<Button Content="Save" Command="{Binding SaveCommand}"/>
<Button Content="Refresh" Command="{Binding RefreshCommand}" IsEnabled="{Binding !IsBusy}"/>
<Button Content="Delete" Command="{Binding DeleteCommand}"
CommandParameter="{Binding SelectedName}"/>
</StackPanel>
<TextBlock Text="{Binding Status}"/>
</StackPanel>
Buttons disable automatically when SaveCommand.CanExecute returns false.
<Window ...>
<Window.InputBindings>
<KeyBinding Gesture="Ctrl+S" Command="{Binding SaveCommand}"/>
<KeyBinding Gesture="Ctrl+R" Command="{Binding RefreshCommand}"/>
<KeyBinding Gesture="Ctrl+Delete" Command="{Binding DeleteCommand}" CommandParameter="{Binding SelectedName}"/>
</Window.InputBindings>
</Window>
KeyGesture parsing is handled by KeyGesture and KeyGestureConverter. For multiple gestures, add more KeyBinding entries on the relevant InputElement.
HotKeyManager attached propertyKeyBinding only fires while the owning control is focused. To register process-wide hotkeys that stay active as long as a control is in the visual tree, attach a KeyGesture via HotKeyManager.HotKey:
<Window xmlns:controls="clr-namespace:Avalonia.Controls;assembly=Avalonia.Controls">
<Button Content="Save"
Command="{Binding SaveCommand}"
controls:HotKeyManager.HotKey="Ctrl+Shift+S"/>
</Window>
HotKeyManager walks up to the owning TopLevel and injects a KeyBinding for you, even when the button is not focused. In code you can call HotKeyManager.SetHotKey(button, new KeyGesture(Key.S, KeyModifiers.Control | KeyModifiers.Shift));. Implementation lives in HotkeyManager.cs.
Bring Avalonia.Input into scope when assigning gestures programmatically so KeyGesture and KeyModifiers resolve.
Use _ to define an access key in headers (e.g., _Save). Access keys work when Alt is pressed.
<Menu>
<MenuItem Header="_File">
<MenuItem Header="_Save" Command="{Binding SaveCommand}" InputGesture="Ctrl+S"/>
</MenuItem>
</Menu>
Access keys are processed via AccessKeyHandler (AccessKeyHandler.cs). Combine them with HotKeyManager to offer both menu accelerators and global commands.
Avalonia ships gesture recognizers derived from GestureRecognizer. Attach them via GestureRecognizers to translate raw pointer data into commands:
<Border Background="#1e293b" Padding="16">
<Border.GestureRecognizers>
<TapGestureRecognizer NumberOfTapsRequired="2" Command="{Binding DoubleTapCommand}" CommandParameter="Canvas"/>
<ScrollGestureRecognizer CanHorizontallyScroll="True" CanVerticallyScroll="True"/>
</Border.GestureRecognizers>
<TextBlock Foreground="White" Text="Double-tap or scroll"/>
</Border>
Implementation: TapGestureRecognizer.cs.
For custom gestures (e.g., drag-to-reorder), handle PointerPressed, call e.Pointer.Capture(control) to capture input, and release on PointerReleased. Pointer capture ensures subsequent move/press events go to the capture target even if the pointer leaves its bounds. Use PointerEventArgs.GetCurrentPoint to inspect buttons, pressure, tilt, or contact rectangles for richer interactions.
private bool _isDragging;
private Point _dragStart;
private void Card_PointerPressed(object? sender, PointerPressedEventArgs e)
{
_isDragging = true;
_dragStart = e.GetPosition((Control)sender!);
e.Pointer.Capture((IInputElement)sender!);
}
private void Card_PointerMoved(object? sender, PointerEventArgs e)
{
if (_isDragging && sender is Control control)
{
var offset = e.GetPosition(control) - _dragStart;
Canvas.SetLeft(control, offset.X);
Canvas.SetTop(control, offset.Y);
}
}
private void Card_PointerReleased(object? sender, PointerReleasedEventArgs e)
{
_isDragging = false;
e.Pointer.Capture(null);
}
To cancel capture, call e.Pointer.Capture(null) or use Pointer.Captured. See PointerDevice.cs and PointerEventArgs.cs for details.
Text entry flows through TextInput events. For IME (Asian languages), Avalonia raises TextInput with composition events. To hook into the pipeline, subscribe to TextInput or implement ITextInputMethodClient in custom controls. Source: TextInputMethodClient.cs.
<TextBox TextInput="TextBox_TextInput"/>
private void TextBox_TextInput(object? sender, TextInputEventArgs e)
{
Debug.WriteLine($"TextInput: {e.Text}");
}
In most MVVM apps you rely on TextBox handling IME; implement this only when creating custom text editors.
Focus() to move input programmatically. InputElement.Focus() delegates to FocusManager.Focusable="False" on decorative elements so they are skipped in traversal.TabIndex (lower numbers focus first); combine with KeyboardNavigation.TabNavigation to scope loops.Focusable="True" + IsTabStop="True") for popups/overlays so focus returns to the invoking control when closed.TraversalRequest and KeyboardNavigationHandler to implement custom arrow-key navigation for grids or toolbars.<StackPanel KeyboardNavigation.TabNavigation="Cycle" Spacing="8">
<TextBox x:Name="First" Watermark="First name"/>
<TextBox x:Name="Second" Watermark="Last name"/>
<Button Content="Focus second" Command="{Binding FocusSecondCommand}"/>
</StackPanel>
public void FocusSecond()
{
var scope = FocusManager.Instance.Current;
var second = this.FindControl<TextBox>("Second");
scope?.Focus(second);
}
For MVVM-safe focus changes, expose an interaction request (event or Interaction<T> from ReactiveUI) and let the view handle it. Keyboard navigation services live under IKeyboardNavigationHandler.
RelayCommand/AsyncRelayCommand implement ICommand and expose CanExecuteChanged. Use [RelayCommand] attributes to generate commands and wrap business logic in partial classes.ReactiveCommand exposes IObservable execution pipelines, throttling, and cancellation. Bind with {Binding SaveCommand} just like any other ICommand.DelegateCommand supports ObservesCanExecute and integrates with dependency injection lifetimes.To unify event-heavy code paths with commands, expose interaction helpers instead of code-behind:
public Interaction<Unit, PointerPoint?> StartDragInteraction { get; } = new();
public async Task BeginDragAsync()
{
var pointerPoint = await StartDragInteraction.Handle(Unit.Default);
if (pointerPoint is { } point)
{
// Use pointer data to seed drag operation
}
}
The example uses ReactiveUI.Interaction and Avalonia.Input.PointerPoint; adapt the pattern to your MVVM framework of choice.
In XAML, use Interaction behaviors (<interactions:Interaction.Triggers> or toolkit EventToCommandBehavior) to connect events such as PointerPressed to ReactiveCommands without writing code-behind. This keeps event routing logic discoverable while leaving testable command logic in the view model.
Avalonia supports routed commands similar to WPF. Define a RoutedCommand (RoutedCommandLibrary.Save, etc.) and attach handlers via CommandBinding.
<Window.CommandBindings>
<CommandBinding Command="{x:Static commands:AppCommands.Save}" Executed="Save_Executed" CanExecute="Save_CanExecute"/>
</Window.CommandBindings>
private void Save_Executed(object? sender, ExecutedRoutedEventArgs e)
{
if (DataContext is MainWindowViewModel vm)
vm.SaveCommand.Execute(null);
}
private void Save_CanExecute(object? sender, CanExecuteRoutedEventArgs e)
{
e.CanExecute = (DataContext as MainWindowViewModel)?.SaveCommand.CanExecute(null) == true;
}
Routed commands bubble up the tree if not handled, allowing menu items and toolbars to share command logic.
Source: RoutedCommand.cs.
Avoid blocking the UI thread. Use AsyncRelayCommand or custom ICommand that runs Task.
public sealed class AsyncRelayCommand : ICommand
{
private readonly Func<Task> _execute;
private readonly Func<bool>? _canExecute;
private bool _isExecuting;
public AsyncRelayCommand(Func<Task> execute, Func<bool>? canExecute = null)
{
_execute = execute;
_canExecute = canExecute;
}
public bool CanExecute(object? parameter) => !_isExecuting && (_canExecute?.Invoke() ?? true);
public async void Execute(object? parameter)
{
if (!CanExecute(parameter))
return;
try
{
_isExecuting = true;
RaiseCanExecuteChanged();
await _execute();
}
finally
{
_isExecuting = false;
RaiseCanExecuteChanged();
}
}
public event EventHandler? CanExecuteChanged;
public void RaiseCanExecuteChanged() => CanExecuteChanged?.Invoke(this, EventArgs.Empty);
}
DevTools (F12) -> Events tab let you monitor events (PointerPressed, KeyDown). Select an element, toggle events to watch.
Enable input logging:
AppBuilder.Configure<App>()
.UsePlatformDetect()
.LogToTrace(LogEventLevel.Debug, new[] { LogArea.Input })
.StartWithClassicDesktopLifetime(args);
LogArea.Input (source: LogArea.cs) emits detailed input information.
AddHandler for PointerPressedEvent/KeyDownEvent, display bubbling order, and compare to the DevTools Events tab.Ctrl+Shift+S gesture with HotKeyManager.HotKey (in XAML or via HotKeyManager.SetHotKey), then toggle the button’s IsEnabled state and confirm CanExecute updates propagate.PointerPoint.Properties to track left vs right button drags.ReactiveCommand or toolkit AsyncRelayCommand with a drag Interaction<T> so the view model decides when async work starts.KeyboardNavigation.TabNavigation="Cycle" on a popup and verify focus returns to the launcher when it closes.RoutedEvent.cs, RoutingStrategiesButtonBase.Command, MenuItem.Command, KeyBindingKeyGesture.cs, HotkeyManager.csInputElement.cs, GestureRecognizer.csFocusManager.cs, IKeyboardNavigationHandlerTextInputMethodClient.csKeyBinding vs registering a gesture with HotKeyManager?PointerPoint data during drag initiation and why does it matter?ReactiveCommand or toolkit command without code-behind?What's next