Accessibility Barriers

Physical Flexibility, Dexterity, and Precision Independence

The User Reality: For users with limited arm mobility, static UI elements and fixed menus represent a fundamental barrier to interaction. When a user cannot extend their arm to reach the edges of a display, they are forced to rely on wrist articulation. However, wrist movement alone is insufficient to generate the angles or distances required for precise control, such as aiming or launching projectiles. Similarly, fine motor actions like "pinching," "twisting," or finding a specific button without looking are often impossible, while repeated inputs (button-mashing) and "quick time events" are unusable for users with motor challenges. However, inputs are often limited to those designed by the developer with no alternative methods provided, leaving users with no "Plan B" if they cannot perform a specific gesture.

Research:

The Openality Standard

  • Constraints:
    • Interaction must not depend on specific wrist angles or complex limb rotations to aim or select targets.
    • Interaction must not rely on precise finger placement (pinching), rapid repetition (button-mashing), or "blind" button finding.
  • Requirements:
    • Replace demanding physical actions (pulling, twisting) with abstract, low-effort inputs (thumbstick movement, button presses).
    • Provide alternative, multi-modal options for key mechanics (e.g., gaze-assisted aiming) to accommodate varying physical capabilities.

Core Behaviours

  • Gesture Abstraction - Complex physical interactions should be abstracted into simple inputs, allowing users to perform the same actions without needing to mimic the physical motion.
  • Dynamic Anchoring - The interface should be capable of "summoning" itself to the user’s immediate reach zone, eliminating the need for physical traversal to access important UI elements.
  • Gaze-Dwell Activation - Users should be able to activate UI elements by looking at them for a short period, providing an alternative to physical button presses for selection.

Primary Interaction Patterns

  • The Magic Slingshot - A thumbstick-based input method that remaps complex physical gestures (e.g., pulling back to throw) to simple thumbstick movements, enabling users with limited mobility to perform the same actions without physical strain.
  • The Gaze Cursor - A gaze-assisted aiming system that allows users to target objects by looking at them, reducing reliance on precise hand movements.
  • The UI Summoner - A system that allows users to summon menus and UI elements to their immediate reach zone with a simple gesture or button press, eliminating the need for physical traversal to access important interfaces.
  • The Auto Grip System - Automatically maintains a "grip" state without requiring sustained button pressure, allowing users to grab and hold objects with a single tap.
Previous
Fatigue minimisation