UX Framework

Multimodal Consistency

Core Behaviour

Logic

Visual, auditory, and haptic feedback must be synchronised to avoid contradictory information.

Addresses

The "Split-Attention Effect," where mismatched sensory cues (e.g., audio not matching visual events) trigger distress and increase cognitive effort.

Implementation Specification

Cues are "redundant and aligned" - for example, a button glows, clicks, and vibrates simultaneously - to reinforce a single interaction.

Interaction Patterns

The Auto Grip System employs Multimodal Consistency to ensure that visual, auditory, and haptic feedback are perfectly synchronised when users interact with objects, reducing sensory confusion and enhancing the overall user experience.

The Magic Slingshot uses Multimodal Consistency to provide synchronised visual, auditory, and haptic feedback when users pull back and release the slingshot, creating a cohesive and immersive experience that minimises sensory overload.

The UI Summoner can use Multimodal Consistency to ensure a seamless and intuitive interaction that reduces sensory confusion.

Accessibility Barriers

Sensory Regulation & Environmental Control

Previous
Linear Task Segmentation