Skip to content

Commit

Permalink
selection plain draft
Browse files Browse the repository at this point in the history
  • Loading branch information
crescentheaded committed Apr 13, 2024
1 parent af51efb commit ab3aafb
Show file tree
Hide file tree
Showing 2 changed files with 89 additions and 40 deletions.
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Interface Interaction

Nothing here yet!
Different methods of controlling an interface

@Metadata {
@PageColor(blue)
Expand All @@ -18,47 +18,78 @@ Nothing here yet!
purpose: link,
label: "General Knowledge")
}

@Comment {
There are universal ways human beings
experience the world. All people have
motivations and build relationships. We all
have abilities and limits to those abilities.
Everyone experiences exclusion as they
interact with our designs. On the other hand,
a solution that works well for someone
who’s blind might also benefit any person
driving a car. Inclusive design works across
a spectrum of related abilities, connecting
different people in similar circumstances.

Designing for people with permanent
disabilities can seem like a significant
constraint, but the resulting designs can
actually benefit a much larger number of
people. For example, closed captioning was
created for the hard of hearing community.
But, there are many benefits of captioning
such as reading in a crowded airport, or,
teaching children how to read.

Similarly, high-contrast screen settings
were initially made to benefit people with
vision impairments. But today, many people
benefit from high-contrast settings when
they use a device in bright sunlight. The
same is true for remote controls, automatic
door openers, audiobooks, email, and much
more. Designing with constraints in mind is
simply designing well.
In [**Interface Perception**](<doc:InterfacePerception>) article we learnt that the information perceived by a person comes through multiple channels of receptive abilities and its interpretation is affected by possible limitations.

### Interfaces are just inputs and outputs
Moreover, we know that user interfaces are not only able to present the data but to receive it. Getting input is an essential task given to any computer to make the device operable. Operating computers enables its users to control the computational processes.

Omitting the sophisticated language it just means that a computer is not solely a transmitter, but a receiver at the same time.

### Touchscreens implement both
Talking about the subject of our interest, people are able to operate modern mobile devices by interacting with its touchscreen. Any interface displayed on the screen consists of elements that are not only informational, but provide the option to be activated.

### Interactive elements of a user interface
Interactive elements of a user interface are called controls. But if a person can not operate a smartphone by precisely navigating and then activating a visible control on the screen?

## Physical limitations

Definition of controls from above implies many pitfalls.

### Visual recognition
First of all, the controls available on the screen can be only distinguished visually, which is a problem. Not everyone is able to see, recognise or simply look at the screen.

@Links(visualStyle: detailedGrid) {
- <doc:ColorsAndShapes>
}

## How People Operate Computers
### Precise navigation
Then there is activating these controls. To operate a mobile interface one has to press on a particular area of the touchscreen with their finger, which is also a problem.

Not everyone has fingers. Moreover, having fingers doesn't guarantee that their owner is able to use them with the required dexterity.

Regardless of their on the screen, graphic controls are still tiny because mobile device are tiny. Otherwise they wouldn't be mobile.

## Classification of Physical Limitations
Then there is responsiveness. Even the slightest touch on the screen is received.

All that makes operating touchscreens an unbearable ordeal for people who lack fine motor skills or efficient visual recognition. Shaking hands, i.e. tremor, immobility of fingers or their absence, blindness disable a person from using controls. So how people are supposed to complete the task of using a smartphone if they simply can't?

## Item Selection
### Direct Selection
### Indirect Selection
Here we are. Navigating to the control and activating as an integral process is called item selection. Both selection methods below work with the same set of elements on the screen, but the way the user comes to the control they want to activate is different.

### Direct selection
Complex graphic user interfaces were originally designed to be used with a pointing device. A pointing device is a mouse, a touchpad, a trackball, a stylus pen, an eye tracker -- everything that can be used to move a pointer, which is a symbol on the screen displaying the current position of the user in the interface, by providing physical input.

Pointing devices allow direct input, which is a "free" focusing on an element on the screen by direct navigation. No iteration of other elements happening.

> Note: Actually the iteration is happening, but the process is that fast and subconscious so it is incomparable to the explicit iteration.
### Indirect selection
On the other side of operating computers there is indirect selection. Indirect selection does not require precise physical aiming: this method goes through every element available on the screen without having to aim for it. The user either waits for the automatic selection frame to iterate to the item they want to activate or manually go to it if they are able to perform the action of focus.

### Interfacial elements hierarchy
To implement the indirect access, the assistive software takes the available elements of the screen and iterates through them in a particular order.

### Accessibility Tree
Talking about iOS devices, the order is taken from Accessibility Tree that represents a hierarchical structure of accessible elements. The iteration happens left to right or right to left depending on the language of the interface, top to bottom.

## Supporting indirect selection
And this is the most important thing we have to consider when designing accessible interface. To enable people use your application with indirect selection, you have to ensure that all controls are accessible and navigable. Moreover, it is crucial to be aware of the order of elements in your interface because indirect selection is essentially what VoiceOver is. Screen readers read the elements in the ordered manner, so it is crucial to structure the elements that way so the semantic model is properly comprehensible without having to perceive the interface layout visually.

To know how to support indirect selection, and, thus, enable users of those assistive technology relying on it, take a look at the following articles:

@Links(visualStyle: detailedGrid) {
- <doc:VoiceOver>
- <doc:SwitchControl>
- <doc:Navigation>
- <doc:OnScreen-Navigation>
- <doc:Between-ScreensNavigation>
- <doc:AccessibilityTree>
}

### Have fun!

## See Also
- <doc:InterfacePerception>
- <doc:AccessibleUI>
- <doc:AccessibleUX>
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Cognitive Modelling
# Interface Perception

A quick dip back into primordial soup to understand how everyone experiences the same world differently

Expand Down Expand Up @@ -129,6 +129,8 @@ It is possible to be born with a sensory system capabilities different from what
Albinism often comes with **congenital visual disability**
}

> Important: People who **were born** without an ability to use a particular sense, for example, those who are *congenitally* completely **blind** or **deaf** have no idea about visual or audial forms of information in the terms of how sighted or hearing people perceive it. **Their cognitive *image* of the world are different.**
### Obtained impairments
Being born with **fully functional sensory system does not guarantee the integrity of perception forever**. Typical perception may be adjusted **temporary** or **permanently** during the life as a result of **changes within the receptive or processing organs**.

Expand All @@ -138,7 +140,23 @@ Being born with **fully functional sensory system does not guarantee the integri

@Image(source: magnifying-glass, alt: "") {
It is natural for people to experience **vision loss** from **aging-related degradation of sensory organs**
}
}

## -- Ok. How does this all information helps me create accessible products?
The *most* important point that can be taken from this page (after the fact that no one is safe from losing abilities to sense, of course) is that **the meaning behind any cause is the same regardless of its interpretation**.

@Image(source: placeholder-image, alt: "")


For example, a mobile applications implementing pizza delivery solves the same task **with** and **without** graphic interface **available to be perceived**. The **experience** of making an order will be different, but the **mental model** of the process is the same for everyone. Because it was *designed* this way.

As *accessibility experts* it is our **goal** to create interfaces that are **perceivable in different conditions**, i.e. have the **content** as independent as possible of its **form**.

## See Also
- <doc:VoiceOver>
- <doc:AccessibleUI>
- <doc:AccessibleUX>
- <doc:ColorsAndShapes>
- <doc:AlternativeDescription>

## What To Do With All This Information

0 comments on commit ab3aafb

Please sign in to comment.