UWP and the evolution of touch development

Windows

Windows
How is programming for touch development on the Universal Windows Platform (UWP) different from mouse and keyboard development in Windows Forms? This post will cover some of the subtle differences between the two and how to use the most advanced tools for building smooth touch experiences.

Please, Log in or Register to view URLs content!


Fig 1. Command prompt

The ways we communicate with our computers have gone through many changes over the years. For a long time, interactions were governed by the keyboard. Then, the
Please, Log in or Register to view URLs content!
(GUI) came along, which not only introduced the mouse, but drastically altered how home screens and apps looked. The success of Windows 95 and Mac OS cemented that user experience, popularized personal computing and changed our computing landscape.

Please, Log in or Register to view URLs content!


Fig 2. Graphical user interface in Windows Form

While
Please, Log in or Register to view URLs content!
(NUI) have been around for a long time, the transition to touch interfaces has happened both more gradually and more quietly. In part, this is because it occurred hand-in-hand with the growth of smartphone-based mobile computing – and we have a tendency not to see our phones as centers of computing power, but rather as accessories or appliances. In part, though, the smooth transition occurred because the creators of development tools such as Visual Studio made a conscious effort to protect developers from these changes by making touch and mouse (or tap and click) appear to be the same thing. As a consequence, we moved from a mouse-centric development world to touch-friendly development world without really noticing the tectonic shift that occurred beneath our feet.

What does touch-friendly mean?


When touch-enabled tablets first arrived for Windows, you pretty much just used your finger as if it was a mouse to manipulate applications designed for the mouse. It was difficult because buttons tended to be too small and you couldn’t see visual affordances, designed for mouse interactions, hidden beneath your fingers. Even if your device supported touch, the apps running on it continued to be mouse-centric. Similarly, early mobile devices came with a stylus because a stylus could manipulate those tiny buttons.

Please, Log in or Register to view URLs content!


Fig 3. Natural user interface in Windows Phone

Phone apps mark the transition from mouse-centric to touch-friendly development. Phone apps weren’t just touch-first, of course. They were touch-only. They forced changes in the design, layout and controls used in apps to make interaction easier for smartphone users. These in turn were eventually incorporated into the UWP platform. While the UWP platform supports both touch and mouse interactions, as well as a variety of other inputs, when you develop apps and controls for it, you should think of it first as a touch interface. Mouse interactions should be added on secondarily.

There are many overlaps between touch gestures and mouse interactions that make this easier. For instance, when you tap on a UWP button or click on it, both the
Please, Log in or Register to view URLs content!
event and the
Please, Log in or Register to view URLs content!
event are triggered. Similarly, events such as
Please, Log in or Register to view URLs content!
and
Please, Log in or Register to view URLs content!
do not differentiate between touch and mouse. Given the amount of effort that has gone into blurring the difference between touch and mouse in UWP, it is worth asking ourselves, when is a click not a click?

The age-old question: click or tapped


The
Please, Log in or Register to view URLs content!
control is an interesting island of GUI behavior in a touch-first world. If you double-click on a button in the Visual Studio designer, Visual Studio will wire up a Click event handler for you in XAML and in code-behind. This is there for backwards compatibility since, as pointed out above, touch and mouse interactions will both throw Click as well as Tapped events.

In order to develop in a touch-first way, however, you should handle the Tapped event rather than the Click. This becomes important when you need to distinguish Tapped from other touch events like DoubleTapped and Holding (the latter cannot even be emulated with a mouse).



<Button Content="My Button"
Click="Button_Click"
Tapped="Button_Tapped"
RightTapped="Button_RightTapped"
DoubleTapped="Button_DoubleTapped"
Holding="Button_Holding"
ManipulationMode="All"
ManipulationStarted="Button_ManipulationStarted"
ManipulationCompleted="Button_ManipulationCompleted"
/>



Once you have confirmed that your app works well for touch, you should add interactions for the mouse and other input vectors. For instance, you can capture clicks of the right mouse button by handling the oddly named RightTapped event.



<VisualStateManager.VisualStateGroups>
<VisualStateGroup x:Name="CommonStates">
<VisualState x:Name="Normal" />
<VisualState x:Name="PointerOver"/>
<VisualState x:Name="Pressed"/>
</VisualStateGroup>
</VisualStateManager.VisualStateGroups>



More importantly, you should also have visual states for hover (PointerOver) and Pressed states. In the touch world, these are not particularly useful since your fingers will typically occlude these state animations. Visual
Please, Log in or Register to view URLs content!
and visual feedback can be extremely useful, however, for mouse, pen and touchpad interactions that do not block the user’s view.

Swiping right: online and offline gestures


Touch gestures come in two varieties. There are direct manipulations of objects, which happen automatically when you move one or more fingers across your screen. These are sometimes also known as online gestures. The other kind of gesture is only successful once the user completes a series of steps such as moving an object on the screen from one position to an offset from that position, such as a swipe. These kinds of gestures are easy to identify because they involve thresholds. If the user doesn’t move the object far enough, it fails a distance threshold. If it doesn’t complete the movement in a set amount of time, it fails a time threshold. These are also known as offline gestures.



<Canvas ManipulationMode="None" Margin="0,12,0,0" MinHeight="400">
<Border x:Name="manipulateMe"
ManipulationMode="All"
ManipulationDelta="ManipulateMe_ManipulationDelta"
Background="LightGray" Height="200" Width="200"
HorizontalAlignment="Left"
VerticalAlignment="Top"/>
</Canvas>



Direct manipulations can be implemented in XAML by taking advantage of the various
Please, Log in or Register to view URLs content!
manipulation events. While the ManipulationStarting event can be useful, the only event you really need to handle is
Please, Log in or Register to view URLs content!
. You will also want to set the
Please, Log in or Register to view URLs content!
property, which determines what sorts of manipulation are allowed, e.g. Scale, Rotate, TranslateY, TranslateX. All allow any kind of direct manipulation on the target element.



private TransformGroup transforms;
private MatrixTransform previousTransform;
private CompositeTransform deltaTransform;

private void InitManipulationTransforms()
{
transforms = new TransformGroup();
previousTransform = new MatrixTransform() { Matrix = Matrix.Identity };
deltaTransform = new CompositeTransform();

transforms.Children.Add(previousTransform);
transforms.Children.Add(deltaTransform);

// Set the render transform on the rect
manipulateMe.RenderTransform = transforms;
}



To move and rotate the target UIElement you need to keep track of the transforms being applied to it.



void ManipulateMe_ManipulationDelta(object sender, ManipulationDeltaRoutedEventArgs e)
{
previousTransform.Matrix = transforms.Value;

// Get center point for rotation
Point center = previousTransform.TransformPoint(new Point(e.Position.X, e.Position.Y));
deltaTransform.CenterX = center.X;
deltaTransform.CenterY = center.Y;

// Look at the Delta property of the ManipulationDeltaRoutedEventArgs to retrieve
// the rotation, scale, X, and Y changes
deltaTransform.Rotation = e.Delta.Rotation;
deltaTransform.TranslateX = e.Delta.Translation.X;
deltaTransform.TranslateY = e.Delta.Translation.Y;
}



Then, every time the ManipulationDelta event indicates that the user is trying to move the element, you update the element with the new information.

Making touch manipulations smooth as butter


Today’s users expect touch-driven content to be smooth and immediate. Using ManipulationDelta events can’t completely ensure this because it runs on the UI thread. When there is heavy processing occurring on the UI thread, users may end up experiencing some lag in response to their touch gestures.

To get the best touch experience, it is important to move touch processing off of the UI thread by using the
Please, Log in or Register to view URLs content!
class. InteractionTracker was introduced with the
Please, Log in or Register to view URLs content!
, and allows interactions and visual feedback at a much lower level than previous UI programming models in UWP. In order to drive composition animations with the InteractionTracker, you need to associate it with a
Please, Log in or Register to view URLs content!
. In the sample code below, the source is a backing visual for the root element of the page.



_tracker = InteractionTracker.Create(_compositor);

var interactionSource = VisualInteractionSource.Create(viewportVisual);

interactionSource.PositionXSourceMode = InteractionSourceMode.EnabledWithInertia;
interactionSource.PositionYSourceMode = InteractionSourceMode.EnabledWithInertia;

_tracker.InteractionSources.Add(interactionSource);

// bind the InteractionTracker outputs to the contentVisual.
var positionExpression = _compositor.CreateExpressionAnimation("-tracker.Position");
positionExpression.SetReferenceParameter("tracker", _tracker);
contentVisual.StartAnimation("Offset", positionExpression);



For an even more sophisticated implementation of low-latency touch input, you can use the
Please, Log in or Register to view URLs content!
class as demonstrated in this
Please, Log in or Register to view URLs content!
on GitHub.

Closing the circle


It’s taken a long time to move from DOS prompts to the natural user interface. One of the peculiarities of all these user experience changes is that rather than supplanting each other, these variations just supplement. The keyboard has never gone away, and like a hipster vinyl revival, the popularity of
Please, Log in or Register to view URLs content!
and the
Please, Log in or Register to view URLs content!
on Windows suggest that command prompts won’t either. For more information on touch interactions, please follow these links:


Please, Log in or Register to view URLs content!
 

Users who are viewing this thread

Top