This morning I watched Jensen Harris, Director of Program Management for the Windows User Experience, discuss 8 traits of great Metro style apps at BUILD2011 (above). It’s a compelling 90-minute presentation and, as Gruber points out, required viewing for anyone interested in the Metro interface. One thing is clear: Harris’s team has put a lot of thought into Metro and believes in it.
Of course, Harris demonstrated a developer build on carefully-chosen hardware. Features and functions demonstrated on the 13th, when the video was recorded, may or may not appear in the final release (or in the same way). Still, I got excited about two features born of the team’s examination of “the language of touch.”
At 29’11“, Harris notes that the mouse and touch are fundamentally different input methods. ”Touch is just different than a mouse cursor,“ he said, and he’s right. A slide asked, ”Do we have a language of touch?” His answer is that a touch language is being written, but on the back of old knowledge:
“We realized early on in working on Windows 8 that pretty much what everyone was doing was moving the mouse cursor around with their finger. We originally had a command line interface which we all understood, and then we moved to GUI at some point, and that had its own set of paradigms: click and double-click and right-click. But, the new world of touch was evolving to be just the same as GUI. We map right-click to press-and-hold for instance…but we really weren’t taking advantage of touch.”
Harris then demonstrated how legacy mouse behaviors are still present in existing touch-based OSs, to their detriment. For example, Harris held a tablet running Windows 8 with a number of apps installed. Each app, etc. is represented by a panel, which for the sake of argument, you can compare to an iOS app icon. Likewise, panels are sorted across screens.
To move a panel from the first screen to the last, Harris grabbed the panel, dragged it to the edge of the screen and waited while screen after screen of panels slowly passed by unitl he reached the end. “That’s the same thing you do with a mouse cursor,” he said. “Pick it up and it moves.”
Next, he picked up the panel with one finger, held it in place and quickly swipped between screens with another finger. It was fast, clever and certainly a “touch-centric” way to move that tile.
Later in the demo, Harris demonstrated a way to deal with long lists. Instead of swiping (panning) over and over, he used a reverse-pinch (they call it semantic zoom) to re-size the list itself so that it appears comfortably on-screen. Again, the gesture replaces mouse behavior with a touch-centric approach. I especially liked the behavior in the calendar app.
The whole session is worth watching. Windows 8 is very young, but Microsoft seems determined to shake the Etch-A-Sketch hard and start over. Good for them.