Issue 1: Touch is not omnipresent What makes the touch interface so compelling on the iPhone and on quality copycats such as the Palm Pre is that the use of touch gestures are a fundamental part of the operating system and the applications. Just as using a mouse is fundamental and universal in Windows and Mac OS X, touch gestures are universal in the iPhone, Palm Pre, and so on. This means the user interfaces are designed with touch at the core, and typically work intuitively as you put your finger to the screen.
I expected the same level of universality in the new Windows 7, given Microsoft's trumpeting of its Surface research for years now, but it simply does not exist. And Mac OS X does no better despite Apple's pioneering use of touch in the mobile context.
What you get instead is the mapping of mouse functions to the touchscreen or trackpad, so that, in essence, your finger becomes a mouse. (Tapping your finger acts like clicking a mouse button.) There's nothing wrong with that approach, but you already have a mouse, so why switch to your finger? That's where the real issue of touch on today's desktops comes up: There's just not that much you can do with touch.
First off, the finger rarely makes a better mouse than a mouse -- it's harder to be precise with a finger on the touchscreens used in the Dell and HP touchscreen systems. (The Mac trackpad is a bit more precise, but still no mouse.)
Second, the gestures that Windows 7 and Mac OS X Snow Leopard support beyond basic mouse actions are few: rotate, scroll, zoom in, and zoom out. What's more, these gestures are rarely available. You can tap (click) and scroll using your fingers universally in the OS and in apps. But beyond that, touch support gets very dicey very fast.
Windows 7 does allow you to zoom in and out of folder views and assign touch shortcuts (called "flicks") for common actions such as copy, paste, and undo; these shortcuts work across the OS and applications. (Mac OS X doesn't do either.)
Of course, most of the touch capabilities are not new to Windows 7; they're simply the pen-input capabilities Windows has long had, now working via a touchscreen. But Windows 7 does add a few touch-specific gestures: It copies the iPhone's pinch and expand gestures for zooming, as well as the iPhone's rotation gesture. And it adds a unique two-finger gesture for opening a contextual menu (hold one finger on the object and tap a second finger near it).
But for touch gestures to work in applications, the software developer usually has to explicitly add touch support. And very few developers have, despite the fact that Microsoft made the touch SDK available to all developers a year ago.