Several Adobe Creative Suite 4 apps, such as Illustrator and InDesign, support some Mac OS X gestures such as rotate -- but the Windows versions do not. In Windows, Internet Explorer 8 supports zooming via gestures; on the Mac, Safari 4 does, too. But Firefox 3.5 doesn't support gesture-based zoom on either OS. And even within IE8 and Safari, the touch gesture support doesn't extend to Flash, PDF, and other objects that may be embedded in a Web page.
In Windows 7, one touch gesture that developers can get "for free" is zooming, since Windows 7 maps the pinch and zoom touch gestures to scrollwheel zooming; that means an app that has been enabled for zooming via a mouse's scrollwheel (such as IE8) is also enabled for touch zoom. Likewise, if an app has been enabled for horizontal scrolling, it's automatically enabled to support Windows 7's new two-finger swipe gesture for horizontal scrolling.
But until all apps are designed to support touch gestures, and the OS makes more use of them (as the iPhone OS does), it's simply easier to stick with the mouse because you know it works everywhere.
Issue 2: PC UIs aren't finger-friendly In using a Dell Studio One desktop and an HP TouchSmart desktop -- whose touchscreens based on NextWindows' technology are quite responsive -- I found another limitation to the adoption of touch technology in its current guide: The Windows UI really isn't touch-friendly. A finger is a lot bigger than a mouse or pen, so it's not as adept at making fine movements.
Also, on a touchscreen, your hand and arm obscure your view of where your fingertip actually is, making it hard to actually touch the intended radio button, close box, slider, or what-have-you. It doesn't help that these elements are often small. And there's no tactile feel to substitute for the lost visual feedback.
But the issues of using touch gestures go beyond the visibility and size of UI controls. The ways the controls work is often not finger-friendly. Take as an example Windows 7's wireless LAN setup. It has some big buttons to select a desired network, so it's natural to just press the desired one. And sometimes that works, but often these visual buttons are really the equivalent of radio buttons -- item selectors -- and you then have to tap the Next button. That's not the kind of direct stimulation that touch assumes. When you work with something on your hands, the manipulation is direct. But most apps are designed for interaction with keyboards and mice, and aren't so direct (to prevent accidental selections and the like, since it's really easy to move a mouse unintentionally).
The result is that using touch is often an awkward process. Unlike an iPhone's apps, Windows or Mac OS X apps weren't designed for touch, and neither the OSes nor the apps are intended to adjust themselves for this input method.