Some very interesting research into touch UI from Microsoft Research, University of Toronto, and the good folks at Mitsubishi (MERL's been doing some great work) illustrates how to improve the precision and efficacy of touch screen computing. This isn't strictly a technology problem (touch screens are pretty accurate) - its a human factors problem (an affordance issue).
I wrote on this a while ago - the mouse is pretty accurate, but one of the significant reasons I think it succeeded as an "intuitive" input device was that it created an interface paradigm that allowed "intent".
Touch screens allow us to create programmable input devices (the hardwares becomes "soft" - the rest is just wiring) - I don't think its tactility that's makes it intriguing.
While the article posits that they solve the "fat fingering" problem by allowing the interactive to happen "above" your fingers - that is, you can touch the front *and* back of the screen, I'll posit that its actually the recapturing of *intent* in the interaction flow that makes the difference here.
Judge for yourself:
In any case, pretty cool.