Observation: One of the great things about the mouse as an input device is the idea of "intent" - that your cursor indicates your locus of attention when used for interaction (Roll-over states and tool-tips being trivial expressions of this).
You have a simpler model of this semantic with the Blackberry trackball (and the Blackberry jog wheel before that), and a crappier version with things like using your remote control with a TV EPG (guide) or the arrow keys on your cell phone to navigate menus.
Touch interaction systems, like the iPhone, lack that model completely - just like most older (read: HW only) Consumer Electronics UIs (think VCR or DVD player).
In some cases, that really doesn't matter much... and in other cases, the directness of interaction provides a far better paradigm.... but, it suggests the question: is "intent" a semantic that will disappear for Touch UI? Or is it a temporarily "lost" item, like tactile feedback - just a gap to be crossed?