The mouse thing will be an issue. Touch devices
currently offer
nowhere near the precision positioning required for (say) drawing
schematics and layout out PC boards. [...]
When that problem is solved, then [...]. But there is
no progress
currently being made to solve it, nothing on the drawing boards to
address it (and no, the free-space gesture stuff doesn't even get
close)...not even any IDEAS on how to address it.
Ideas there are. Heck, I've got two, or maybe three, and that's with
maybe a minute of thought. (One: zoom in with a snap-to grid
corresponding to whatever precision is needed; two: displace the touch
point from the active point so that the user can see precisely where is
being addressed; three: scale down, moving the hotspot by less than the
point-of-touch - a bit like zoom, only reversed. Probably some
combination would be best.)
Progress currently being made? What reason do you have to think you
would necessarily be aware of any such progress? For all you know I
could be making such progress right now. (I'm not, but all it would
take would be for me to get documented touchscreen hardware - something
you wouldn't necessarily hear about.)
What I really think it will take is for someone to completely redesign
the software - schematic capture and board layout, in your example -
for a touchscreen interface, rather than trying to perver^Wadapt
mouse-based software for touchscreen use.
/~\ The ASCII Mouse
\ / Ribbon Campaign
X Against HTML mouse at
rodents-montreal.org
/ \ Email! 7D C8 61 52 5D E7 2D 39 4E F1 31 3E E8 B3 27 4B