On Jun 4, 2013, at 2:16 PM, Mouse <mouse at rodents-montreal.org> wrote:
There are UI tasks keyboards and mice suck at, but
people put up with
them because that's what they've got. (Indeed, I've seen cases where
people use mice when a keyboard would work better, or vice versa, for
reasons I can only guess at.) Back in the '80s, I had the good fortune
to get to use a tablet input device - basically, think touchscreen, but
input-only, and with a (wired) pen-like device rather than a fingertip
selecting the active input point; alternatively, think laptop touchpad
"mouse" grown up to a size on the order of one foot square, with the
pen-style device needed because at that scale you can't really help
resting your hand/arm on the thing. I've sometimes wanted to get
myself one for tasks it does better than a mouse-and-keyboard for.
Lots and lots of webcomic artists use them (though the pens are not
usually wired anymore). There are even ones like the Wacom Cintiq,
which is a drawing tablet with a high-color-fidelity screen built
in; it's expensive, but if you are a digital artist, it can be
invaluable.
I've also got an electronic piano which can be
turned into a MIDI
keyboard (basically, the keybaord and synth can be decoupled); I've
considered how I might be able to use it instead of, or in addition to,
my existing input devices, and what sorts of tasks it might work better
than them at - besides music input, of course.
My guitar direct-inject box (Line6 POD), which emulates the amp and
speakers as well as the effects, takes a MIDI input exclusively for
control purposes. Likewise, there are lots of MIDI input devices
that are just arrays of buttons and dials, exclusively for the
purpose of controlling auxiliary functions of MIDI devices like
rack-mounted synths or effects devices.
As for touchscreens, I don't know. Like most
computer people, I've
become so heavily accustomed to the mouse-and-keyboard paradigm that I
have trouble breaking out of that mold. Perhaps I should try to find
documented touchscreen hardware and play with it for a bit, see what I
can come up with for input paradigms - part of the issue, I suspect,
with redesigning things like schematic capture for touchscreen use is
that API designers' minds, and thus the APIs, are still designed from
a mouse-and-keyboard mindset, making it difficult to truly exploit a
touchscreen's capabilities.
I think the biggest problem with touchscreens when it comes to CAD
is that fingers are so much bigger than pixels. When you're using
a mouse, you are moving a pixel-precise cursor through a device
directly attached to your hand; you are not trying to pick a single
pixel with a fingertip the size of a 20x20 square of them. That's
the biggest reason I think CAD will never catch on with touch
interfaces until we either come up with a GOOD way of making pixel-
precise selection with fingers, or come up with a GOOD way of
removing that requirement. Snap-to-grid on most CAD programs still
generally implies a grid much smaller than a fingertip.
Using a stylus is, of course, an option. That's how lots of CAD
programs worked in the '80s with light pens. Some schematic entry
programs which originated in that era (I'm specifically thinking
of Mentor Design Architect, though there are likely others) still
retain the gesture commands even with a mouse-based interface;
even with a mouse, I found them generally more useful and easier
to remember than the thousands of hotkeys Altium has to do things.
Also, I can't see holding my arm up to a touchscreen for 6 to 8
hours at a go. They'd get tired. Likewise, I can't see working
on a horizontally-oriented table for that long. The diagonal
compromise sucks a little both ways. So there are significant
ergonomic considerations blocking something like CAD from tablets
as well.
- Dave