On 10/10/2011 10:03 AM, Liam Proven wrote:
But here we are, in 2011, stuck with X11 on many
systems. Better imaging
is one thing Steve Jobs got right, in NEXTSTEP and OS X.
...which is something that, in my opinion, makes the whole world work
better. I feel that the OS X graphics system, while capable of beautiful
rendering, is a huge step backward in functionality. I mean, c'mon, a
non-networked windowing system, post-2000? What is this, Windows? WTF?! I
will stick with X11!
I think it had its place at one time, but it's not really got one any more.
Not for you, apparently! ;) Others hold different opinions, though.
CPU, RAM and disk are all very, very cheap now. Local
processing and
rendering cost effectively nothing and the network stuff is just in
the way - for the deployment of 99% of Unix boxes now (Linux and Mac
PCs, iOS and Android phones and slates, a lot of routers and a
relatively small number of servers), a networkable GUI is not merely
irrelevant, it's baggage.
For end-users on cell phones, of course. And again, irrelevant to
you, perhaps. I and many others use it heavily.
What is expensive is *state* - maintaining
configuration files,
patching, updating and so on. Currently the trendy kids are playing
with whole-system virtualisation for this. It's laughably inefficient
but it costs lots for all the licences - which of course the
proprietary-S/W vendors *love*. I expect it will go away in time and
we'll end up with something like Solaris Zones or FreeBSD jails in the
end, or some other OS-level virtualisation solution. Or perhaps even a
return to network-booting workstations, but with lots of local caching
this time.
This addresses a different problem. People use X today for different
reasons than they did twenty years ago, just like pretty much everything
else.
X was about sharing CPU and rendering power over the
network, and that
is 1980s thinking. There is just no need for it any more.
You don't need it, that doesn't mean nobody else does. You are
correct of course about its early reason for being, but as you know,
things evolve; it's used to great benefit today for reasons other than
wimpy computers.
As a quick case in point, I have a hugely powerful computer a few
hundred feet away, and a hugely powerful computer in front of me. This
morning, I am editing a large file on the far-away one. I choose to do
it via emacs over X11, because it's easier and faster. It's a trivial
example, but sometimes those are the best kinds. People do stuff like
this all the time. Both of the companies I'm working for work this way
company-wide.
It's a waste of bandwidth.
Hardly! It only uses network bandwidth if it's getting USED over a
network, and if that's happening, it's being used because it's needed,
which makes it most assuredly NOT a waste of bandwidth. When used
locally, with server and clients on the same system, it uses no network
bandwidth at all. Not "a little", but NONE.
Even Ubuntu is seriously looking into getting rid of
X.11 and moving
to Wayland, with networking relegated to a legacy-compatibility
submodule.
Yes, I'm aware. Don't misinterpret the reasons for that, though.
The Ubuntu decision makers have gotten really big heads about how wildly
popular their distribution has become, and they're pulling the whole "we
own the world now, so let's change some stuff for the sake of changing
it" crap. They did the same thing with Unity, which has a userbase
penetration of essentially zero (pretty much everyone who learns it can
be turned off does so immediately), forcing it down peoples' throats by
making it the default in Ubuntu 11. Fortunately you can change it at
the login screen, which every Ubuntu 11 user I know of has done.
This is another example of the Ubuntu group flirting with
unilaterally deciding how we should use our machines, which is why I
just dumped MacOS X as my primary desktop platform after years of loving it.
That said, though, I understand the logic behind what they're doing.
Ubuntu is competing with both Windows and OS X, and in every case I've
seen they're doing so very successfully. They're targeting primarily
nontechnical end-users. However, as an OS for highly technical users,
such as myself, it gives the aesthetic beauty and ease-of-use of an
end-user-oriented platform without taking away any of the power...a line
MacOS X started crossing a few releases ago. (which is why I dumped it)
In other words, X11 isn't going anywhere anytime soon. It has stood
the test of time on hundreds of platforms for decades...One
widely-contested suggestion from the maintainers of one distribution on
one platform isn't going to change that.
And please don't think I'm poo-pooing Ubuntu, I just switched TO
Ubuntu 11 as my main desktop platform a couple of months ago...a
monumental personal decision which I take with the same level of
seriousness and consideration as switching spouses.
(I'm sorry that got so long! I wanted to clearly explain my position.)
I have been installing, maintaining and using Unix
machines since 1988
- probably a newbie around here - and I have never *once* used X over
a network. For remote-control, it's rdesktop/mstsc for Windows
machines and something involving VNC for everything else.
Our usage patterns are clearly very different. (and there's nothing
at all wrong with that of course) I've been doing the same for about as
long, and I've used X over networks daily since then. First for the
reasons you mentioned above, distributed computing power, and now (from
maybe 1994 on) for convenience. I use it to great advantage, and I
don't see myself NOT using it anytime soon. VNC and related
technologies work of course, but I consider them to be gigantic kludges.
All IMHO, natch...
Mine too.
-Dave
--
Dave McGuire
New Kensington, PA