It was thus said that the Great Lance Lyon once stated:
And to drag this back on topic, in my role (Genesys engineer), when I'm
hiring people that piece of paper that states they are a certified Genesys
professional has great importance - I would not hire someone if (a) they
didn't have it, or (b) they weren't embarked on the certification process
without some indication that they would come out the other end with the
relevant qualifications. There are times when a piece of paper is extremely
important to some roles because it shows dedication to achieving the
relevant learning for a particular role.
This reminds of of the time around 1998 when companies were trying to hire
Java programmers with five years of experience. That would be a neat trick,
since in 1998, there were only a few people *in the world* that could claim
five years experience with Java (and they all worked for Sun).
This also reminds me of the time a friend of mine (mid to late 90s) was
trying to hire some programmers. Several were brought into the office for
an interview (his company developed and sold X server software, so the
position was a C developer with some hardware experience). Only *one*, out
of perhaps a half dozen canidates, could even implement a simple program to
read in a list of numbers into a linked list. All the candidates had that
damnable piece of paper called a "degree" from a "university".
Going backwards - we may all love and be passionate
about our old machines
and may have learnt skills to keep our hobby alive, but those skills (ie
soldering, coding in a dead language etc) don't nec. translate into
something useful in today's computer related job market and people should
not be put down or denigrated because they don't have them.
I beg to differ. Racter [1] was a *very* interesting piece of software,
refering to both the program and the compiler used to generate the program.
It has pattern matching subroutine calls (similar to functional langauges
that pattern match on parameters) but it goes beyond that to mind-blowing
proportions.
Cornerstone was also an interesting IDE/compiler---human readable
identifiers actually pointed to internal identifiers. Change the name of a
varible in one location, and *all* other locations in the source code would
be updated with the new name. I think Elipse can do similar stuff, but it
doesn't have quite the same indirection that Cornerstone did.
The TREE-META system [2] (late 60s/very early 70s) is also something worth
looking into. A mix between lex (for parsing tokens), yacc (parsing
languages) with the ability to manipulate the abstract syntax tree being
built, all in the same file. It's predecesor, META-II, was a self-compiling
lex/yacc program [3][4].
Heck, I've had my mind blown with reading about the VAX CALLS and CALLG
instructions that's had a profound influence on how I view programming that
I'm still playing around with.
Going backwards *can* translate into something use for today's computer
related job market. I've just been fortunate enough to see it.
-spc (Oh, and the current trend of VMs? IBM was doing that in the 60s)
[1] There are only two pages I'm aware of that have *any* technical
information on the compiler. They are:
http://www.robotwisdom.com/ai/racterfaq.html
http://boston.conman.org/2008/06/18.2
[2]
http://en.wikipedia.org/wiki/TREE-META
And I would very much like a reference to the bit about
"TREE-META was the last of a line of metacompilers, starting with
META II, right before subsequent versions were designated classified
technology by the U.S. military and government agencies."
[3] Yes, it's a type of chicken-egg problem, but I was able to compile
and run the code with 240 lines of Lua and 500 lines of C. The
META-II source code itself is only 27 lines of META-II code. It's
pretty much a replacement for lex and yacc in less than 800 lines of
code.
[4] The Mother Of All Demos was written in META-II. And it's only now,
42 years later, that we duplicate the Mother Of All Demos.