Of Bytes
and Balusters
I can just picture the scene, some tens of thousands of years ago,
when some enterprising person invented a code of symbols to record that which until then had only been spoken. The tribal elders began worrying that no one would remember anything anymore, and as it turned out, they were right. There's no longer
any great call
in society for memorization of epics like Beowulf.
Electronic calculators (and their precursors, slide rules) have had a similar effect on grunt arithmetic. Addition and subtraction
(and their extensions
of multiplication and division) are still taught to the point of fluency because of their necessary usefulness, and relative ease of execution once learned.
I remember being taught how to extract square roots manually, but why anyone should have done so, even back then, when some one(s) had done so once, and put the answers in tables, was beyond reason. There are
times which bring changes so fundamental, that to avoid learning new ways is like walking upstairs on your hands instead of taking an elevator.
You can convince yourself you're maintaining the purity of your creativity by not sullying yourself with the new technology, and
you might even
be right. At least one columnist
I know of prides himself on writing his column in fountain pen on yellow pads. Of course, then there are 20 gazillion others who, in switching to word processors,
have left their typewriters
so far behind, the Guam Civil Service can't find them.
What do you suppose made their employers (who bought the machinery) do such a thing? Do you suppose there could really be such advantage in the switch that they did it gladly? You betcha. There
is no doubt that
fountain pens are cheaper than typewriters are cheaper than dedicated word processors are cheaper than general purpose computers. You and I, however, are godawful expensive.
Where do you think our productivity would be if we were still laying down ink on cloth? Mark well that I mean productivity to be that which allows us to think about our designs
instead of spending
our time rendering them. Pencil on flimsy is great for concept, but the time comes when you have to go with your best thinking and commit to Mylar.
This generally works much better when you commit to silicon instead. It means you don't have to quit thinking when you start production. Not only that, it's faster. No matter what smoking pencil awards you may be famous for, no matter how the speed of your graphite melts the Mylar upon which you work, you
cannot go faster by hand than by machine. John Henry was the last to do so, he did so only once, and the steam
drill didn't even
have an on-board computer.
Understand, CAD "operators" are an endangered species, as inkers and draftsmen were. Those who draw must design. Those who design must not forget how to draw.
Of course, there are caveats. An inker's productivity would have declined rather than increased had he tried to use a mechanical lead holder to lay down ink on Mylar. You try to draw with a CAD system as you do now manually, the
same thing will happen
to you. It probably took you six months to a year to become proficient enough with manual drawing implements to be considered trainable by a design firm when you first started
down this road.
The first rule of CAD productivity is never to draw anything twice, and many times, not even once. Things like plan view doors and windows are better drawn and broken out
in the wall
by the program after it asks you where to put the door, how wide it is, and which way it should swing. This can complicate your life a little because these things are usually aftermarket add-ons to CAD programs, and some are better implementations than others.
You've got to be prepared to make some kind of investment in time to make anything at all to do with
computers productive,
cute Macintosh
ads to the contrary notwithstanding. Once you've made that decision, you'll discover any terrors you might have had begin to evaporate.
Electricians and Mere Mortals
If it controls lights from two locations, why is it a
three-way switch? Why do you use a four-way switch for every location after the first two? Why can't electricians talk like the rest of us?
The somewhat confusing terms used in the preceding paragraph are only so to us because we tend to think of the devices
by what they do. I
mean, if you
switch something from two locations, you should use two-way switches while something switched from three locations would call for three-way switching, right? Not really. By those lights, lights switched from five locations would call for
five-way switches. If
that were true we'd require different types of switches to control lights
on an upper
level as opposed to those used on grade level.
I think you get the point. In point of fact, three and four-way switches are so named for how many wires connect to them and how the circuit paths are altered by the devices.
Although it
takes a pair of three way switches to control lights from two locations, for each location after the first two, a four-way switch is required. For ease of wiring, the three-way switches should go at the two extreme
locations with the four-way switches between.
Of course, I don't know if you'd find a situation in the real world where you'd want to switch from five locations as shown that are close enough together for three-way
and four-way
switches to make sense. The current
has to go through each of the switches in the circuit, and the voltage drop could become substantial under such circumstances. That's one of the reasons relays came to be.