Me, then Mistral:
What do people think the computers of the future will really be like? Do people agree that the Federation computer technology we see on screen is just a smokescreen concealing the Real Thing? How does one go about programming them? -- I've never believed the prodding-with-fancy-screwdrivers line.
Well, this is just a thought, but ... one of the early super-computers (Cray II, perhaps?) was built in a circle, because the limiting speed was how fast electricity could travel along its wires, and the circle allowed the wires to be shorter. Suppose that Federation computers have a similar problem of limiting speed that can be overcome to a certain extent by building specialized modular circuits - maybe tiny atomic-level computers in themselves - for different types of processing. The bit with the fancy screwdrivers then becomes reconfiguring the manner in which the modules interconnect.
[Not playing the game, but... ] I'd always seen the fancy screwdrivers bit as evidence that the scriptwriters &c hadn't a clue how one might interact with the computers of the future.
Surely with the system you describe, which pretty much exists at present methinks, one could have some form of control at a level we'd recognise as software? (Rather than screwdrivers?)
I'm totally stuck with something I'm writing because, being totally computer illiterate myself, I simply can't envisage how someone who's presumably quite good with the things, such as Avon, might think about what we'd call programming. So, any further thoughts very welcome...
Tavia
Tavia mused:
Surely with the system you describe, which pretty much exists at present methinks, one could have some form of control at a level we'd recognise as software? (Rather than screwdrivers?)
Not necessarily. There are three popular styles of programming right now:
1. Procedural or Imperative. You give the computer a sequence of instructions, and it follows them, one by one. Almost all programming everywhere is done in this manner (about 99.9%, at a guess). C, Basic, COBOL, FORTRAN, Pascal, etc. are all procedural languages.
2. Functional. The problem to be solved is expressed in terms of functions that have input parameters and a result. They're expressed in special languages, which tend to be bastardised somewhat so that they can affect the "real world" (i.e. have side effects). This is good for a certain class of problem, and bad for most others. Lisp, Miranda, Glide, etc. are functional languages.
3. Declarative. You define the "world" as a series of predicates and facts, and then pose new predicates and facts to the computer, and it answers true or false, based on what you've already told it. If you've defined X, and X=>Y, and asked it "Y?", it'd say "yes". You could also ask "X=>?" and it'd say "Y". Good for artificial intelligence and expert systems. Pretty much unpredictable for anything else. Prolog is the classic language of this style.
But all computers, right now, are imperative beasts at their heart, regardless of the veneer at their programming level. The up-and-coming quantum computer approach will require something different again at the programming level, and that turns out to be very hard to express.
I'm totally stuck with something I'm writing because, being totally computer illiterate myself, I simply can't envisage how someone who's presumably quite good with the things, such as Avon, might think about what we'd call programming. So, any further thoughts very welcome...
I have two examples I like to use for this. (a) How programmers need to think. (b) Why they get it wrong more often than you'd expect.
First off, shuffle a deck of cards. Deal out five cards in a row, face down. Then sort them into order (doesn't matter what order, as long as you can say whether the cards are properly ordered, or not). But there are two rules you must follow when sorting the cards:
1. You can pick any two cards, turn them face up, and then turn them back; 2. You can pick any two cards, while face down, and exchange them.
No other operations are allowed.
To sort your five cards like this is fairly easy. To sort ten gets tricky, because you've got to remember the position and value of more cards (you can keep looking at them and swapping for as long as you like, and you have to, as you forget more and more). To sort the whole deck... by this point, you need to have acquired either a great memory, be willing to take ages while trusting to luck, or have developed a system. Like this:
Compare cards one and two. Swap if necessary. Compare cards one and three. Swap if necessary. Compare cards one and four. Swap if ... etc. When you get to the far end, go back to the start, and repeat. One and two, one and three, and so on. Keep repeating, until you don't actually need to swap any cards with position one. Card one is now sorted.
Compare cards two and three. Swap if necessary. Compare cards two and four. Swap if necessary. ...and so on, until finally, you compare the last-but-one and last card, and don't need to swap them. You're done.
This system is called an algorith. Specificially, it's the Bubble sort, and it's horrendously slow, but it's bullet-proof. Programmers spend their days working out how to express simple operations in even simpler steps, given that they don't know the specifics of the situation the operation will be used in.
So, why do they get it wrong so often? Well, consider this problem: supposing you wanted to tell the computer to (a) remember when "now" is, and then some time later (b) tell you how much time has gone by, since you asked it to do so. Say, number of seconds.
At the basic level, this is easy. Computers count time pretty well, and it's just a subtraction. But supposing your programmer didn't just count seconds, but remembered the date and time, and then took it away from the current date and time? Get the same answer, right? Well, no.
How about leap years? Has the algorithm for determining it's a leap year been implemented correctly? How about leap seconds? Do you have daylight savings, and has that boundary been crossed? Are you in a country with sensible daylight savings, or is it decided on a whim by Parliment? Have you moved into a different timezone? Are you expressing the date/time in local time, GMT, or one of the UTC variants? One of the times is noon, or midnight? How's that expressed? Have you crossed the International date boundary? When *really* long gaps have happened, have you overflowed your internal representation of a date? When you first remembered the date, were you using a different calendar from the one you've got now? If so, was it a clean change-over, or a gradual one? Or did one of the years get deformed? Are you y2k compliant? Are you computing the difference at a fine-enough resolution for the reason you want it? If so, are you computing it from values recorded at a fine-enough resolution?
In other words, simple problems often aren't, for all cases, but for the majority of cases, the awkward bits don't matter, and hence don't come to mind. Programmers often find people are doing things they didn't expect, with their programs. Sometimes this is the user's fault, and sometimes (usually) it's the programmer's. Insufficient specification is the problem. But, if you look at all the potential flaws, the amount of checks you'd have to write would be huge, and you could make a mistake defining any one of them (the leap-year one being the classic), in which case, your program would fail sometimes, for those checks. That's just life.
ObB7: some programming in Avon's time would be the same: Gates are supposed to be *locked* dammit; if they open, someone's breaking in. Sound the alarm. (this doesn't happen in the series, but it should). Others would be completely different: Orac, Zen, and the teleportation equipment are all "problems" of immense size - consider how long the card-sorting rules would be, written out, and then consider writing similar rules to model Zen's behaviour, including understanding English. Therefore, for Avon to poke around inside and understand things, there must be some higher-level paradigm, that expresses things at more useful concept levels. Windows programmers don't draw big squares on the screen, they ask for a Window, and the underlying system draws 'em the same for everyone. Perhaps there are similar abstractions for ship control systems, AI, the huge data transfers required to encode a living human as a transmission....
steve
Chuffing heck, Steve! I'm impressed! But let's face it, Season 4 was broadcast in 1981, the first year in which it was even feasible to think of computers as "small box that sits on desktop and is owned by almost everyone (for their sins)" rather than "giant boxes attended to by a priesthood." Hence the assumption of the scriptwriters (who probably used typewriters) that anyone who could fix computers would have to be clever enough to fix anything else (e.g., the Ortega).
-(Y) PS--a lot of Human Resources computer systems had to be debugged before September 9, 1999 because the string 9/9/99 meant something dire.
Dana Shilling wrote:
1981, the first year in which it was even feasible to think of computers as "small box that sits on desktop and is owned by almost everyone (for their sins)" rather than "giant boxes attended to by a priesthood."
The boxes themselves might no longer be giant but just down from my desk at work is a huge room, full of 7ft high racks of numerous boxes that go "whirr" and blink their pretty lights with a "cat in the woolshop" tangle of cables sticking out the back. And yes, they're still attended to by a priesthood. Only the priesthood's security badges let them into that room. There's even a Buckfast in there (codename for one of the modules).
ObB7: So what exactly is behind Zen's screen? A little man with a megaphone or something less organic? And how come Orac never gets an "Error 404. File Not Found" message when he looks things up on the g(alaxy)ww.
Kat W