Alison wrote:
One can easily imagine the corpus of a software system becoming not just unwieldy but literally too complex for anyone or any AI to begin to encompass or understand it.
This routinely happens. Normally, it's considered to happen to very large systems - all those things that get put in at police stations, airports, government offices, paid for by taxpayers, and binned after they're three years overdue - but it also happens to much smaller systems.
The average mind doesn't consider a system in its entirety. Instead, when making changes, the relevant subsystem is considered. It's not difficult to break something elsewhere because of unforeseen interactions, similar to how a user breaks something because they use it in an unexpected way.
I think you can fix up a possible future where welding bits and bobs together (or using screwdrivers) could be a realistic programming activity (this isn't in the vinge book, I'm making it up).
This happens now. PC hardware goes together like lego, and there are assumptions made as to how the components will be used. Plug'n'Play, Intel Inside, Designed for Microsoft Windows, and other such slogans are all signs of this approach. Note that the hardware generally works perfectly, and the software fails abysmally. This is because the hardware's doing a much simpler job, and one that's clearly specified. Software always screws *something* up.
One could imagine this happening as a way of keeping out viruses and unknown side-effects. The individual components of the software would be literally sealed systems, with inputs and outputs defined and locked perhaps decades or centuries earlier. These sub-programmes are using up massive amounts of storage space, but it is just too complex to open them up and try to simplify them, and processing speeds are so fast that it could never be cost or time effective to do so. Instead they are simply mass-produced en bloc like engine components.
Alas, this is a holy grail of programming, and unlikely to be ever reached. Why? Complex problems require complex solutions. Off-the-peg solutions only solve off-the-peg problems, and just lead to unexpected failures, because the system doesn't adapt as well as the user.
There's a standard software problem, which is: how big to make a reusable component? The components have defined interfaces, and connect to other components, like atoms and molecules. The small, simple ones can be connected in many ways, but you have to deal with all the connections. The much bigger ones have far fewer connections, but there are only so many ways you can put them together, before you have to create new ones out of whole cloth. It's the difference between making a body from atoms, and from a torso, head, and limbs: you can make many different people with the atoms, but the head is the same, and you've only got one. And it makes a lousy foot.... What constitutes the software equivalent of DNA is an on-going research issue.
Avon, and any super-technician, would be distinguished by his/her familiarity with huge numbers of these components, and by an understanding of how they interact to make useful programmes. A subtle programmer would understand how installing slightly different versions of sub-routine blocs would subtly alter the operation of the overall system.
And that's what he's doing with his screwdriver
Maybe. On the one hand, something like quantum programming requires a different way of thinking about problems, and that'd be a special skill. Plausibly, the screwdriver is a device for downloading new instructions or data into independent subcomponents. Alternatively, Avon might be a hardware type of guy - the equivalent of the PC heads who love to overclock their processors to make them go faster than the recommended speeds.
It might even be something much simpler. There's a *huge* difference between someone who writes a database application - the definition of the data, how it's retrieved, what operations are available to the user - and the person who administrates the database system. The first guy knows that he wants to look up this person by name, and get their address. The second knows that there are N thousand such lookups per second, and though he doesn't care why, he knows that he has to make sure the system configuration makes that particular operation go fast, so that goes on a disk with caches, indices, high-bandwidth data paths, etc.
steve