Search This Blog

Follow adrianbowyer on Twitter

My home page

Wednesday, 11 April 2018


This is an edited version of a letter that was published in the London Review of Books Vol. 39, No. 11, 1 June 2017.

Driving speed is easily controlled by self-funding radar cameras and fines; in contrast, MP3 music sharing is unstoppable.

Every technology sits somewhere on a continuum of controllability that can be adumbrated by another two of its extremes: nuclear energy and genetic engineering. If I want to build a nuclear power station then I will need a big field to put it in, copious supplies of cooling water and a few billion quid. Such requirements mean that others can exert control over my project. Nuclear energy is highly controllable. If, by contrast, I want to genetically engineer night-scented stock to make it glow in the dark so it attracts more pollinators, I could do so in my kitchen with equipment that I could build myself. Genetic engineering is uncontrollable.

We may debate controllable technologies before they are introduced with some hope that the debate will lead to more-or-less sensible regulation (if it is needed).

But it is pointless, or worse damaging, to debate an uncontrollable technology before its introduction.  Every technology starts as an idea in one person’s mind, and the responsibility for uncontrollable technologies lies entirely with their inventors. They alone decide whether or not to release a given technology because - if they put the idea up for debate - its uncontrollability means that people can implement it anyway, regardless of the debate's conclusions. (Note in passing that - all other things being equal - an uncontrollable technology will have greater Darwinian fitness than a controllable one when it comes to its being reproduced.)

In my own case I classify technologies I invent as broadly beneficial or damaging. The former I release online, open-source. The latter I don’t even write down (these include a couple of weapons systems at the uncontrollable end of the continuum); they will die with me.

I may be mistaken in my classification, with consequences we may regret. Other inventors may act differently: we may regret that too. But we shouldn’t make the mistake of indulging in (necessarily) endless discussion of what to do about a technology if it is uncontrollable. The amount of debate that we devote to a technology should, inter alia, be proportional to how controllable it is.

Technological changes have unforeseen and occasionally negative social and political consequences.  This is inevitable when something powerful impinges on things that are relatively weak like regulation; the same applies to the benefits. Fortunately the vast majority of people are well intentioned, and technology amplifies the majority along with its complementary minority. Much happens faster and more spectacularly, but the ratio of more good to less bad stays about the same.

No comments:

Post a comment