Machines

IN A DECISION OF the NSW Civil and Administrative Tribunal, the senior member was asked to rule—amongst other things—on whether a particular kind of rifle, that the applicant wanted to import into Australia and own, was 'of a kind that is designed or adapted for military purposes'. Unless you're particularly interested in the arcane details of specific 20thC weapons, which I'm not,1 it's an interesting judgement for the sheer taxonomic argument that must have gone on between the two parties, discussing what particularly about this object set it in either a prohibited or a permissible category.

No, said the Commissioner of Police, you may not import it, it's obviously a military weapon, since it's a derivative of a design original to the Soviet Army. But wait, said the applicant, this one's a product for the commercial market, specifically the pre-1996 Australian market, stripped of its most gruesome military bits, and deliberately altered in appearance so that no military buyer would want it. No, said the Commissioner of Police's witness, that's not how it works; it's the function of the thing, its 'ability to discharge ammunition' that counts, and this is clearly able to kill a lot of people. But, continued the applicant, 'disturbing' the Senior Member of the Tribunal, the same is true of all centre-fire rifles, having been originally designed for military purposes—and that point was where the ontology kicked in. Platonists may kindly exit the room now, leave your essentialism on the table.

We are all of us surrounded by such objects, originally designed or brought into being for military purposes, commercialised. Some of them are incredibly useful, and our society wouldn't work without them: communications satellites, jet engines, encryption. Others, like synthetic textiles, are byproducts of war economies, with an original military purpose now completely illegible in the commodity—you wouldn't know that Nylon was so important to the Second World War just by wearing clothes. Wars tend to have been times when advances in medical science, including psychology, have taken place. Even games: every nerd lining up to role-play their warrior or sorcerer against monsters in a dungeon owes their rulebook to 19thC Prussian officers who wanted to engage bloodlessly in war-games, theorising the whole field of human conflict down to dice rolls and rules on maps. The distinction between things for 'military purposes' and for 'non-military purposes' is always arbitrary and, because it's in the nature of war itself to engage technologies adaptively, in a competitive cycle, it's always changing, bringing existing and new kinds of things into its category—the violent purpose makes the definition. As Manuel de Landa wrote, 'when synthetic intelligence does make its appearance on the planet, there will already be a predatory role awaiting it'.2 But purpose is so slippery, and so socially constructed!

Computers and computer networks are the most obvious example. You are reading this on a device with an incredibly sophisticated processor, the information delivered to you electronically on a network spanning the globe. The first owes its lineage to machines meant to do to the calculus for bombs' and shells' trajectories, and later, to calculate theoretical atomic explosions. The second, the internet, is a technology original to the United States' military-industrial complex's desire to resiliently communicate during a nuclear war. Both have outgrown their original 'military purposes', you think, or have they? My own country is part of the Five Eyes alliance of intelligence sharing, a sovereignty-busting agreement for mutual spying, which adapts completely civilian functions—social networks, phone conversations—for security and surveillance ends. The technologies of networked computing seem to lend themselves to aggressive adaptation, used by bad-faith actors to recruit, disinform, confuse, incite. Nobody in 2020 can still maintain that technologies like Facebook and Twitter have entirely civilian, benign functions. At the time of writing, most bafflingly of all, the Tik Tok video app used by teenagers seems to be under a cloud of suspicion as a vector for Chinese spying.

As a citizen, I'm glad that Mr Bankowski wasn't allowed to import his KS-30; those kinds of so-obviously destructive machines have no place in any kind of society I'd want to live in. That's a political consensus in Australia, though, not a functional one, inherent to the device; it was confirmed in 1996. We mitigate the violent purpose, by restricting and regulating its potential machines, whatever they are. And if we can do that for the most grossly offensive and dangerous machines, like self-loading rifles, the possibility exists that we can do it to other machines, and other systems: the Americans right now are looking at their militarised, hypermacho Police forces in exactly this way. What other machines might we decide, as a community, to turn into ploughshares?


  1. Well, in this case, anyway. Since it's not a vehicle you can drive, sail, or fly... 

  2. De Landa, Manuel. War In The Age of Intelligent Machines. Zone Books, NY, 1991. 

---

/

Add a comment