We have earlier seen some quotes from the book The Golem: What You Should Know About Science. There are two companion volumes to this book The Golem Unleashed: What You Should Know about Technology and Dr. Golem: How to think about Medicine. These series of books by Harry Collins and Trevor Pinch provide us with examples from these fields which most of the times are ‘uncontested’. For example in the first volume they discuss about the famous 1920 experimental confirmation of Einstein’s predictions in general relativity by Eddington. This experiment is told as a matter-of-fact anecdote in physics, where petty borders of nationalism could not stop physics and physicists. But in the book, as they show inspite of scanty or almost no positive evidence, Eddington “Concluded” that the predictions were true. This they term “experimenters’ regress”.
The experimenter’s regress occurs when scientists cannot decide what the outcome of an experiment should be and therefore cannot use the outcome as a criterion of whether the experiment worked or not.
– The Golem Unleashed pp. 106
In The Golem Unleashed they present us with many examples of this from field of technology. One of the examples is from the Challenger accident which Feynman made famous by courtroom drama. In this case they call the “experimenter’s regress” as “technologist’s regress”.
Recently I read (all further quotes from the same link)an episode in India which would fit in very with these episodes. This is regarding baggage scanning machines installed at Indian airports. They were brought at 2 crore rupees per unit in 2010. But in August 2011 they failed the tests on tasks they were supposed to do.
The scanners are called in-line baggage inspection systems as they scan bags that go into the cargo hold of the aircraft after passengers check in and hand over their luggage to the airline. They use x-ray imaging and “automatic intelligence” to verify the contents of bags and determine whether they include explosives.
Now one would think that this would be as easy as it gets. Either the scanner detects whether the explosives are present in the baggage or they do not. But it is not as simple as it seems so. Now when the tests were done, the testers found the machines failed.
During the tests, security sources said that a technological specification committee of officials from the IB, RAW, SPG, NSG, BCAS and the civil aviation ministry passed bags containing 500 gm of six kinds of explosives, including PETN and ammonium nitrate, as well as IEDs through these systems. The scanners did not flag any of these bags as suspicious, the sources said.
So after this “failure” the companies which supplied these machines were asked to improve upon the machines or to share the software to recalibrate them. But the companies and interestingly Airport Authortiy of India AAI said that the testing methods were at fault. Now the explosives were passed and the machines did not detect them, then how can companies say that the testing methods were not working?
The machines work on the so called 70:30 principle.
“Though it works on a 70:30 principle, if there is an explosive in the 70 per cent, it will throw up the image of each and every bag that has dangerous substances. We would like to emphasise that the systems supplied and installed by our company at Indian airports are of state-of-the-art technology and are fully compliant with current standards.”
The 70:30 principle refers to the “automatic intelligence” used by Smiths Detection machines to clear 70 per cent of the baggage and reject the rest, according to the Airports Authority of India (AAI). “The machines reject 30 per cent of the baggage, the images of which are then sent to the screener. These systems have automatic intelligence capability and have been tested against a wide range of substances considered dangerous for aircraft. The details and specifications are never disclosed, or else terrorists would understand the software,”
But if anyway machines are doing the job, why not do it 100%? And the funny thing is that they are not sharing the software, which is the main agenda of the proprietary software companies. This is a case where people realize that they are just Users of the software under question. This argument that “or else terrorists would understand the software” does not hold. They don’t need to if the machine is going to reject a whole lot of bags And in anyway if there are bus/holes in the software, a thousand eyes repair them much faster than a few. And this is The companies further say that
“The technology or physics is that x-ray based system can’t detect explosives, it is only approximate detection of dangerous substances,”
Why is the AAI siding (they are rather defending the companies) with the companies is something worth pondering.
AAI people say “The problem could be due to the sheer ignorance of officers who lacked the skills to test for explosives,”
Still with no unanimity in the testing results, the case truly presents us with a “technologist’s regress.”