Explosives or Not

We have earlier seen some quotes from the book The Golem: What You Should Know About Science. There are two companion volumes to this book The Golem Unleashed: What You Should Know about Technology and Dr. Golem: How to think about Medicine. These series of books by Harry Collins and Trevor Pinch provide us with examples from these fields which most of the times are ‘uncontested’. For example in the first volume they discuss about the famous 1920 experimental confirmation of Einstein’s predictions in general relativity by Eddington. This experiment is told as a matter-of-fact anecdote in physics, where petty borders of nationalism could not stop physics and physicists. But in the book, as they show inspite of scanty or almost no positive evidence, Eddington “Concluded” that the predictions were true. This they term “experimenters’ regress”.

The experimenter’s regress occurs when scientists cannot decide what the outcome of an experiment should be and therefore cannot use the outcome as a criterion of whether the experiment worked or not.
The Golem Unleashed pp. 106

In The Golem Unleashed they present us with many examples of this from field of technology. One of the examples is from the Challenger accident which Feynman made famous by courtroom drama. In this case they call the “experimenter’s regress” as “technologist’s regress”.
Recently I read (all further quotes from the same link)an episode in India which would fit in very with these episodes. This is regarding baggage  scanning machines installed at Indian airports. They were brought at 2 crore rupees per unit in 2010. But in August 2011 they failed the tests on tasks they were supposed to do.

The scanners are called in-line baggage inspection systems as they scan bags that go into the cargo hold of the aircraft after passengers check in and hand over their luggage to the airline. They use x-ray imaging and “automatic intelligence” to verify the contents of bags and determine whether they include explosives.

Now one would think that this would be as easy as it gets. Either the scanner detects whether the explosives are present in the baggage or they do not. But it is not as simple as it seems so. Now when the tests were done, the testers found the machines failed.

During the tests, security sources said that a technological specification committee of officials from the IB, RAW, SPG, NSG, BCAS and the civil aviation ministry passed bags containing 500 gm of six kinds of explosives, including PETN and ammonium nitrate, as well as IEDs through these systems. The scanners did not flag any of these bags as suspicious, the sources said.

So after this “failure” the companies which supplied these machines were asked to improve upon the machines or to share the software to recalibrate them. But the companies and interestingly Airport Authortiy of India AAI said that the testing methods were at fault. Now the explosives were passed and the machines did not detect them, then how can companies say that the testing methods were not working?
The machines work on the so called 70:30 principle.

“Though it works on a 70:30 principle, if there is an explosive in the 70 per cent, it will throw up the image of each and every bag that has dangerous substances. We would like to emphasise that the systems supplied and installed by our company at Indian airports are of state-of-the-art technology and are fully compliant with current standards.”
The 70:30 principle refers to the “automatic intelligence” used by Smiths Detection machines to clear 70 per cent of the baggage and reject the rest, according to the Airports Authority of India (AAI). “The machines reject 30 per cent of the baggage, the images of which are then sent to the screener. These systems have automatic intelligence capability and have been tested against a wide range of substances considered dangerous for aircraft. The details and specifications are never disclosed, or else terrorists would understand the software,”

But if anyway machines are doing the job, why not do it 100%? And the funny thing is that they are not sharing the software, which is the main agenda of the proprietary software companies. This is a case where people realize that they are just Users of the software under question. This argument that  “or else terrorists would understand the software” does not hold. They don’t need to if the machine is going to reject a whole lot of bags And in anyway if there are bus/holes in the software, a thousand eyes repair them much faster than a few. And this is The companies further say that

“The technology or physics is that x-ray based system can’t detect explosives, it is only approximate detection of dangerous substances,”

Why is the AAI siding (they are rather defending the companies) with the companies is something worth pondering.

AAI people say “The problem could be due to the sheer ignorance of officers who lacked the skills to test for explosives,”

Still with no unanimity in the testing results, the case truly presents us with a “technologist’s regress.”

Science And Certainty

Science is not about certainty. Science is about finding the most reliable way of thinking, at the present level of knowledge. Science is extremely reliable; it’s not certain. In fact, not only it’s not certain, but it’s the lack of certainty that grounds it. Scientific ideas are credible not because they are sure, but because they are the ones that have survived all the possible past critiques, and they are the most credible because they were put on the table for everybody’s criticism.
The very expression ‘scientifically proven’ is a contradiction in terms. There is nothing that is scientifically proven. The core of science is the deep awareness that we have wrong ideas, we have prejudices. We have ingrained prejudices. In our conceptual structure for grasping reality there might be something not appropriate, something we may have to revise to understand better. So at any moment, we have a vision of reality that is effective, it’s good, it’s the best we have found so far. It’s the most credible we have found so far, its mostly correct.
via | Edge

This is something that I think separates science from religion. Religion is about absolutes, trust in the absolute God. And this is the difference that should be also taught to the students of science.

In Denial of Fukushima

The arrogance and jingoism exhibited by the Nuclear lobby in India is well known. Even in face of disaster
Fukushima, the people in DAE remain adamant that there is no option to Nuclear Energy and also that it is safe from accidents, and even if an accidents happens at all they will be ready to control. The optimism that they have regarding issues of safety in case of radioactive materials and nuclear reactors is something a person with a good understanding of science would not share. Too much reliance on the idea that “nothing can go wrong” is what will lead to the horrible consequences of not understanding the Golem. And the statements by the DAE junta does exactly this. The very idea that the reactors are completely safe; are different than what was present in Japan, we can contain the damage, are what are needed to be questioned.
A nice article in Tehelka makes the point more clearer. Here are some lines from the same:

Fukushima also demonstrated unambiguously that communities living near nuclear facilities would be the worst affected in the event of an accident, a lesson that hasn’t been lost on the local populations in Koodankulam and Jaitapur. At the other end of the spectrum was the reaction of the people associated with nuclear establishments, who vociferously argued that it was essential to persist with nuclear power — not surprising, since it conforms to their self-interest.

Whatever the experts at DAE maybe saying, the images that the people at large are seeing are that of desolate landscapes, ruined buildings, poisoned farmlands, and inaccessible homes. The very idea that Nuclear Power can solve all the issue of power in India is questionable. Lets say even if we construct 10 such more plants, where will be the power used? Who will get the priority over the power? The villages near which the power plants are present, or the metro cities whose demands for power and its abuse are ever increasing. Just think about how many electrical appliances  you have, and how many you could do without?

On 15 March 2011, NPCIL Chairman SK Jain trivialised what was going on in Japan saying, “There is no nuclear accident or incident in Fukushima… It is a well-planned emergency preparedness programme… (that) the nuclear operators of the Tokyo Electric Power Company are carrying out to contain the residual heat after the plants had an automatic shutdown following a major earthquake.” Such denial would be laughable but when the person thus opining is in charge of India’s power reactor fleet, it ceases to be amusing.
In September 2011, for example, the DAE Secretary claimed: “We are prepared to handle an event like Fukushima.” This assertion is belied by the Secretary, Ministry of Health and Family Welfare, who testified to the Parliamentary Standing Committee in 2010 that it was “nowhere (near) meeting an eventuality that may arise out of nuclear and radiological emergencies”.
On more than one occasion, the DAE Secretary has made assertions that the probability of a nuclear accident in India is zero. In November 2011, for example, he stated that the probability was “one in infinity”. The public image sought to be created is one of great confidence in safety. Is such confidence justified?
The first point to note is that the very statement that the likelihood of an accident is zero is scientifically untenable; every nuclear reactor has a finite, albeit small, probability of undergoing a catastrophic failure.
A second question: is the confidence on the part of officials about the zero probability of accidents good for safety? This is not a question about technology but about organisations. … Safety scholar James Reason once noted: “If an organisation is convinced that it has achieved a safe culture, it almost certainly has not.” The DAE and its attendant institutions appear to be convinced not just that they have a safe culture, but that the hazardous technologies they operate are incapable of undergoing accidents. This is not conducive to safety.
What the Koodankulam protest tells us is that these populations are not consenting to be subject to this risk. They deserve to be listened to, not dismissed as stooges of foreign funding. That is an insult to the intellects and minds of millions of people and to democracy itself.

The Demarcation Problem

What is the demarcation problem?
I want to discuss an acute problem which philosophers of science have to face. The question it self is quite simple. You don’t have to be genius to understand the question, but the answer to this question is far from simple.
The question put simply would read something like this:
What is the difference between science and non-science?
Or
What is science?
If you ask this question perhaps to a school going kid, you will probably get a good and clear cut answer, Physics, Chemistry and Biology are sciences, [also perhaps mathematics also?]. Also the
perhaps this is the view not only school going kids but their teachers also feel and so do practicing scientists.
Most of the lay people are afraid of science and scientists. The very idea of science is mystical and scientists are seen as the worshippers of the nature itself. This is the common image which is also portrayed in the media, [so it is popular or it is the other way round?]. In the movies scientists are [if they are not the protagonists] shown as causing almost the end of the world, or having no hearts but for the subject of their study. This is the label of evil genius which has been put on them. The list of examples would be endless. But to give a few of my own favorite ones are as under:
Uma Thurman as Poison Ivy in Batman and Robin

And Mike Myers as Dr. Evil in the Austin Powers series

This can be easily seen that the public opinion about science is not what can be called good. Another thing to add here, if we in general see that there is an attribute scientific to any thing then the thing is has to be rational, logical and something that can be relied upon. Take for example the warning which every cigarette smoker reads but ignores, this warning is supposed to be `scientific’ so that you have to take it seriously, no bullshit here, this is what scientists say. This is The Truth, with a capital T. All these concepts are what I call the traditional concepts in Philosophy of Science [PoS hereafter], have a root in the beginning of the 20th century.
What is the point of bringing all this up in an philosophical discussion? Wait, what we will see is the fact that the things just mentioned have a very deep root in philosophy. What we want to do is to explicate this root.
We start our discussion with the so called modern era of the philosophy, which was mostly in the last century. In this era a group of philosophers known as the Vienna Circle presented the first dominant view point, which persisted till the first half of the century.
But this will be in another post….