By Paul Ben Ishai

A synopsis

At the end of Part I left you in 1980.  A great year!  Punk was in full swing.  I was 15.  Cellphones hadn’t been invented (they had, but in Thatcher’s Britain we didn’t know about them).  It was a blast!! You could go out of the house for 5 hours and nobody called you or asked where you were.   Oh, I forgot.  The Soviets were going to nuke us, but Jimmy (Carter) would save us, and you could sneak into a pub and get served! The youth of today don’t know what they are missing!  But I digress, let’s talk about serious stuff…….

MORE INFO HERE  “Arabidopsis Cryptochrome Is...

I left you with a plum.  In the world of exposure safety levels to Radio Frequency radiation the choice of 10 mW/cm2 as the safe limit was little more than a whim from Bell Labs.  The belief was that the only damage to you from prolonged exposure would be tissue heating and less than 100 mW/cm2 wouldn’t kill you (literally. They boiled dogs alive at this level after one hour [1]).  As long as exposure was limited to no more than an hour  (the recommendations of H. Schwarm [1]) everything should be ok.

Indeed that was ok.  Most of us were exposed only to our AM/FM radio and the TV.  The ambient level of exposure was only 10-7 mW/cm2, one ten millionth of the recommended exposure level.

MORE INFO HERE  Wireless tech killing insects worldwide – 5G worst of all

Actually, this was not a standard yet, just a recommendation.  It became a standard in 1966. The policies, thinking and motivation that turned this to a standard was summarized in two documents in 1980 , one a federal study [2] and the other a Science paper [3].   Initially, after Col. Knauf had boiled his dogs and their testes (literally), the matter had been handed over to a joint committee of the US Navy and the American Standards Association (ASA).

https://ehtrust.org/the-times-of-israel-the-sorry-story-of-cell-phone-radiation-exposure-how-did-we-get-here-part-ii/ Source: Environmental Health Trust