By *Vasudevan Mukunth on Sep 01, 2018 07:29 am
This article is part of a weekly column called ‘Infinite in All Directions’, written by Vasudevan Mukunth, science editor.
Two quick updates: The Guardian’s science blogs network is closing and the bloggers are all going their separate ways. On the plus side, Sandhya Ramesh, the science editor over at The Print, has launched a weekly roundup of science news from around the world called ScientiFix.
(On a related note: even though few publications have a few science writers in India, it’s important we come together to work complementarily, instead of competitively, to ensure we can share our readers but more importantly function as a mosaic of science journalism departments. To this end, kudos to Sandhya for doing this.)
A fraying Musk
It feels so difficult to tear oneself away from the life and times of Elon Musk and especially the spectacle of disintegration he’s been putting on. Earlier this week, he took another jab at Vernon Unsworth, the British diver who participated in the Thai cave rescue – the same person Musk had earlier called a “pedo guy”.
In response to another user, Musk asked on Twitter as to why Unsworth hadn’t attempted to defend himself after having been casually accused of pedophilia. It was then brought to his notice that Unsworth had sent a letter to the tech billionaire saying he’d face a libel suit if he didn’t set the public record straight.
My engineer friends still speak fondly of Musk and his technological achievements whereas others feel identity is increasingly marred by an overwhelming distastefulness. His comments at Unsworth aside, Musk had recently also announced on Twitter that he planned to take Tesla Motors private, before his colleagues and other investors swiftly disabused him of the idea. Musk does seem to be increasingly restless somehow, and volatile. His previous remarks last year and this also indicate he might have a god complex; one can only hope he sheds it to let his inner engineer shine through.
Singularities in science
I have a hypothesis: The race for primacy in scientific research is reducing the pursuit of truths to single members or isolated groups of the scientific community, diverting large grants going to a few people, and causing awards to be given to leaders of teams instead of the teams. Further, I suspect that all of these practices may or may not stand for, but do reflect a desire for, evaluation proxies, whereby resource-constrained evaluators flatten the evaluation process into distinct numbers, each of which is forced to shoulder enormous amounts of meaning and based on which decisions are made faster.
One of these numbers – rather parameters – has to do with primacy. However, every day now, we have had more proof that primacy may never have been a native feature of scientific research and that it doesn’t deserve to be the fulcrum balancing scientific success and failure. Last week, such proof took the form of the story of COBRA, a sounding rocket experiment led by a physicist named Herbert Gush to study the thermal characteristics of the cosmic microwave background (CMB) radiation.
The CMB is a residue of energy leftover from the Big Bang, distributed throughout the universe in the form of microwave radiation. It’s the reason why deep space has a temperature of 2.7 K, not 0 K. It’s the universe cooling off after its explosive birth. In the mid-1990s, the NASA COBE telescope was instrumental in understanding the characteristics of this temperature and what that told us about the distribution and other properties of the CMB. However, it was only slightly ahead of Gush’s COBRA experiment in making the first definitive measurement.
Now, does the fact that Gush’s sounding rocket experiment almost scooped the COBE experiment’s results render his and his team’s efforts useless? Of course not. The COBE team went on to win a Nobel Prize whereas Gush didn’t, but at the time, should administrators not have funded Gush? That would’ve been silly. We remember COBE’s efforts because of its primacy in discovering the thermal component of the CMB radiation but Gush’s work is no less significant.
In fact, excluding the time offset (and factoring in the less-developed ICT, and communication speeds, of the time), Gush’s results were precisely the same as COBE’s – that the CMB has a thermal component. But what primacy does in this context is it doesn’t help us remember that “COBE did xyz”; on the other hand, it encourages us to forget that “Gush did xyz”, and relegates his and his colleagues’ efforts into the esoteria of the history of science.
Two ways to tackle this problem come to mind – feel free to discuss/correct/etc. First, science journalists should make an attempt to discard primacy as a measure of success or fame altogether and focus instead on the quality of scientific research and data being reported. More than anything else, this is about discovery (‘primary reports’ should be the gateway into an area of research, not the destination) and language (analytical instead of celebratory). Second, those evaluating scientific accomplishments for prestigious grants, awards and/or rewards should consider modifying (if necessary) their goal: is it to award people or to award good science?
My favourite example to illustrate the latter is the BICEP2 cosmic inflation debacle. In 2014, the BICEP2 telescope at the south pole reported that it had discovered evidence that the universe had undergone a rapid expansion in its teething years. Shortly after the announcement, however, many in the scientific community began to have doubts about the BICEP2 data, esp. the fact that the telescope team had failed to check for a confounding factor that could put paid to their claims. Eventually, scientists looking for the same evidence using the Planck telescope found that, indeed, the BICEP2 data was incomplete and that the team didn’t have the evidence it claimed it did.
The BICEP2 team had been led by Brian Keating. He authored a book about the whole affair and it was published earlier in 2018. In it, he writes that the BICEP2 team had reached out to the Planck team asking for data that could help Keating’s colleagues plug the holes in their analysis. According to Keating, the Planck team refused, either because they didn’t have the data (which they didn’t communicate) or because they wanted to scoop BICEP2. Keating – in pursuit of a Nobel Prize himself – ultimately decided to go ahead and announce his team’s results to scoop the Planck team.
If only the prize had had a tradition of awarding all those who helped produce good research instead of going after the first producers alone… (although by no means should this be construed as support for the Nobel Prize’s questionable preeminence).
There’s only one absolute zero but there are multiple absolute ‘hots’, depending on the temperature at which various theories of physics break down. This is an interesting conception because, while absolute zero is very well-defined and perfectly understood, absolute hot simply stands for the exact opposite not in a physical sense but in an epistemological one: it is the temperature at which the object of study resembles something not understood at all. According to the short Wikipedia article on it, there are two well-known absolute hots:
- Planck temperature – when the force of gravity becomes as strong as the other fundamental forces, leading to a system describable only by theories of quantum gravity, which don’t exist yet
- Hagedorn temperature – when the system’s energy becomes so large that instead of heating up further, it begins to produce hadrons (particles made up of quarks and gluons, like protons and neutrons) or turns into a quark-gluon plasma
A physicist-friend suggested the example of a black hole. Thermodynamics stipulates that there is an upper limit to the amount of energy that can be packed into a given volume of space-time. So if you keep heating this volume even after it has breached its energy threshold, then it will transform into a black hole (by the rules of general relativity). For this system, its absolute hot will have been reached, and from the epistemological point of view, we don’t know the microscopic structure of black holes. So there.
However, it seems not all physical systems behave this way, i.e. become something unrecognisable beyond their absolute hot temperature. Quantum thermodynamics describes such systems as having negative temperatures on the kelvin scale. You are probably thinking it is simply colder than absolute zero – a forbidden state in classical thermodynamics – but this is not it. There seems to be a paradox here but it is more a cognitive illusion. That is, the paradox comes undone when you acknowledge the difference between energy and entropy.
The best way to keep news free and independent is to crowdfund it. Just Rs 7 per day from readers like you will keep The Wire going. To support The Wire, click here.
The energy of a system is the theoretically maximum capacity it has to perform work. The entropy of the system is the amount of energy that cannot be used to do work, also interpreted as a degree of disorderliness. When a ‘conventional’ system is heated, its energy and entropy both increase. In a system with negative temperature, heating increases its energy while bringing its entropy down. In other words, a system with negative temperature becomes more energetic as well as is able to dedicate a larger fraction of that energy towards work at higher temperatures.
Such a system is believed to exist only when it can access quantum phenomena. More fundamentally, such a system is possible only if the number of high energy states it has are limited. In classical systems, which is anything that you can observe in your daily life, such as a pot of tea, objects can be heated as high a temperature as needed. But in the quantum realm, akin to what classical thermodynamics says about the birth of black holes – that its energy density became so high that space-time wrapped around the system – systems of elementary particles are often allowed to possess only certain energies. As a result, even if the system is heated beyond its absolute hot, its energy can’t change, or at least there will be nothing to show for it.
While it was a monumentally drab subject in college, thermodynamics – as I have learnt since – can be endlessly fascinating the same way, say, the study of financial instruments can illuminate the pulse of capitalism. This is because thermodynamics – as in the study of heat, energy and entropy – encapsulates the physical pulse of the natural universe. You simply need to go where its laws take you to piece together many things about reality.
Of course, a thermodynamic view of the world may not always be the most useful way to study it. At the same time, there will almost always be a way to translate some theory of the world into thermodynamic equivalents. In that sense, the laws and rules of thermodynamics allow its practitioners to speak a kind of universal language, the way Douglas Adams’s Babel fish does.
The most famous example of this in the popular conception of scientific research is the work of Stephen Hawking. Together with Jacob Bekenstein and others, Hawking used thermodynamic calculations to show (on paper) that black holes were mortal and in fact emitted radiation out into the universe, instead of sucking everything in. He also found that the total entropy contained inside a black hole – its overall disorderliness – was closely related to its surface area. This was in the 1970s, but the idea that there are opportunities to understand the insides of a black hole by observing its outsides is as profound today as it was then.
(Disclosure: This column is composed in bits and pieces, and sometimes I use some of those of pieces for my blog – such as the section that just concluded – before they’re published here.)
Preprints are useful: scientists
On August 29, the journal Nature published three bits of correspondence regarding the anti-preprints article by Tom Sheldon a few weeks ago. I was delighted to see that many scientists were coming forward to speak in favour of preprints and the benefits they brought to the research community, that I wasn’t the only one who might have been screaming hoarse that Sheldon’s views, IMHO, were simply misguided. Excerpts from the correspondence:
A responsible journalist consults multiple independent sources to verify research findings. This critical evaluation is not contingent on the research having been peer reviewed. Preprints provide early and unrestricted dissemination of research outputs, so journalists can often peruse expert feedback when considering a story. And most preprint servers either label preprints as ‘not peer reviewed’ or have editorial ‘sanity checks’ in place to prevent the posting of junk science. Link
Wherever they hear about a story, journalists are under the same obligation as scientists to critically review the work they intend to communicate to readers. When journalists try to secure independent expert opinions, they should indicate whether and how preprint manuscripts have been screened — in keeping with disclaimers on some preprint servers. And scientists can impede the spread of low-quality information by publicly commenting on preprints and peer-reviewed papers, giving readers an insight into the scientific community’s reaction to a work. Link
Restricting when or how preprints are released risks suppressing science communication without any clear advantage to the public. When scientists and journalists follow fundamental principles for reporting research results — such as ensuring that publications are rigorously sourced and fact-checked — preprints pose no greater risk to the public’s understanding of science than do peer-reviewed articles. Link
Read in browser »