X Close

Science blog

Home

News, anecdotes and pictures from across science and engineering at UCL

Menu

From summer interns to cubesats in space

By Charlotte E Choudhry, on 7 October 2016

Pratham2

Many students send emails requesting summer internships at UCL Physics & Astronomy, but one particularly caught Dr Anasuya Aruliah’s eye. Vishal Ray was a 2nd year Aerospace Engineering undergraduate at the prestigious Indian Institute of Technology Bombay, India (IIT). He belonged to a student team building their own miniature satellite. Vishal was just the student that Dr Aruliah needed for her new direction of research: satellite drag. After a successful application to the International Students Dean’s Summer Student Scholarships, he was awarded a 2 month internship in summer 2015.
Dr Aruliah’s group, the Atmospheric Physics Laboratory (APL), is a subgroup of the Astrophysics Group. It has a long history of researching the upper atmosphere using a global circulation atmospheric model. They also operate a network of Fabry-Perot Interferometers (FPIs) in Arctic Scandinavia to observe the aurora. The Earth’s atmosphere is like an onion skin, with the troposphere (the domain of weather forecasters), stratosphere and mesosphere as layers on top of each other. The thermosphere is the final layer of the Earth’s atmosphere, and is the altitude region between 90-400km. Low Earth Orbit (LEO) satellites occupy the top of the thermosphere, and rely on upper atmospheric models to predict their orbits.

Recently the APL group found a discrepancy between measurements of thermospheric winds calculated from Doppler shifts of airglow photons, and winds determined from atmospheric drag on the Challenging Minisatellite Payload (CHAMP) satellite. This is an important puzzle to solve because satellite drag measurements are put into atmospheric models to bring them as close to reality as possible. If the ground and satellite measurements do not agree, then which is correct?

The IIT miniature satellite, commonly called a cubesat, is composed of a single cube, only 30 cm in length, width and breadth, and weighing only 10 kg, as much as a few bags of sugar. Their cubesat is called Pratham. This fits perfectly with UCL’s involvement in the European Union FP7-funded QB50 project, in which fifty cubesats carrying miniaturised sensors will be launched nearly simultaneously. This is an international collaboration involving many universities, academic institutes and the space industry. It is an unprecedented science operation, with potential for future Space Weather monitoring campaigns. The QB50 cubesats will be carried by rocket into the upper thermosphere, and fall to Earth in decaying orbits while sampling regions of the thermosphere and ionosphere that were previously poorly understood owing to the lack of detailed measurements.

“The simplicity and low cost of cubesats has spurred much excitement and creativity amongst young (and old) engineers and scientists over the last few years. There are new frontiers being opened by this miniaturised space technology,” said Dr Aruliah.

The UCL Mullard Space Science Laboratory (MSSL) designed and built one of the three key sensors: the Ion Neutral Mass Spectrometer, which will be carried on several of the cubesats, as well as their own cubesat called UCLSat. The QB50 cubesats are scheduled for launch in three batches over the winter period of 2016-2017. Two batches from a Ukrainian-Russian Dnepr rocket, and a third from the International Space Station. During Vishal’s internship at UCL he met with the MSSL cubesat and sensor team, led by Mr Dhiren Kataria and Dr Rob Wicks; and with Dr Stuart Grey in the UCL Department of Civil, Environmental & Geomatic Engineering.

Dhiren Kataria holding the UCLSat designed and built at MSSL

Dhiren Kataria holding the UCLSat designed and built at MSSL

Vishal used his experience to write several sophisticated computer programs to calculate drag coefficients from simulations of a cubesat orbiting in our 3-dimensional atmospheric model called CMAT2. This work was subsequently built upon by Dr Aruliah’s 4th year project student, Jennifer Hall. Jennifer wrote her own programs to derive and compare satellite drag coefficients from CMAT2 simulations and EISCAT radar measurements. Jennifer’s project won the UCL Physics & Astronomy Tessela prize for best use of computer technology in a 4th year project.

IIT Pratham group

Pratham team at the Indian Institute of Technology Bombay. Vishal Ray is 2nd from the right in the top row.

One year on, after a busy 3rd year of studies, Vishal has written up his summer project as a journal paper, and “Pratham” was scheduled for launch at 0530 UTC on the 26 September 2016. The IIT Bombay student team installed their cubesat on the launch vehicle PSLV C-35 on the remote island of Sriharikota in South India. Vishal said that he “…had goosebumps when we actually placed the satellite on the launch vehicle module and completed the testing for one last time!”. “Pratham” was successfully launched and will measure the total electron count from 800 km altitude in a Sun Sychronous Orbit. MSSL were the first to receive Pratham’s beacon signal, which the students were incredibly excited to hear. You can hear the cubesat from 4:20 onwards as it passes within range of the detector at MSSL. The signal is decipered as “Pratham IIT Bombay Student Satellite”. The accompanying image is of Theo Brochant De Viliers (MSSL) beside the MSSL receiver.

https://soundcloud.com/uclsound/signals-from-the-pratham-cubesat-satelllite

The prospect of finally being launched is very exciting, with both projects having been nearly 10 years in the making. Once launched, the missions will change from the technical challenges of the innovation of miniature sensor devices to the scientific challenges of collecting, analysing and interpreting the measurements. The rewards will be great: from the new technologies surrounding cubesats; to the training of future space scientists and engineers, and to the Space Weather community.

The LHC is back in operation at record energy

By Oli Usher, on 3 June 2015

After two years of repairs and upgrades, the Large Hadron Collider (LHC) is back in operation – and UCL scientists are at the heart of the action. Engineers at CERN confirmed today that the beams of protons that circle in the 27km tunnel near Geneva are stable, and scientific data is once again being collected.

The ATLAS experiment is made of concentric rings of detectors (the particle beam passes through the centre), as seen here during shutdown in 2008. Credit: CERN (CERN licence)

The ATLAS experiment is made of concentric rings of detectors (the particle beam passes through the centre), as seen here during shutdown in 2008. Credit: CERN/Claudia Marcelloni De Oliveria (CERN licence)

UCL scientists have been closely involved in the design, construction and operation of ATLAS, one of the giant detectors that track the high-energy particle collisions in the LHC, so this is an important milestone for the university’s High Energy Physics group.

Until now, the LHC has not been operating at full power. The faults that led to its shutdown shortly after it was inaugurated in 2008 meant that it could only be used to accelerate particles to around half the energy it was designed for.

* * *

Einstein’s equation E=mc2 states that energy and matter are interchangeable.

Atom bombs, famously, create vast amounts of energy by destroying small amounts of matter.

Particle accelerators like the LHC do the opposite, pumping vast amounts of energy into tiny particles, making them move at close to the speed of light. When they collide together, some of that energy is converted into extra matter, in the form of new particles flung out from the site of the collision. The greater that energy, the heavier the particles that can be generated.

Physicists describe the world around us using the ‘standard model’ of particle physics, a set of a handful of particles which can explain the properties and makeup of all the matter and energy we see around us.

The last of those particles to be detected in the lab was the Higgs boson, discovered at CERN in 2012 shortly before the upgrade began.

This doesn’t mean there is nothing left to discover, though. Scientists, including CERN’s director, have begun speaking of a tantalising ‘new physics’ – whole new uncharted areas of science that are currently totally unknown, but which might be explored with higher-energy collisions and heavier particles generated.

One of these areas could be a solution to the riddle of dark matter.

Dark matter can be detected by astronomers (as seen in this Hubble image of a galaxy cluster), but it has not been spotted on Earth, and is known to not be made out of any of the particles in the standard model. Credit: NASA/ESA (CC-BY)

Dark matter can be detected by astronomers (as seen in this Hubble image of a galaxy cluster), but it has not been spotted on Earth, and is known to not be made out of any of the particles in the standard model. Credit: NASA/ESA (CC-BY)

Dark matter has been detected in distant galaxies, thanks to its gravitational effects, but astronomers have determined that it cannot be made of any of the particles described in the standard model.

ATLAS, the building-sized instrument that UCL participates in, played a key role in the Higgs boson discovery, and will play a starring role in the future work of the LHC, and could help explain how dark matter relates to the standard model of particle physics.

* * *

The LHC consists of two 27km-long pipes that use powerful magnets to accelerate beams of protons in opposite directions.

Inside ATLAS, the two beams are brought together on a collision course. The beams are not continuous: the protons come in pulses (“bunches”) about 10cm long and the width of a human hair, each containing around a hundred billion particles. When these bunches cross each other, protons collide and new particles cascade out through the concentric rings of detectors that make up the ATLAS experiment.

One of the detections made by ATLAS today. This picture is a cross-section of the instrument, with each concentric ring detecting particles' location or energy, and the particles' tracks (shown as multi-coloured curved lines) inferred from this data. Credit: CERN

One of the detections made by ATLAS today. This picture is a cross-section of the instrument, with each concentric ring detecting particles’ location or energy, and the particles’ tracks (shown as multi-coloured curved lines) inferred from this data. Credit: CERN (CC BY SA)

Scientists can then trace the path that the particles took – and determine their energy, mass and electrical charges. And from those, they can infer the process that take places in each proton-proton collision.

During the two-year LHC shutdown, the ATLAS scientists also made several improvements to their detector, most notably with the installation of an extra ring of detectors close to the beam pipe, making it more precise than ever before, and part of the team’s work now that the LHC is running again is to ensure that this is all properly calibrated and working as expected.

When it runs at full capacity, ATLAS detects 40 million particle collisions every second, far more than could ever be studied.

Part of the challenge is to discard the unimportant data so that scientists can focus on what’s important. One of the major contributions UCL scientists have made to ATLAS is to the design and operation of the hardware and software algorithms used to discard trivial events in real time and select only the interesting ones – reducing 40 million collisions per second down to a far more manageable 1,000 that are recorded offline.

Another challenge is the simple matter of timing and synchronisation.

With millions of events per second, and everything moving at close to the speed of light, untangling the data from different collisions is challenging. ATLAS is still detecting particles ejected by one collision while another is already taking place.

Particles from multiple events cascade through the detectors at one time, and synchronising them is not straightforward. Credit: CERN (CC-BY-SA)

Particles from multiple events cascade through the detectors at one time, and synchronising them is not straightforward. Credit: CERN (CC-BY-SA)

UCL scientists played key roles in developing the electronics that ensure that the data is accurately recorded and readouts from the different components of ATLAS are all kept properly synchronised.

* * *

So what’s next for the LHC?

It’s very hard to say – and that’s what is so exciting about particle physics today.

The standard model is complete. There could be a radical departure, revealing areas of physics never explained before.

Equally, there might just be further confirmation of the dramatic discoveries of the past few decades, giving more precision and certainty to the standard model.

In either case, there is now a chance to explore physics without the constraint of theoretical preconceptions – an unusual and liberating place for a physicist to be.

With thanks to Prof Nikos Konstantinidis for help with this article

 

 

 

 

The art of unseeing

By Oli Usher, on 16 February 2015

While astronomers expend a lot of effort trying to see things better – building ever more powerful telescopes that can detect even the faintest, most distant objects – they are occasionally faced with the opposite problem: how to unsee things that they don’t want to see.

A group of scientists led by UCL’s Emma Chapman is working on methods to solve this problem for a new radiotelescope that is currently under development. In so doing, they could help give us our first pictures of a crucial early phase in cosmic history.

How to avoid seeing things you don’t want to see is a particular problem for cosmologists – the scientists who study the most distant parts of the Universe. Many of the objects and phenomena that interest them are faint, and lie hidden beyond billions of light years of gas, dust and galaxies. To make matters even more difficult, telescopes are flawed too – the data from them is not perfectly clean, which is manageable when you’re looking at relatively bright and clear objects – but a serious problem when you’re looking at the faint signature of something very far away.

Observing these phenomena is much like looking at a distant mountain range through a combination of a filthy window, a chain-link fence, some rain, clouds… and a scratched pair of glasses. In other words, you’re unlikely to see very much at all, unless you can somehow find a way to filter out all the things in the foreground.

***

The Square Kilometre Array (SKA) is a new radiotelescope, soon to begin construction in South Africa and Australia. The SKA will use hundreds of thousands of interconnected radio telescopes spread across Africa and Australia to monitor the sky in unprecedented detail and survey it thousands of times faster than any current system.

Artist's impression of the South African site of the Square Kilometer Array. Credit: SKA Organisation (CC BY)

Artist’s impression of the South African site of the Square Kilometer Array. Credit: SKA Organisation (CC BY)

One of its objectives is to make the first direct observations of a brief phase of a few hundred million years in cosmic history known as the ‘era of reionisation’. This technical term conceals something quite dramatic: a profound and relatively sudden transformation of the whole Universe, which led to the space between galaxies being fully transparent to light as it is today.

(more…)

Asteroid’s close encounter with Earth – the UCL view

By Oli Usher, on 2 February 2015

2004 BL86

Last week, asteroid 2004 BL86 passed near Earth. The ball of rock, a little over 300 metres across, passed 3.1 lunar distances from Earth.

This is far enough not to be of any serious concern – but it is closer than any other known asteroid will come to us until 2027. If an asteroid like 2004 BL86 were to hit Earth, we could expect widespread destruction – the famous Barringer Crater in Arizona was gouged out by an object just 50 metres across.

During its close approach, UCL’s observatory spotted the asteroid and snapped the picture above: a series of 30 second exposures separated by 9 second gaps. The asteroid can be seen moving rapidly against the background stars as the telescope was programmed to track the movement of the stars.

Reprogramming the telescope to hold the asteroid in its sights creates the image below – with the stars appearing as streaks instead.

2004 BL86

This video, featuring a series of observations of the asteroid made at the observatory over the night of 26-27 January, shows both types of observation, including a long shot tracking the asteroid across the sky.

Images by Steve Fossey, Theo Schlichter and Ian Howarth.

Links

High resolution image

Mauna Kea diary

By ucapowe, on 7 January 2015

Amidst the indescribable stress that is writing up my PhD, there is a massive silver lining. I’m currently writing this from 2,800m (that’s about 9,200 feet), half way up the Mauna Kea volcano in Hawaii. I say volcano, it’s not actually erupted for several thousand years and (the reason that I’m here) it has billions of pounds worth of massive telescopes on top of it.

The peak of Mauna Kea, with Subaru, Keck 1, Keck 2 and NASA IRTF telescopes. Photo: Alan L (CC BY)

During the second year of my PhD, my supervisors and I, whilst looking at some data everyone had assumed was assumed was empty, discovered the first molecule containing a noble gas in it in space. Those of you who know anything about Chemistry will know this is really weird. Noble gasses are so named because they’re noble: they don’t mix with the other elements.

However, in the remnants of a star that exploded around 1000 years ago, the conditions for it to actually do so happened. This is a massive deal and needs following up quickly – which as well as taking over a substantial chunk of my time over the past year has now brought me to Hawaii.

The Crab Nebula - where UCL researchers discovered argon hydride molecules. Photo credit: NASA/ESA/Hester/Loll/Barlow

The Crab Nebula – where UCL researchers discovered argon hydride molecules. Photo credit: NASA/ESA/Hester/Loll/Barlow

Last night was my first trip up to the summit, where I spend several hours at NASA’s Infrared Telescope Facility. It’s an odd experience being that high up. Everything needs to be a little bit slower. There are perpetual reminders that you are somewhere not normal. From the warning signs to the bottles of oxygen placed liberally around the control room.

People don’t function so well that high up.

Our trip was mostly to acclimatise to the 4,200m altitude and get used to the instruments we will be using. This is a good thing, because while Mauna Kea has 350 clear nights a year, last night was not one of them.

Last night there was a storm. The drive up to the summit was a pretty hairy experience with squalls of wind  and rain. Thankfully it wasn’t me doing the driving.

The only work that needed doing last night was calibration set up, for which we didn’t need to be able to actually see stars. Just as well, as there’s no way that we could have.

This weather system should have passed by tomorrow so we’ll be free to do science.

NASA Infrared Telescope Facility. Photo: Afshin Darian (CC BY)

* * *

Night 2

Tonight’s drive up was much clearer. As well as stars we could see the top of a thunder storm out over the pacific and the orange glow from a neighbouring volcano (a nice reminder that although it hasn’t erupted for several thousand years, Mauna Kea is not actually extinct).

Clearer… until we got to the summit.

Inside the NASA Infrared Telescope Facility. Photo: Patrick Owen

Inside the NASA Infrared Telescope Facility. Photo: Patrick Owen

Having prepared and calibrated everything and chosen our first standard star to use as a check for everything, the Telescope Operator said “no”.

Apparently it’s 100% humidity outside and it is lovely and misty.

We had a slight tease at about midnight: we got as far as opening the telescope dome and finding our standard star. Alas, just as we started taking actual measurements the humidity shot back up and we had to close the dome.

Night 3

There’s a massive difference when we get up to the summit tonight! Stars! I can see stars, not particularly brightly, which is mostly to do with the lack of oxygen at this altitude meaning my eyes aren’t working as well as they should, but stars!

NASA Infrared Telescope Facility at night. Photo: NASA

If I can see stars, the telescope can see stars. After some changing of instruments and refilling of coolants (no mean feat at that altitude) we were finally ready to get started.  We found and observed out standard star without much of an issue.

Then we started looking for the little “knots” of gas we are observing in the Crab Nebula.  This took us a while longer than planned, but we got there and all lined up on the instrument so we could get the data we need. Nothing. Tried again. Nothing. By this time it was also about 3am, the combination of the time and the lack of oxygen made this all rather difficult to cope with.

The Crab Nebula is full of knots and filaments of gas. Photo: NASA/ESA/Hester/Loll/Barlow

We set the telescope to run for a two hour run to see if we could get anything at all.  Other than some cosmic rays (really not what we’re looking for at all) we got… nothing.  Frustration and worry about our calculations and whether what we were doing was right ensued.  I paced. Lots.  As the sun came up I went outside to get some fresh air (and see the telescopes, I’d only been up here in the dark until now), before heading down to the base camp for some fried food and sleep.

The telescope at dawn, with crescent moon. Photo: Patrick Owen

The telescope at dawn, with crescent moon. Photo: Patrick Owen

Night 4

I woke up this “morning” to an email telling me that there had been something wrong with the instrument. Good news as it means we’re probably going to get some good data this evening. Bad news because it was a really simple fix that had we known about it would have allowed us to get some good data last night too.

Ah well, onwards and upwards.  After another slog to get set up and find a new brighter standard star and things, we got observing.

Final night lucky, at about half past three, we finally realised we’d found what we were looking for! Massive amounts of relief all round, we still had to finish the run and get as much data as we could before the sun came up, but we got some.

It’ll take several weeks of processing the data followed by several more weeks of analysis before we know exactly what we have.

That can wait until after I’ve finished writing my thesis.

Sunset from base camp. Photo: Patrick Owen

Sunset from base camp. Photo: Patrick Owen

Patrick Owen is a PhD student in UCL Physics & Astronomy, and has recently returned from observing in Hawaii

UCL stars on ‘The Sky at Night’

By Oli Usher, on 15 December 2014

sky-at-night

December’s Sky at Night was practically a UCL full house.

As well as Maggie Aderin-Pocock (UCL Physics & Astronomy) presenting, the programme featured UCL astronomers Serena Viti and Steve Fossey, UCL chemists Andrea Sella and Stephen Price, and was filmed at UCL’s observatory.

Well worth a watch – viewers in the UK can watch the programme again on BBC iPlayer until 14 January 2015 by clicking above.

The show will also be repeated on BBC Four at the following times: 7.30pm on 18 December, and 2am on 19 December.

Do neutrinos have mass? Anatomy of a scientific debate

By Oli Usher, on 7 August 2014

Do neutrinos have mass? And if so, how much? This apparently simple question has no simple answer and has been the subject of debate, controversy and confusion in the world of physics in recent years.

Neutrinos are subatomic particles created during certain types of nuclear reactions, including those that power the Sun. Although the Sun churns out neutrinos in unimaginably large numbers – around 80 octillion (8 followed by 28 zeroes) pass through the Earth every second – they are very hard to detect. Totally unaffected by electromagnetism, they are invisible and pass through matter unimpeded. They only interact gravitationally and, on the scale of atomic nuclei, through the weak nuclear force.

The first detection of a neutrino in a bubble chamber, in 1970. Photo credit: Argonne National Laboratory (public domain)

The first detection of a neutrino in a bubble chamber, in 1970. Photo credit: Argonne National Laboratory (public domain)

The framework of theories scientists use to explain and describe the world of subatomic particles, known as the standard model of particle physics, predicts that neutrinos, like photons, should have no mass. However experimental studies detecting solar neutrinos in recent years contradict this, suggesting that neutrinos do have mass.

So when data shows that a key element of your theoretical framework is proved wrong, what do you do? You could assume that the data are incorrect, or the theory is wrong. The consensus among physicists is that the standard model of particle physics is incomplete – but identifying what is missing from it is a complex issue.

Cosmologists are currently trying to get to the bottom of this question, with sometimes quite radical solutions. Boris Leistedt and Hiranya Peiris, two UCL researchers, have recently ruled out one of these eye-catching theories. The question won’t be settled until new data from a range of physics and cosmology experiments comes in a few years time. (Among these is the Dark Energy Survey, in which UCL is closely involved.)

The Dark Energy Camera on the Blanco telescope in Chile will give new data on the structure and distribution of galaxy clusters in the universe. Photo credit: Reidar Hahn/Fermilab (All rights reserved)

The Dark Energy Camera on the Blanco telescope in Chile will give new data on the structure and distribution of galaxy clusters in the universe. Photo credit: Reidar Hahn/Fermilab (All rights reserved)

But even before the new evidence comes in, the debate surrounding these claims and counterclaims cast light on how scientific theories develop.

***

The evidence that neutrinos have small (but non-zero) mass is now quite compelling, and scientists are in broad agreement that the standard model needs to be modified or extended to fit this new data. But how much mass they have and how much our theories needs to evolve are still open questions.

Finding the neutrino’s mass isn’t simply of academic interest. The mass of the neutrino is intimately tied up not only with the evolution of the standard model, but with our understanding of key issues in cosmology including the formation of galaxies, the way galaxies are scattered through the universe and the behaviour of the Big Bang.

Some recent studies of cosmological data have made waves this year, proposing a relatively high mass for the neutrino (of around twice the mass of previous estimates). These studies suggested that apparent discrepancies between several large datasets (including temperature fluctuations in the cosmic microwave background, the statistical distribution of galaxies through the universe, and X-ray detections of galaxy clusters) could be explained if neutrinos were heavier than previously assumed.

The Cosmic Microwave Background, as observed by the WMAP spacecraft, is one of the datasets that can be reconciled with each other if the neutrino has a large mass. Photo credit: NASA (public domain)

If the neutrino has a large mass, then several major datasets, including the Cosmic Microwave Background (above) can be reconciled with each other. But just because it is mathematically plausible doesn’t necessarily mean it’s true. Photo credit: NASA (public domain)

Statistically, these studies are quite compelling as they manage to reconcile apparently incompatible data, as well as addressing the key problem of the standard model by ascribing a mass to the neutrino.

But the results, despite tidying up loose ends, have not met with everyone’s approval. In particular, the UCL team think the evidence for neutrinos having such high masses is specious. They have recently published their thoughts on the subject.

They look at this apparent evidence for high-mass neutrinos from two different angles:

  • First, if the neutrino is indeed heavy, what would that imply about the universe around us (and does it fit with what we see)?
  • Second, if it is not, what else could explain the discrepancies in the data?

On both fronts, they think the evidence points firmly at a significantly lighter neutrino, in line with previous estimates.

***

Assuming that neutrinos have a relatively large mass fits experimental datasets quite well, but, Leistedt and Peiris argue, it is disproved by looking at a broader range of evidence.

The large-scale structure of the universe is dominated by gravity, much of it caused by an exotic (and invisible) type of matter known as dark matter. This governs the distribution, size, shape and motion of galaxies on unimaginably large scales, helping galaxies form, and clustering them together.

This map shows the location of galaxies within about 250 million light years of the Milky Way. Even on this scale - a tiny proportion of the universe - they are clearly bunched together in clusters and filaments. Photo credit: Richard Powell (CC-BY-SA 2.5)

This map shows the location of galaxies within about 500 million light years of the Milky Way. Even on this scale – a tiny proportion of the universe – they are clearly bunched together in clusters and filaments. Photo credit: Richard Powell (CC-BY-SA 2.5)

A high mass for the neutrino would upset this balance, and in particular would inhibit the formation of galaxies like the one we live in, meaning there were fewer visible in the sky. It would also mean a less-structured universe on the scale of galaxy clusters and galaxy filaments. This is because very light neutrinos travel very quickly and pass through galaxies and clusters without any significant interaction, leaving them to form under the action of gravity. If neutrinos had higher mass (and hence lower velocities), they would interact far more, diffusing and scattering through galaxies and clusters, changing the way they collapse. In effect they would smear out the distribution of galaxies.

The distribution of galaxies through space seen by astronomers is extremely uneven, with filaments and sheets of galaxies surrounding huge voids, forming a vast cosmic web that fills the known universe. But the web is not consistent with what we would see if the neutrino had a high mass: structure would be washed out, voids would be larger, filaments thinner and galaxy clusters smaller.

Clustering of galaxies, as seen in this Hubble picture, would be much less pronounced if the neutrino had a large mass

Clustering of galaxies, as seen in this Hubble picture, would be much less pronounced if the neutrino had a large mass. Photo credit: NASA, ESA, HST Frontier Fields (public domain)

Moreover, the properties of the cosmic microwave background (the afterglow of the Big Bang) also undermine the idea of a high-mass neutrinos.

So, Peiris and Leistedt say, the models that propose a large mass for the neutrinos, which appear to fit the numbers quite well, turn out not to fit well at all with the individual data sets. The apparent agreement between them, they argue, is not much more than a statistical trick.

As an aside, some experiments have proposed higher-still masses for the neutrino, several times greater than even these controversial calculations. With neutrinos of that sort of mass, it is questionable whether galaxies would have been able to form at all and the universe would have been a dramatically different place.

***

Which brings us to the second question: if the discrepancies between different datasets can’t be explained by a heavier-than-expected neutrino, what does explain them?

Leistedt and Peiris think that this can be answered quite simply: actually most of the data broadly do agree with each other. It is the observations of how common galaxy clusters are in the universe which are out of line. And these are known to be the least robust of them all.

The unreliability, they argue, comes from multiple angles, including the difficulty of proving you have representative sets of galaxy data, the difficulty of knowing whether there is a selection bias in the data (e.g. large galaxy clusters being overrepresented), the difficulty of estimating cluster masses through gravitational lensing, and the extensive modelling involved which involves some degree of educated guesswork.

Gravitational lensing - the bending of the light from distant galaxies - can be used to estimate the mass of galaxy clusters (as the extent of the bending is directly proportional to the amount of mass present). Lensed light from distant galaxies is visible in this Hubble picture as streaks and arcs of light, most obviously the large diagonal streak of blue light in the right-hand side of the image. But it is fraught with difficulties. Photo credit: NASA, ESA, HST frontier fields (public domain)

Gravitational lensing – the bending of the light from distant galaxies – can be used to estimate the mass of galaxy clusters (as the extent of the bending is directly proportional to the amount of mass present). Lensed light from distant galaxies is visible in this closeup of the Hubble picture above as streaks and arcs of light, most obviously the large diagonal streak of bluish light in the right-hand side of the image. But it is fraught with difficulties. Photo credit: NASA, ESA, HST frontier fields (public domain)

Strip this unreliable data out, and it is far from obvious that there even is an anomaly that needs to be explained – and the previous, lighter estimates of neutrino mass look far more plausible once more.

A crucial test of this assumption will come with the Dark Energy Survey, which will bring with it far more robust data on galaxy clusters, alongside measurements of the distribution and gravitational lensing of galaxies, which will cross-check this data. This should, hopefully, settle the controversy. The survey’s five year observing run began last year, with early data expected late in 2015. This early data should be enough to settle the issue, ahead of the final data release in 2017.

***

On one hand, Peiris and Leistedt’s refutation of the neutrino having a large mass seems to bring us back to square one. We still don’t have a terribly clear idea of what the neutrino’s mass is. We still have a hole in the standard model, because even if it is small, the neutrino does have mass. And we have just contradicted some research that appeared to reconcile some of the available facts.

But the practice of science is often like this, with bold predictions, competing claims and imperfect evidence.

And we’re not – quite – back where we started: a plausible theory has been ruled out, we now have a clear hypothesis about why the data had discrepancies, and there will soon be, in the form of the Dark Energy Survey, a tool to test this hypothesis.

The rival teams, at least for now, are sticking by their guns. Leistedt and Peiris think the Dark Energy Survey will prove them right.

Time will tell.

Related links

What’s the chance?

By Oli Usher, on 23 June 2014

Montage of two dreidels and a die. Photo: O. Usher (UCL MAPS)

Montage of two dreidels and a die. Photo: O. Usher (UCL MAPS)

What are the chances of a fair dice landing on each of its faces? One in six of course. But unfair dice are another matter entirely: nobody has ever come up with a complete mathematical explanation of their probabilities.

Two UCL PhD students from UCL Department of Physics & Astronomy, George Pender and Martin Uhrin have taken a step closer to this, coming up with a theory that can explain the behaviour of biased two-dimensional (four sided) dice, and similar objects such as dreidels (a type of spinning top).

This picture shows a fair spinning dreidel (top left), a biased one (top right) and a fair die.

Links

High resolution images