First ever images of black hole: How this landmark achievement came about.

Apart from the basic science insights EHT has already produced, the technologies and algorithms developed for this research are likely to find many other applications

The release of the first ever images of a black hole on April 10, marked the culmination of an enormous collaborative effort that lasted several years. It involved a pool of 200-odd scientists spread across 13 different institutions, and drawn from multiple disciplines. In some ways, the most impressive contributions came from the mathematicians and computer scientists who patched data together to create the images.


This was a very impressive feat. The black hole in question is 55 million light years away and occupies less volume of space than the solar system. “Zooming” in on it required a telescope powerful enough to read a newspaper on a Paris newsstand, while sitting in New York. A single telescope that powerful (capable of capturing a 20 microsecond-arc of a circle) would be about the size of the Earth itself.

The Event Horizon Telescope (EHT) actually consists of eight different radio-telescopes spread across four continents, including Antarctica. This method of putting together data from different telescopes to create a single image is called Very Long Baseline Interferometry. The data was synchronised using atomic clocks. Even VLBI could not produce complete images for an object that small and distant. It was like a jigsaw puzzle with many pieces missing. and very clever algorithms were required to fill in missing details.

The measurements derived by EHT tell us that the target, the black hole at the centre of the Virgo A Galaxy (also known as M87 and NGC486) is a “supermassive” black hole. It has a mass some 6.5 billion times that of the sun and a radius of about 40 million km, which is about the distance from the sun to Neptune. This is the black disc in the centre of the image. The event horizon – the orange ring around the black disc – is caused by superhot matter, travelling at great speed.

When a star runs out of fuel for its nuclear reactions, the matter within collapses and is tightly packed together. It may become a black hole, which is a very massive, dense object that doesn’t allow even light to escape from its gravitational pull. It is literally, invisible. As black holes swallow more matter, they become more massive. The region just beyond where the gravitational pull becomes too strong for light to escape is called the “Event Horizon”.

Black holes were predicted by the General Theory of Relativity postulated by Albert Einstein in 1915 Think of space as a trampoline, with the stars like heavy balls rolling on it. The trampoline will bend near the balls, creating “gravity wells”, which attract other objects. Gravity warps space and time the same way. A very massive object creates a gravity well that’s impossible to escape.

Various scientists have made other predictions about black holes. Subrahmanyan Chandrasekhar calculated the minimum size of a star that could become a black hole. Other predictions about black hole radiation and conservation of information, have been made by Stephen Hawking and Roger Penrose.

Black holes are detected by inference when astronomers observe the movements of stars attracted by them. The LIGO (Laser Interferometer Gravitational-Wave Observatory) was the first to directly detect black holes, identifying black hole mergers, via disturbance in gravitational waves in 2016. This was in itself, validation of the General Theory of Relativity.The EHT shows that the event horizon matches closely with Einstein’s predictions, which is yet another validation.

In April 2017, after long preparation, the EHT array of radio-telescopes collected data for a week on two specific supermassive black holes. They captured carefully-synchronised measurements of radio waves at a frequency of 1.03 mm. These data were then stitched together, and the missing bits inferred, to create the image.

This very challenging task took two years. Four different teams of computer scientists started to develop the requisite algorithms and putting together the hardware back in 2015. There was over 5,000 terabytes (5 billion megabytes) of data. The data had to be transported in special hard disks from those telescopes to the computer centres.

Creating the images meant synchronising the data, and filling in blanks through algorithms running on super computers. Four different teams working independently helped ensure that possible biases were eliminated in image processing.

Dr Katie Bouman of MIT’s Haystack Institute explains, “We didn’t know what a black hole looked like. So we used commonly available images to build algorithms. It’s like asking forensic artists around the world to draw a face from the same description. If you get similar faces, you know that you have eliminated bias.“

The EHT also captured data from Sagittarius A*, a supermassive black hole at the centre of our own galaxy. The data was more blurred since this is closer and “moves” more relative to Earth. It has not yet been deciphered. The EHT is not only processing that data; it is adding more telescopes to the array. It will probe black holes in two different frequencies at the next stage.

This will add more information, which should give more insights into how black holes behave. Apart from the basic science insights EHT has already produced, the technologies and the algorithms developed for this research are likely to find many other applications. (Source: The Business Standard)

Current Affairs Home