O On Wednesday, astronomers announced that they had taken the first picture of the world of a black hole – and the Internet could not handle it. No, it's not about Black Hole Shrek Memes or tricky opinions on how that image of an object 55 million light-years away was "so blurry". We talk about how the Internet literally could not deal with the amount of data collected by the eight telescopes on five continents that make up the Event Horizon Telescope experiment, with which this image of the black hole in the Center of the galaxy Messier 87 was taken.
Instead, the huge amount of data collected by the radio antennas had to be flown in aircraft to central data centers where they could be cleaned and analyzed. The M87 image of the black hole was not only a massive achievement of human ingenuity and understanding, confirming several theories of black holes, but also a Herculean power of data storage and management.
On seven days in April 201
There is no Internet that can compete with 5 petabytes of data on a plane. "
" We had 5 petabytes of data recorded. Dan Marrone, Ph.D., a professor of astronomy at the University of Arizona who specializes in data storage for the US EHT experiment, told Reporters on Wednesday.
"It amounts to more than half a ton of hard drives. Five Petabytes is a Lot of data: equivalent to 5,000 years of MP3 files. "
Why and how this image requires the data equivalent of 1.39 billion copies of Lil's" Old Town Road "
Eight Synchronized Telescopes
The EHT experiment used a technique called very long baseline interferometry, in which the eight simultaneous telescopes of observation essentially transformed the Earth into a single, rotatable telescope cup. Each of these telescopes recorded raw incoming radio signals as tons of data.
In other words, as if eight people were taking videos of the same distant phenomenon from different angles, and then putting all their videos together to really make one video delete. In this scenario, however, the object was far away and the telescopes were very far apart.
The advantage of this long baseline between the telescopes is that the Earth's rotation allowed scientists to shoot from eight corners of the black hole
Clean up data
After all 1,000 pounds of hard drives were filled with these 5 petabytes of raw data, they were loaded into planes and flown to two centralized "correlators" in Massachusetts and Germany.
"The fastest way to do this is not over the Internet, it's actually getting them on planes," Marrone said. "There is no internet that can compete with 5 petabytes of data on an airplane."
In addition to this challenge, scientists had to wait until the summer to send the disks from the South Pole telescope, as the images were taken during the Antarctic winter.
The correlators then began to synchronize all data from the telescopes. This means that supercomputers used all the raw data gathered from the telescopes for observation and used the atomic clock information to string them together, creating a seamless record of the wavefront of black hole light on Earth.
Replacing Tools with Silicon Valley
Chi-Kwan Chan, Ph.D., a computational astrophysicist at the University of Arizona who studied the M87 imaging project, says Inverse that the correlators had once cleaned the data and the task then became much grainy.
Layla Weighted Blanket
Two different types of fabric for two different types of feelings. Less noisy than other weighted blankets, and starting at $ 11 per month.
"After this step, people usually only use one job and do the calculations on them," he says. "However, my contribution was to bring cloud computing technology into collaboration so that we can start and accelerate many powerful virtual machines in the cloud for data analysis."
Chan and his team developed this software, which helped the EHT Team further cleans up the data to create the final composite image, which is only a few hundred kilobytes. He hopes that the technology industry can use this software for network architecture in the future.
"In this sense, we also give something back to society," he says.
In particular, the computers of the University of Arizona That Chan and his team have performed black hole simulations are based on graphics processing units, which are computationally very powerful. These are the same graphics cards that are in extreme demand as they are popular with crypto currency miners. Just as the Black Hole project developed software that could be used by others, it also used clues from a completely different area of computer science, all in the name of discovery.
Ironically, Chan's team used these powerful GPUs to simulate So many black holes before the M87 observation did they already know what to expect from the real black hole.
"We've created a huge library of black hole images," he says. "Because we saw so many of them and saw so many possibilities, we were not surprised when we saw the real ones."