Recovering SpaceX’s Falcon 9 Ocean Landing Video – How it was done
When SpaceX launched its Falcon 9 v1.1 rocket last April, most eyes were on the vehicle’s passenger, the CRS-3 Dragon spacecraft, en route to another mission to the ISS. However, for many SpaceX followers, a hugely interesting aspect of the mission was its secondary objective – to bring the first stage back from space to a soft splashdown in the ocean after stage separation. While this goal was achieved, video footage suffered from heavy interference, leading to a huge crowd sourcing effort to restore the historic imagery.
First Stage Return:
Since the debut of the Falcon 9 v1.1, SpaceX has worked on their reusability aspirations, ranging from boost back tests of their first stages, through to ocean splashdowns – albeit with lessons learned, not least on the controllability element as the stage heads back to Earth under large aerodynamic stresses.
An actual soft splashdown was achieved after the F9 successfully sent the Dragon spacecraft on her way to the ISS, with numerous firsts including the successful deployment of the four legs attached to the aft of the vehicle.
The landing attempt was to be supported by a NASA P3 Orion aircraft, which would have observed the stage, and a SpaceX-chartered recovery vessel, to retrieve it from the ocean after landing.
Unfortunately, heavy seas in the landing area kept the recovery ship in port, and icing conditions kept the Orion on the ground.
SpaceX improvised, sending CEO Elon Musk’s private jet out to the landing area with a small satellite dish affixed behind a window. The plane kept a safe distance from the re-entering stage, but managed to retrieve telemetry confirming that the landing had been completed successfully.
A video stream of this historic event was also captured, but unfortunately heavy interference damaged it beyond recognition. A SpaceX-hired video recovery expert managed to extract two still frames, but judged the error rate too high to allow further recovery.
On April 28, SpaceX published both the original and the partially repaired video stream on their web site, and asked the world for help.
Taking on the Challenge:
The main response came from the NASASpaceFlight.com forum, with several skilled programmers and computer experts willing to take up the challenge. The first results came surprisingly quickly.
By the 4th of May, twelve of what ultimately turned out to be 15 keyframes had been extracted. None of them showed much by way of visuals, however, as heavy damage caused decoding to fail, resulting in mostly gray images.
This changed with the introduction of a special version of FFmpeg, a video decoding tool, modified for the project by one of its authors.
Rather than showing a gray image when an error was detected, this version continued to attempt to decode the image.
Applying it to the damaged frames resulted in a jumble of colored blocks, with scattered throughout some tantalizing bits of detail.
Gaining good images out of this seemed possible, but would be a jigsaw puzzle of epic proportions.
Frames and Macroblocks:
To understand the challenge, it helps to know something about the MPEG4 video compression used by SpaceX.
MPEG4 encodes a video frame-by-frame, with each frame split into a grid of 16×16 pixel squares, known as macroblocks (MBs), which are stored in left-to-right, top-to-bottom order. For each macroblock, brightness and color information is separated, color detail is reduced, and then both are compressed and stored separately.
To further reduce the space needed, most information about a macroblock is not stored directly, but as a difference with respect to either its left or top neighbor.
While this greatly reduces the bandwidth required to transmit high-quality video, it also causes any error in a single macroblock to propagate throughout the subsequent macroblocks, and multiple independent errors to stack up.
As a result of this, repairing a damaged frame is like solving a jigsaw puzzle with damaged, paint-splattered pieces, whose remaining undamaged parts are scrambled until the piece is slotted into place correctly. A seemingly impossible task.
We Do This SpaceX Style:
The team set to work with the new tools, which allowed them to reset the decoding process by artificially inserting a plain gray macroblock, and to then restart at an arbitrary position in the file.
This kept errors from propagating, and let them piece together undamaged macroblocks even if earlier damaged macroblocks had not yet been repaired.
At launch, a spray of dirty water from the flame trench had left splatters of mud on the camera.
As these appeared in the same place in different frames, they could be used to place pieces of the image in the right location, which was especially helpful in the otherwise rather featureless top half of the frames.
Meanwhile, an easy-to-use web interface was built, so that people could help the repair without having to install special software on their computers. New extensions were created to patch up damaged macroblocks, rather than just ignoring them.
A wiki was added to document the results, which, less than two weeks in, were already significant.
There was still a lot of damage, but the images now showed that the rocket had successfully deployed its landing legs, fired its Merlin engine to slow the descent, and reached the water intact.
This result did not go unnoticed, with Elon Musk himself tweeting a link to the partially repaired video, and he would later comment on the effort at the Dragon v2 reveal.
“The data came through really well, but the video was corrupted because unfortunately when you compress video, it’s hard to uncorrupt a compressed video because you actually have to figure out the compression algorithms and all these things, so we weren’t able to get very far,” said Mr. Musk (video and transcript).
“We put the video up online and then we crowd sourced the clean-up of the video and people did a really great job of fixing it. Mostly the people on the NASASpaceFlight forum were able to fix the video.”
The repaired video began showing up in SpaceX presentations.
Another on-going effort was the development of tools for automatically repairing frames by undoing damage at the level of individual bits.
Analysis of the repairs made to the transport stream had revealed that these occurred in particular patterns, and software was created to automatically try to undo potential damage according to these patterns.
Unfortunately, it turned out to be very difficult to automatically judge whether a change was an improvement. Also, in combination with the sheer amount of possible error patterns, the results were not as good as what could be achieved manually.
Fortunately, manual work continued as well, and it was becoming clear that some of the frames had damage beyond mere macroblock errors. Furthermore, there appeared to be frames missing, and so far only so-called I-frames had been looked at, while the video consisted mostly of P-frames
I-frames, P-frames and the Transport Stream:
In MPEG-4, I-frames are essentially independent pictures. They consist of the collection of compressed macroblocks making up the picture, and some additional information needed to decode them. Even with the macroblocks compressed, an I-frame still requires a significant amount of space.
To further reduce the size of a video, most frames are therefore stored as P-frames rather than I-frames.
In a P-frame, most macroblocks do not contain image data, but point to the corresponding area in a previous I-frame. The decoder then simply copies the area from the previous frame.
If a completely new object appears, such as a whitecap or flames from the engine, the corresponding macroblocks will contain the actual data just like in an I-frame.
Once the encoder has encoded all the frames, it puts them into a transport stream, which is used to combine audio, video, and other information into a single file.
The landing video only contains video, but damage to the transport stream still caused frames to decode incorrectly, or even disappear completely.
Many hours were spent creating purpose-built software, and using it to clean up the transport stream. The hard work paid off though, resulting in a complete set of frames.
The Home Stretch:
It was now late May, and the project had been underway for about a month.
Team members continued to contribute macroblock repairs, now both for I-frames and P-frames, and the results were now being processed and uploaded to YouTube automatically.
Damaged areas in I-frames were back-filled from previous frames.
The time counter in the bottom left of the video turned out to be fairly heavily damaged in most frames, most likely because such a high-detail region requires many bits to describe, which increases the chance of some of them being damaged.
Fortunately, the shapes of the numbers could be extracted from good frames, and the correct values were reconstructed from the internal MPEG4 timestamps, making it possible to redraw this area directly.
As with the rest of the repair, the team proceeded very carefully, wanting to be as sure as possible that the restored timestamps matched what had been sent by the rocket originally.
Work continued throughout early June, and by the 22nd, the team were ready to declare the work as complete.
A short introduction sequence had been created, together with a trailer with credits, and these were glued onto the repaired video.
About two months after SpaceX had first published the damaged stream and two partially restored still frames, the restored video now clearly showed a controlled descent, leg deployment, and finally a soft touchdown on the surface of the Atlantic ocean.
*Click here for the final version of the restored video*
*Click here to view the 1,780 post/365,000 read recovery effort thread*
(Images via the restoration process and SpaceX).