How NASA Broadcast Neil Armstrong Live from the Moon
Apollo 11’s lunar landing and specifically Neil Armstrong’s first steps on the Moon was, arguably, the biggest television event of...
Apollo 11’s lunar landing and specifically Neil Armstrong’s first steps on the Moon was, arguably, the biggest television event of the 20th century. Knowing the impact a live broadcast would have on the world, Deke Slayton went so far as to push NASA to include an erectable antenna on the LM so Armstrong and Buzz Aldrin wouldn’t have to wait for a tracking station to come within range before stepping outside. NASA’s live broadcast of Apollo 11’s landing was nearly a decade in the making, and required some stunning feats of engineering.
Communicating Through Deep Space
A lot of information has to pass between a spacecraft and supporting ground crews on any mission, including but not limited to telemetry, computer upload information, and voice communication. As early as 1962, NASA realized that the Apollo missions would demand a unique communications system. The Mercury and Gemini programs, both of which saw missions flying only in Earth orbit, used separate radio systems. Two-way voice communications, uplinked data, and downlinked telemetry were done using ultra high frequency (UHF) and very high frequency (VHF) systems while tracking was achieved with a C-band beacon on the spacecraft interrogated by ground-based radar. The system worked on simpler missions, but Apollo would be going much farther than Earth orbit, and with three men working in two spacecraft that would be operating simultaneously and sending down live television images, NASA needed a new way to uplink and downlink more data.
The solution was called Unified S-band or USB. It combined tracking, ranging, command, voice and television data into a single antenna. Voice and biomedical data were transmitted on a 1.25 MHz FM subcarrier, telemetry was done on a 1.024 MHz bi-phase modulated subcarrier, and the two spacecraft — the command and lunar modules — would use a pseudo-random ranging code using a common phase-modulated S-band downlink frequency of 2287.5 MHz for the CSM and 2282.5 MHz for the LM. In short, every type of information traveling between the ground and a Moon-bound spacecraft had its place. Except for the television broadcast.
Stan Lebar and the Apollo Cameras
To free up space for a television downlink from the lunar module, NASA removed the ranging code and changed the modulation from phase to frequency. This freed up 700 kHz of bandwidth for a television downlink on the USB signal. The problem was that this wasn’t enough bandwidth for the standard video camera of the day that transmitted 525 scan lines of data at 30 frames per second at 5 MHz. Instead, NASA would need a slow-scan camera optimized for a smaller format, 320 scan lines of data at 10 frames per second that could be transmitted at just 500 kHz.
With the guidelines for the camera set, NASA awarded two contracts. One went to RCA for the command module camera. Another went to Westinghouse Electric’s Aerospace Division for the lunar module camera.
From the Moon to Family Room
The Westinghouse slow-scan Lunar Camera was designed by Stan Lebar, Program Manager of the Apollo TV Lunar Camera. It was a small, lightweight camera designed to withstand the punishing forces of launch, the subsequent sudden weightlessness, and the striking temperature differences in space. It was also simple and maneuverable enough for astronauts to use it with their bulky gloves on.
The surface camera also had a key piece of classified technology inside it. The lunar surface camera would have to capture a clear image in spite of a high contrast between the bright lunar surface and the atmosphereless black sky, and Westinghouse had the answer. The company had developed a special low-light television imaging tube for the Department of Defense to use in a jungle surveillance camera during the Vietnam War, one that could find a downed pilot at at night. The key was a sensitive image tube combining a variable-gain light intensifier with a secondary electron conduction target. That SEC tube could reproduce objects in motion at low light levels without smearing the image. The DOD allowed NASA to use the top secret technology in its lunar surface camera, though it’s likely few people who worked on the project for the space agency knew they were handling sensitive technology.
Armstrong Training for the Moon
It was this camera that captured Armstrong’s first steps on the Moon. The camera was stowed in the LM’s descent stage in the Modularized Equipment Stowage Assembly (MESA) in the fourth storage area on the left of the LM’s ladder. The MESA released when Armstrong, standing on the lunar module’s porch, pulled a lanyard allowing it to unfold. Though covered with a thermal blanket, the lens poked through a hole so it could see everything going on. Inside the LM cabin, Buzz Aldrin hit a circuit breaker that turned the camera on, allowing it to capture Armstrong’s walk down the ladder and first steps on the Moon.
The signal was sent from the LM’s antenna to the tracking stations at Goldstone, Honeysuckle Creek near Canberra, and the Parkes Radio Astronomy Site in New South Wales, Australia. NASA used a scan converter to adapt the image to a broadcast standard format of 525 scan lines at the higher 30 fps rate. Then, the tracking stations transmitted the signals by microwaves to Intelsat communications satellites and AT&T landlines to Mission Control in Houston at which point they were broadcast to the world. The translation process left the image significantly degraded, but it was still live footage of man’s first steps on the Moon.