TIME this week launched TIME Immersive, a new iPhone and Android app that we’ll use to deliver groundbreaking augmented reality and virtual reality experiences. First up: the TIME Moon Landing experience, the world’s most accurate 3D re-creation of the Apollo 11 mission, which took place 50 years ago this month. Users can watch an approximately five-minute AR simulation of the Apollo 11 landing, narrated by TIME’s Jeffrey Kluger and featuring original NASA audio from the mission, then explore the surface of the moon on their own.

What makes the TIME Moon Landing hyper-accurate? At the experience’s core lies incredibly precise data meticulously collected over the last 20 years by John Knoll, the chief creative officer and visual effects supervisor at Industrial Light and Magic, a top Hollywood special effects company founded by George Lucas.

“I’m old enough to remember seeing the Apollo 11 landing live as a kid,” says Knoll, who gave his data to TIME. “That really left a big impression on me. In the years that followed, I was always fascinated with the space program.”

Knoll began collecting Apollo 11 landing data after stumbling upon a transcript of radio calls between the spacecraft and mission control. Those transcripts, he says, underscored the harrowing few minutes just before the “Eagle” lander touched down on the lunar surface, when it was running dangerously low on fuel. That moment, says Knoll, was largely glossed over in the Apollo 11 documentaries of his youth. “In reading the timestamped transcripts, this is white-knuckle time,” he says.

Knoll’s commitment to accuracy came in part from his disappointment with some Hollywood directors who pay lip service to scientific precision but abandon it in favor of what they or the studios believe is better storytelling. “I was very committed to making the re-creation as technically accurate as I could make it, in getting everything right about the motion of the spacecraft, the lighting conditions, the lunar terrain, where individual rocks and craters were,” says Knoll. “And to figure out if there were clever or sneaky ways to extract data from unlikely sources.”

To that end, Knoll relied on a handful of data sources, including NASA telemetry graphs, footage from a descent camera on the lunar module (LEM), and data from the Lunar Reconnaissance Orbiter (LRO), a probe orbiting the moon that was launched in 2009. He made up for shortcomings in the data with advanced computer vision techniques, including one process whereby the altitude of moon surface features can be estimated based on how bright or dark they appear in photographs.

“When you look at a photograph of the moon, and you see all that light and shadow, what you’re seeing is the orientation of the surface relative to the sun,” says Knoll. “If a surface is brighter, it’s because it’s inclined more towards the illuminance, and if it’s darker, it’s because it’s inclined more away. If you start on one end of an image, and if a surface is lighter than the average then it’s inclined up, so you accumulate the altitude, and if it’s darker, it’s declined, and so you decrement the altitude. By doing that, you can integrate an approximation of the terrain.”

Knoll hopes that the experience helps people better understand and take pride in the complexity of the Apollo project.

“I’m a big champion of science education, and people really understanding what we achieved,” says Knoll. “Those Apollo missions were great and amazing, and especially in these very divisive times, everyone regardless of their political affiliation can look back with some pride and look back at the accomplishment.”

The TIME Moon Landing experience was co-produced by TIME, John Knoll, the Smithsonian’s National Air and Space Museum and Smithsonian’s Digitization Program Office, Trigger, RYOT, and the Yahoo News XR Program. It is available within the TIME Immersive app, which you can download for iPhone in Apple’s App Store, or for Android in the Google Play Store. Look out for more TIME Immersive projects in the near future.






credits: https://time.com/5626529/apollo-11-time-app/

NASA is using HoloLens AR headsets to build its new spacecraft faster

“Just about every time, we are building something for the first time,” says Brian O’Connor, the vice president of production operations at Lockheed Martin Space.

Traditionally, aerospace organizations have replied upon thousand-page paper manuals to relay instructions to their workers. In recent years, firms like Boeing and Airbus have started experimenting with augmented reality, but it’s rarely progressed beyond the testing phase. At Lockheed, at least, that’s changing. The firm’s employees are now using AR to do their jobs every single day.

This piece first appeared in our twice-weekly newsletter, Clocking In, which covers how technology is transforming the future of work. Sign up here—it’s free! 

Spacecraft technician Decker Jory uses a Microsoft HoloLens headset on a daily basis for his work on Orion, the spacecraft intended to one day sit atop the powerful—and repeatedly delayed—NASA Space Launch System. “At the start of the day, I put on the device to get accustomed to what we will be doing in the morning,” says Jory. He takes the headset off when he is ready to start drilling. For now, the longest he can wear it without it getting uncomfortable or too heavy is about three hours. So he and his team of assemblers use it to learn a task or check the directions in 15-minute increments rather than for a constant feed of instructions.

Photo augmented reality view of technician working on machinery


In the headset, the workers can see holograms displaying models that are created through engineering design software from Scope AR. Models of parts and labels are overlaid on already assembled pieces of spacecraft. Information like torquing instructions—how to twist things—can be displayed right on top of the holes to which they are relevant, and workers can see what the finished product will look like.

The virtual models around the workers are even color-coded to the role of the person using the headset. For Jory’s team, which is currently constructing the heat shield skeleton of Orion, the new technology takes the place of a 1,500-page binder full of written work instructions.

Lockheed is expanding its use of augmented reality after seeing some dramatic effects during testing. Technicians needed far less time to get familiar with and prepare for a new task or to understand and perform processes like drilling holes and twisting fasteners.

Photo augmented reality view of technician working on machinery


These results are prompting the organization to expand its ambitions for the headsets: one day it hopes to use them in space. Lockheed Martin’s head of emerging technologies, Shelley Peterson, says the way workers use the headsets back here on Earth gives insight into how augmented reality could help astronauts maintain the spacecraft the firm helped build. “What we want astronauts to be able to do is have maintenance capability that’s much more intuitive than going through text or drawing content,” says Peterson.

For now, these headsets still need some adjustments to increase their wearability and ease of use before they can be used in space. Creating the content the workers see is getting easier, but it still takes a lot of effort. O’Connor sees these as obstacles that can be overcome quickly, though.

“If you were to look five years down the road, I don’t think you will find an efficient manufacturing operation that doesn’t have this type of augmented reality to assist the operators,” he says.




fonte: https://www.technologyreview.com/s/612247/nasa-is-using-hololens-ar-headsets-to-build-its-new-spacecraft-faster/