In-Line Water Filtration: Better Hygeine, Less Expense

Water sphere behavior experiments aboard the International Space StationWater sphere "behavior" experiments aboard the International Space Station. Image Credit: NASA
Watch the video on YouTube

Water, essential to sustaining life on Earth, is that much more highly prized in the unforgiving realm of space travel and habitation. Given a launch cost of $10,000 per pound for space shuttle cargo, however, each gallon of water at 8.33 pounds quickly makes Chanel No. 5 a bargain at $25,000 per gallon. Likewise, ample water reserves for drinking, food preparation, and bathing would take up an inordinate amount of storage space and infrastructure, which is always at a premium on a vessel or station.

Water rationing and recycling are an essential part of daily life and operations on the space shuttles and International Space Station. In orbit, where Earth's natural life support system is missing, the International Space Station has to provide abundant power, clean water, and breathable air at the right temperature and humidity for the duration of human habitation and with virtually no waste. The Environmental Control and Life Support System (ECLSS), under continuing development at the Marshall Space Flight Center, helps astronauts use and reuse their precious supplies of water. Future work will explore air management, thermal control, and fire suppression -- in short, all of the things that will make human habitation in space comfortable and safe.

The DentaPure waterline purification cartridge sees use in 40 percent of dental schools in the United States and is lauded for offering remarkable filtration and significant cost savingsThe ECLSS Water Recycling System (WRS) reclaims wastewaters from humans and lab animals in the form of breath condensate, urine, hygiene and washing, and other wastewater streams. On Earth, biological wastewater is physically filtered by granular soil and purified as microbes in the soil break down urea, converting it to a form that plants can absorb and use to build new tissue. Wastewater also evaporates and returns as fresh rain water -- a natural form of distillation. WRS water purification machines on the ISS mimic these processes, though without microbes or the scale of these processes.

A NASA industry partner, Umpqua Research Company, of Myrtle Creek, Oregon, supplier of the bacterial filters used in the life support backpacks worn by space-walking astronauts, helped develop air and water purification technologies for human missions in space. To prevent back-contamination of a drinking water supply by microorganisms, Umpqua developed the microbial check valve, consisting of a flow-through cartridge containing iodinated ion exchange resin. In addition to the microbial contact kill, the resin was found to impart a biocidal residual elemental iodine concentration to the water.

Umpqua's valve and resin system was adopted by NASA as the preferred means of disinfecting drinking water aboard U.S. spacecraft, and canisters are now used on space shuttle missions, the ISS, and for ground-based testing of closed life support technology. Iodine was selected by NASA as the disinfectant of choice because of its lower vapor pressure and reduced propensity for formation of disinfection byproducts compared to chlorine or bromine.

MRLB International Inc., of Fergus Falls, Minnesota, used Umpqua's water purification technology in the design of the DentaPure waterline purification cartridge (Spinoff 1998). It was designed to clean and decontaminate water as a link between filter and high-speed dental tools and other instruments, and offers easy installation on all modern dental unit waterlines with weekly replacement cycles. The product, like its NASA forebear, furnished disinfected water and maintained water purity even with "suckback," an effect caused by imperfect anti-retraction valves in dental instruments, which draws blood, saliva, and other materials from a patient's mouth into the waterline.

Various models of DentaPure now address a variety of needs, and are used in dental offices and schools across the country. The technology offers remarkable filtration: registered to provide 200 CFU/ml purity (Colony Forming Unit/milliliter, a standard measure of microbial concentration) -- the Centers for Disease Control and Prevention (CDC) standard is 500 CFU, and untreated lines can harbor in excess of 1,000,000 CFU/ml.

Better filtration, greater capacity, and longer service intervals have also led to great savings -- the University of Maryland Dental School estimates it saves $274,000 per year courtesy of DentaPure. The DentaPure system has proven so effective that 40 percent of dental schools nationwide employ it.

The investment in water filtration for space missions continues to pay huge dividends to users and society, year after year, in technologies so woven into our lives that we use them without even thinking about them.

DentaPure® is a registered trademark of MRLB International Inc.

> Read the entire story in Spinoff 2008

Cosmic Rays Hit Space Age High

Energetic iron nuclei counted by the Cosmic Ray Isotope Spectrometer on NASA's Advanced Composition Explorer (ACE) spacecraft reveal that cosmic ray levels have jumped 19% above the previous Space Age highPlanning a trip to Mars? Take plenty of shielding. According to sensors on NASA's ACE (Advanced Composition Explorer) spacecraft, galactic cosmic rays have just hit a Space Age high.

"In 2009, cosmic ray intensities have increased 19% beyond anything we've seen in the past 50 years," says Richard Mewaldt of Caltech. "The increase is significant, and it could mean we need to re-think how much radiation shielding astronauts take with them on deep-space missions."

The cause of the surge is solar minimum, a deep lull in solar activity that began around 2007 and continues today. Researchers have long known that cosmic rays go up when solar activity goes down. Right now solar activity is as weak as it has been in modern times, setting the stage for what Mewaldt calls "a perfect storm of cosmic rays."

An artist's concept of the heliosphere, a magnetic bubble that partially protects the solar system from cosmic rays"We're experiencing the deepest solar minimum in nearly a century," says Dean Pesnell of the Goddard Space Flight Center, "so it is no surprise that cosmic rays are at record levels for the Space Age."

Galactic cosmic rays come from outside the solar system. They are subatomic particles--mainly protons but also some heavy nuclei--accelerated to almost light speed by distant supernova explosions. Cosmic rays cause "air showers" of secondary particles when they hit Earth's atmosphere; they pose a health hazard to astronauts; and a single cosmic ray can disable a satellite if it hits an unlucky integrated circuit.

The sun's magnetic field is our first line of defense against these highly-charged, energetic particles. The entire solar system from Mercury to Pluto and beyond is surrounded by a bubble of solar magnetism called "the heliosphere." It springs from the sun's inner magnetic dynamo and is inflated to gargantuan proportions by the solar wind. When a cosmic ray tries to enter the solar system, it must fight through the heliosphere's outer layers; and if it makes it inside, there is a thicket of magnetic fields waiting to scatter and deflect the intruder.

"At times of low solar activity, this natural shielding is weakened, and more cosmic rays are able to reach the inner solar system," explains Pesnell.

The heliospheric current sheet is shaped like a ballerina's skirt.Mewaldt lists three aspects of the current solar minimum that are combining to create the perfect storm:
  1. The sun's magnetic field is weak. "There has been a sharp decline in the sun's interplanetary magnetic field (IMF) down to only 4 nanoTesla (nT) from typical values of 6 to 8 nT," he says. "This record-low IMF undoubtedly contributes to the record-high cosmic ray fluxes."
  2. The solar wind is flagging. "Measurements by the Ulysses spacecraft show that solar wind pressure is at a 50-year low," he continues, "so the magnetic bubble that protects the solar system is not being inflated as much as usual." A smaller bubble gives cosmic rays a shorter-shot into the solar system. Once a cosmic ray enters the solar system, it must "swim upstream" against the solar wind. Solar wind speeds have dropped to very low levels in 2008 and 2009, making it easier than usual for a cosmic ray to proceed.
  3. The current sheet is flattening. Imagine the sun wearing a ballerina's skirt as wide as the entire solar system with an electrical current flowing along the wavy folds. That is the "heliospheric current sheet," a vast transition zone where the polarity of the sun's magnetic field changes from plus (north) to minus (south). The current sheet is important because cosmic rays tend to be guided by its folds. Lately, the current sheet has been flattening itself out, allowing cosmic rays more direct access to the inner solar system.
"If the flattening continues as it has in previous solar minima, we could see cosmic ray fluxes jump all the way to 30% above previous Space Age highs," predicts Mewaldt.

Earth is in no great peril from the extra cosmic rays. The planet's atmosphere and magnetic field combine to form a formidable shield against space radiation, protecting humans on the surface. Indeed, we've weathered storms much worse than this. Hundreds of years ago, cosmic ray fluxes were at least 200% higher than they are now. Researchers know this because when cosmic rays hit the atmosphere, they produce an isotope of beryllium, 10Be, which is preserved in polar ice. By examining ice cores, it is possible to estimate cosmic ray fluxes more than a thousand years into the past. Even with the recent surge, cosmic rays today are much weaker than they have been at times in the past millennium.

"The space era has so far experienced a time of relatively low cosmic ray activity," says Mewaldt. "We may now be returning to levels typical of past centuries."

MESSENGER Spacecraft Prepares for Final Pass by Mercury

NASA's Mercury Surface, Space Environment, Geochemistry, and Ranging spacecraft known as MESSENGER will fly by Mercury for the third and final time on Sept. 29. The spacecraft will pass less than 142 miles above the planet's rocky surface for a final gravity assist that will enable it to enter Mercury's orbit in 2011.

Determining the composition of Mercury's surface is a major goal of the orbital phase of the mission. The spacecraft already has imaged more than 90 percent of the planet's surface. The spacecraft's team will activate instruments during this flyby to view specific features to uncover more information about the planet.

"This flyby will be our last close look at the equatorial regions of Mercury, and it is our final planetary gravity assist, so it is important for the entire encounter to be executed as planned," said Sean Solomon, principal investigator at the Carnegie Institution in Washington. "As enticing as these flybys have been for discovering some of Mercury's secrets, they are the hors d'oeuvres to the mission's main course -- observing Mercury from orbit for an entire year.

The spacecraft may observe how the planet interacts with conditions in interplanetary space as a result of activity on the sun. During this encounter, high spectral- and high spatial-resolution measurements will be taken again of Mercury's tenuous atmosphere and tail.

"Scans of the planet's comet-like tail will provide important clues regarding the processes that maintain the atmosphere and tail," said Noam Izenberg, the instrument's scientist at the Johns Hopkins University Applied Physics Laboratory, or APL, in Laurel, Md. "The Mercury Atmospheric and Surface Composition Spectrometer will give us a snapshot of how the distribution of sodium and calcium vary with solar and planetary conditions. In addition, we will target the north and south polar regions for detailed observations and look for several new atmospheric constituents."

As the spacecraft approaches Mercury, cameras will photograph previously unseen terrain. As the spacecraft departs, it will take high-resolution images of the southern hemisphere. Scientists expect the spacecraft's imaging system to take more than 1,500 pictures. Those images will be used to create a mosaic to complement the high resolution, northern-hemisphere mosaic obtained during the second Mercury flyby. The first flyby took the spacecraft over the eastern hemisphere in January 2008, and the second flyby took it over western side in October 2008.

"We are going to collect high resolution, color images of scientifically interesting targets that we identified from the second flyby," said Ralph McNutt, a project scientist at APL. "The spectrometer also will make measurements of those targets at the same time."

Two spacecraft maneuvers will improve the ability of the spacecraft's Neutron Spectrometer to detect low-energy neutrons sensitive to the abundances of iron and titanium on Mercury's surface. These two elements absorb neutrons and are critical to an understanding of how the planet and its crust formed. A combination of day and night measurements will enable scientists to test the influence that planetary surface temperature has on the neutron population. The data are important for interpreting measurements that will be made after the probe is in orbit around Mercury.

An altimeter will make a topographic profile along the instrument ground track of Mercury's surface. The data gathered will provide additional topography of Mercury's surface features for ongoing studies of the form and structure of its craters and large faults. The information also will extend scientists' equatorial view of Mercury's global shape and allow them to confirm the discovery made during the first and second flyby that Mercury's equatorial region is slightly elliptical.

The spacecraft has completed nearly three-quarters of its 4.9-billion-mile journey to enter orbit around Mercury. The trip includes more than 15 trips around the sun. In addition to flying by Mercury, the spacecraft flew past Earth in August 2005 and Venus in October 2006 and June 2007.

The project is the seventh in NASA's Discovery Program of low-cost, scientifically focused space missions. The spacecraft was designed and built by APL. The mission also is managed and operated by APL for NASA's Science Mission Directorate in Washington.

For more information about the mission, visit:

› NASA's MESSENGER Mission Page
› Information and briefing materials on MESSENGER's third flyby

Newest Astronauts Follow the Footsteps of Pioneers at Langley

The 14 members of the Astronaut Class of '09 lesson on the history and purpose of the Gantry during a Thursday visit to NASA LangleyJack Fischer flew F-22 Raptors at Langley Air Force Base but didn't have a clue about what was going on the other side of the fence.

Reid Wiseman flew F-18 Super Hornets at Oceana and had an idea about some of the things done at NASA's Langley Research Center, but had never been to the facility.

"My daughter's really interested in NASA," he said. "She loves the Air and Space Center. I'm not sure whether she liked that best or the ride through the (Hampton Roads) tunnel."

Wiseman's daughter is three.

Fischer, Wiseman and the rest of the Astronaut Candidate Class of '09 -- the first such class in five years -- got an introduction to Langley on Thursday during a tour of the facility, and they were particularly awed while at the Gantry. Once in the class they instantly become part of NASA history and so are aware of those who went before them, beginning on April 9, 1959, when Alan Shepard, Gus Grissom, John Glenn, Walter Schirra, Scott Carpenter, Deke Slayton and Gordon Cooper were announced as the first American astronauts.

All of them trained at Langley.

"I wasn't alive for Apollo, but I know about it," Wiseman said, obviously impressed with the Gantry. "Everybody who's ever stepped on the moon trained here."

"I want to be in there," said Fisher after viewing a video of Neil Armstrong and others training on the facility.

2009 Astronaut Candidates visit NASA Langley

It's not likely, at least for a long while, but the candidates are young -- Fischer is 35, Wiseman 33 -- so "Who knows?" said Duane Ross, the manager of Astronaut Candidate Selection and Training at Johnson Space Center in Houston. "If they're around long enough, they could be part of Constellation."

For now, they're part of the astronaut crew that's being taught to work on the International Space Station. That training began with a bonding exercise in which they went through the Navy's Survival, Evasion, Resistance and Escape regimen in northern Maine.

From now until May, 2011, they will receive 39 weeks of space station systems training, plus robotics training, extra-vehicular activity training and some flight time for those who aren't pilots. "Oh, and we're going to teach 'em Russian," Ross said.

The Russian lessons replace the 54 weeks once allowed for space shuttle training, an unnecessary skill with the shuttle being retired. The new language skills will prepare the astronauts to travel in the Russians' Soyuz to the International Space Station.

The nine members of the U.S. contingent in the class were chosen from 3,535 applicants. Like so many other government positions, you could apply through

"But that's just us," Ross said. "We have two Canadians and three Japanese. That 14 people out of about 10,000."

On Thursday, the class was introduced to Langley's work in aeronautics, atmospheric science and flight test hardware, along with the National Transonic Facility, structure and materials and the NASA Engineering and Safety Center. Added to that was another walk down the agency's memory lane, to the Hangar, where the Rendezvous Docking Simulator hangs from the ceiling, a reminder of those who trained in it, and of Buzz Aldrin, who joined Armstrong on the moon during Apollo 11, and who used the facility in research for his doctoral thesis.

"That we're able to be a part of all of this is inspiring," said Fischer, an Air Force major who is giving up a pilot's first love -- flying the Raptor -- for "something bigger: being part of helping the entire world."

Wiseman, a lieutenant commander in the Navy, agreed.

"It's a new challenge," said Wiseman, who learned of his selection to the class while flying close-air support to NATO troops in Afghanistan, and who continued to fly sorties until the Navy could separate him for astronaut duty.

"I'm like a kid," he said, laughing. "I wanted to fly, and I've wanted to an astronaut since I was a kid. So I can keep being a kid."

A particularly select kid in a new job that could keep him busy for a long time.

NASA Ice Satellite Maps Profound Polar Thinning

Satellite data shows fast ice thinning (red) along the coast of West Antarctica.Researchers have used NASA’s Ice, Cloud and Land Elevation Satellite (ICESat) to compose the most comprehensive picture of changing glaciers along the coast of the Greenland and Antarctic ice sheets.

The new elevation maps show that all latitudes of the Greenland ice sheet are affected by dynamic thinning -- the loss of ice due to accelerated ice flow to the ocean. The maps also show surprising, extensive thinning in Antarctica, affecting the ice sheet far inland. The study, led by Hamish Pritchard of the British Antarctic Survey in Cambridge, England, was published September 24 in Nature.

ICESat’s precise laser altimetry instrument, launched in 2003, has provided a high-density web of elevation measurements repeated year after year across the Greenland and Antarctic ice sheets. With the dense coverage, the research team could distinguish which changes were caused by fast-flowing ice and which had other causes, such as melt.

The maps confirm that the profound ice sheet thinning of recent years stems from fast-flowing glaciers that empty into the sea. This was particularly the case in West Antarctica, where the Pine Island Glacier was found to be thinning between 2003 and 2007 by as much as 6 meters per year. In Greenland, fast-flowing glaciers were shown to thin by an average of nearly 0.9 meters per year.

Related Links

› British Antarctic Survey press release
› Extensive Dynamic Thinning on the Margins of Greenland and Antarctic Ice Sheets

Floundering El Niños Make for Fickle Forecasts

The 1997-1998 El Nino, seen here in this Sept. 20, 1997 image from the NASA/French Space Agency Topex/Poseidon satellite, was one of the most powerful El Ninos of the past 100 yearsSince May 2009, the tropical Pacific Ocean has switched from a cool pattern of ocean circulation known as La Niña to her warmer sibling, El Niño. This cyclical warming of the ocean waters in the central and eastern tropical Pacific generally occurs every three to seven years, and is linked with changes in the strength of the trade winds. El Niño can affect weather worldwide, including the Atlantic hurricane season, Asian monsoon season and northern hemisphere winter storm season. But while scientists agree that El Niño is back, there's less consensus about its future strength.

One of the characteristics that signal a developing El Niño is a change in average sea surface height compared to normal sea level. The NASA/French Space Agency Jason-1 and Ocean Surface Topography Mission/Jason-2 satellites continuously observe these changes in average sea surface height, producing near-global maps of the ocean's surface topography every 10 days.

Recent data on sea-level height from the Jason-1 and Ocean Surface Topography Mission/Jason-2 satellites, displayed at , show that most of the equatorial Pacific is near normal (depicted in the images as green). The exceptions are the central and eastern equatorial Pacific, which are exhibiting areas of higher-than-normal sea surface heights (warmer-than-normal sea-surface temperatures) at 180 and 110 degrees west longitude.

The latest image from Jason-2, which can be seen at, reflects a 10-day data cycle centered around September 17, 2009. It shows a series of warm "bumps" visible along the equator, denoted in the image by a black line. Known as Kelvin waves, these pools of warm water were triggered when the normally westward-blowing trade winds weakened in late July and again in early September, sending them sliding eastward from the western Pacific toward the Americas. The Kelvin waves are 5 to 10 centimeters (2 to 4 inches) high, a few hundred kilometers wide and a few degrees warmer than surrounding waters. Traveling east at about 3 meters per second (6 miles per hour), they are expected to reach the coast of Peru in October. (An animation of the evolution of Pacific Ocean conditions since January 2006 is at: ).

Yet the present condition of this year's El Niño is dwarfed in comparison with the "macho" El Niño of 1997-1998, which brought devastating floods to California and severe drought to Indonesia, Australia and the Philippines. As seen in this September 20, 1997, image from the NASA/French Space Agency Topex/Poseidon satellite (see ), the size and intensity of the 1997-1998 event were much greater by this time of year. That leads some scientists, such as Bill Patzert, an oceanographer and climatologist at NASA's Jet Propulsion Laboratory, Pasadena, Calif., to express uncertainty as to whether this El Niño event will intensify enough to deliver the dramatic impacts seen during that last intense El Niño in 1997-1998.

"For the past few months, the trade winds have weakened somewhat, but whether the new Kelvin waves traveling eastward across the Pacific will be adequate to pump this El Niño up enough to reinvigorate it and deliver any real impacts remains uncertain," Patzert says.

Patzert notes that it is important to remember that not all El Niños are created equal. "Some El Niños are show stoppers, but most are mild to modest, with minimal to mixed impacts," he says. He notes that since 1998, there have been three mild to moderate El Niño's: in 2002-2003, 2004-2005 and 2006-2007.

None of these events delivered the heart-thumping impacts of the monster El Niño of 1997-1998. During the winter of 1997-1998, Southern California was soaked with nearly 79 centimeters (more than 31 inches) of rain (twice Los Angeles' normal annual rainfall amount of about 38.5 centimeters, or 15.14 inches). In addition, there was heavy snowpack in the Sierra Nevada and Rocky Mountains. In comparison, during the past four winters, Los Angeles has averaged only 24.6 centimeters (9.7 inches) of rain (64 percent of normal), and snowpacks have been stingy.

In fact, Patzert notes that this El Niño bears many similarities to the 2006-2007 El Niño event. During that winter, much of the American Southwest experienced record-breaking drought, and Los Angeles had its driest winter in recorded history.

So what will El Niño 2009-2010 hold in store for the world this coming winter? In spite of the uncertainties, experienced climate forecasters around the world will continue to monitor the Pacific closely for further signs of El Niño development and will give it their best shot.

"Unless present El Niño conditions intensify, I believe this El Niño is too weak to have a major influence on many weather patterns," he says. "A macho El Niño like that of 1997-1998 is off the board, but I'm hoping for a relaxation in the tropical trade winds and a surprise strengthening of El Niño that could result in a shift in winter storm patterns over the United States. If the trade winds decrease, the ocean waters will continue to warm and spread eastward, strengthening the El Niño. That scenario could bring atmospheric patterns that will deliver much-needed rainfall to the southwestern United States this winter. If not, the dice seem to be loaded for below-normal snowpacks and another drier-than-normal winter."

Still, Patzert remains hopeful. "Don't give up on this El Niño," he added. "He might make a late break and put his spin on this fall and winter's weather systems."

To learn more about Jason-1 and the Ocean Surface Topography Mission/Jason-2, visit: .

NASA Spacecraft Sees Ice on Mars Exposed by Meteor Impacts

The High Resolution Imaging Science Experiment camera on NASA's Mars Reconnaissance Orbiter took these images of a fresh, 6-meter-wide (20-foot-wide) crater on Mars on Oct. 18, 2008, (left) and on Jan. 14, 2009NASA's Mars Reconnaissance Orbiter has revealed frozen water hiding just below the surface of mid-latitude Mars. The spacecraft's observations were obtained from orbit after meteorites excavated fresh craters on the Red Planet.

Scientists controlling instruments on the orbiter found bright ice exposed at five Martian sites with new craters that range in depth from approximately half a meter to 2.5 meters (1.5 feet to 8 feet). The craters did not exist in earlier images of the same sites. Some of the craters show a thin layer of bright ice atop darker underlying material. The bright patches darkened in the weeks following initial observations, as the freshly exposed ice vaporized into the thin Martian atmosphere. One of the new craters had a bright patch of material large enough for one of the orbiter's instruments to confirm it is water-ice.

The finds indicate water-ice occurs beneath Mars' surface halfway between the north pole and the equator, a lower latitude than expected in the Martian climate.

"This ice is a relic of a more humid climate from perhaps just several thousand years ago," said Shane Byrne of the University of Arizona, Tucson.

Byrne is a member of the team operating the orbiter's High Resolution Imaging Science Experiment, or HiRISE camera, which captured the unprecedented images. Byrne and 17 co-authors report the findings in the Sept. 25 edition of the journal Science.

"We now know we can use new impact sites as probes to look for ice in the shallow subsurface," said Megan Kennedy of Malin Space Science Systems in San Diego, a co-author of the paper and member of the team operating the orbiter's Context Camera.

During a typical week, the Context Camera returns more than 200 images of Mars that cover a total area greater than California. The camera team examines each image, sometimes finding dark spots that fresh, small craters make in terrain covered with dust. Checking earlier photos of the same areas can confirm a feature is new. The team has found more than 100 fresh impact sites, mostly closer to the equator than the ones that revealed ice.

An image from the camera on Aug. 10, 2008, showed apparent cratering that occurred after an image of the same ground was taken 67 days earlier. The opportunity to study such a fresh impact site prompted a look by the orbiter's higher resolution camera on Sept. 12, 2008, confirming a cluster of small craters.

"Something unusual jumped out," Byrne said. "We observed bright material at the bottoms of the craters with a very distinct color. It looked a lot like ice."

The bright material at that site did not cover enough area for a spectrometer instrument on the orbiter to determine its composition. However, a Sept. 18, 2008, image of a different mid-latitude site showed a crater that had not existed eight months earlier. This crater had a larger area of bright material.

"We were excited about it, so we did a quick-turnaround observation," said co-author Kim Seelos of Johns Hopkins University Applied Physics Laboratory in Laurel, Md. "Everyone thought it was water-ice, but it was important to get the spectrum for confirmation."

Mars Reconnaissance Orbiter Project Scientist Rich Zurek, of NASA's Jet Propulsion Laboratory, Pasadena, Calif., said, "This mission is designed to facilitate coordination and quick response by the science teams. That makes it possible to detect and understand rapidly changing features."

The ice exposed by fresh impacts suggests that NASA's Viking Lander 2, digging into mid-latitude Mars in 1976, might have struck ice if it had dug 10 centimeters (4 inches) deeper. The Viking 2 mission, which consisted of an orbiter and a lander, launched in September 1975 and became one of the first two space probes to land successfully on the Martian surface. The Viking 1 and 2 landers characterized the structure and composition of the atmosphere and surface. They also conducted on-the-spot biological tests for life on another planet.

NASA's Jet Propulsion Laboratory in Pasadena manages the Mars Reconnaissance Orbiter for NASA's Science Mission Directorate in Washington. Lockheed Martin Space Systems in Denver built the spacecraft. The Context Camera was built and is operated by Malin Space Science Systems. The University of Arizona operates the HiRISE camera, which Ball Aerospace & Technologies Corp., in Boulder, Colo., built. The Johns Hopkins University Applied Physics Laboratory led the effort to build the Compact Reconnaissance Imaging Spectrometer and operates it in coordination with an international team of researchers.

To view images of the craters and learn more about the Mars Reconnaissance Orbiter, or

NASA Instruments Reveal Water Molecules on Lunar Surface

Water around a fresh lunar crater
NASA scientists have discovered water molecules in the polar regions of the moon. Instruments aboard three separate spacecraft revealed water molecules in amounts that are greater than predicted, but still relatively small. Hydroxyl, a molecule consisting of one oxygen atom and one hydrogen atom, also was found in the lunar soil. The findings were published in Thursday's edition of the journal Science.

NASA's Moon Mineralogy Mapper, or M3, instrument reported the observations. M3 was carried into space on Oct. 22, 2008, aboard the Indian Space Research Organization's Chandrayaan-1 spacecraft. Data from the Visual and Infrared Mapping Spectrometer, or VIMS, on NASA's Cassini spacecraft, and the High-Resolution Infrared Imaging Spectrometer on NASA's Epoxi spacecraft contributed to confirmation of the finding. The spacecraft imaging spectrometers made it possible to map lunar water more effectively than ever before.

The confirmation of elevated water molecules and hydroxyl at these concentrations in the moon's polar regions raises new questions about its origin and effect on the mineralogy of the moon. Answers to these questions will be studied and debated for years to come.

"Water ice on the moon has been something of a holy grail for lunar scientists for a very long time," said Jim Green, director of the Planetary Science Division at NASA Headquarters in Washington. "This surprising finding has come about through the ingenuity, perseverance and international cooperation between NASA and the India Space Research Organization."

From its perch in lunar orbit, M3's state-of-the-art spectrometer measured light reflecting off the moon's surface at infrared wavelengths, splitting the spectral colors of the lunar surface into small enough bits to reveal a new level of detail in surface composition. When the M3 science team analyzed data from the instrument, they found the wavelengths of light being absorbed were consistent with the absorption patterns for water molecules and hydroxyl.

"For silicate bodies, such features are typically attributed to water and hydroxyl-bearing materials," said Carle Pieters, M3's principal investigator from Brown University, Providence, R.I. "When we say 'water on the moon,' we are not talking about lakes, oceans or even puddles. Water on the moon means molecules of water and hydroxyl that interact with molecules of rock and dust specifically in the top millimeters of the moon's surface.

The M3 team found water molecules and hydroxyl at diverse areas of the sunlit region of the moon's surface, but the water signature appeared stronger at the moon's higher latitudes. Water molecules and hydroxyl previously were suspected in data from a Cassini flyby of the moon in 1999, but the findings were not published until now.

"The data from Cassini's VIMS instrument and M3 closely agree," said Roger Clark, a U.S. Geological Survey scientist in Denver and member of both the VIMS and M3 teams. "We see both water and hydroxyl. While the abundances are not precisely known, as much as 1,000 water molecule parts-per-million could be in the lunar soil. To put that into perspective, if you harvested one ton of the top layer of the moon's surface, you could get as much as 32 ounces of water."

For additional confirmation, scientists turned to the Epoxi mission while it was flying past the moon in June 2009 on its way to a November 2010 encounter with comet Hartley 2. The spacecraft not only confirmed the VIMS and M3 findings, but also expanded on them.

"With our extended spectral range and views over the north pole, we were able to explore the distribution of both water and hydroxyl as a function of temperature, latitude, composition, and time of day," said Jessica Sunshine of the University of Maryland. Sunshine is Epoxi's deputy principal investigator and a scientist on the M3 team. "Our analysis unequivocally confirms the presence of these molecules on the moon's surface and reveals that the entire surface appears to be hydrated during at least some portion of the lunar day."

NASA's Jet Propulsion Laboratory, Pasadena, Calif., manages the M3 instrument, Cassini mission and Epoxi spacecraft for NASA's Science Mission Directorate in Washington. The Indian Space Research Organization built, launched and operated the Chandrayaan-1 spacecraft.

For additional information and images from the instruments, visit:

For more information about the Chandrayaan-1 mission, visit:

For more information about the EPOXI mission, visit:

For more information about the Cassini mission, visit:

NASA Ice Campaign Takes Flight in Antarctica

NASA's DC-8 flying laboratory takes off in Punta Arenas, Chile, during NASA's AirSAR 2004 campaignEarly in the 20th century, a succession of adventurers and scientists pioneered the exploration of Antarctica. A century later, they're still at it, albeit with a different set of tools. This fall, a team of modern explorers will fly over Earth's southern ice-covered regions to study changes to its sea ice, ice sheets, and glaciers as part of NASA's Operation Ice Bridge.

Starting next month, NASA will fly its DC-8, a 157-foot-long airborne laboratory that can accommodate many instruments. The fall 2009 campaign is one of few excursions to the remote continent made by the DC-8, the largest aircraft in NASA's airborne science fleet.

The plane is scheduled to leave NASA's Dryden Flight Research Center in Edwards, Calif., on October 12 and fly to Punta Arenas, Chile, where the plane, crew and researchers will be based for through mid-November. For six weeks, the Ice Bridge team will traverse the Southern Ocean for up to 17 flights over West Antarctica, the Antarctic Peninsula, and coastal areas where sea ice is prevalent. Each round-trip flight lasts about 11 hours, two-thirds of that time devoted to getting to and from Antarctica.

Operation Ice Bridge is a six-year campaign of annual flights to each of Earth's polar regions. The first flights in March and April carried researchers over Greenland and the Arctic Ocean. This fall's Antarctic campaign, led by principal investigator Seelye Martin of the University of Washington, will begin the first sustained airborne research effort of its kind over the continent. Data collected by researchers will help scientists bridge the gap between NASA's Ice, Cloud and Land Elevation Satellite (ICESat) -- which is operating the last of its three lasers -- and ICESat-II, scheduled to launch in 2014.

The Ice Bridge flights will help scientists maintain the record of changes to sea ice and ice sheets that have been collected since 2003 by ICESat. The flights will lack the continent-wide coverage that can be achieved by satellite, so researchers carefully select key target locations. But the flights will also turn up new information not possible from orbit, such as the shape of the terrain below the ice.

"Space-based instruments like the ICESat lasers are the only way to find out where change is occurring in remote, continent-sized ice sheets like Antarctica," said Tom Wagner, cryosphere program scientist at NASA Headquarters in Washington, D.C. "But aircraft missions like Ice Bridge allow us to follow up with more detailed studies and make other measurements critical to modeling sea level rise."

Lasers and Radars

ICESat launched in January 2003 and since then, its sole instrument -- a precise laser altimeter -- has helped scientists map ice sheet elevation, calculate sea ice thickness, and monitor how both have changed.

"With ICESat, we have seen significant changes, things we wouldn't otherwise know were taking place," said Jay Zwally of NASA's Goddard Space Flight Center in Greenbelt, Md., and ICESat investigator on the mission. For example, shifts in surface elevation have previously revealed the draining and filling of lakes below Antarctica's ice.

After ICESat, scientists will rely on an airborne laser called the Airborne Topographic Mapper (ATM), developed at NASA Wallops Flight Facility in Wallops Island, Va. ATM pulses laser light in circular scans on the ground, and those pulses reflect back to the aircraft and are converted into elevation maps of the ice surface. By flying ATM over the same swath of ground covered by ICESat, researchers can compare the two data sets and calibrate them so that aircraft can continue the record keeping after the satellite data ends. They can also make more detailed elevation studies over dynamic areas, such as the Crane glacier on the Antarctic Peninsula, which sped up following the collapse of the Larsen Ice Shelf in 2002.

In addition, University of Kansas scientists will fly the Multichannel Coherent Radar Depth Sounder, which measures ice sheet thickness. It can also map the varied terrain below the ice, which is important for computer modeling of the future behavior of the ice.

The Laser Vegetation Imaging Sensor, developed at Goddard, will map large areas of sea ice and glacier zones. And a gravimeter, managed by Columbia University, will measure the shape of seawater-filled cavities at the edge of some major fast-moving major glaciers. Finally, a snow radar from University of Kansas will measure the thickness of snow on top of sea ice and glaciers, allowing researchers to differentiate between snow and ice and make more accurate thickness measurements.


The Antarctic continent may be remote, but it plays a significant role in Earth's climate system. The expanse is home to glaciers and ice sheets that hold frozen about 90 percent of Earth's freshwater -- a large potential contribution to sea level rise should all the ice melt.

How and where are Antarctica's ice sheets, glaciers, and sea ice changing? Compared to the Arctic, where sea ice has long been on the decline, sea ice in Antarctica is growing in some coastal areas. Snow and ice have been accumulating in some land regions in the east. West Antarctica and the Peninsula, however, have seen more dramatic warming and rapid ice loss.

"We don't see the same sea ice changes in Antarctica that we see in the Arctic, and the reason is that the system is more complex," said Thorsten Markus of NASA Goddard, the principal sea ice investigator for the mission. "But the fact that we don't see the same changes in Antarctica that we see in the Arctic doesn’t make it less important to study those changes. It's really important for us to understand the global climate system."

With the DC-8 limited to just a few hours over Antarctica on each flight, mission planners have carefully selected targets of current and potential rapid change.

One such target is West Antarctica's Pine Island Glacier. "That glacier is one of the great unknowns because its bed -- where the glacier contacts rock -- is below sea level," Martin said. "So if there's a surge or dramatic change, seawater could get under the glacier and we could be looking at very rapid change."

Other proposed targets along the Amundsen coast include the Thwaites, Smith, and Kohler glaciers and the Getz Ice Shelf. Researchers also intend to study the myriad glaciers and ice shelves on the Peninsula, which has been undergoing dramatic changes.

"A remarkable change is happening on the Earth, truly one of the biggest changes in environmental conditions on Earth since the end of the ice age," Wagner said. "It's not an easy thing to observe, let alone predict what might happen next. Studies like this one are key."


Operation Ice Bridge

IceBridge Twitter

Robotic Lunar lander

Marshall Space Flight Center is testing a new robotic lunar lander test bed that will aid in the development of a new generation of multi-use landers for robotic space exploration. The test article is equipped with thrusters that guide the lander, one set of which controls the vehicle's attitude that directs the altitude and landing. On the test lander, an additional thruster offsets the effect of Earth’s gravity so that the other thrusters can operate as they would in a lunar environment. MSFC is partnered with John Hopkins University Applied Physics Laboratory and the Von Braun Center for Science and Innovation for this project.

With an Eye on Locusts and Vegetation, Scientists Make a Good Tool Better

Locusts, the grasshopper-like insects of Biblical lore, are normally docile creatures that prefer solitary lives in the desert, away from other members of their species. But sometimes, when the rains come and patches of green begin to dot dry landscapes, their populations skyrocket and something extraordinary can happen. Hormonal changes, triggered by crowding, can cause the insects to change color, become more active and congregate in huge swarms capable of decimating crops.

Swarms typically contain between 40 and 80 million locusts per square kilometer. This swarm passed through Nouakchott, the largest city in MauritaniaIn the 1980s, scientists at NASA's Goddard Space Flight Center and the United Nations' Food and Agriculture Organization (FAO) teamed up to develop a monitoring system that used satellite observations and other environmental data to monitor vegetation in the deserts of Africa, the Middle East and Asia for signs that swarms may be imminent. The Desert Locust Information Service (DLIS) used the satellite-derived Normalized Difference Vegetation Index (NDVI) -- based on the ratio of red and infrared radiation reflecting off the leaves of plants -- to detect where deserts were greening the most.

Compared to previous attempts to study vegetation from space, NDVI represented a vast improvement. Scientists could determine whether plant growth was significantly more or less productive than usual over a given time period -- just what they needed to predict whether locusts were likely to swarm. The advance gave officials precious time to target worrisome locust populations with pesticides before they could swarm and take their toll on crops.

Ironing Out the Wrinkles

Though state-of-the-art at the time, the system had a few shortcomings. For instance, bare soil in deserts can register an NDVI value similar to that of sparse vegetation. As a result, DLIS has occasionally issued false alarms, interpreting vegetation growth where there was none and missing the development of some real vegetation.

Environmental conditions can cause desert locusts to enter a "If DLIS warns locust control teams of a risk and then it doesn't materialize, or if it misses places where vegetation and swarms may be developing, then officials could be less apt to mobilize the next time," said Pietro Ceccato, an associate research scientist at Columbia University, N.Y., who has also worked with the FAO on its locust monitoring system.

That system has evolved over the years, particularly since the arrival of the MODIS instruments on NASA's Terra and Aqua satellites, which offer a considerably better view than previous instruments. Since 2002, locust monitors at DLIS have supplemented NDVI with information from an additional channel -- the shortwave infrared -- to create composite images that better account for the differences between vegetation and bare soil.

While NDVI remains the most important tool available to monitor locusts from space, remote sensing specialists are hardly resting on their NDVI laurels. For instance, the Goddard group that helped create NDVI and FAO’s locust monitoring system continues to refine its ability to screen out extraneous data and increase image resolution.

Beyond Locusts

The impulse to refine NDVI isn't limited to locust studies. Small particles in the atmosphere (aerosols) and water vapor can make interpreting NDVI measurements difficult in some situations, explained Susan Ustin, a remote sensing expert at the University of California-Davis. Clouds, especially thin cirrus clouds, also can contaminate short-term measurements. And the color of soil can cause complications because vegetation over dark soils produces higher NDVI values than the same amount of vegetation over light soils.

As technology has advanced, scientists have attempted to overcome such problems by developing dozens of experimental indices, many of which are based upon NDVI. "It seems like a new index comes out every month," said Ustin. In fact, there are so many new indices being developed for such a variety of situations that's it's sometimes difficult for researchers to agree on which are worth pursuing.

Another problem with all the new indices, said Compton Tucker, a scientist at NASA Goddard who pioneered the use of NDVI, is that many are geared toward such specific ecosystems and environments that they aren't useful globally. There's a risk of creating niche products that won’t allow researchers to see the bigger, global picture.

"Most of the new indices will never make it out of the lab," said Steve Running, a vegetation scientist at the University of Montana and member of the Intergovernmental Panel on Climate Change. "But I think that we'll eventually come up with one or two alternatives that we can use to complement NDVI."

Related Links:

› NDVI: Satellites Could Help Keep Hungry Populations Fed as Climate Changes
› Measuring Vegetation (NDVI & EVI)
› An Insect's Alter Ego
› Locusts Plague Northern and Western Africa
› Locust Watch

The Ups and Downs of Global Warming

According to the vast majority of climate scientists, the planet is heating up. Scientists have concluded that this appears to be the result of increased human emissions of greenhouse gases, especially carbon dioxide, which trap heat near the surface of Earth. However, some information sources -- blogs, websites, media articles and other voices -- highlight that the planet has been cooling since a peak in global temperature in 1998. This cooling is only part of the picture, according to a recent study that has looked at the world's temperature record over the past century or more.

In their recently published research paper2 entitled "Is the climate warming or cooling?", The world's surface air temperature change (David Easterling of the U.S. National Climate Data Center and Michael Wehner of Lawrence Berkeley National Laboratory show that naturally occurring periods of no warming or even slight cooling can easily be part of a longer-term pattern of global warming.

This may sound counter-intuitive at first sight, so let's take a closer look at the data. Figure 1 shows the change in the world's air temperature averaged over all the land and ocean between 1975 and 2008. The warming is obvious -- about 0.5° C (0.9° F) during that time. However, there are plenty of periods -- 1997 to 1985 and 1981 to 1989 (see insets, Figure 1), and 1998 to 2008 -- when no warming is seen, the most recent of which some global warming skeptics say is evidence that the world is actually cooling.

What's going on? To answer this question, Easterling and Wehner pored over global temperature records dating from 1901 to 2008 and also ran computer simulations of Earth's climate looking back into the past and forward into the future. They concluded that in a climate being warmed by man-made carbon emissions, "it is possible, and indeed likely, to have a period as long as a decade or two of 'cooling' or no warming superimposed on a longer-term warming trend."

Natural Fluctuations

These temperature plateaus, or cooling spells, can be attributed to natural climate variability, explains Josh Willis, a climate scientist at NASA's Jet Propulsion Laboratory in Pasadena, Calif. and a recent recipient of the 2009 Presidential Early Career Award for Scientists and Engineers. "Natural variability refers to naturally-occurring fluctuations or events that change Earth's climate on time scales ranging from years to decades. Big volcanic eruptions, for instance, can cause cooling that lasts for several years. When a volcano erupts, it blasts dust into the upper atmosphere where it reflects sunlight and cools the planet, a bit like a natural umbrella." He goes on, "There are also all kinds of natural fluctuations that sometimes cause warming, sometimes cooling." Ocean changes, for instance, can have a big impact on the world's temperature. One example that Willis cites is the Pacific Decadal Oscillation, a pattern of warmer and cooler surface temperatures in the Pacific Ocean that can last between 10 and 30 years.

Another important example is El Niño, which is an abnormal warming of surface ocean waters in the eastern tropical Pacific that happens every three to eight years and can affect global temperatures for a year or two. Between 1997 and 1998, there was an unusually strong El Niño, and this caused 1998 to be one of the hottest years on record (Figure 1). When Easterling and Wehner dropped the 1998 temperature spike from the data altogether, and zoomed in on the readings from 1999 to 2008, they saw a strong warming trend over this period. But when the 1998 measurement is included in the data, it looks as if there is no overall warming between 1998 and 2008 at all.

The authors say that it is easy to "cherry-pick" a period to reinforce a particular point of view. "Claims that global warming is not occurring that are derived from a cooling observed over short time periods ignore natural variability and are misleading."

What you have to look at, says Willis, is the long-term temperature readings that have been collected over the past century -- which is exactly what Easterling and Wehner do in their study. Over that sort of time scale, global warming becomes apparent from observations of both our atmosphere and our ocean, which are intimately linked pieces of the climate puzzle.

Sea Change

Since around the time of the Industrial Revolution (the late 18th and early 19th centuries), Earth's atmosphere has warmed by a little less than 1° C (1.8° F) (Figure 2). In turn, the ocean has also risen by about 15 centimeters (6 inches) over the past 100 years -- for two reasons. First, when water warms up, it expands, in much the same way as a solid does when it heats up. As the volume of seawater increases, it causes sea level to rise. Second, global warming causes glaciers and ice sheets to melt, which adds more water to the world's ocean, again causing sea level to rise 4,5.

The world's average surface air temperature change anomaly from 1880 to the present day

"If you look at the ocean data, there has been a very clear acceleration in sea level rise," explains Willis. "At the beginning of the last century, sea level was rising by less than 1 millimeter (0.04 inches) per year; mid-century it was 2 millimeters (0.08 inches) per year and now it's 3 millimeters (0.12 inches) per year. This is directly caused by the increasing temperature of the planet."

Big Picture

As Willis explains, global warming is a long-term process. "Despite the fact it's been warmer and cooler at different times in the last 10 years, there's no part of the last 10 years that isn't warmer than the temperatures we saw 100 years ago."

Assuming our greenhouse gas emissions continue at their present levels with little reduction, existing climate forecasts suggest that our planet will warm by about 4° C (7.2° F) by the end of the 21st century. Although scientists continue to study the nuances of Earth's climate, the link between carbon emissions, global warming and sea level rise over the past century is clear. Even if our global carbon emissions began to fall tomorrow, Earth would continue to warm for some time due to the inertia of the climate system.

"In the next century it's definitely going to get warmer," Willis says. "You don't need a crystal ball or fancy climate model to say that. Just look at the sea level and temperature records from the past 100 years -- they're all going up." Likewise, Easterling and Wehner's work reminds us that understanding climate change -- one of the most important challenges we face today -- requires a long-term view. "Unlike people," says Willis, "the climate has a very long memory."

A Body of Evidence

In 2007, a scientific intergovernmental body called the Intergovernmental Panel on Climate Change (IPCC) released its Fourth Assessment Report on Climate Change, which summarizes our current understanding of climate change. The report took 6 years to produce, involved over 2500 scientific expert reviewers and more than 800 authors from over 130 countries.

Some of their key findings include:

  • The warming trend over the last 50 years (about 0.13° C or 0.23° F per decade) is nearly twice that for the last 100 years.
  • The average amount of water vapor in the atmosphere has increased since at least the 1980s over land and ocean. The increase is broadly consistent with the extra water vapor that warmer air can hold.
  • Since 1961, the average temperature of the global ocean down to depths of at least 3 km (1.9 miles) has increased. The ocean has been absorbing more than 80% of the heat added to the climate system, causing seawater to expand and contributing to sea level rise.
  • Global average sea level rose on average by 1.8 mm (0.07 inches) per year from 1961 to 2003. There is high confidence that the rate of observed sea level rise increased from the 19th to the 20th century.
  • Average arctic temperatures increased at almost twice the global average rate in the past 100 years.
  • Mountain glaciers and snow cover have declined on average in both hemispheres. Widespread decreases in glaciers and ice caps have contributed to sea level rise.
  • Long-term trends in the amount of precipitation have been observed over many large regions from 1900 to 2005.

Ames Bioengineering Scientist Establishes GREEN Team

Image of Jonathan TrentAfter earning his Ph.D. in Biological Oceanography at Scripps Institution of Oceanography, University of California at San Diego, Jonathan Trent spent six years in Europe at the Max Planck Institute for Biochemistry in Germany, the University of Copenhagen in Denmark, and the University of Paris at Orsay in France. He returned to the United States to work at the Boyer Center for Molecular Medicine at Yale Medical School for two years before establishing a biotechnology group at the Department of Energy's Argonne National Laboratory in Illinois.

In 1998, he moved to NASA Ames Research Center, where he established the Protein Nanotechnology Group. These researchers focus on building nanostructures using biomolecules from extremophiles-organisms adapted to extreme environments, such as high temperatures, high or low pH, ionizing radiation, or saturated salts. Using these robust biomolecules, and manipulating molecular recognition and self-assembly with genetic engineering, his team has built patterned nano-particle arrays for data storage and molecular scaffolds for enhancing enzyme activities.

In addition to working at NASA, Trent was appointed Adjunct Professor in the Dept. of Biomolecular Engineering at the University of California at Santa Cruz in 2004. Two years later, he was awarded the prestigious Nano50 award for Innovation in Nanotechnology, and was elected Fellow of the California Academy of Sciences. Since then, Trent has initiated Global Research into Energy and the Environment at NASA (GREEN) with support from Google. Among other projects, Trent and the GREEN team are developing systems for producing a sustainable, carbon-neutral feedstock for the biofuels of the future. Trent's recent research and inventions are focused on methods for obtaining alternative fuels, processing municipal wastewater, and economically producing freshwater by desalination. In April 2009, he organized and led an international conference in Denmark entitled: Wind, Sea, and Algae.

From Nothing, Something: One Layer at a Time

A group of engineers working on a novel manufacturing technique at NASA's Langley Research Center in Hampton, Va., have come up with a new twist on the popular old saying about dreaming and doing: "If you can slice it, we can build it."

Electron beam freeform fabrication processThat's because layers mean everything to the environmentally-friendly construction process called Electron Beam Freeform Fabrication, or EBF3150, and its operation sounds like something straight out of science fiction.

"You start with a drawing of the part you want to build, you push a button, and out comes the part," said Karen Taminger, the technology lead for the Virginia-based research project that is part of NASA's Fundamental Aeronautics Program.

She admits that, on the surface, EBF3 reminds many people of a Star Trek replicator in which, for example, Captain Picard announces out loud, "Tea, Earl Grey, hot." Then there is a brief hum, a flash of light and the stimulating drink appears from a nook in the wall.

In reality, EBF3 works in a vacuum chamber, where an electron beam is focused on a constantly feeding source of metal, which is melted and then applied as called for by a drawing—one layer at a time—on top of a rotating surface until the part is complete.

While the options for using EBF3 are more limited than what science fiction allows, the potential for the process is no less out of this world, with promising relevance in aviation, spaceflight—even the medical community, Taminger said.

Commercial applications for EBF3 are already known and its potential already tested, Taminger said, noting it's possible that, within a few years, some aircraft will be flying with large structural parts made by this process.

To make EBF3 work there are two key requirements: A detailed three-dimensional drawing of the object to be created must be available, and the material the object is to be made from must be compatible for use with an electron beam.

First, the drawing is needed to break up the object into layers, with each cross-section used to guide the electron beam and source of metal in reproducing the object, building it up layer by layer.

"If you take a slice through a typical truss, you can see a couple of dots in each cross-section that move as you go from layer to layer," Taminger said. "When complete, you see those moving dots actually allowed you to build a diagonal brace into the truss."

Second, the material must be compatible with the electron beam so that it can be heated by the stream of energy and briefly turned into liquid form, making aluminum an ideal material to be used, along with other metals.

A structural metal part fabricated from EBF3In fact, the EBF3 can handle two different sources of metal—also called feed stock—at the same time, either by mixing them together into a unique alloy or embedding one material inside another.

The potential use for the latter could include embedding a strand of fiber optic glass inside an aluminum part, enabling the placement of sensors in areas that were impossible before, Taminger said.

While the EBF3 equipment tested on the ground is fairly large and heavy, a smaller version was created and successfully test flown on a NASA jet that is used to provide researchers with brief periods of weightlessness. The next step is to fly a demonstration of the hardware on the International Space Station, Taminger said.

Future lunar base crews could use EBF3 to manufacture spare parts as needed, rather than rely on a supply of parts launched from Earth. Astronauts might be able to mine feed stock from the lunar soil, or even recycle used landing craft stages by melting them.

But the immediate and greatest potential for the process is in the aviation industry where major structural segments of an airliner, or casings for a jet engine, could be manufactured for about $1,000 per pound less than conventional means, Taminger said.

Environmental savings also are made possible by deploying EBF3, she added.

Normally an aircraft builder might start with a 6,000-pound block of titanium and machine it down to a 300-pound part, leaving 5,700 pounds of material that needs to be recycled and using several thousand gallons of cutting fluid used in the process..

"With EBF3 you can build up the same part using only 350 pounds of titanium and machine away just 50 pounds to get the part into its final configuration," Taminger said. "And the EBF3 process uses much less electricity to create the same part."

While initial parts for the aviation industry will be simple shapes, replacing parts already designed, future parts designed from scratch with the EBF3 process in mind could lead to improvements in jet engine efficiency, fuel burn rate and component lifetime.

"There's a lot of power in being able to build up your part layer by layer because you can get internal cavities and complexities that are not possible with machining from a solid block of material," Taminger said.

Watch Karen Taminger's Electron Beam Freeform Fabrication Technical Seminar →

How to Make a Planet

This artist's conception shows a lump of material in a swirling, planet-forming disk. Astronomers using NASA's Spitzer Space Telescope found evidence that a companion to a star -- either another star or a planet -- could be pushing planetary material together, as illustrated here.

Planets are born out of spinning disks of gas and dust. They can carve out lanes or gaps in the disks as they grow bigger and bigger. Scientists used Spitzer's infrared vision to study the disk around a star called LRLL 31, located about 1,000 light-years away in the IC 348 region of the constellation Perseus. Spitzer's new infrared observations reveal that the disk has both an inner and outer gap.

What's more, the data show that infrared light from the disk is changing over as little time as one week -- a very unusual occurrence. In particular, light of different wavelengths seesawed back and forth, with short-wavelength light going up when long-wavelength light went down, and vice versa.

According to astronomers, this change could be caused by a companion to the star (illustrated as a planet in this picture). As the companion spins around, its gravity would cause the wall of the inner disk to squeeze into a lump. This lump would also spin around the star, shadowing part of the outer disk. When the bright side of the lump is on the far side of the star, and facing Earth, more infrared light at shorter wavelengths should be observed (hotter material closer to the star emits shorter wavelengths of infrared light). In addition, the shadow of the lump should cause longer-wavelength infrared light from the outer disk to decrease. The opposite would be true when the lump is in front of the star and its bright side is hidden (shorter-wavelength light would go down, and longer-wavelength light up). This is precisely what Spitzer observed.

The size of the lump and the planet have been exaggerated to better illustrate the dynamics of the system.

NASA's Kennedy Space Center

Space shuttle Discovery and 747 in the mate-demate device.
Platforms from the mate-demate device at NASA's Kennedy Space Center surround space shuttle Discovery and its 747 Shuttle Carrier Aircraft. The strucutre is used to lift the shuttle safely on and off the aircraft. There is a similar structure at Dryden Flight Research Center in California adjacent to Edwards Air Force Base. Discovery landed at Edwards Sept. 11 to end the STS-128 mission.
› High-Res

Astronaut James McDivitt, Others Inducted Into Aerospace Walk of Honor

Retired NASA Apollo program astronaut James McDivitt was inducted into the Aerospace Walk of Honor in Lancaster City, Calif. on Sept. 19, 2009. McDivitt, who commanded the Gemini IV mission in 1965 and the Apollo 9 mission in 1969, was one of five former test pilots and astronauts honored at the 20th induction ceremonies.

Retired NASA Apollo program astronaut James McDivitt (right) is presented with a medal by Ron Smith, vice-mayor of the City of Lancaster, Calif., at the city’s Aerospace Walk of Honor induction ceremonies Sept. 19McDivitt was joined at the induction ceremony by retired NASA astronaut Gordon Fullerton, Apollo 17 commander Gene Cernan, NASA Dryden acting deputy director Gwen Young and Ron Smith, vice-mayor of the City of Lancaster, Calif. Cernan was the featured speaker during the ceremony.

Following the induction ceremony, McDivitt and the group wielded shovels in front of the Lancaster Performing Arts Center to plant a commemorative moon tree. The sycamore sapling is a second-generation descendant of sycamore trees that were germinated from seeds that were flown on the Apollo 14 moon mission in 1971. This moon tree joins dozens of other trees now growing at state capitols, university campuses, and other select locations across the nation.

McDivitt commanded the first American space walk mission during Gemini IV, and later during Apollo 9, he oversaw the first tests of the Lunar Module in orbit around Earth. Joining the Air Force in 1959, he started as a student test pilot. McDivitt quickly climbed through various positions and programs before being selected as an astronaut in 1962.

A graduate of the US Air Force Experimental Test Pilot School and member of the Society of Experimental Test Pilots, he has been honored with many awards highlighting his accomplishments, including two NASA Distinguished Service Medals, four Distinguished Flying Crosses, five Air Medals, the NASA Exceptional Service Medal, two Air Force Distinguished Service Medals, induction into the U.S. Astronaut Hall of Fame and the International Space Hall of Fame. McDivitt now joins the 93 other honorees in the Aerospace Walk of Honor.

Retired NASA astronaut Gordon Fullerton, Vice-Mayor of Lancaster City, CalifEstablished in 1990 by the Lancaster City Council, the Aerospace Walk of Honor runs along Lancaster Boulevard through the city where each inductee is memorialized with a granite pillar that recognizes the important contributions of each individual who 'soared above the rest.'

Lancaster City is near both Edwards Air Force Base and the NASA Dryden Flight Research Center making it a hotbed of aviation activity. Dryden Flight Research Center has been the home of NASA’s high performance aircraft research since it’s founding.

› Learn more about Moon Trees
› Learn more about the Aerospace Walk of Honor →

NASA Engineers to Practice on Webb Telescope Simulator

The huge assembly standing in Northrop Grumman Corporation’s high bay looks a lot like NASA's James Webb Space Telescope, but it’s a full-scale simulator of the space telescope’s key elements.

This photograph shows simulators of the James Webb Space Telescope's optical telescope element and the sunshield's integrated validation article, mated together in Northrop Grumman's high bay in Space ParkEngineers are using the simulator, consisting of the telescope’s primary backplane assembly and the sunshield’s integrated validation article, to develop the Webb Telescope’s hardware design. In addition, technicians are using it to gain experience handling large elements in advance of working with the actual hardware that will fly in space.

"Having a functioning demonstration article enables us to see how components, which were developed and tested individually, fit together as a whole system," said Martin Mohan, Webb Telescope program manager for Northrop Grumman Aerospace Systems sector. "The simulator is an effective risk reduction tool to help us validate design approaches early."

John E. Decker, Deputy Associate Director for the Webb Telescope at NASA's Goddard Space Flight Center said, "Simulators are important for the development of any spacecraft, and they are absolutely critical for one with the size and complexity of the Webb Telescope. We have already learned many important lessons from this simulator, and we expect to learn many more."

The simulator is a key element in the company’s extensive test and verification program, which relies on incremental verification, testing, and the use of crosschecks throughout the Webb Telescope’s development. The goal is to ensure that the final end-to-end Observatory test is a confirmation of the expected results. Northrop Grumman’s approach emulates its highly successful Chandra X-ray Observatory test and verification program.

Northrop has conducted a variety of tests with the simulator, including checking the clearances between sunshield membranes and the telescope to evaluating membrane management hardware and simulating the backplane support structure’s alignment measurements for future testing.

These Webb telescope simulators are full-scale representations of the optical telescope element and sunshieldNorthrop Grumman is the prime contractor for the Webb Telescope, leading a design and development team under contract to NASA’s Goddard Space Flight Center. Ball Aerospace & Technologies Corp. is the principal optical subcontractor to Northrop Grumman for the JWST program. ATK builds the telescope backplane and ITT develops the complex cryogenic metrology for optical testing.

The James Webb Space Telescope is the next-generation premier space observatory, exploring deep space phenomena from distant galaxies to nearby planets and stars. The Webb Telescope will give scientists clues about the formation of the universe and the evolution of our own solar system, from the first light after the Big Bang to the formation of star systems capable of supporting life on planets like Earth. It is expected to launch in 2014. The telescope is a joint project of NASA, the European Space Agency and the Canadian Space Agency.