Special for Earth Day: New NSF resources highlight surprising ecological and economic benefits of biodiversity

Press Release 14-056
Special for Earth Day: New NSF resources highlight surprising ecological and economic benefits of biodiversity

Multimedia resources explain little known societal benefits of biodiversity, bust myths and describe new, high-tech approaches for measuring impacts of environmental change on biodiversity

graphic showing various marine organisms and the text photogallery

A slideshow on how biodiversity boosts the economy.
Credit and Larger Version

April 17, 2014

Every organism on Earth, from microbes to plants to large predators, has evolved unique survival mechanisms and distinct ecological roles. For decades, the National Science Foundation (NSF) has funded basic research on how these varied organisms, which make up the Earth's biodiversity, function.

For example, recent findings about how geckos climb up vertical walls and walk across ceilings led to the development of new adhesives and wall-climbing robots that may be used to, for example, produce new gravity-defying climbing boots and help collect space junk. (Learn more about gecko-inspired discoveries in the accompanying slide show.)

Kellar Autumn of Lewis & Clark College, who helped characterize the nanophysics of the gecko's Spider Man-like abilities, said, "Geckos, which evolved 160 million years ago are so novel that engineers would never have developed nano-adhesive structures without them. It took 15 years and lots of NSF support to understand the basic physical principles of gecko adhesion and then to apply them to make them work. This suggests that there is a library of biodiversity that can be mined for valuable uses if we have enough resources and enough time--in light of high extinction rates--to really understand them."

New resources

Learn more about the amazingly diverse ecological and economic benefits of biodiversity and its enduring mysteries from these accompanying resources:

  • 10 Surprising Ways that Biodiversity Benefits the Economy: A slide show about how basic research on biodiversity drives innovation, boosts the economy and produces other important societal benefits.
  • Biodiversity: A Boon to Brain Research: A video about how two unlikely microbes (that don't even have brains) made possible the development of one of today's most promising brain research tools--which is being used to study many brain diseases and disorders, including schizophrenia, Parkinson's, epilepsy and anxiety.
  • NEON at a Glance: A video about the National Ecological Observatory Network (NEON)--which will be a massive nationwide infrastructure for collecting standardized long-term data on biodiversity (and other ecological variables) throughout the United States. Currently under construction and partially operational, NEON will enable scientists to generate the first apples-to-apples comparisons of ecosystem health over time. NEON will be fully operational in 2017.
  • A Google+ Hangout on the surprising ecological and economic benefits of biodiversity: Held on April 17, 2014, the Hangout covered ecological benefits of biodiversity that have been scientifically tested and others that have yet to be tested; how biodiversity boosts scientific and engineering innovation and new tools used to measure biodiversity in the face of environmental change.

Panelists in the Hangout were:

  • Ed Boyden of MIT: A neuroscientist and expert on how studies of biodiversity have helped generate revolutionary new research tools. A recent press release on Boyden's brain research reviews the contributions of biodiversity to his research advances.
  • Bradley Cardinale of the University of Michigan: An expert on the impacts of humans on biodiversity and ecosystem health, and on how losses of biodiversity may impact ecological processes.
  • Sarah Bergbreiter of the University of Maryland: An expert in insect-inspired robotics. Bergbreiter's research on micro robots was covered in a recent Science Nation video.
  • Steve Polasky of the University of Minnesota: An expert on integrating ecological and economic analyses, biodiversity conservation and ecosystem services.
  • Elizabeth Blood of NSF: NSF's program director for NEON.


Media Contacts
Lily Whiteman, National Science Foundation, (703) 292-8310, lwhitema@nsf.gov

Related Websites
Following in the Footsteps of Nature (article about gecko-inspired innovations): http://www.nsf.gov/discoveries/disc_summ.jsp?cntn_id=116297

The National Science Foundation (NSF) is an independent federal agency that supports fundamental research and education across all fields of science and engineering. In fiscal year (FY) 2014, its budget is $7.2 billion. NSF funds reach all 50 states through grants to nearly 2,000 colleges, universities and other institutions. Each year, NSF receives about 50,000 competitive requests for funding, and makes about 11,500 new funding awards. NSF also awards about $593 million in professional and service contracts yearly.

 Get News Updates by Email 

Useful NSF Web Sites:
NSF Home Page: http://www.nsf.gov
NSF News: http://www.nsf.gov/news/
For the News Media: http://www.nsf.gov/news/newsroom.jsp
Science and Engineering Statistics: http://www.nsf.gov/statistics/
Awards Searches: http://www.nsf.gov/awardsearch/

Frozen in time: Three-million-year-old landscape still exists beneath the Greenland Ice Sheet

Press Release 14-057
Frozen in time: Three-million-year-old landscape still exists beneath the Greenland Ice Sheet

NSF-funded researchers say the massive ice sheet has fixed the landscape in place, rather than scouring it away

A camp at the edge of the Greenland ice sheet

A camp at the edge of the Greenland ice sheet.
Credit and Larger Version

April 17, 2014

Some of the landscape underlying the massive Greenland ice sheet may have been undisturbed for almost 3 million years, ever since the island became completely ice-covered, according to researchers funded by the National Science Foundation (NSF).

Basing their discovery on an analysis of the chemical composition of silts recovered from the bottom of an ice core more than 3,000 meters long, the researchers argue that the find suggests "pre-glacial landscapes can remain preserved for long periods under continental ice sheets."

In the time since the ice sheet formed "the soil has been preserved and only slowly eroded, implying that an ancient landscape underlies 3,000 meters of ice at Summit, Greenland," they conclude.

They add that "these new data are most consistent with [the concept of] a continuous cover of Summit… by ice … with at most brief exposure and minimal surface erosion during the warmest or longest interglacial [periods]."

They also note that fossils found in northern Greenland indicated there was a green and forested landscape prior to the time that the ice sheet began to form. The new discovery indicates that even during the warmest periods since the ice sheet formed, the center of Greenland remained stable, allowing the landscape to be locked away, unmodified, under ice through millions of years of cyclical warming and cooling.

"Rather than scraping and sculpting the landscape, the ice sheet has been frozen to the ground, like a giant freezer that's preserved an antique landscape", said Paul R. Bierman, of the Department of Geology and Rubenstein School of the Environment and Natural Resources at the University of Vermont and lead author of the paper.

Bierman's work was supported by two NSF grants made by its Division of Polar Programs, 1023191 and 0713956. Thomas A. Neumann, also of the University of Vermont, but now at NASA's Goddard Space Flight Center, a co-author on the paper, also was a co-principal investigator on the latter grant.

Researchers from Idaho State University, the University of California, Santa Barbara, and the Scottish Universities Environmental Research Centre at the University of Glasgow also contributed to the paper.

The research also included contributions from two graduate students, both supported by NSF, one of whom was supported by the NSF Graduate Research Fellowships Program.

The team's analysis was published on line on April 17 and will appear in Science magazine the following week.

Understanding how Greenland's ice sheet behaved in the past, and in particular, how much of the ice sheet melted during previous warm periods as well as how it re-grew is important to developing a scientific understanding of how the ice sheet might behave in the future.

As global average temperatures rise, scientists are concerned about how the ice sheets in Greenland and Antarctica will respond. Vast amounts of freshwater are stored in the ice and may be released by melting, which would raise sea levels, perhaps by many meters.

The magnitude and rate of sea level rise are unknown factors in climate models.

The team based its analysis on material taken from the bottom of an ice core retrieved by the NSF-funded Greenland Ice Sheet Project Two (GISP2), which drilled down into the ice sheet near NSF's Summit Station. An ice core is a cylinder of ice in which individual layers of ice, compacted from snowfall, going back over millennia can be observed and sampled.

Summit is situated at an elevation of 3,216 meters (10,551 feet) above sea level.

In the case of GISP2, the core itself, taken from the center of the present-day Greenland ice sheet, was 3,054 meters (10,000 feet) deep. It provides a history of the balance of gases that made up the atmosphere at time the snow fell as well as movements in the ice sheet stretching back more than 100,000 years. It also contains a mix of silts and sediments at its base where ice and rock come together.

The scientists looked at the proportions of the elements carbon, nitrogen and Beryllium-10, the source of which is cosmic rays, in sediments taken from the bottom 13 meters (42 feet) of the GISP2 ice core.

They also compared levels of the various elements with soil samples taken in Alaska, leading them to the conclusion that the landscape under the ice sheet was indeed an ancient one that predates the advent of the ice sheet. The soil comparisons were supported by two NSF grants: 0806394 and 0806399.


Media Contacts
Joshua Brown, University of Vermont, (802) 656-3039, joshua.e.brown@uvm.edu

Principal Investigators
Paul Bierman, University of Vermont, (802) 656-4411, pbierman@uvm.edu

The National Science Foundation (NSF) is an independent federal agency that supports fundamental research and education across all fields of science and engineering. In fiscal year (FY) 2014, its budget is $7.2 billion. NSF funds reach all 50 states through grants to nearly 2,000 colleges, universities and other institutions. Each year, NSF receives about 50,000 competitive requests for funding, and makes about 11,500 new funding awards. NSF also awards about $593 million in professional and service contracts yearly.

 Get News Updates by Email 

Useful NSF Web Sites:
NSF Home Page: http://www.nsf.gov
NSF News: http://www.nsf.gov/news/
For the News Media: http://www.nsf.gov/news/newsroom.jsp
Science and Engineering Statistics: http://www.nsf.gov/statistics/
Awards Searches: http://www.nsf.gov/awardsearch/

Earth-sized exoplanet discovery

Earth-sized exoplanet discovery

Gemini confirms Earth-sized planet

image of a new Earth-sized planet

Video availableView video

NSF-funded Gemini confirms first potentially habitable earth-sized planet.
Credit and Larger Version

April 17, 2014

The National Science Foundation-funded Gemini observatory helps confirm the first potentially habitable earth-sized exoplanet.

Researchers say this discovery is unique because the planet, called Kepler-186f resides in a temperate region around its host star where water could exist and could possibly sustain life. Earth-sized planets are very difficult to detect because of contrast with their host stars.

While the Kepler space telescope made the initial discovery, researchers say both ground based telescopes, the W.M. Keck and Gemini observatories were critical in confirming the earth-sized planet. Its host, Kepler-186, is an m1-type dwarf star relatively close to our solar system in the milky way galaxy.

Five small planets have been found orbiting this star, four are in very short-period orbits and too hot for liquid water. The fifth is the earth-sized planet. Utilizing the Gemini north telescope, researchers were able to probe into the star system. The visiting differential speckle survey instrument on the telescope produced images with extreme detail.

Researchers say the observations from Keck and Gemini on Mauna Kea in Hawaii, combined with other data, calculations and analysis, allowed the team to be 99.8 percent confident that Kepler-186f is real.

The paper is published in the current issue of the journal Science.

--  Dena Headlee, (703) 292-7739 dheadlee@nsf.gov

Doug Simons
William Smith

Related Institutions/Organizations
Association of Universities for Research in Astronomy, Inc.

Related Awards
#0647970 Management and Operations of the Gemini Observatory

Total Grants

Shaving nanoseconds from racing processors

Shaving nanoseconds from racing processors

University of Wisconsin researcher finds hidden efficiencies in computer architecture

Mark D. Hill

Mark D. Hill, University of Wisconsin-Madison
Credit and Larger Version

April 17, 2014

The computer is one of the most complex machines ever devised and most of us only ever interact with its simplest features. For each keystroke and web-click, thousands of instructions must be communicated in diverse machine languages and millions of calculations computed.

Mark Hill knows more about the inner workings of computer hardware than most. As Amdahl Professor of Computer Science at the University of Wisconsin, he studies the way computers transform 0s and 1s into social networks or eBay purchases, following the chain reaction from personal computer to processor to network hub to cloud and back again.

The layered intricacy of computers is intentionally hidden from those who use--and even those who design, build and program--computers. Machine languages, compilers and network protocols handle much of the messy interactions between various levels within and among computers.

"Our computers are very complicated and it's our job to hide most of this complexity most of the time because if you had to face it all of the time, then you couldn't get done what you want to get done, whether it was solving a problem or providing entertainment," Hill said.

During the last four decades of the 20th century, as computers grew faster and faster, it was advantageous to keep this complexity hidden. However, in the past decade, the linear speed-up in processing power that we'd grown used to (often referred to as "Moore's law") has started to level off. It is no longer possible to double computer processing power every two years just by making transistors smaller and packing more of them on a chip.

In response, researchers like Hill and his peers in industry are reexamining the hidden layers of computing architecture and the interfaces between them in order to wring out more processing power for the same cost.

Ready, set...compute

One of the main ways that Hill and others do this is by analyzing the performance of computer tasks. Like a coach with a stopwatch, Hill times how long it takes an ordinary processor to, say, analyze a query from Facebook or perform a web search. He's not only interested in the overall speed of the action, but how long each step in the process takes.

Through careful analysis, Hill uncovers inefficiencies, sometimes major ones, in the workflows by which computers operate. Recently, he investigated inefficiencies in the way that computers implement virtual memory and determined that these operations can waste up to 50 percent of a computer's execution cycles. (Virtual memory is a memory management technique that maps memory addresses used by a program, called virtual addresses, to physical addresses in computer memory, in part, so that every program can seem to run as if is alone on a computer.)

The inefficiencies he found were due to the way computers had evolved over time. Memory had grown a million times bigger since the 1980s, but the way it was used had barely changed at all. A legacy method called paging, that was created when memory was far smaller, was preventing processors from achieving their peak potential.

Hill designed a solution that uses paging selectively, adopting a simpler address translation method for key parts of important applications. This reduced the problem, bringing cache misses down to less than 1 percent. In the age of the nanosecond, fixing such inefficiencies pays dividends. For instance, with such a fix in place, Facebook could buy far fewer computers to do the same workload, saving millions.

"A small change to the operating system and hardware can bring big benefits," he said.

Hill and his colleagues reported the results of their research in the International Symposium on Computer Architecture in June 2013.

Computer companies like Google and Intel are among the richest in the world, with billions in their coffers. So why, one might ask, should university researchers, supported by the National Science Foundation (NSF), have to solve problems with existing hardware?

"Companies can't do this kind of research by themselves, especially the cross-cutting work that goes across many corporations," said Hill. "For those working in the field, if you can cross layers and optimize, I think there's a lot of opportunity to make computer systems better. This creates value in the U.S. for both the economy and all of us who use computers."

"The National Science Foundation is committed to supporting research that makes today's computers more productive in terms of performance, energy-efficiency and helping solve problems arising from the entire spectrum of application domains, while also studying the technologies that will form the basis for tomorrow's computers," said Hong Jiang, a program director in the Computer Information Science and Engineering directorate at NSF.

"In the process of expanding the limits of computation, it's extremely important to find both near-term and long-term solutions to improve performance, power efficiency and resiliency. Professor Mark Hill's pioneering research in computer memory systems is an excellent example of such efforts."

The "divide and conquer" approach to computer architecture design, which kept the various computing layers separate, helped accelerate the industry, while minimizing errors and confusion in an era when faster speeds seemed inevitable. But Hill believes it may be time to break through the layers and create a more integrated framework for computation.

"In the last decade, hardware improvements have slowed tremendously and it remains to be seen what's going to happen," Hill said. "I think we're going to wring out a lot of inefficiencies and still get gains. They're not going to be like the large ones that you've seen before, but I hope that they're sufficient that we can still enable new creations, which is really what this is about."

Most recently, Hill has been exploring how graphic processing units (GPUs), which have become common in personal and cloud computing, can process big memory tasks more efficiently.

Writing for the proceedings of the International Symposium on High-Performance Computer Architecture, Hill, along with Jason Power and David Wood (also from the University of Wisconsin), showed that it is possible to design virtual memory protocols that are easier to program without slowing down overall performance significantly. This opens the door to the use of GPU-accelerated systems that can compute faster than those with only traditional computer processing units.

Accelerating during a slow-down

Improvements to virtual memory and GPU performance are a few examples of places where cross-layer thinking has improved computer hardware performance, but they are also emblematic of a wholesale transformation in the way researchers are thinking about computer architecture in the early 21st century.

Hill led the creation of a white paper, authored by dozens of top U.S. computer scientists, that outlined some of the paradigm-shifts facing computing.

"The 21st century is going to be different from the 20th century in several ways," Hill explained. "In the 20th century, we focused on a generic computer. That's not appropriate anymore. You definitely have to consider where that computer sits. Is it in a piece of smart dust? Is it in your cellphone, or in your laptop or in the cloud? There are different constraints."

Among the other key findings of the report: a shift in focus from the single computer to the network or datacenter; the growing importance of communications in today's workflows, especially relating to Big Data; the growth of energy consumption as a first-order concern in chip and computer design; and the emergence of new, unpredictable technologies that could prove disruptive.

These disruptive technologies are still decades away, however. In the meantime, it's up to computer scientists to rethink what can be done to optimize existing hardware and software. For Hill, this effort is akin to detective work, where the speed of a process serves as a clue to what's happening underneath the cover of a laptop.

"It's all about problem solving," Hill said. "People focus on the end of it, which is like finishing he puzzle, but really it's the creative part of defining what the puzzle is. Then it's the satisfaction that you have created something new, something that has never existed before. It may be a small thing that's not well known to everybody, but you know it's new and I just find great satisfaction in that."


NSF has been crucial in supporting Mark D. Hill's research throughout his career. For more than 26 years, he has been the the recipient of 19 NSF grants, which supported not only Hill and his collaborators, but also three dozen PhD students from his group, who themselves have trained more than a 100 scientists. Hear his distinguished lecture at NSF from December 2013.

--  Aaron Dubrow, NSF (512) 820-5785 adubrow@nsf.gov

Mark Hill
David Wood
James Larus
Gurindar Sohi
Michael Swift

Related Institutions/Organizations
University of Wisconsin-Madison

Madison , Wisconsin

Related Awards
#8957278 Presidential Young Investigator Award: Cache Memory Design
#9971256 Experimental Partnerships: Multifacet: Exploiting Prediction and Speculation in Multiprocessor Memory Systems
#0551401 CRI: A MASSIV Cluster for Designing Chip Multiprocessors
#9225097 Cooperative Shared Memory and the Wisconsin Wind Tunnel
#1218323 SHF: Small:Energy-Optimized Memory Hierarchies
#0916725 SHF: Small: Managing Non-Determinism in Multithreaded Software and Hardware
#0720565 CSR---AES: Deconstructing Transactional Memory: System Support for Robust Concurrent Programming
#8902536 The Design of Secondary Caches

Years Research Conducted
1989 - 2014

Total Grants

Related Websites
Mark Hill's home page: http://pages.cs.wisc.edu/~markhill/includes/publications.html
21st Century Computer Architecture: A Community White Paper: http://www.cra.org/ccc/files/docs/init/21stcenturyarchitecturewhitepaper.pdf

Early Career Scientists and Engineers receive highest honor from the White House

Press Release 14-055
Early Career Scientists and Engineers receive highest honor from the White House

Twenty NSF-funded scientists recognized for strengthening America's scientific enterprise

President Obama talks with PECASE recipients at the White House.

President Obama talks with PECASE recipients at the White House.
Credit and Larger Version

April 16, 2014

On Monday, 102 men and women received the United States government's highest honor for scientists and engineers in the early stages of their independent research careers--the Presidential Early Career Award for Scientists and Engineers (PECASE). The National Science Foundation (NSF) nominated 20 of the awardees.

They received their awards from NSF Director France Córdova at a morning ceremony presided over by John P. Holdren, assistant to the president for science and technology and director of the Office of Science and Technology Policy.

The awardees come from universities around the country and excel in research in a variety of scientific disciplines: biology, computer and information sciences, education and human resources, geosciences, the physical sciences including mathematics, chemistry, physics and materials research, engineering and social, behavioral and economic sciences.

"The PECASE awardees embody America's high priority of producing outstanding scientists and engineers to advance the nation's goals, tackle grand challenges and contribute to all sectors of the economy," said NSF Director France Córdova. "I salute the creativity, ingenuity, dedication and generosity of this year's PECASE winners. I commend them for shouldering the responsibilities to inspire and mentor the next generation and to help America sustain global leadership in science and technology."

All NSF PECASE awardees are from a pool of individuals whose research has been vetted by NSF's rigorous peer review process and have received five-year grants from the Faculty Early Career Development (CAREER) Program. CAREER awardees have proven themselves exemplary in integrating research and education within the context of the mission of their organization and selection is highly competitive: 20 percent or 525 of the nearly 2,600 CAREER award applicants were funded in 2012. Of those, just shy of 4 percent were PECASE winners, the cream of the crop.

The Office of Science and Technology Policy within the Executive Office of the President coordinated the awards, which were established by President Clinton in February 1996. Awardees are selected on the basis of two criteria: pursuit of innovative research at the frontiers of science and technology and a commitment to community service as demonstrated through scientific leadership, public education or community outreach.

This year's NSF recipients are:

Theodor Agapie, California Institute of Technology

For the creative design of model complexes for active sites that catalyze the formation and cleavage of O-O bonds relevant to alternative-energy and sustainability; exemplary teaching, mentoring and outreach activities; and a commitment to increase participation of underrepresented groups in science.

Javier Arce-Nazario, University of Puerto Rico, Cayey

For ambitious interdisciplinary research advancing the frontiers of sustainability science related to water issues in extreme climate conditions, and for direct involvement in bringing science into the homes of Puerto Rican people in rural areas of the La Plata watershed.

Sarah Bergbreiter, University of Maryland, College Park

For designing and fabricating ant-scale microrobots with the ability to navigate on rough terrain in order to deepen our understanding of insect biomechanics and locomotion, and for work to increase participation in this field of research.

Moises A. Carreon, Colorado School of Mines

For fundamental research aimed at developing a novel family of crystalline porous membranes for carbon dioxide capture, and for international collaboration and minority student recruitment.

Sigrid Close, Stanford University

For profound discoveries related to the effects of meteoroid impacts on the atmosphere and spacecraft, and for extraordinary informal science education and outreach efforts to bring space science to K-12 students and to the general public.

Raffaella De Vita, Virginia Polytechnic Institute & State University

For outstanding research into female pelvic floor disorders based on engineering-based knowledge on the structural and mechanical properties of associated supporting tissues.

Abigail Doyle, Princeton University

For the development of innovative approaches to incorporating fluorine into organic compounds, impacting synthetic and medicinal chemistry and chemi­cal biology, and for mentoring young scientists, broadening participation in STEM disciplines and engaging middle-school teachers and community college students.

Daniel I. Goldman, Georgia Institute of Technology

For dynamic, multi-disciplinary studies of the neuromechanics of locomotion on granular substrates, and for the creative use of robots to generate interest in and teach the principles of science to students, teachers and the public.

Joel Griffitts, Brigham Young University

For creative experimental approaches to studying the molecular negotiations between nitrogen-fixing bacteria and their plant hosts that allow productive symbiotic associations to arise, and for the Symbiosis Learning Consortium that brings undergraduates and high-school students into the research effort as key participants.

Samantha Hansen, University of Alabama

For innovative research that will provide critical constraints on the geodynamic evolution of the Antarctic continent as well as information to better constrain evolution of the Antarctic ice sheets, and for developing novel approaches to introduce underrepresented students to the geosciences.

Jeffrey D. Karpicke, Purdue University

For innovative contributions at the intersection of cognitive science and education to advance our understanding of learning and memory, for applying those insights to practical challenges in science classrooms and for ensuring that classroom teachers learn from these scientific findings. (Also nominated by the Department of Education.)

Rouslan Krechetnikov, University of California, Santa Barbara

For outstanding work combining applied mathematics, analytical mechanics and challenging experimental and theoretical fluid mechanics including geo­physics, micro-hydrodynamics and physics of complex interfaces.

Tamara J. Moore, Purdue University

For transformative research on how young students learn engineering concepts and how to integrate those practices into K-12 teacher development in order to have a transformative impact on underrepresented minority and underprivileged urban K-12 students.

Daniela A. Oliveira, Bowdoin College

For pioneering disruptive approaches to cybersecurity that are enabling transformative solutions to serious financial and economic threats, and for novel outreach activities that successfully encourage the participation of underrepresented groups in cybersecurity-related research issues.

Jonathan W. Pillow, The University of Texas at Austin

For foundational advances in probabilistic methods for understanding how populations of neurons encode and process information, and for leadership in education and broadening participation in computational neuroscience and related fields.

Benjamin Recht, University of California, Berkeley

For visionary research on scalable computational tools for large-scale data analysis and machine learning, and for initiating paradigm shifts in several related disciplines with enormous scientific and societal impact.

David Savitt, University of Arizona

For work on the p-adic Langlands program and generalizations of Serre's modularity conjecture, and for educational activities including organizing the high-school summer program Canada/USA Mathcam, running graduate workshops, and helping students from underrepresented groups reach their full potential.

Noah Snavely, Cornell University

For innovative research in developing new computer-vision algorithms for scalable 3-D reconstruction; camera location estimation from diverse unknown cameras; and innovations in STEM education.

Junqiao Wu, University of California, Berkeley

For leading-edge research on nanomaterials with phase transitions, and for creating a comprehensive program to educate students and the general public about science and nanotechnology.

Ahmet Yildiz, University of California, Berkeley

For developing state-of-the-art single-molecule approaches for visualization and quantitation of the behavior of molecular motors responsible for cellular traffic, and for outstanding outreach providing innovative educational and research activities to underrepresented groups at local charter high schools.

Note to regional reporters: For more information about, or interviews with, local winners of the Presidential Early Career Award for Scientists and Engineers, please contact the awardees' home institution or agency.


Media Contacts
Lisa-Joy Zgorski, NSF, (703) 292-8311, lisajoy@nsf.gov

Program Contacts
Mayra N. Montrose, NSF, (703) 292-4757, mmontros@nsf.gov

Related Websites
CAREER and PECASE Information: http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=503214
Presidential Early Career Awards for Scientists and Engineers: http://nsf.gov/awards/pecase.jsp

The National Science Foundation (NSF) is an independent federal agency that supports fundamental research and education across all fields of science and engineering. In fiscal year (FY) 2014, its budget is $7.2 billion. NSF funds reach all 50 states through grants to nearly 2,000 colleges, universities and other institutions. Each year, NSF receives about 50,000 competitive requests for funding, and makes about 11,500 new funding awards. NSF also awards about $593 million in professional and service contracts yearly.

 Get News Updates by Email 

Useful NSF Web Sites:
NSF Home Page: http://www.nsf.gov
NSF News: http://www.nsf.gov/news/
For the News Media: http://www.nsf.gov/news/newsroom.jsp
Science and Engineering Statistics: http://www.nsf.gov/statistics/
Awards Searches: http://www.nsf.gov/awardsearch/