Archive for Materials Genome Initiative
You are browsing the archives of Materials Genome Initiative.
You are browsing the archives of Materials Genome Initiative.
The launch of the DOE’s $120 million Critical Materials Institute (nee “Hub”), the fifth energy innovation-oriented “integrated research center” initiated by the Obama administration, strikes me as mainly a balanced approach to the complex issue of securing for the United States adequate supplies of unique raw materials that are crucial to clean energy applications, electronics and other advanced applications.
By “balanced,” I mean an approach that appreciates that there are a lot of dynamic forces that affect the certainty–uncertainty of the supplies of, for example, rare earth elements. Besides science and engineering issues, these forces include geology, environmental considerations and mining technology, geopolitics, economics and business considerations, and educational planning. In other words, it is an approach that understands that neither bashing China nor “mine, baby, mine” are, by themselves, realistic long-term strategies.
Although the official announcement of the creation of the five-year CMI project was made last week, the concept of such a hub goes back many months and, as Eileen reported last March (see “DOE to hold Critical Materials Energy Innovation Hub workshop“), the DOE has been actively engaging the science and business communities for some time to discuss missions for such a hub and who should best lead it.
As it turned out, the DOE feels its Ames Laboratory is in the best position to lead what is meant to be a broad collaborative effort that encompasses federal and private labs, universities, and private industry.
In describing this hub, the DOE sketches out four “focus areas” or missions: develop substitutes, improve reuse and recycling, conduct crosscutting research, and diversify supply. As the administration’s point man on the Materials Genome Initiative, Cyrus Wadia, puts it, the purpose of the CMI is “to find solutions that can be applied at all stages of a material’s “life cycle”—from new ways to access it at the source, to better ways to recycle and reuse it after it has served its primary functions.”
In the above video, Alex King, director of the Ames Lab and designated leader of the CMI effort, describes the one particular goal slightly differently. Besides diversifying supplies and developing substitute materials and tools for recycling, King says the fourth goal is “forecasting.” He describes the latter as “trying to figure out what materials might become critical in the future” and acknowledges that such forecasting is “important, but a little bit different for a national lab.”
The devil, of course, is in the details. For example, in their article that appeared in the April 2012 Bulletin (“Issues of Scarce Materials in the United States”), Stephen Freiman and Lynette Madsen laid out the most comprehensive critical materials agenda I’ve seen, and its not clear (yet) how well the 17 goals they delineated match up with the directions charted for the CMI. Some, such as Freiman’s and Madsen’s call to “continue trade agreements with foreign sources to minimize supply disruption risk,” are appropriately outside the scope of CMI. Others, such as their suggestions to “support efforts to collect data on sources of critical elements” and “improve [acquisition] data exchange and analysis mechanisms” will be addressed by CMI participants, one hopes.
But, given that a national critical materials strategy also must encapsulate trade and policy issues, its only fair that CMI be seen as one crucial part of such a strategy, and not a panacea for the US’ interests.
Unfortunately, the trade and policy issues seem to be on the back burner while other political battles are being fought.
Another vital issue Freiman and Madsen raise is getting too little attention: workforce development and the lack of trained mining and mineral processing personnel, specialists in geology and other geosciences and specialists in related sciences and engineering fields that are trained in sustainability, beneficiation, processing, and recycling. If not directly, I think the CMI will provide some new impetus for making progress on this topic.
In summary, the new CMI seems to be a multidimensional step in the right direction, but the nation’s large critical materials strategy cannot be solve by technocracy alone. I don’t think the White House or Congress are oblivious to this, and interagency tasks forces have been meeting on this. The nation needs to have the CMI be an inspiring starting point and not an orphan in the US’ strategic material plans.
Here are some other Ames Lab videos about the critical materials research:
Check ‘em out:
The two-dimensionality and structural flatness make graphene films ideal candidates for thin film devices and combination with other semiconductor materials. In this work, vertical light emitting diodes (VLEDs) with highly reflective membrane as current blocking layer and graphene transparent conductive layer have been fabricated and characterized. VLEDs show improved optical output and efficiency droop due to the current spreading effect of the hybrid electrode by preventing current crowding under the top electrode and increasing the internal and external quantum efficiency.
[Materials Views] By showing that tiny particles injected into a liquid crystal medium adhere to existing mathematical theorems, physicists at the University of Colorado Boulder have opened the door for the creation of a host of new materials with properties that do not exist in nature. The findings show that researchers can create a “recipe book” to build new materials of sorts using topology, a major mathematical field that describes the properties that do not change when an object is stretched, bent or otherwise “continuously deformed.” Published in the journal Nature, the study also is the first to experimentally show that some of the most important topological theorems hold up in the real material world, said CU-Boulder physics department’s Ivan Smalyukh, a study senior author. Once injected into a liquid crystal, the particles behaved as predicted by topology. “Our study shows that interaction between particles and molecular alignment in liquid crystals follows the predictions of topological theorems, making it possible to use these theorems in designing new composite materials with unique properties that cannot be encountered in nature or synthesized by chemists. These findings lay the groundwork for new applications in experimental studies of low-dimensional topology, with important potential ramifications for many branches of science and technology.” The research supports the goals laid out by the White House’s Materials Genome Initiative, Smalyukh said, which seeks to deploy “new advanced materials at least twice as fast as possible today, at a fraction of the cost.”
Like a fine wine or aged cheese, ultrastable glass takes a long time to make, needs special conditions and is considered quite valuable. Unfortunately, manufacturers who want to take advantage of the strengths of ultrastable glass don’t have the luxury of waiting hundreds of years for it to develop. While exploring ways to create this valuable material on a shorter timetable, researchers from the University of Wisconsin-Madison have gained some key insights into the bizarre structure of glasses as well as how it affects their properties. “In attempts to work with aged glasses, for example, people have examined amber,” says study coauthor Juan de Pablo, a University of Chicago professor of molecular theory. “Amber is a glass that has been aged millions of years, but you cannot engineer that material. You get what you get.” In many laboratories, scientists use a technique called vapor deposition to create specialized materials. Previous research by another of the new study’s coauthors, Mark Ediger, found that glasses grown in this manner—within a certain temperature range and on a specially prepared surface - are far more stable than ordinary glasses. Ediger determined that in order to achieve this degree of stability, molecules in the glass are arranged in a tightly packed manner like the multi-shaped objects in the popular videogame Tetris.
Researchers are aiming to develop a new class of materials with remarkable properties using one atom-thick substances such as graphene in a new collaborative project. The proposal, which will involve researchers from the Universities of Manchester, Cambridge, and Lancaster, has been awarded €13.4 million to form a “Synergy Group” by the European Research Council. It will aim to utilize two-dimensional substances, such as graphene, to engineer new types of materials that are just a few atoms thick, but nevertheless have the power to revolutionize the future development of devices such as solar cells, and flexible and transparent electronics. Starting with one atom-thick substances, which possess remarkable properties, the group will focus on ways in which they can be layered up to form “heterostructures.” These heterostructures will still be just a few atoms thick, but will combine the properties of the different two-dimensional materials which comprise them, effectively enabling developers to embed the functions of a device into its very fabric. For example, the research team envisage combining an atomic layer which functions as a sensor, with layers that function variously as an amplifier, transistor, or solar cell, for power generation. The resulting material, still just a few atomic layers in thickness, would be capable of running a whole circuit.
Batteries for Norfolk Southern Railway No. 999, just like automotive batteries, are rechargeable until they eventually die. A leading cause of damage and death in lead-acid batteries is sulfation, a degradation of the battery caused by frequent charging and discharging that creates an accumulation of lead sulfate. In a recent study, the researchers looked for ways to improve regular battery management practices. The methods had to be nondestructive, simple, and cheap—using as few sensors, electronics, and supporting hardware as possible while still remaining effective at identifying and decreasing sulfation. ”We wanted to reverse the sulfation to rejuvenate the battery and bring it back to life,” says Christopher Rahn, professor of mechanical engineering at Penn State. Rahn, along with mechanical engineering research assistants Ying Shi and Christopher Ferone, cycled a lead-acid battery for three months in the same way it would be used in a locomotive. They used a process called electroimpedance spectroscopy and full charge/discharge to identify the main aging mechanisms. Through this, the researchers identified sulfation in one of the six battery cells. They then designed a charging algorithm that could charge the battery and reduce sulfation, but was also able to stop charging before other forms of degradation occurred. The algorithm successfully revived the dead cell and increased the overall capacity.
Modern information processing allows for breathtaking switching rates of about a 100 billion cycles per second. New results from Ferenc Krausz’s Laboratory for Attosecond Physics of the Max Planck Institute of Quantum Optics, Garching, Germany, and Ludwig-Maximilians-Universität, Munich, could pave the way towards signal processing several orders of magnitude faster. In two groundbreaking complementary experiments a collaboration led by LAP-physicists has demonstrated that, under certain conditions, ultrashort light pulses of extremely high intensity can induce electric currents in otherwise insulating dielectric materials. Furthermore, they provided evidence that the fast oscillations of the electric field instantly alter the electrical and optical properties of the material, and that these changes can be reversed on a femtosecond time scale. This opens the door for signal processing rates reaching the petahertz domain, about 10,000 times faster than it is possible with the best state-of-the-art solid state microchips. The experiments were carried out in close cooperation with the theoretical group of Mark Stockman, Georgia State University.
Although nanoparticles with exquisite properties have been synthesized for a variety of applications, their incorporation into functional devices is challenging owing to the difficulty in positioning them at specified sites on surfaces. To develop a materials-general method for synthesizing nanoparticles on surfaces for broader applications, a mechanistic understanding of polymer-mediated nanoparticle formation is crucial. A Northwestern University group, led by Chad A. Mirkin, has designed a four-step synthetic process that enables independent study of the two most critical steps for synthesizing single nanoparticles on surfaces: phase separation of precursors and particle formation. Using this process, they have elucidate the importance of the polymer matrix in the diffusion of metal precursors to form a single nanoparticle and the three pathways that the precursors undergo to form nanoparticles. Based on this mechanistic understanding, the synthetic process is generalized to create metal (Au, Ag, Pt, and Pd), metal oxide (Fe2O3, Co2O3, NiO, and CuO), and alloy (AuAg) nanoparticles. This mechanistic understanding and resulting process represent a major advance in scanning probe lithography as a tool to generate patterns of tailored nanoparticles for integration with solid-state devices.
The stated goal of the Materials Genome Initiative (pdf) is “to double the speed at which we discover, develop and manufacture new materials.” The goals are clear, but how to tackle them is challenging. MGI will draw on the concerted efforts of academia, manufacturers, federal funding agencies and national labs. Meanwhile, each of those constituencies must remain true to their missions, and there are often dependencies between them, for example, between academia and federal funding agencies.
Also, as MGI’s White House point man, Cyrus Wadia, explained in this interview with us last year, the idea is for MGI to evolve in a grass roots manner, not bureaucratically in a top-down way. Since it was announced in June 2011, the MGI has transitioned from a twinkle in the eye of the White House OSTP to a multi-agency initiative taking its first toddling steps. And, indeed, with the first drops from the funding tap flowing from diverse funding agencies, it looks like the concept is working.
Even so, getting the materials science community’s collective arms around MGI is not so easy. NIST, as the nation’s data and standards experts, are naturally positioned to take a leadership role in defining the issues and guiding the development of a “materials innovation infrastructure.”
NIST embraced that challenge/opportunity and last May convened a workshop—”Building the Materials Innovation Infrastructure: Data and Standards”—to help define the “cross-cutting and domain-specific data challenges” that need to be overcome.
The workshop convened 125 stakeholders from academia, federal agencies, national labs, industry and professional societies. Most, although not all, participants came from United States organizations. In November, the agency released its summary report summarizing the outcome of the exercise to evaluate the status of the Materials Innovation Infrastructure and identify gaps and opportunities. It also outlines the process the group used to attack the issue.
Lead author of the report and NIST scientist, Jim Warren explained in a phone interview that the MII is “about lowering the barrier to entry for manufacturers” to accelerate materials innovation. He says some disciplines are way ahead on developing a data infrastructure, for example, with regard to data sharing and quality, computer codes, metrics, etc. “We want to harvest the best practices and use them where it makes sense for materials science,” he says.
Warren likens the coalescence of the MII to the birth and growth of the Internet “superhighway.” He says, “We are all willing to pay a small cost for access to the Internet, which makes our life better. The MII imagines something similar for materials data.”
Another way the MII compares to an established infrastructure model is the US roads and highways system. Some roads are owned at the federal level, some at the state or local level and some other roads are private. Envision the emerging MII as having a similar mix of owners, access points, etc.
At the workshop, the participants were charged with assessing data infrastructure in four areas: data representation and interoperability, data management, data quality and data usability. To provide a framework for the discussion, participants considered the four areas in the context of two broad topics: length scale challenges and technical applications.
Length scale challenges fall into two categories: challenges relating to the mathematics of the scale and crossing regimes, and challenges relating to the computing power needed to perform the calculations.
For example, phase field methods are used to model microstructure development in the nanometer to micron range. However, microstructure development arguably can be modeled also at the crystal lattice scale with approaches such as density functional theory. Finding the mathematics that transitions between them is something like finding a clutch that can shift between first gear and fourth gear.
Also, it does not take long to peg the computing power, according to Warren. Say, for example, you want to model one millimeter of a solidification interface. By modeling conditions every 10 angstroms, normal to and along the interface, the computation very quickly generates terabytes of data.
The workshop participants divided length scales into different regimes: macro, micro, nano and molecular lengths, and atomic lengths. Within each of these, challenges were prioritized and categorized as short-term or long-term, based on whether their impacts could make a difference in less than five years or more than five years. For example, in the area of data representation and interoperability, participants identified the “definition of data or metadata for particular applications with standards for software to facilitate linkage,” as both a high priority and something achievable in the short term.
The organizers of the workshop also recognized that new materials developed in the MGI construct will be developed with specific applications in mind. Thus, they considered four possible “technical application areas”: electrochemical storage, high-temperature alloys, catalysis and lightweight structural materials. They selected these TAAs because, broadly speaking, they represent areas that are positioned well to adapt an MGI approach. Warren notes, however, that the four TAAs are only representative examples, and workshop organizers and participants acknowledge that there are many other applications that could have been considered instead.
In both contexts, the key metric considered was the amount of time that could be saved if the data challenges were eliminated; cost issues were not addressed directly.
Finally, the report calls out cross-cutting challenges that impact anybody involved in materials design, such as community leadership, data sharing, computational validation, etc.
According to Warren, MGI leaders are optimistic that the initiative will begin to yield benefits as early as 2013, with the rollout of prototype solutions and virtual communities. Contact Warren via email to learn more about MGI and related activities.
We have written numerous time in the past about the Materials Genome Initiative, which was launched in June 2011. Just to review, the MGI sets the goal of cutting in half the current time and cost of bringing new materials from the lab to the marketplace. At the federal level the MGI is viewed as an interagency effort that aims to accelerate the discovery-to-deployment timeline by creating a materials innovation infrastructure that will more closely integrate experimental tools, computational tools and digital data.
In support of MGI, this year NSF launched its “Designing Materials to Revolutionize and Engineer our Future” — a program to fund proposals that go beyond simple and more traditional collaborations among theorists, computational experts and experimentalists. Instead, the focus of the DMREF is on iterative processes, in which data drive theory and simulation, and theory and simulation drive experiments.
Earlier this year (for FY 2012), NSF organized the first DMREF competition, during which the agency reviewed approximately 140 discrete projects. Ultimately, the agency announced Oct. 11 that it is issuing $12 million in awards for 22 projects (many of which were collaborative) to fund 14 distinct efforts: seven projects based in the Directorate for Mathematical and Physical Sciences and seven based in the Directorate for Engineering, at a total investment in excess of $12 million.
The good news is that support for DMREF is continuing. NSF today announced that it is holding a similar competition for FY 2013 awards. Although the amount of available funding isn’t certain, the president’s 2013 budget request to Congress included $35 million for DMREF. Meanwhile, the NSF has posted a “Dear Colleague Letter” detailing its interests and the process for submitting FY 2013 proposals. This letter notes the NSF seeks research
“…to advance fundamental understanding of materials across length and time scales to elucidate the effects of microstructure, surfaces and coatings on the properties and performance of engineering materials. The ultimate goal is to control material properties through design via the establishment of computational interrelationships between composition, processing, structure, properties, performance and process control, validated and verified through measurement and experimentation. This requires new data analytic tools and statistical algorithms; advances in predictive modeling that leverage machine learning, data mining and sparse approximation; data infrastructure that is accessible, extensible, scalable, and sustainable; and new collaborative capabilities for managing large, complex, heterogeneous, distributed data supporting materials design, synthesis, and longitudinal study. …the proposed research must be a collaborative and iterative process where computation guides experiments and theory, while experiments and theory advance computation. The proposal should provide a plan for enhanced data management that ensures transparency, data-sharing and open source software.”
Regarding NSF’s desire mentioned above for collaborative proposals, the agency says that proposals are expected to be collaborative and iterative; as such, some divisions are particularly interested in receiving proposals from small groups or teams of faculty.
Several additional divisions say they intend to participate in the FY 2013 DMREF awards. The full list of divisions now includes: Chemistry; Materials Research; Chemical, Bioengineering, Environmental and Transport Systems; Civil, Mechanical and Manufacturing Innovation; Electrical, Communications and Cyber Systems; and all divisions in the Directorate for Computer & Information Science & Engineering.
The window for submitting FY 2013 DMREF proposals is Jan. 15–Feb. 15, 2013.
NSF says that participants interested in submitting proposals are strongly encouraged to first contact a program officer at the appropriate division (listed in the DCL). Likewise, final proposals should be submitted to individual programs, where they will be reviewed separately by each division or co-reviewed when appropriate.
Caution: Submitters should adhere to the guidelines listed in the latest version of NSF’s Grant Proposal Guide, which goes into effect on Jan. 14, 2013.
DMREF is part of the OneNSF investment in Cyber-Enabled Materials, Manufacturing, and Smart Systems, which plays a key role in NSF’s growing portfolio of advanced manufacturing investments.
With the election behind it, the Obama administration appears to be quickly returning to a major emphasis on energy and “materials genome” related research.
First, according to a new story coauthored by Cyrus Wadia, the Office of Science and Technology Policy’s point person for the Materials Genome Initiative, $25 million worth of grants recently announced by NSF and DOE will directly impact the initiative and are “a significant milestone” for the project.
Wadia, and coauthor Meredith Drosback, a TMS fellow at OSTP, highlight seven particular MGI-related projects that received the new awards:
• A new Lawrence Berkeley National Laboratory/MIT software center focused on computer simulations to rapidly prototype lithium ion battery electrolyte candidates;
• A University of Washington/GM collaboration to model thermoelectric materials to add efficiencies to next-gen auto engines;
• A University of Michigan center to create a suite of software tools to predict the behavior of magnesium alloys in lightweight vehicles;
• A University of Minnesota center to develop computer algorithms for the design of porous materials aimed at delivering advanced utility-scale carbon capture and sequestration technologies;
• A collaboration between the Universities of Pennsylvania and Delaware to create models to predict and assemble efficient and low-cost solar energy biomaterials;
• A project by researchers from University of Virginia and University of Alabama Tuscaloosa that will model, synthesize, and test new materials to be incorporated into circuits for faster computer memory;
• A research network between University of Illinois and Oak Ridge, Sandia, Argonne and Lawrence Livermore National Labs to develop computer code to better predict the behavior and performance of catalysts, semiconductor and related materials.
We will be working on providing more links to these projects. In the meantime, we are also working on an a related story for next week about the new report on the results of a NIST workshop held in May on the topic of Building the Materials Innovation Infrastructure: Data and Standards.
Also, the administration announced today that it is investing $120 million over the next five years in a new Joint Center for Energy Storage Research that will be led by Argonne National Lab. JCESR also will involve several universities, other federal labs and a handful of private sector partners. Participants include Lawrence Berkeley National Lab, Pacific Northwest National Lab, Sandia National Labs, SLAC National Accelerator Lab, Northwestern University, University of Chicago, University of Illinois-Chicago, University of Illinois-Urbana Champaign, University of Michigan, Dow Chemical, Applied Materials, Johnson Controls and Clean Energy Trust.
According to a DOE news release, JCESR will focus on “advancements in batteries and energy storage technology are essential for continued efforts to develop a fundamentally new energy economy with decisively reduced dependence on imported oil. Improved storage will be vital to fully integrating intermittent renewable energy sources such as wind and solar into the electrical grid. It will also be critical to transitioning the transportation sector to more flexible grid power.” The hub, formerly known as the Batteries and Energy Storage Hub, is supposed to address
• Efficacy of materials architectures and structure in energy storage;
• Charge transfer and transport;
• Multi-scale modeling; and
• Probes of energy storage chemistry and physics at all time and length scales.
JCESR is DOE’s fourth Energy Innovation Hub. DOE is planning a fifth hub dedicated to “critical materials” research. The agency is still accepting and evaluating proposals for this hub.