Beyond the obvious lightweighting advantages, the thing that makes composites so appealing to the aerospace market is the variability that permits engineers to tailor laminates to a wide range of performance criteria. But this also makes it difficult to predict part performance. To meet that challenge, airframers have developed a robust “building block” approach to physical structural testing. It works, but it’s expensive and time-consuming, and it typically results in overly conservative structural design, especially as increasingly complex composite structures proliferate in aerospace applications.
To meet that challenge, aircraft engineers are moving from the test fixture to the computer screen. Advancements in software-based nonlinear finite element analysis (FEA) methods and, especially, the introduction of multiscale, composites-specific analysis tools have prompted dramatic growth in FE-based simulation of in-service part behavior — virtual testing — that can augment and, in many cases, offset physical testing, reducing the time and cost of development.
But what exactly is virtual testing? Jocelyn Gaudin, structure analysis R&T manager, Airbus (Toulouse, France), and Michel Mahe, the company’s head of Advanced Numerical Simulation, provide this definition: “Virtual testing is usually described as the capability to provide by simulation blind prediction of the real-world structure physical behavior. The prediction is expected to provide the structure strength value in order to ensure a proper sizing against in-service conditions. It should also demonstrate the capability to describe in-depth local effects, material progressive damage up to localized material failure, and all the cascading effects up to final structure collapse.”
Simply put, it’s a tall order, one that the composites industry and academia have been working on for many years. “We’ve been building aircraft out of aluminum for decades, and engineers have gotten to a point where they’re very confident in simulating metal structures,” explains Jerad Stack, CEO of software supplier Firehole Composites (Laramie, Wyo.). “It’s only been in the last few years that we’ve been able to transition and gain confidence in simulating composites as well.”
“Simulation of composites is being proven by developing the modeling methods that more closely correlate with the existing mechanical testing data,” explains Robert Yancey, senior director of global aerospace and marine at Altair Engineering Inc. (Troy, Mich.), makers of the trademarked HyperWorks simulation software suite. “As simulation technology and methods get closer to replicating existing test data, there will be greater confidence in the ability to use it in the future to reduce the amount of mechanical testing or at least help target the best material samples to be tested.”
Simulation impacts all levels of the testing pyramid and product development, says Kyle Indermuehle, technical lead, aerospace, for SIMULIA products at Dassault Systèmes (Providence, R.I.), the U.S. source for trademarked Abaqus unified FEA software. At the coupon level, simulation is used to quickly evaluate various materials, layup possibilities, and other attributes. At the structural detail and subcomponent level, it is used to fully understand the performance and the damage characteristics and to gain confidence that the part or vehicle will survive operational loads. And during finished structure and primary component testing, simulation is employed to more accurately assess the actual strength of aircraft test structures to improve confidence in the expensive, large-scale structural tests that are required prior to certification. These tools also are helping aircraft manufacturers better understand the causes of structural failure.
If these tools can be validated, then simulation models can be used to virtually test a host of load conditions and scenarios. Validation is now at the top of the to-do list. In its Space Technology Roadmap for Materials, Structures, Mechanical Setup, and Manufacturing (released April 2012), NASA called for development of high-fidelity analytical tools, failure prediction capabilities from both deterministic and probabilistic standpoints, and tool verification with test data to create a “new paradigm for developing and qualifying safe structures ... without undue conservatism that causes mass penalties, and in a ... cost-effective manner.” This is especially needed, NASA says, for structures “with new materials, designs with little heritage experience, and which may be complicated by multifunctional capabilities.”
A major roadblock to overcome, however, is the lack of confidence in virtual testing of high-performance composites, especially given the scarcity of structural failure data against which to benchmark.
Moving beyond black aluminum
“Composites require more sophisticated simulations than traditional materials because there are multiple failure modes to be considered,” explains Indermuehle. “With metals, engineers can simply look at a Von Mises stress criterion and have a good understanding if the part is going to fail. With composites, engineers have to look at fiber damage, matrix cracking and delamination in each of the plies. Because of this, very accurate damage models are needed in the simulation code, and physical coupon testing needs to be performed to determine both the stiffness properties and the damage properties of the composite.”
Although much research and method development has been done regarding failure theories, damage modeling and fatigue-life prediction, there is still much to be done. “Until recently, industry has consistently used heritage designs based on 30-year-old methods, using open-hole tension laminate allowables, and treating the composite as if it were a black aluminum,” reports Emmett Nelson, Firehole’s chief technology officer. Historically, this has led to extreme conservatism in critical composite structures engineered for space vehicles and commercial and military aircraft and in numerous subscale testing programs.
If engineers are to be less conservative, yet still reliable, Dr. Carl Rousseau, aeronautical engineer, Lockheed Martin Aeronautics (LM Aero, Fort Worth, Texas), believes manufacturers must “more closely control key parameters, gain a better understanding of complex failure mode interactions, and account explicitly for more stochastic effects, through a combination of testing and analysis.”
Altair’s Yancey isolates the problem: “Failure often occurs at the fiber/matrix level in the composite. Yet, from a simulation standpoint, people are generally not modeling the composite at the fiber/matrix level. Instead they’re kind of smearing the properties of the lamina or the ply into orthotropic material properties.”
Yancey points to techniques used in university labs for decades that are now more widely available as commercial tools designed to model composites at the micro-level without adding a lot of time to the simulation. Among them are Helius:MCT (from Firehole); MCQ-Composites (AlphaStar, Long Beach, Calif.); and DIGIMAT (e-Xstream Engineering SA, Louvain-la-Neuve, Belgium; now part of Santa Ana, Calif.-based MSC.Software). Because they enable modeling with greater fidelity, Yancey believes these micromechanical modeling tools will enable manufacturers to built lighter composite structures. “Still, it will take time to gain confidence in these less conservative, but more accurate methods,” he adds.
“It is a long process to transition methods developed in academia into daily use in industry,” agrees Firehole’s Nelson, but he notes, “Over the past five years, many strides have been made, and we expect to see the impact of simulation on design continue to grow over the next five years. The combination of simulation and testing can be used to create a very optimized structure.”
Analyzing at the fiber/matrix level
According to Firehole’s Stack, the Helius tool was built specifically for composites. “It’s very domain-specific and utilizes multi-scale analysis, which allows users to understand both the fiber and the matrix at their level,” he says. As previously noted, the traditional approach for a large structural analysis is to compute only homogenized composite stresses. These calculations do not represent stresses in the fiber and the matrix, but it is the fiber/matrix stresses that drive composite behavior, explains Stack.
Helius:MCT employs multicontinuum theory (MCT) to “efficiently extract average constituent stresses and strains from composite stress and strain.” MCT is a version of multiscale analysis that deals with composites at both the lamina scale and the micro scale. Readily coupled for the FE method, the Helius:MCT framework applies separate and distinct failure theories to the constituent materials individually to predict composite response. The Helius:Fatigue solution applies this framework to the problem of fatigue. It uses the constituent stresses calculated in Helius:MCT and applies the established kinetic theory of fracture — a proven, physics-based theory of polymer fatigue life — to the matrix stress in the composite. According to Stack, matrix failure is the key to predicting composite fatigue failure. “In contrast, the fiber failure planes remain nearly unchanged,” he adds, because “their strength is unaffected by cycle loading.”
Reportedly, Helius:MCT has been rigorously compared to experimental test data and validated through two World Wide Failure Exercises, several industry test programs, and years of use inside the U.S. Department of Defense (DoD). In a joint venture, the U.S. Air Force Research Laboratory (AFRL) Space Vehicles Directorate, Moog-CSA Engineering (Mountain View, Calif.) and Firehole conducted tests to benchmark failure analysis of large composite aerospace structures against rare experimental failure testing. Helius:MCT was used to assess stresses in the fiber and matrix. Failure analysis of two large space structures was conducted in an industry setting and compared to experiments. The program reportedly showed that traditional analysis techniques are often unable to fully capture a composite structures’ response to loading. But the multiscale, progressive failure analysis method reportedly provided good correlation with the experimental data. The program provides a platform for improving confidence in analysis of composite structures. If adopted, Firehole’s Nelson and Dr. Jeffry Welsh, an AFRL aerospace engineer, predict that the method will “lead to mass optimization of structures and a reduction in subscale testing, providing improved structural performance and greatly reduced development time.”
A progressive failure method based on Helius:MCT also was applied by The Boeing Co. (Seattle, Wash.) and NSE Composites (Seattle, Wash.) for residual strength evaluation of unstiffened, notched composite panels, using Abaqus (SIMULIA) finite element code. “As commercial aircraft companies are developing and testing potential designs using composite laminates, the large notched panel problem becomes cost- and time-prohibitive due to the sheer size of the panels required for experimental testing,” explains Boeing Commercial Airplanes stress analyst Kazbek Karayev, et. al., in a paper about the program (for full source information, see the note at the end of this article).
“First and foremost, the analysis method must be capable of accurately predicting the trajectory of damage evolution from its initiation at the notch tip to final failure. Accurately capturing the damage trajectory is mandatory because certain trajectories are easier to arrest in aircraft structures than others,” say Karayev, et. al. Accurate prediction of the final failure load also is important because the residual strength of the notched panel is used to determine safety factors.
Initially, 11 benchmark panels — all constructed of the same unidirectional carbon/epoxy material but with a wide range of layups, thicknesses, notch lengths, and in-plane loadings — were used to calibrate the methodology. Subsequently, an additional 23 panels, made from different unidirectional carbon/epoxy materials with a range of notch sizes and ply layup sequences, were used to validate the calibrated model.
In the end, the damage trajectories were correctly predicted in all but one of the 34 panels, and the predicted final failure loads were within 15 percent of the measured final failure loads for 32 of the 34 panels.
“The method, however, is not without limitations,” cautions Karayev, et. al. “First, as the notch size and panel size becomes larger, the predicted final failure load typically shows more deviation from the measured final failure load. Secondly, the results are mesh sensitive; thus, the methodology is developed for a characteristic mesh density and mesh structure.” Accordingly, the authors began working to develop a progressive failure methodology that reduces mesh sensitivity.
Testing and analysis go hand in hand
Benchmarking is critical as new methods for analyzing composites are incorporated into finite element codes. Ronald Krueger, senior research scientist at the National Institute of Aerospace (Hampton, Va.), under the auspices of ASTM International’s (West Conshohocken, Pa.) Committee D30 on Composite Materials, is drafting a new guide for the development of benchmark solutions for composite delamination growth analysis.
“Having the ability to predict delamination propagation, onset and growth in the commercial finite element codes used by industry will enable the use of analyses to reduce the amount of subcomponent testing now required in the building block approach for certification of damage-tolerant composite structures,” says Krueger. Like many, he believes that combining testing and analysis, rather than testing alone, will speed the development of composite structure design and certification.
“We’re definitely seeing a push for reduction in engineering cycle time,” explains Dr. Olivier Guillermin, director of product and marketing strategies at Siemens PLM Software (Plano, Texas). Siemens now supplies FiberSIM software for composites design, which was developed by recently acquired VISTAGY Inc. (Waltham, Mass.), and offers its own NX CAE software for simulation. “In composites, we design the material as much as we design the part, so the design phase of composites is much more complex than with metals. With so many variables at play ... simulation allows manufacturers to focus on the best options as opposed to spending money on physical testing of too many options. Historically, it’s been about validation, but now it is more about optimization.”
New software tools, like Altair’s FE-based OptiStruct, are focusing on optimization. “Analysis tools help you know how much material you need ... to meet the structural requirements,” says Altair’s Yancey. “OptiStruct gives you insight into where you don’t need material. It helps users determine where thinner laminates are appropriate or where an open structure or hole can be placed to be able to save weight.”
Getting the method off the ground
Yancey believes that virtual testing will see its widest implementation in the highly competitive commercial aircraft market. “The aerospace industry moves cautiously, so increases in the use of simulation will proceed at a measured pace,” he says. “But, over the next 10 years, there will be an increased confidence in the use of simulation.”
To that end, the European Commission is funding MAAXIMUS (More Affordable Aircraft structure through eXtended, Integrated and Mature nUmerical Sizing), a consortium that is coordinating development of a physical platform to develop and validate composite technologies for low-weight aircraft and a virtual structural-development platform for faster identification and earlier validation of the best solutions. The project began in April 2008 with 57 partners from 18 countries. Airbus’ Gaudin is the MAAXIMUS project coordinator.
The consortium’s working theory is that a significant step forward in simulation-based design will enable faster development and “right-first-time” validation of an optimized composite fuselage. With high-fidelity modeling and confidence in virtual assessment of structural behavior during the numerical optimization process, the group hopes to reduce development cycle times — from preliminary design through full-scale testing — by 20 percent and cut related nonrecurring costs by 10 percent. The group also is assessing a new certification philosophy, based on virtual testing.
In a program update last year, Gaudin reported high confidence in virtual testing. To date, new solvers for large nonlinear barrel problems have been developed, and models used to simulate the fiber placement process and expected defects have been initiated without validation. “Virtual testing will be a major asset to freeze a trouble-free design earlier than today. It will provide more mature aircraft at entry into service, with fewer service bulletins or post-entry-into-service modifications” — a key to airline satisfaction.
LM Aero fellow Dr. Steve Engelstad believes the future will hold an “optimum mix of building block testing and analysis.” He says, “Today’s structural analysis software tools have the capability to analyze at extensive levels of fidelity and detail, however, the analyst must always balance criticality of the structural detail with the cost of the analysis.” He adds, “Most high-fidelity modeling is very time-consuming and thus expensive.” Engelstad says that progressive damage methods have not yet received enough verification and validation within LM Aero to be used for structural analysis.
“Since actual testing is so expensive, it would be great if we could do more in the virtual realm, but I think we’re way too early to see if anything significant is going to come from it,” warns Don Adams, president, Wyoming Test Fixtures (Salt Lake City, Utah). “In general, I don’t think we want to do less testing. Many people would agree, in fact, that we want to do more testing, but it’s cost-prohibitive to do so. We have to draw the line somewhere, and I think what we have right now is probably a reasonable compromise between the luxury of doing more testing and getting ourselves into trouble for not doing enough testing.”
Boeing’s Adam Sawicki, ASTM D30 chairman adds, “If we were making the same sorts of aircraft structures that we were years ago, and we were looking for the same levels of performance, then I think we would be seeing less testing and more analyses. However, given that we’re continually looking to improve performance and push toward new architectures and new material configurations, there will always be a necessity for new validation of models.” As more novel designs are conceived, and materials are applied more aggressively, this “mandates a certain level of testing,” he adds.
It remains to be seen how much virtual testing will impact composite structure development and certification. “In terms of part tolerance and variation, we are still arguably about 10 times less accurate with composites than with metals,” estimates Guillermin. But in the short term, most people agree that virtual testing will enable composite manufacturers to save time and money. “We’re certainly using more complex models today, and using simulation to identify critical load cases and things of that nature,” Sawicki sums up, “so even when we’re having to move forward and conduct test validation, we’re using simulation to test with more intelligence.”
Note: Quoted material attributed to Kazbek Karayev, et. al., is taken from “Residual strength evaluation of typical aircraft composite structures with a large notch,” Kazbek Z. Karayev, Pierre J. Minguet, Sangwook Lee, Vladimir Balabanov and Nav Muraliraj, The Boeing Co. (Seattle, Wash.); Thomas H. Walker, NSE Composites (Seattle, Wash.); and Emmett E. Nelson and Don Robbins, Firehole Composites (Laramie, Wyo.). Published by the American Institute of Aeronautics and Astronautics Inc. (Reston, Va.), with permission from The Boeing Co., 2012.