Escolar Documentos
Profissional Documentos
Cultura Documentos
Contents
Industry Spotlight
6
Departments
Editorial - To Collaborate,
You Need People
2 3 16 18 25 26 33 36 40
Industry News - Recent Announcements and Upcoming Events Software Profile - The New Face of
ANSYS ICEM CFD
Features
10
14
FEA in Micro-Robotics
Researchers use ANSYS to develop micron-sized, selfpowered mobile mechanisms
28
For ANSYS, Inc. sales information, call 1.866.267.9724, or visit www.ansys.com on the Internet. Go to www.ansyssolutions.com/subscribe to subscribe to ANSYS Solutions.
Editorial Director John Krouse jkrouse@compuserve.com Managing Editor Jennifer L. Hucko jennifer.hucko@ansys.com Designers Miller Creative Group info@millercreativegroup.com Art Director Paul DiMieri paul.dimieri@ansys.com Ad Sales Manager Ann Stanton ann.stanton@ansys.com Circulation Manager Elaine Travers elaine.travers@ansys.com Editorial Advisor Kelly Wall kelly.wall@ansys.com CFD Update Advisor Chris Reeves chris.reeves@ansys.com
ANSYS Solutions is published for ANSYS, Inc. customers, partners, and others interested in the field of design and analysis applications.
The content of ANSYS Solutions has been carefully reviewed and is deemed to be accurate and complete. However, neither ANSYS, Inc., nor Miller Creative Group guarantees or warrants accuracy or completeness of the material contained in this publication. ANSYS, ANSYS DesignSpace, CFX, ANSYS DesignModeler, DesignXplorer, ANSYS Workbench Environment, AI*Environment, CADOE and any and all ANSYS, Inc. product names are registered trademarks or registered trademarks of subsidiaries of ANSYS, Inc. located in the United States or other countries. ICEM CFD is a trademark licensed by ANSYS, Inc. All other trademarks or registered trademarks are the property of their respective owners. POSTMASTER: Send change of address to ANSYS, Inc., Southpointe, 275 Technology Drive, Canonsburg, PA 15317, USA 2004 ANSYS, Inc. All rights reserved. 2004 ANSYS, Inc. All rights reserved.
www.ansys.com
ANSYS Solutions
Summer 2004
Editorial
www.ansys.com
ANSYS Solutions
Summer 2004
Industry News
Recent Announcements
EASA 3.0 - The New Standard for Efficient Application Development EASA enables ultra-rapid creation and deployment of Web-enabled applications that can drive most applications, including ANSYS and CFX. EASA also can be used to integrate several tools, thus automating processes involving say CAD, FEA and even in-house codes. EASA is available as a software product to author and publish your own custom applications. Alternatively, several ASDs are now using EASA to create turnkey applications to your specification as a service. New features in EASA 3.0 include: Connectivity to Relational Databases such as SQL Server and Oracle, and with database applications such as ERP, CRM and PLM systems. Improved Security for Internet Use using Secure Socket Layer (SSL) technology, enabling you to host applications for use over the Internet. Multi-Language EASAPs create your app in your language, and users see it in their preferred language. Character sets supported include Roman, Chinese, Japanese, Russian and Arabic. New parametric study and optimization capabilities New API EASAs differentiator has always been to allow non-programmers to create professionalgrade Web-enabled applications around their underlying software. Now an API allows EASA authors who have programming skills to create applications at the next level by using custom code. For more information, visit www.ease.aeat.com.
Vision and strategy set the theme for the general session. Kicking off the conference with a welcome address, ANSYS president and CEO, Jim Cashman, set the stage for keynote speaker, Brad Butterworth of Team Alinghi. As the cunning strategist aboard the Team Alinghi yachts, Brad shared his experience and discussed how the Americas Cup winner is using ANSYS integrated simulation solutions to defend its title in the 2007 competition. After the morning break, ANSYS presented its Technology Roadmap, the companys successful, ongoing strategy for integrating the power of the entire ANSYS, Inc. family of products into the ultimate engineering simulation solution. Then, Bruce Toal, director of Marketing and Solutions, High Performance Technical Computing Division at Hewlett-Packard Company, spoke about the company s Adaptive Enterprise for Design and Manufacturing. Following a day of technical and general sessions, and visiting exhibitor booths, attendees enjoyed a conference social sponsored by Hewlett-Packard Monday evening. Standing ovations and triumphant applause echoed throughout the ballroom during the social as ANSYS president and CEO, Jim Cashman, presented Dr. John Swanson, ANSYS founder, with an award for being the recipient of the 2004 AAES John Fritz Medal. ANSYS long-standing partners and its key customers took to the podium for the Tuesday general session. LMS International s Tom Curry, executive vice president and chief marketing officer, spoke about the product creation process. Tom guides the companys growth in predictive computer-aided engineering, physical prototyping and related services. Herman Millers Larry Larder, director of engineering services, discussed how they use ANSYS simulation technologies to experiment and innovate in the office furniture industry.
2004 International ANSYS Conference Hailed Success Engineering professionals from throughout the world gathered at the Hilton Pittsburgh in May for the 2004 International ANSYS Conference to discover the true meaning behind what it is to Profit from Simulation.
www.ansys.com
ANSYS Solutions
Summer 2004
Industry News
SGI s director of product marketing, Shawn Underwood, presented future of high performance computing followed by Dr. Paresh Pattani, director of HPC and Workstation Applications at Intel Corporation who focused on the paradigm shift in high performance computing. Jorivaldo Medeiros, technical consultant at PETROBRAS, offered his ANSYS success story on how the company drives development and innovation in equipment technology. In addition, ANSYS became the first engineering simulation company to solve a 111 Million Degrees of Freedom structural analysis model. After lunch, the Management Track addressed strategies on how to implement new technologies and explain the benefits of engineering simulation to management.
of quality products to market, users have faced major challenges to realizing the full value. For example, hardware and software limitations have historically made realistic simulations elusive when realism involves highly detailed models and complex physical behavior. Manufacturers are looking for more accurate, large system simulations to improve their time-to-money, said Charles Foundyller, CEO at Daratech, Inc. This announcement means that users now have a clear roadmap to improved productivity. As hardware advances in speed and capacity, ANSYS is committed to being the leader in developing CAE software applications that take advantage of the latest computing power. This leadership provides customers with the best engineering simulation tools for their product development process to help achieve better cost, quality and time metrics. This powerful new offering from ANSYS speaks to its commitment to develop and deliver the best in advanced engineering solutions. In turn, ANSYS has entered into a three-year partnership with SGI to advance the capabilities of ANSYS in parallel processing and large memory solutions.
ANSYS Breaks Engineering Simulation Solution Barrier ANSYS, Inc. has become the first engineering simulation company to solve a structural analysis model with more than 100 million degrees of freedom (DOF), making it possible for ANSYS customers to solve models of aircraft engines, automobiles, construction equipment and other complete systems. In a joint effort with Silicon Graphics, Inc. (SGI), the 111 million DOF structural analysis problem was completed in only a few hours using an SGI Altix computer. DOF refers to the number of equations being solved in an analysis giving an indication of a models size. ANSYS ability to solve models this large opens the door to an entirely new simulation paradigm. Prior to this capability, a simulation could be conducted only at a less detailed level for a complete model or only at the individual component level for a detailed model. Now, it will be possible to simulate a detailed, complete model directly; potentially shortening design time from months to weeks. Equally important, having a high fidelity comprehensive model can allow trouble spots to be detected much earlier in the design process. This may greatly reduce additional design costs and can provide an even shorter time to market, said Jin Qian, senior analyst at Deere & Company Technical Center. According to Marc Halpern, research director at Gartner, although simulation accelerates the delivery
Safe Technology Incorporates AFS Strain-Life Cast Iron Database in fe-safe Safe Technology Ltd has been granted a license to use the AFS cast iron database from the research report Strain-Life Fatigue Properties Database for Cast Iron in its state-of-the-art durability analysis software suite for finite element models, fe-safe. Safe Technology Ltd is a technical leader in the design and development of durability analysis software that pushes the boundaries of fatigue analysis software to ensure greater accuracy and confidence in modern fatigue analysis methods for industrial applications. The availability of the AFS database within fe-safe ensures that users will have access to the most up-to-date and accurate cast-iron materials data for their durability analyses. The AFS Ductile Iron and the Gray Iron Research Committees have developed a Strain-Life Fatigue Properties Database for Cast Iron. This database represents the capability of the domestic casting industry and is available as a special AFS publication. It is the culmination of a five-year effort in partnership
www.ansys.com
ANSYS Solutions
Summer 2004
with the DOE Industrial Technology Program. The scope of this information includes 22 carefully specified and produced castings from ASTM/SAE standard grades of irons, including Austempered Gray Iron (AGI) (specification is under development). Each grade is comprehensively characterized from an authoritative source with chemical analysis, microstructure analysis, hardness tests, monotonic tension tests and compression tests. This information is contained in user-friendly digital files on two CD-ROMs for importing into computer aided design software. AFS Publications are described online at www.afsinc.org/estore/. For more information, visit www.safetechnology.com
Autodesk is proud to be working with an industry innovator like ANSYS, said Robert Kross, vice president of the Manufacturing Solutions Division at Autodesk. This reinforces our commitment to deliver proven and robust technologies to manufacturers, in order to help them deliver better quality products and bring them to market faster. Inventor Pro 9.0 will make simulation (CAE) functionality available to a broader mechanical design community, while protecting customers business investment by seamlessly integrating with other high-end ANSYS offerings. Our customers will surely benefit from this relationship. The total solution will help product development teams make more informed decisions earlier in the design process, allowing them to reduce costs and development time while designing better and more innovative products. This new offering from Autodesk will be viewed very strategically by their customers. As they deploy simulation tools throughout their product design process, the Autodesk-ANSYS offering will be a key component to a customers overall simulation strategy, said Mike Wheeler, vice president and general manager of the Mechanical Business Unit at ANSYS. ANSYS is proud to be part of the design effort to create this next generation tool as part of our overall ANSYS Workbench development plan.
Product Development Platform Will Simulate and Optimize Design Performance for Autodesk Inventor Professional Customers Autodesk will license ANSYS simulation technologies and package them as an integral part of the Autodesk Inventor Professional 9.0 product and future releases. Powered by ANSYS part-level stress and resonant frequency simulation technologies, Autodesk Inventor Professional 9.0 will enable design engineers to create more cost-effective and robust designs, based on how the products function in the real world, by facilitating quick and easy what-if studies right within the softwares graphical user interface.
Upcoming Events
Date August 29-September 3 September 5-8 September 6-9 September 7-8 September 19-20 September 21-22 September 22-25 September 29-30 September 29-30 September 28 - October 2 October 4 October 4-5 October 12 October 13 October 20 October 28-29
www.ansys.com
Event ICAS 2004 RoomVent 2004 17th International Symposium on Clean Room Technology European UGM for Automotive Applications Radtherm User Conference German Aerospace Congress 2004 Numerican Analysis and Simulation in Vehicle Engeineering 3rd International Symposium on Two-Phase Flow Modeling and Experimentation Calculation & Simulation in Vehicle Building Pump Users Intarnational Forum 2004 ASME DETC/CIE Conference 2004 PLM European Event DaratechDPS ANSYS Multiphysics Seminar Construtec Conference ANSYS 9.0 Update Seminar ANSYS User Conference
Location Yokohama, Japan Coimbra, Portugal Bonn, Germany Neu-Ulm, Germany Dresden, Germany Troy, Michigan, USA Pisa, Italy Wurzburg, Germany Karlsruhe, Germany Salt Lake City, Utah, USA UK Novi, MI Sweden Spain Sweden Mexico
ANSYS Solutions
Summer 2004
Industry Spotlight
Industry Spotlight:
Scale-up, to extrapolate a process from laboratory and pilot plant scale, to the industrial plant scale, which may require many millions of dollars investment. Process intensification, to combine different processes into smaller compact and efficient units, instead of treating them as individual processes. Retrofitting, to upgrade a plant to become more efficient, within the many constraints of the existing footprint of the plant. This issue of ANSYS Solutions provides examples of these, in offshore oil production, waste water treatment and chemical processing, and many other examples which highlight the benefits to be obtained are to be found on the ANSYS CFX Website at www.ansys.com/cfx. These problems are inherently multi-scale, with the combination of different physical and chemical processes at the molecular level, and the macro-flow processes transporting a reacting fluid around the complex geometries of a large industrial chemical reactor. The recent advances in modeling capabilities, combined with the scalable parallel performance of low cost hardware, and the powerful geometrical and meshing tools in the ANSYS software modules open up many new opportunities to achieve major new benefits in the complex and demanding world of the chemical and process industries.
Offshore platform
www.ansys.com ANSYS Solutions
Summer 2004
Case-in-point:
Integral Two-Phase Flow Modeling in Natural Gas Processing
Customized version of CFX reduces costs 70% compared to the conventional route without CFD in developing gas separator equipment.
By Marco Betting, Team Leader Twister Technology; Bart Lammers, Fluid Dynamics Specialist; and Bart Prast, Fluid Dynamics Specialist, Twister BV Natural gas processing involves dedicated systems to remove water, heavy hydrocarbons and acidic vapors from the gas stream to make it suitable for transportation to the end-customer. From a process engineering perspective, these systems are combinations of flashes, phase separations, flow splitters, and heat and mass exchangers exhaustively designed to achieve required export specifications. While the process engineer is concerned with finding the optimal system configuration using pre-defined process steps and equilibrium thermodynamics, the flow-path designer tries to optimize the performance of each individual process step in the system based on an understanding of both two-phase flow behavior and non-equilibrium thermodynamics. The fluid dynamics interaction between subsequent process steps is not always
Normalized total C8 fraction in vortex section of Twister Supersonic Separator
C8 separation
taken into account to its full extent, even though this can strongly influence the total system performance. Developing and designing new equipment for the process industry is a time-consuming and expensive exercise. Twister BV (www.twisterbv.com) offers innovative gas processing solutions that can play an essential role in meeting these challenges. The team has been developing the Twister Supersonic Separator, which is based on a
Expander
Cyclone Separator
Compressor
Liquids + Slip-Gas
70 bar, 5C
In Twister, the feed gas is expanded to supersonic velocity, thereby creating a homogeneous mist flow. During the expansion, a strong swirl is generated via a delta wing, causing the droplets to drift toward the circumference of the tube. Finally a co-axial flow splitter (vortex finder) skims the liquid enriched flow from the dried flow in the core. The two flows are recompressed in co-axial diffusers resulting in a final pressure being approximately 35% less than the feed pressure.
www.ansys.com ANSYS Solutions
Liquid Vapor
Uniform C8 distribution
Dry gas
70 bar, 5C
Twister separator
Summer 2004
Industry Spotlight
unique combination of known physical processes, combining aero-dynamics, thermo-dynamics and fluid-dynamics to produce a revolutionary gas conditioning process. The route from a new Twister tube concept to marketable hardware via several production field trials has been a major undertaking. Reducing costs in the cycle of designing, testing and redesigning of Twister prototypes for the challenging conditions involved in high-pressure sour natural gas processing is of great importance. The introduction of computational fluid dynamics in the Twister development four years ago resulted in a cost reduction of approximately 70% compared to the conventional route without CFD.
Multi-component gases with several condensable species A homogeneous nucleation model to determine the droplet number density A growth model, to allow for the change in size of the particles, through condensation and evaporation Droplet coalescence models depending on droplet size, number density and turbulence intensities Slip models to predict the separation of the droplets from the continuous phase Accounts for turbulent dispersion Aforementioned models are coupled via mass, momentum and energy equations Energy is affected by release of latent heat during condensation/evaporation The development and validation of the customized CFX code was of paramount importance in maturing the Twister separator for commercial application in the oil & gas industry. This custom version of CFX-5 includes all first-order effects useful for determining the performance of liquid/gas separators proceeded by an expander or throttling valve.
G+L dispersed
G+L stratified
G+L dispersed
For a process engineer, the quality of the gas coming over the top of the separator is determined with the phase equilibrium after an isenthalpic flash, presuming a certain liquid carry-over. The flow-path designer is concerned with the reduction of the carry-over by optimizing the flow variables of the separator, based on a feed with presumed droplet sizes.
www.ansys.com ANSYS Solutions
Summer 2004
Using the customized two-phase code, the flow path designer can study the influence of the geometry of a choke valve on the resulting droplet size distribution and better assess the performance of the separator based thereon.
Mach number
1.4 1.2 1.0 0.8 0.6 0.4 0.2 0.0
*I.P. Jones et. al, The use of coupled solvers for multiphase and reacting flows; 3rd international conference of CSIRO, 1012 December 2003, Melbourne, Australia.
www.ansys.com
ANSYS Solutions
Summer 2004
10
Analysis, imaging and visualization technologies are being applied increasingly in medical applications, particularly in evaluating different approaches to surgery and determining the best ways to proceed in the operation. In this growing field, one of the primary focuses of our work applies finite element analysis to orthopedic surgery: specifically, the specialized area of osteotomy, where bones are surgically segmented and repositioned to correct various deformities. We chose ANSYS for this work because of the reliability and flexibility of the software in handling the irregular geometries and nonlinear properties inherent in these materials. Medical imaging technologies such as CT, MRI, PET or SPECT deliver slice or projection images of internal areas of the human body. These tools are generally used to visualize configurations of bones, organs and tissue, but they also have the ability to export image data and additional information in commonly known medical file formats like DICOM.
These files then can be processed by third-party computer programs for assessing and diagnosing the condition of the patient and planning surgical intervention, that is, how the surgical procedure will be performed. Other very promising fields include telesurgery, virtual environments in medical school education and prototype modeling of artificial joints. The goal of the research is to develop computer applications in the field of orthopedic surgery, especially osteotomy intervention procedures based on CT images. the team at the Institute of Informatics uses this simulation technology to examine theories underlying new types of surgeries as well as to aid doctors in treating individual patients undergoing hip joint correction. These two approaches have many common tasks: extracting image data from diverse medical image exchange format files, enhancing images, choosing the appropriate segmentation techniques, CAD-oriented volume reconstruction, data exchange with FEM/FEA tools, and geometric description of virtual surgery.
www.ansys.com
ANSYS Solutions
Summer 2004
11
Building Orthopedic Models
CT data files. The first step in building an orthopedic model is extracting an image file from medical data exchange formats. As CT images represent the X-ray absorption of a given crosssection, the intensity values of their pixels represent this 12-bit absorption rate, rather than common color ranges. Since the slice density is usually reduced to a minimum for in-vivo scanning, considerable information often is lost, especially in complex regions of the human body. For visualization purposes, this deficiency can be compensated with interpolation techniques, but no lost anatomical data can be recovered in this way. Using these files for FEA work thus often requires further enhancement. Image enhancement and segmentation. As given tissue structures have their own absorption rate intervals, a windowing technique might be sufficient for a simple visualization. However, because these intervals can overlap, other tissue parts that differ from our VOI (Volume Of Interest) remain in the image, after applying the intensity window. Some conventional procedures like morphological or spectral-space filtering must be applied, as well as specific techniques for CT segmentation. We found that other methods, such as region growing and gradient-based segmentation, achieved excellent results for bone structures. Volume reconstruction. The final goal of the project is to develop an application to be used in surgery planning on a routine basis by medical staff without experience in using CAD-related software. We wanted this application to be able to transfer structural data into a finite element modeling and analysis software. Thus, volumetric information must be represented in a geometrically appropriate way. There is a difference between simple surface rendering and geometrical volume reconstruction in CAD systems. Volumetric data has to be represented using solid modeling primitives, and reconstructed using related concepts: keypoints, parametric splines, line loops, ruled and planar surfaces, volumes and solids. When extracting contour points of ROIs (Regions Of Interest), we need to reduce the number of points to approximately 10-15% by keeping only points with rapidly changing surroundings. These points then can be interpolated with splines, splines assembled to surfaces and surfaces to solids. The major difficulty is that CAD-related systems are designed to work with regular-shaped objects, and bone structures are not like that. However, to be able to execute FEA, it is necessary to use this approach. Moreover, virtual surgery interventions have to be carried out on this representation, or in such a way that proper geometrical representation of the modified bone structure remains easy to regain. As is often the case, conversion problems may occur when exchanging data between CAD systems, so we perform the above volume reconstruction procedure directly in the FEM software using built-in tools provided in the package. After testing many FEM programs, we chose ANSYS software for this task. Figure 1 illustrates how they
www.ansys.com
ANSYS Solutions
Summer 2004
12
Figure 2. Part of theoretical path and planar intersection of the cutting tool.
reconstructed in ANSYS an 8-inch part of a femur (pipe-like bone) using the mentioned procedure. The entire reconstruction procedure was implemented in a simple ANSYS script file. A natural extension of this method seems to be suitable also for bones containing more parts, holes, etc. In this case, Boolean operations between solids provided by ANSYS gives us a powerful tool. Another challenging problem currently being investigated is the reconstruction of those parts of the bones where the CT slices contain varying topology (e.g., when reaching a junction in some special bones).
Figure 3. Subtraction of the cutting tool from a bone section profile in 2-D, and the 3-D outcome.
www.ansys.com
Summer 2004
13
Figure 6. Different bar hole types and variable helix paths to improve efficiency of lengthening.
obtained the wanted solid object (Figure 4). This Boolean subtraction was also executed by ANSYS. As previously mentioned, they also work on pre-operative analysis and comparison of hip joint osteotomy. The 3-D reconstruction of this region is more difficult because of the information loss during the CT scanning procedure. There are many consecutive slice pairs with large differences. In this case, interpolation gives no satisfying results, and we specialize in general methods to reduce the level of user action required. Our interface for virtual surgery is GLUT-based, containing I/O tools for importing existing meshes and exporting the model into a FEM/FEA environment. Besides using similar scripts for building up the geometry as described above, we also take advantage of the mesh generator and manager capabilities of ANSYS in data exchange. That way, we can import a tetrahedron mesh used in OpenGL technology into ANSYS for FEA analysis, for example, and ANSYS geometry also can be exported as a tetrahedron mesh for visualizing purposes. Figure 5 shows an example of a tetrahedron mesh visualization in OpenGL. FEM/FEA results. Using the volume reconstruction approach, we needed only to translate our internal representation to the scripting language. Material types and parameters also can be defined using scripts. The bone material model we used is a linear isotropic one. After applying constraints and forces on the nodes of the solids, they have tested stress and displacement of the bone structure. Using the obtained results, a comparison can be made for the known osteotomy interventions of a certain type. For femur lengthening, our experience indicated that the highest stress values occurred around the starting and ending boreholes of the cut, so we also considered the usability of different borehole types and helix with variable pitches, as shown in Figure 6. I
Dr. Andr s Hajdu is an instructor with the Institute of Informatics at the University of Debrecen in Hungary and can be contacted at hajdua@inf.unideb.hu. His research is supported by OTKA grants T032361, F043090 and IKTA 6/2001. Zoltn Zrg (zorgoz@inf.unideb.hu) is in PhD studies at the Institute.
References and Resources for Further Reading H. Ab, K. Hayashi and M. Sato (Eds.): Data Book on Mechanical Properties of Living Cells, Tissues, and Organs, Springer-Verlag, Tokyo, 1996. Z. Cserntony, L. Kiss, S. Man, L. Gspr and K. Szepesi: Multilevel callus distraction. A novel idea to shorten the lengthening time, Medical Hypotheses, 2002, accepted. R. C. Gonzalez and R. E. Woods: Digital image processing, Addison-Wesley, Reading, Massachusetts, 1992. A. L. Marsan: Solid model construction from 3-D images (PDF, PhD dissertation), The University of Michigan, 1999. K. Radermacher, C. V. Pichler, S. Fischer and G. Rau: 3-D Visualization in Surgery, Helmholtz-Institute Aachen, 1998. L. A. Ritter, M. A. Livin, R. B. Sader, H-F. B. Zeilhofer and E. A. Keeve: Fast Generation of 3-D Bone Models for Craniofacial Surgical Planning: An Interactive Approach, CARS/Springer, 2002. M. Sonka, V. Hlavac and R. Boyle: Image processing, analysis, and machine vision, Brooks/Cole Publishing Company, Pacific Grove, California, 1999. Tsai Ming-Dar, Shyan-Bin Jou and Ming-Shium Hsieh: An Orthopedic Virtual Reality Surgical Simulator (PDF), ICAT 2000. Zoltn Zrg, Andrs Hajdu, Sndor Man, Zoltn Cserntony and Szabolcs Molnr: Analysis of a new femur- lengthening surgery, IEEE IASTED International Conference on Biomechanics (BioMech 2003) (2003), Rhodes, Greece, Biomechanics/34-38.
www.ansys.com
Summer 2004
14
FEA in Micro-Robotics
Researchers use ANSYS to develop micron-sized, self-powered mobile mechanisms.
By Bruce Donald, Craig McGray, and Igor Paprotny of the Micro-Robotics Group, Computer Science Department, Dartmouth College; Daniela Rus, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology; and Chris Levey, Dartmouth Thayer School of Engineering
Mobile robots with dimensions in the millimeter to centimeter range have been developed, but the problem of constructing such systems at micron scales remains largely unsolved.
The anticipated applications for mobile micro-robots are numerous: manipulation of biological cells in fighting cancer, for example, or stealth surveillance technology where clouds of flying micro-robots could monitor sites relatively undetected by sight or radar. Micrometer-sized robots could actively participate in the self-assembly of higherorder structures, linking to form complex assemblies analogous to biological systems. One could envision such self-assembly to take place inside a human body, growing prosthetic devices at their destination, for example, thus alleviating the need for intrusive surgery. Targeting these types of potential future micro-robotic applications, the Micro-Robotics Group at Dartmouth College has been developing a new class of untethered micro-actuators. Measuring less than 80 mm in length, these actuators are powered through a novel capacitive-coupled power delivery mechanism, allowing actuation without a physical connection to the power source. Finite element analysis using ANSYS allowed us to test the feasibility of the power delivery mechanism prior to actual fabrication of the device. The micro-actuators are designed to move in stepwise manner utilizing the concept of scratch-drive actuation (SDA). The functionality of a scratch-drive
Figure 1. Concept behind scratch-drive actuation, which moves the micro-actuators in a stepwise manner. An electrical potential applied between the back-plate (1) and an underlying substrate (2) causes the back-plate to bend down, storing strain energy, while the edge of a bushing (3) is pushed forward. When the potential is removed from the back-plate, the strain energy is released and the backplate snaps back to its original shape, causing the actuator to move forward.
actuator is shown in Figure 1. The actuation cycle begins when an electrical potential is applied between the back-plate and an underlying substrate. The back-plate bends downward, storing strain energy, while the edge of a bushing is pushed forward. When the potential is removed, the strain energy is released and the back-plate snaps back to its original shape. The actuation-cycle is now completed, and the actuator has taken a step forward. In contrast to traditional SDA power delivery schemes (such as using rails or spring tethers), our designs induce the potential onto the back-plate using
www.ansys.com
ANSYS Solutions
Summer 2004
a capacitive circuit formed between underlying interdigitated electrodes and the back-plate of the actuator. A circuit representation of the system as shown in Figure 2 indicated that the back-plate potential should be approximately midway between the potentials of the underlying electrodes. We validated the power delivery concept for the specific geometry of our design by modeling the system through electro-static analysis in ANSYS. Figure 3 shows the volume model of the actuator and the electrode field. The results of the analysis are shown in Figure 4, indicating the electrical potentials of the conductive elements in the model. Additionally, a cut through the air element shows the electrical potential from the field propagating through it. The potential of the electrodes in this example was set to 0 V (blue) and 100 V (red), which represented the model boundary conditions. The required potential of the back-plate was solved to be approximately 50 V, validating the circuit-model approximation. We also discovered that the potential of the back-plate changes only slightly as a function of the orientation of the drive in relation to the electrode field. This indicates that the actuator can be powered regardless of its orientation, so long as the device remains inside the electrode field. Additionally, we used the ANSYS model to visualize the intensity of the electric field propagating through the bottom layer of the insulation material, as shown in Figure 5. We suspect charging of the device due to charge-migration in the direction of the field, and charges embedding in the insulating layer underneath the drive. We anticipate that these charges will cluster along the areas where the electric field is the strongest. In future experiments, attempt will be made to image this pattern using a scanning electron microscope. Following the finite element analysis, we have successfully fabricated and actuated an untethered scratch-drive actuator capable of motion at speeds of up to 1.5mm/sgood pace for such a tiny device. Our current work is focused on how to apply these actuators to create steerable autonomous micro-robotic systems. We anticipate further use of ANSYS to model the electrostatic and mechanical interaction of the system components to further shorten our development cycle. In particular, we plan to use the ANSYS coupled-physics solver to determine the snap-down and operational characteristics of our actuators. I
15
Figure 3. Volume model of the actuator and the electrode field, prior to solving the model in ANSYS.
Figure 4. Results of the electrostatic analysis, indicating the calculated potentials of the different model components after applying the boundary conditions.
Figure 5. Intensity of the electric field propagating through the bottom insulation layer of the actuator.
www.ansys.com ANSYS Solutions
Summer 2004
Software Profile
16
Fault-Tolerant Meshing
Having the geometry in hand doesnt do you any good if you cant create a mesh. Fault-tolerant meshing algorithms remain the heart of the ANSYS ICEM CFD meshing suite. Using an octree-based meshing algorithm, ANSYS ICEM CFD Tetra generates a volumetric mesh of tetrahedral elements that are projected to the underlying surface model. This methodology renders the mesh independent of the CAD surface patch structure. This makes the meshing algorithm highly fault-tolerant sliver Getting Geometry In surfaces, small gaps and surface overlaps cause no problem. The mesh ANSYS ICEM CFD is well-known has the ability to walk over small for its ability to get geometry from ANSYS ICEM CFD remains the clear details in the model. Control is in the virtually any source: native CAD choice for meshing complex geometry. hands of the user, who has the flexibility packages, IGES, ACIS or other Shown is a tet/prism mesh for a race to define which geometric details are formats. The package continues to car wheel and suspension. ignored and which are represented be unique among mesh generators accurately by the mesh. Tetras computation speed in its ability to use geometry in both CAD and faceted has been improved with V5.0. As an example, a test representations. Faceted geometry is commonly used model of 250,000 elements and moderate geometry for rapid prototyping (stereo lithography, STL), reverse complexity required 32% less CPU time during engineering (where the STL geometry comes from meshing when compared with the previous version. techniques such as digital photo scan) and biomedical The Delaunay tet meshing algorithm was added applications (where the geometry can come to the meshing suite in the previous version and has from techniques such as magnetic resonance undergone numerous improvements, including imaging [MRI]). support for density volumetric mesh controls. One major development is that V5.0 is the first For viscous CFD applications, tet meshes can be version of ANSYS ICEM CFD capable of running improved by adding a layer of prism elements for within the ANSYS Workbench Environment. As the improved near-wall resolution for boundary layer common platform for all ANSYS products, Workbench
www.ansys.com ANSYS Solutions
Summer 2004
17
Prism before
Prism after
Images showing a cut through a hybrid hex/tet mesh of a wind tunnel/missile configuration before and after adding a layer of prism elements on the wind tunnel walls. Note that the prism layer is included for both the hex and tet zones (new feature in V5.0).
flows. ANSYS ICEM CFD Prism also has been improved for this release. Prism layers can now be grown from surface mesh without the need for an attached volume tet mesh. Perhaps more significant, prism layers can now be grown from both tri and quad elements. This means that it is now possible to grow a prism layer in a combined hybrid hex/tet mesh.
efficient. Most operations now take advantage of multi-selection methods, such as box and polygon select. The addition of blocking hotkeys is a real time-saver, giving the user single-keystroke access to the most frequently used operations. For shell meshing, V5.0 offers unstructured 2-D blocks, combining the best of ANSYS ICEM CFD Hexa and the patch-based mesher formerly known as Quad. The creation of blocks for 2-D shell meshing has been automated, so that blocks can be created automatically for all selected surfaces.
Mesh Editing
ANSYS ICEM CFD offers maximum flexibility in its mesh editing tools, whether its via global smoothing algorithms or techniques to repair or recreate individual problem elements. These tools provide one last place to work around any bottlenecks. Noteworthy are new unstructured hex mesh smoothing algorithms, which strive for mesh smoothness and near-wall orthogonality while preserving mesh spacing normal to the wall. Two new quality metrics have been added in order to help quantify mesh smoothness: adjacent cell volume ratio and opposite face area ratio.
Scripting Tools
ANSYS ICEM CFD provides a powerful suite of tools for geometry creation, model diagnosis and repair, meshing and mesh editing. All of these tools are exposed at a command line level, providing a formidable toolbox for the development of vertical applications. Every operation performed can be stored in a script for replay on model variants. This power can be extended by using the Tcl/Tk scripting language, enabling the development of entire applications. These tools enable users to get around virtually any geometry or meshing bottleneck, getting the mesh you need using the geometry you have. I
www.ansys.com
ANSYS Solutions
Summer 2004
18
LEFT: Representation of a cross-section of an abdominal aortic aneurysm (AAA) with a bifurcating stent-graft. RIGHT: Representation of an aortic artery aneurysm (bulge on left) between the renal artery (to the kidneys, top) and the iliac bifurcation (to the legs). Aside from the color shading chosen, this is what the surgeon would see before starting to implant the stent-graft.
Wall displacements and pressure/stress levels for Resteady=1200, using CFX and ANSYS: (left) axisymmetric AAA, and (right) stented AAA, where the stent-graft clearly shields the weakened aneurysm wall from the blood flow
www.ansys.com ANSYS Solutions
Summer 2004
Schematic representation of an axisymmetric AAA, including implanted stent-graft with relevant analytical data.
19
Using five case histories, CFX and ANSYS Structural were used to compute the incipient migration forces of a stented graft under different placement conditions. In the process, we modeled different artery neck configurations, variable arterial wall thicknesses, transient hemodynamics and multi-structure interactions. The actual stented AAA model in ANSYS consisted of a lumen or bulge in the artery wall, an endovascular graft shell, a cavity of stagnant blood and the AAA wall. Using iterative fluid structure interaction was an intense computational problem as ANSYS Structural and CFX exchanged coupled variations in wall flex and geometry, requiring several new flow and structure results at each time step. The ANSYS Structural problem centered around nonlinear, large deformation, contact and dynamic analyses.
shear stress, inappropriate configurations of the healthy aortic neck section, tissue problems in the aortic neck segment and biomechanical degradation of the prosthetic material. To set the model stent-graft into motion, an increasing pull force was applied with an APDL subroutine. Coulombs Law was used for each contact elements friction coefficients, but the simulations revealed a nonlinear correlation in large displacements between the migration force needed to move the stent and the friction coefficients. The simulation also revealed that the risk of displacement rises sharply in patients with high blood pressure. Coupled ANSYS and CFX fluid structure simulations verified that a stent-graft can significantly reduce the risk of an aneurysm rupture even when high blood pressure is the fundamental cause. Clearly, these tools for blood-flow-stent-artery interactions are valid, predictive and powerful for optimal surgical recommendations, improved stent designs and proper stent placement. I
For this study, CFX-4 was linked to ANSYS with Fortran to perform fluid-structure interaction. Presently, generalized, fully representative stented abdominal aortic aneurysm configurations are being analyzed, employing ANSYS and CFX-5.
Summer 2004
20
During oil processing, heavier products are broken down by high temperatures into lighter products in cokers. This cracking process strips off lighter liquid hydrocarbon products such as naphtha and gas oils, leaving heavier coke behind. The challenge that CSIRO Minerals has been helping Syncrude resolve is how to best reduce coke deposits that build-up in their fluid coker stripper while maintaining or improving hydrocarbon stripping.
Syncrude Canada Ltd. is the world s largest producer of crude oil from oil sands and the largest single-source producer in Canada. CSIRO (Australias Commonwealth Scientific and Industrial Research Organisation) is one of the worlds largest and most diverse scientific global research organizations. CSIRO Minerals is a long time user of CFX and in collaboration with the Clean Power from Lignite CRC developed the fluidized bed model in CFX-4. Because of its robust multiphase capability and its ability to be extended into new application areas, CFX is used extensively by CSIRO Minerals in undertaking complex CFD modeling of multiphase, combustion and reacting processes in the mineral processing, chemical and petrochemical industries. In the past, physical modeling had been used to understand the flow of solids and gas in the stripper. This modeling is performed at ambient conditions, so scaling of both the physical size and materials is required to approximate the actual high temperature and pressure in the stripper. This scaling process can introduce some uncertainty in understanding the actual stripper operation.
Maintenance work on a coker unit at Syncrudes oil sands plant in Alberta, Canada.
www.ansys.com ANSYS Solutions
Summer 2004
0.0 secs
By using CFD modeling to complement the physical modeling programs, scaling is eliminated and the actual dimensions and operating conditions are used. Furthermore, CFX simulation provides much greater detail of the flows and forces in the stripper than can be obtained from physical models or from the plant. This is due to the difficulty in making measurements and visualizing the flow in complex multiphase systems. Syncrude senior research associates Dr. Larry Hackman and Mr. Craig McKnight explain that extensive cold flow modeling (but not CFD modeling) had previously been used to investigate the operation of the fluid bed coker stripper and the gas and solids behavior in the unit. McKnight notes this project with CSIRO Minerals resulted in detailed, high quality reports, which provide a new understanding of the fluid coker stripper operation. Hackman indicated, By using CFX to gain a better understanding, it is anticipated that design changes will be identified to improve stripping efficiency, reduce shed fouling and optimize stripper operation. To most efficiently perform the simulations and utilize the results, the two companies are leveraging the distance separating their facilities. When it is night in Edmonton, Alberta, Canada where Syncrude Research is located, CSIRO Minerals staff is hard at work in Australia performing analyses and posting results (including pictures and animations) on their extranet. The next morning, the group in Canada can view progress of the modeling work and provide feedback for a quick turnaround. In this way, CSIRO is utilizing CFX technology to assist Syncrude in determining how best to utilize their current plant to get maximum throughput and thus make the most of their capital investment. I
5.0 secs
21
9.0 secs
13.0 secs
16.5 secs
Three-dimensional fluidized bed model of the Syncrude fluid coker stripper. The model predicts the motion of bubbles (in purple) rising from injectors in the lower part of the bed and the complex flow behavior of coke particles. Flow simulations provide insights into the stripper operation, which are then used to 0.75 improve the design. Gas Volume 0.68 Fraction 0.60 0.52 0.45
20 secs
www.ansys.com
ANSYS Solutions
Summer 2004
22
CFX data can be interpolated directly onto ANSYS CBD files, providing a flexible route to transfer CFX results to an existing ANSYS mesh.
www.ansys.com
ANSYS Solutions
Summer 2004
CFD meshing technology provides a comprehensive CAD-to-meshing solution for CFD applications. As this is often the most time-consuming stage of CFD simulation, this represents a genuine time-saving benefit to ANSYS CFX users. The latest release introduces a fluid structure interaction (FSI) capability for when the interaction of a fluid around the solid is important, such as fluids-induced stresses and heat transfer. A simple-to-use, one-way transfer of data from a CFX solution to ANSYS provides for seamless passing of thermal and loads information from fluids to structural analysis. This approach automatically interpolates the data into the ANSYS CBD file format. For more complex FSI situations, such as large-scale solid deformation or motions in which the two-way influences are important, CFX-5 can dynamically interact with ANSYS stress analysis. ANSYS Inc. has the unique distinction of offering the industrys only native connection between such components, which means ease-of-use, flexibility and reliability.
23
Reacting particles are a feature of this release, including a fully featured coal combustion model.
www.ansys.com
ANSYS Solutions
Summer 2004
24
How many holes do we need to dig? Construction costs can exceed $1 million for a new 22m diameter tank at a water treatment plant.
Concentration profiles through a cross section of the clarifier approaching 8000 mg/l solids in the blanket. This tank features an Energy Dissipation Influent EDI, optimized stilling well diameter and additional Stamford baffling below the effluent weir.
A useful post-processing idea is to track stream lines for the solid phase velocity field. In this case colored with G scalar to show where floc may experience greatest shear.
Summer 2004
Upfront Analysis
in the
Global Enterprise
same processes and techniques. Using analysis as an integrated part of product development enables engineers from around the world to collaborate in unprecedented ways. Many of Delphi Corporation s customers are global companies which market and sell their products around the world. It is therefore important for all of Delphis resources to be used to satisfy our customers needs regardless of where the need arises. Recent programs at Delphi Electronics and Safety (Formerly Delphi Delco Electronics Systems) have involved just such a scenario. Engineers from three different countries have been involved in the design process from the moment contracts are awarded. Even while some the system features are being finalized, the resources of the company around the world are mobilized to analyze and evaluate the component designs. Finite element analysis is used extensively to evaluate component performance. In many cases the early analysis indicates that modifications are necessary. The modifications are made and assessed until all problems are eliminated. Engineers responsible for making design modifications can use the local resources as well as those abroad to ensure the viability of their design. For example, many engineers at Delphi Electronics and Safetys design centers around the world have been trained to use first-order analysis tools. These engineers are usually able to use analysis to eliminate many design flaws. However, often they need help in completing the picture, either because of shortage of time and other resources, or because they lack the specialty skills that are available at other sites. Finally, one of the most important reasons for performing upfront CAE is simply that many of our customers require it. In many cases, customers have developed extensive validation requirements that use simulation extensively in the concept approval phase. I
25
Early simulation is especially important when engineers at dispersed locations must collaborate in product development.
By Fereydoon Dadkhah Mechanical Analysis and Simulation Delphi Electronics and Safety Two important side effects of the continuing pressure to reduce product development time and development costs have been the increased use of analysis in the early stages of design and the development and manufacturing of many products at overseas sites. Upfront analysis has been identified by many companies as a critical stage of product development due to the many benefits it provides. Done properly, upfront analysis can shorten the design cycle of a product drastically by identifying problems early before substantial investment of time and material has been made in the product. In the earlier stages of design, engineers have more options at their disposal when changing a design to address problems uncovered by analysis. As a product s design approaches completion, many design modification options are eliminated due to a variety of reasons such as manufacturability, cost, system integration, packaging etc. Therefore, problems that are discovered later in the process are generally more expensive to implement. Once a problem is discovered using upfront analysis, all the viable design options can also be evaluated by employing the same analysis techniques. As a result, when a prototype is finally built and tested, it is much more likely to pass the tests than if upfront analysis had not been used. Another fact of today s global economic environment is that many companies have moved beyond establishing manufacturing-only facilities overseas to performing some of their product development activities at the overseas locations as well.This global footprint can lead to situations where a product is conceived and its performance requirements specified in country A, it is then designed and tested in country B and mass produced in country C. Therefore, development centers have to be flexible enough to respond to the needs of their local market as well as be able to develop products for different, distant markets. Once again, the shortened design schedules makes the use of CAE mandatory, especially in the early stages. Because of the distributed product development process, it is important that all the engineers and designers use the
www.ansys.com
ANSYS Solutions
Summer 2004
Simulation at Work
26
Y Z X
Founded in 1895, DePuy is the oldest manufacturer of orthopedic implants in the United States, with a reputation for innovation in new product development. The company has patented a wide range of replacement knee systems, the first of which was developed more than 20 years ago. One of these types incorporates a state-of-the-art mobile bearing, which offers a wide range of options to allow the surgeon to match the implant to the patients anatomy. Figure 1 illustrates a typical replacement knee. In one recent application, two sizes of a replacement knee design were analyzed at different angles of articulation using ANSYS. Initially, finite element results were compared with known experimental measurements obtained on one of the two sizes at three angles of articulation. Once correlation had been achieved, the same methodology was used to analyze the other design at various angles.
Both the femoral component and the bearing were meshed with 3-D higher order tetrahedral elements. The meshing of the two parts was made fully parameterized. The mesh on the underside of the femoral component was made sufficiently fine to ensure minimal loss of accuracy in the geometry of the curved contact surfaces. A coarser mesh was used in the interior and on the upper side of the femoral component, since its material was significantly stiffer than that of the bearing, and, consequently, very little structural deformation was expected. Another option was to mesh the contact surfaces of the femoral component with rigid target and the load applied to a pilot node. A similar approach was used for the bearing, as the size of the elements was more critical in the contact region than other non-contacting surfaces. However, a mesh density even finer than that on the contact surfaces of the femoral component was desirable in the bearing to ensure a good resolution of the contact area and stresses. An indiscriminate refinement of the mesh on all the upper surfaces of the bearing proved to be computationally too expensive, and a new meshing procedure was developed and tested by IDAC, a finite element analysis and computer-aided engineering consulting firm and the leading UK provider of ANSYS and DesignSpace software.
www.ansys.com
ANSYS Solutions
Summer 2004
27
Figure 1. One of DePuys knee implants incorporates a mobile bearing that offers a wide range of options to allow the surgeon to match the implant to a patients anatomy.
Figure 2. Solid geometry of orthopedic knee design in ANSYS after importation of the CAD model in Parasolid format.
Figure 3. Analysis shows stress distribution in contact area between the bearing and the femoral component.
www.ansys.com
ANSYS Solutions
Summer 2004
Faster...
Quickly study the design impact of varying geometry, even without a parametric CAD model.
By Pierre Thieffry, ParaMesh and Variational Technology Solutions Specialist and Raymond Browell, Product Manager, ANSYS, Inc. The combination of the ANSYS Workbench Environment and DesignXplorer VT provides ANSYS users with powerful tools for gaining significant insight into designs when working with a CAD system. Bi-directional parametric associativity with the parent CAD package, made possible by the ANSYS Workbench Environment, makes understanding the design impact of varying geometry easy and comprehensive. But what if this is an old design and the user cannot find the geometry files for the part? Or perhaps you have the geometry, but it is in a non-associative format such as IGES or Parasolid. Maybe the geometry is parametric and regenerates robustly, but the parameters created by the designer are not the ones that make sense for the analyst. For instance, the parameters definitions might be chained together so that it is impossible to vary one feature without changing others. Perhaps a consultant provided the user with only the FEA or math model and the original geometry used to create the model isn t available, or it might take too much time to recreate. This is quite often the case with legacy models. Typically, this would be the end of the story. But with the combination of ParaMesh and DesignXplorer VT, it is just the beginning. Take an inside lok at how these tools can be used to study a legacy model like the engine torsional damper model shown in Figure 1. To optimize this engine damper, perform the following six-step procedure: 1. From an existing ANSYS database write a .cdb file. 2. Import the .cdb file into ParaMesh. 3. Create the mesh morphing parameters within ANSYS ParaMesh. (Note the name of the parameters and their order of creation. This information will be mandatory in the next steps.) 4. Declare the mesh morphing parameters by editing the ANSYS input file 5. Perform the Parametric Solution using ANSYS DesignXplorer VT. 6. Post-process with Solution Viewer, the DesignXplorer VT post-processor within the ANSYS Environment and, if desired, optimize the results.
Summer 2004
29
inner_diam Maximum Value Figure 3. Extremes in part geometry obtained by varying mesh morphing parameters.
The fourth parameter is the location of the start of the bevel angle bend and is named Angle_position, which has a range of 6 to +2mm with an initial value of about 20mm. Comparing the images in Figure 3 indicates the extremes of the part. The boundary conditions applied are: symmetry boundary conditions on the planar faces, structure is clamped on the central hole and an inward radial pressure applied on the external surface. For Step 4, edit the ANSYS input file (see Figure 4) and declare the ParaMesh mesh morphing parameters so that DesignXplorer VT will know to solve for them. As seen in the sample file, the parameters definitions are straightforward. To access the ParaMesh parameters from within the ANSYS DesignXplorer VT solution, use the SXGEOM command.
www.ansys.com
ANSYS Solutions
Summer 2004
30
/SX SXMETH,,AUTO ! Define the output results SXRSLT,disp,NODE,U,ALL,, SXRSLT,sigma,ELEM,S,ALL,, SXRSLT,mass,ELEM,MASS,ALL,, ! Define the file where the parameters have been created SXRFIL,tor_spring,rsx ! Declare the shape parameters SXGEOM,inner_diam SXGEOM,hole_diam SXGEOM,angle_position SXGEOM,hole_position FINISH /SOLU ! Prepare for a DXVT solution STAOP,SX
Figure 4. Sample section of the ANSYS input file.
Starting with a sensitivity histogram as shown in Figure 5, the sensitivity of maximum stress with respect to each of the mesh morphing parameters is evident. These values are interpreted such that for a change from the minimum value to the maximum value of the parameter hole_position, the maximum stress increases by 103MPa. For the hole_diam parameter, the maximum stress decreases as the parameter increases.
This brings us to Step 5, which is solving the model with the mesh morphing parameters using DesignXplorer VT. DesignXplorer VT uses a new and exclusive technique called Variational Technology. In a traditional finite-element analysis, each change of the value of any input variable requires a new finite element analysis. To perform a what-if study where several input variables are varied in a certain range, a considerable number of finite element analyses may be required to satisfactorily evaluate the finite element results over the range of the input variables. In other methods, it is important to remember that each design candidate requires a complete re-mesh and re-solve. The benefit of Variational Technology is that only one solution is required to make the same type of forecast that other methods provide. The response surface created by DesignXplorer VT is an explicit approximation function of the finite-element results expressed as a function of all selected input variables. Variational Technology provides more accurate results, faster. Now post-process the analysis using the Solution Viewer, the DesignXplorer VT post-processor within the ANSYS Environment.
Figure 6. Histogram showing sensitivity of stress, displacement and mass with respect to morphing parameters.
DesignXplorer VTs Solution Viewer allows the user to view the sensitivity of multiple results to the input parameters. In the histogram shown in Figure 6, we see the sensitivity of the Maximum Von Mises Stress, Maximum Displacement, and the Model Mass with respect to all of the mesh morphing parameters. The above sensitivities are relative ones. The angle_position parameter has essentially no effect on the stress.
www.ansys.com
ANSYS Solutions
Summer 2004
With the initial structure, there is a maximum Von Mises Stress at 305MPa, a maximum displacement of 0.06 mm and a mass of 116g. It is ideal to keep the maximum stress under 265Mpa and keep the mass as low as possible. To reach this objective, the above sensitivities give some ideas about the changes to be made. The parameter hole_position has the most influence on the stress and has to be lowered. Moreover, it does not affect the mass, so it is a critical parameter for stress reduction only. The same holds for the inner_diam parameters effect on the mass. It is the most influent on mass, and has little effect on stresses. To reach the objective, expect two parameters to be lowered.
Note the complexity of the response of this torsional damper with respect to the input parameters. As seen in the design curves, all but one of the responses to the input parameters are nonlinear. The response of the stresses to the hole_diameter has a definitive kink in it. The reason for that is tht the maximum stress jumps from one location to another when the parameters are changing. Simple Design of Experiment (DOE) curve fitting of results to selected samples would not have typically discovered this. One of the unique features of DesignXplorer VT is the instant, real time availability to the entire finite element solution for anywhere in the parameter domain. Pick any combination of parameter values and see a contour display of the finite element results. This is directly available, unlike DOE, DesignXplorer VT already has the results available for the user. Figure 9 shows a contour of Von Mises Stress from directly inside the Solution Viewer, for the parameter hole_position at -3mm, 0 and 3mm. The color scheme is the same for all meshes, so we really see the evolution with the given parameter.
31
hole_position at -3mm
Additionally, DesignXplorer VTs Solution Viewer allows you to view your parametric response as either design curves such as those in Figure 7, or as response surfaces as shown in Figure 8.
hole_position at 0
hole_position at 3mm
Figure 8. Response surface created by Solution Viewer. Figure 9. Stress plots created by Solution Viewer with respect to varying parameter hole_position.
www.ansys.com
ANSYS Solutions
Summer 2004
Software Highlights
32
DesignXplorer VT also includes powerful optimization and tolerance capabilities. Using the optimization capabilities built into the Solution Viewer, optimize the part. As stated before, minimize the mass of the part while keeping its maximum stress under 265 MPa. The optimization needs in this case only 63 iterations. These are achieved in 60 seconds an amazingly short time considering the number of iterations. This is because, as mentioned earlier, DesignXplorer VT has the entire finite element solution for anywhere in the parameter domain. No additional solutions are required. The final maximum stress is 265Mpa. The final stress value is more than 10% below the initial stress
value. The final mass has also been lowered to 101g, a saving of 13% which is a better solution in terms of both stress and the mass. The powerful combination of ParaMesh and DesignXplorer VT opens doors to analyses that never before existed. Previously, you had to guess, or optimize manually. Now parametric analysis is available, no matter what environment is being used: Workbench for those that have parameterized CAD models, and ParaMesh with DesignXplorer VT for those with models without parametric CAD. That is the value this powerful combination of ANSYS ParaMesh and ANSYS DesignXplorer provides: more design insight, faster...even for legacy models. I
Procedure Overview
2. Import the .cdb file into ANSYS ParaMesh
ANSYS ParaMesh
Create an ANSYS .cdb file 4. Declare the mesh morphing parameters by editing the ANSYS input file (file.dat) (Be sure to have consistent names) 3. Create the mesh morphing parameters within ANSYS ParaMesh
Edit the input file (file.dat) by adding the following commands: SXGEOM, SXRFIL, SXMETH 5. Perform the Parametric Solution by using ANSYS DesignXplorer VT
Follow these six steps in using ParaMesh and DesignXplorer VT to optimize legacy models, even if you do not have geometry or if the geometry is non-associative. ParaMesh easily prepares these models so they can be studied with DesignXplorer VT to arrive at quick insight into the design impact of varying the geometry.
www.ansys.com
ANSYS Solutions
Summer 2004
Tech File
33
Node-to-Node Elements
In the early days of finite element analysis, there was one type of contact element: the node-to-node variety. The early versions of node-to-node contact elements were CONTAC12 (2-D) and CONTAC52 (3-D). More recently, CONTA178 (2-D and 3-D) was introduced to encompass the capabilities of both of these elements and also introduce some new features, such as additional contact algorithms. Node-to-node contact elements are simple and solve relatively quickly. Their basic function is to monitor the movement of one node with respect to another node. When the gap between these nodes closes, the contact element allows load to transfer from one node to the other. What does this really mean and how does ANSYS know when the nodes are touching?
www.ansys.com
ANSYS Solutions
Summer 2004
Tech File
34
These characteristics are true for all types of contact elements. While they may seem a little primitive when compared with the newer contact elements, node-tonode contact elements have a lot going for them. Theyve been around long enough to have had their bugs worked out many years ago, and their extensive use over several decades means that there is a vast experience base to draw upon when setting up and debugging an analysis. CONTAC12 and CONTAC52 can have nodes that are either coincident or non-coincident. While the majority of applications involve using non-coincident nodes, coincident nodes can be useful for certain analyses. If coincident nodes are used, the orientation of the contact surface that exists between the two nodes must be defined. The initial condition gap or interference can be provided by the user as being either positive (gap) or negative (interference), or automatically calculated from the relative positions of the nodes. Node-to-node contact is also available in COMBIN40. COMBIN40 is a rather unique element because it also includes a spring-slider, a damper (which works in parallel with the spring-slider) and a mass at each node. Any of these features can be used alone or simultaneously with any or all of the other features. While node-to-node contact elements are very useful, there are some limitations that must be kept in mind when using them. One limitation is that the orientation of the gap is not updated when large deflection analyses are performed. Another limitation is that these elements do not account for moment equilibrium. This does not present a problem when a line drawn between the nodes is normal to the contact surface because in this instance the moments are zero, but care should be taken in each analysis to recognize whether this is the case or not. If not, it is important to consider what effect this might have on the results. It is the responsibility of the analyst to recognize whether this condition is present and whether it introduces an unacceptable error that invalidates the usefulness of the analysis. Node-to-node elements can always be generated manually, and, depending on the model, you can often use the EINTF command to make them as well.
Node-to-Surface Elements
The next evolution in contact elements was the introduction of node-to-surface contact elements, such as CONTAC26 (2-D), CONTAC48 (2-D), CONTAC49 (3-D), and the recent addition of CONTA175 (2-D and 3-D). The major enhancement offered by node-to-surface contact elements is that they allow a node to contact anywhere along an edge (in 2-D) or a surface (in 3-D). Rather than a node being confined to contacting a specific node, a node can contact the edge of a certain element. This has significant benefits when objects translate or rotate relative to each other. Node-to-surface contact elements are capable of simulating large relative movements with accuracy. Because CONTA175 includes all the capabilities of the other node-to-surface contact elements and has other features that these elements do not have, CONTA175 will replace the other node-to-surface elements in future versions of ANSYS. Beginning in ANSYS 8.1, CONTAC26, CONTAC48 and CONTAC49 will be undocumented, and they will eventually be removed from ANSYS. There are several ways to generate node-tosurface contact elements. They can be made manually, but this becomes impractical when making more than a few elements. GCGEN and ESURF are commands that are frequently used to generate node-to-surface contact elements, with GCGEN being the easiest and quickest way to make CONTAC48 and CONTAC49 node to surface contact elements, while ESURF is used to make CONTA175 node to surface elements. To use GCGEN, you make two components, one that contains the nodes from one of the contact surfaces, and another that contains the elements from the other contact surfaces, and then use GCGEN to automatically generate node-to-surface contact elements between every node and every element that are in these components. To use ESURF, you select the elements that the CONTA175 elements will be attached to and their nodes that are on the surface you wish to place the contact elements onto, making sure that you have the proper element attributes active (TYPE, REAL and MAT), and then issue the ESURF command. Last but not least, the Contact Wizard can be used to generate node-to-surface contact elements and is usually the easiest and quickest way of making them.
www.ansys.com
ANSYS Solutions
Summer 2004
Surface-to-Surface Elements
The latest evolution of contact element technology has been in the area of surface-to-surface contact. This allows contact to take place between one or more edges in 2-D, or one or more surfaces in 3-D. There are several important characteristics that make surface-to-surface contact elements very different from their less sophisticated ancestors. Surface-to-surface contact is not defined by a single element, but by two types of elements called targets and contacts. Any number of target and contact elements can be identified as being a set or group. Contact can take place between any contact elements and any target elements that are in this group. ANSYS uses the real constant number to identify the target and contact elements that are in a group. All target and contact elements in this group have the same real constant number. Two-dimensional contact problems can be simulated using either CONTA171 or CONTA172 with TARGE169, while three-dimensional problems would use either CONTA173 or CONTA174 with TARGE170.
CONTA171 and CONTA173 are appropriate for edges and surfaces made from linear (no midside nodes) elements while CONTA172 and CONTA174 can be used with edges and surfaces made from quadratic (having midside nodes) elements. Both CONTA172 and CONTA174 can be used in a degenerate form on surfaces made from linear elements. The introduction of surface-to-surface contact elements has brought about big improvements in solution efficiency and has also broadened the types of contact problems that can be modeled. They offer many new and improved features, such as the ability to contact and then bond two surfaces together, automatic opening or closing of gaps to a uniform value, and a variety of contact algorithms, to name just a few. You can generate surface-to-surface contact elements by using series NSEL, ESEL and ESURF commands. The Contact Wizard automates these steps and makes the generation of surface-to-surface contact elements quick and easy in both 2-D and 3-D. Now that we have been introduced to the contact elements that are at our disposal, well follow up next time with some helpful hints on how to use them. I
Part two of this article, to appear in the next issue of ANSYS Solutions, will discuss various aspects of using contact elements, including modeling tips and setting appropriate stiffness.
35
www.ansys.com
ANSYS Solutions
Summer 2004
36
As every experienced FEA analyst knows, no two contact problems are exactly alike, so there is no silver bullet combination of KEYOPT and real constant settings that will successfully work for all problems. That explains the many features available today within the contact elements. It also explains, in part, the rationale behind the different default settings sometimes found in the different environments. As migration between Workbench and ANSYS environments progresses, it is important for analysts to recognize that, although the contact technology used in both of these environments is exactly the same, some of the default KEYOPT and real constant settings are not. Tables 1 and 2 summarize all surface-tosurface contact element (CONTA171 174)
KEYOPTs and real constant properties with their respective default settings in each environment. Those that have different defaults in the different environments are highlighted in bold italic. KEYOPT(1): Select Degrees of Freedom (DOF) This option gives you the freedom to assign the contact DOF set consistent with the physics of the underlying elements. ANSYS surface-to-surface contact technology offers an impressive combination of structural, thermal, electric and magnetic capabilities. When building pairs through the ANSYS environment with traditional ANSYS Parametric Design Language (APDL), users must
1 2 3 4 5 6 7 8 9 10 11 12
Selects DOF Contact algorithm Stress state when super element is present Location of contact detection point CNOF/ICONT adjustment (blank) Element level time increment control Asymmetric contact selection Effect of initial penetration or gap Contact stiffness update Beam/shell thickness effect Behavior of contact surface
manual Aug Lagrange no super elem gauss no adjust no control no action include all btwn loadsteps exclude standard
automatic Aug Lagrange no super elem gauss no adjust no control no action include all btwn substps exclude standard
www.ansys.com
ANSYS Solutions
Summer 2004
37
Notes: 1. FKN = 10 if only linear contact is active (bonded, no sep). If any nonlinear contact is active, all regions will have FKN = 1 (including bonded, no sep). 2. Depends on contact behavior, rigid vs. flex target, KEYOPT (9) and NLGEOM ON/OFF. 3. Calculated as a function of highest conductivity and overall model size. 4. 10% of target length for NLGEOM,OFF. 2% of target length for NLGEOM,ON.
set this option manually. The default will always be KEYOPT(1) =0 (for UX,UY). When building contact pairs in the ANSYS environment using the contact wizard, KEYOPT(1) is set automatically according to the DOF set of the underlying element. In Workbench, this option also is set automatically, depending on the underlying element DOF set. KEYOPT(2): Contact Algorithm ANSYS contact technology offers many algorithms to control how the code enforces compatibility at a contacting interface. The penalty method (KEYOPT(2) =1) is a traditional algorithm that enforces contact compatibility by using a contact spring to establish a relationship between the two surfaces. The spring stiffness is called the penalty parameter or, more commonly, the contact stiffness. The spring is inactive when the surfaces are apart (open
status), and becomes active when the surfaces begin to interpenetrate. The augmented Lagrange method (KEYOPT(2) = 0) uses an iterative series of penalty methods to enforce contact compatibility. Contact tractions (pressure and friction stresses) are augmented during equilibrium iterations so that final penetration is smaller then the allowable tolerance. This offers better conditioning than the pure penalty method and is less sensitive to magnitude of contact stiffness used, but may require more iterations than the penalty method. The Multi-Point Constraint (MPC) Method (KEYOPT(2) = 2) enforces contact compatibility by using internally generated constraint equations to establish a relationship between the two surfaces. The DOFs of the contact surface nodes are eliminated. No normal or tangential stiffness is required. For small deformation problems, no
www.ansys.com
ANSYS Solutions
Summer 2004
38
iterations are needed in solving system equations. Since there is no penetration or contact sliding within a tolerance, MPC represents true linear contact behavior. For large deformation problems, the MPC equations are updated during each iteration. This method applies to bonded surface behavior only. It is also useful for building surface constraint relationships similar to CERIG and RBE3. MPC is available as a standard option when modeling bonded contact in both ANSYS and Workbench environments. The Pure Lagrange multiplier method (KEYOPT(2) = 3) adds an extra degree of freedom (contact pressure) to satisfy contact compatibility. Pure Lagrange enforces near-zero penetration with pressure DOF. Unlike the penalty and augmented Lagrange algorithms, it does not require a normal contact stiffness. Pure Lagrange does require a direct solver, can be more computationally expensive and can have convergence difficulties related to overconstraining, but it is a very useful algorithm when zero penetration is critical. It also can be combined with the penalty algorithm in the tangential direction (KEYOPT(2) = 4), when zero penetration is critical, and friction is also present. The ANSYS environment uses the augmented Lagrange by default. The Workbench environment currently uses the penalty method, but the default can be changed via the Options Menu at 8.1. MPC
is available as a standard alternative in both environments. The Pure Lagrange options are available in ANSYS, but can be accessed in Workbench via the pre-processor command builder. At version 8.1, Pure Lagrange is available in the Workbench environment. Table 3 summarizes all the algorithms with pros and cons of each. KEYOPT(9): Effect of Initial Penetration or Gap Properly accounting for or controlling interferences and gaps can sometimes be the difference between success and failure in simulating a complicated contact relationship. There are several contact options available to control how the code accounts for initial interference or gap effects: (0) Include everything: Include an initial interference from the geometry and the specified offset (if any). (1) Exclude everything: Ignore all initialinterference effects. (2) Include with ramped effects: Ramp the interference to enhance convergence. (3) Include offset only: Base initial interference on specified offset only. (4) Include offset only w/ ramp: Base initial interference on specified offset only, and ramp the interference effect to enhance convergence.
Augmented
Pure Lagrange
Offers near-zero penetration; zero elastic slip (no contact stiffness required)
Might require more iterations; might also require adjustment to chatter control parameters unique to this algorithm; can produce overconstraints in model Same as Pure Lagrange
Same as Pure Lagrange, plus simulation of friction is handled most efficiently More efficient than traditional bonded contact; offers contact betweenmixed element types; offers CERIG RBE3 type constraints
When zero penetration is critical and friction is present Recommended for large bonded contact models to enhance run time and for contact between mixed element types and surface constraint applications
www.ansys.com
ANSYS Solutions
Summer 2004
In ANSYS, the default KEYOPT(9) = 0 is to include everything. In Workbench, the default is to exclude everything (1) when linear contact (bonded, no separation) is defined and include with ramped effects (2) when nonlinear contact (frictional, frictionless, rough) is defined. KEYOPT(10): Contact Stiffness Update When using the penalty and/or augmented Lagrange method, contact stiffness has long been recognized as a critical property that influences both accuracy and convergence. Too high a stiffness will ultimately lead to convergence difficulty; too low a stiffness will result in over-penetration and an inaccurate assessment of surface pressures and stresses at the interface. In an effort to arrive at a good balance between these extremes, automatic stiffness updating between loadsteps (KEYOPT(10) = 0) and substeps (KEYOPT(10) = 1), or between iterations (KEYOPT(10) = 2) was introduced as an enhancement to traditional trial-and-error methods. In ANSYS, when contact is built via APDL, the default is to update stiffness between loadsteps. In ANSYS, when contact is built via the Wizard, the default has been changed to update between substeps. This is considered to produce the most robust contact simulation in most cases. In Workbench, the default behavior is still between loadsteps, but the default can be changed via the Option Menu at Version 8.1. These defaults may change in future releases as further enhancements are made. KEYOPT(12): Behavior of Contact Surface ANSYS contact technology offers a rich library of surface behavior options to simulate every possible situation. These options are as follows: (0) Standard: (Referred to as Frictionless or Frictional in Workbench) normal contact closing and opening behavior, with normal sticking/sliding friction behavior when nonzero friction coefficient is defined. (1) Rough: Normal contact closing and opening behavior, but no sliding can occur (similar to having an infinite coefficient of friction). (2) No Separation: Target and contact surfaces are tied once contact is established (sliding is permitted). This is not available as a standard option in Workbench, but can be accessed via the pre-processor command builder.
www.ansys.com
(3) Bonded: Target and contact surfaces are glued once contact is established. (4) No Separation (always): (Referred to simply as No Separation in Workbench) Any contact detection points initially inside the pinball region or that come into contact are tied in the normal direction (sliding is permitted). (5) Bonded Contact (always): (Referred to simply as Bonded in Workbench) Any contact detection points initially inside the pinball region or that come into contact are bonded. (Design-space Default) (6) Bonded Contact (initial contact): Bonds surfaces ONLY in initial contact, initially open surfaces will remain open. This is not available as a standard option in Workbench, but can be accessed via the pre-processor command builder. The default surface behavior in ANSYS is nonlinear standard for simulating the most general normal contact closing and opening behavior, with normal sticking/sliding friction. In Workbench, the default behavior (which can be changed via the Options Menu at Version 8.1), set up with automatic contact detection to simulate an assembly, is linear Bonded Contact (Always). Real Constant(3): Normal Penalty Stiffness Factor (FKN) Users control the initial contact stiffness used by multiplying the calculated value by a factor, FKN. The default value for FKN used in ANSYS (APDL or Wizard) is 1.0. In Workbench, FKN = 10 if only linear contact is active (bonded or no separation). If any nonlinear contact is active, all regions will have FKN = 1 (including bonded and no separation). Real Constant(14): Thermal Contact Conductance (TCC) This constant dictates the thermal resistance across the interface of contacting bodies in applications involving thermal analysis. The default value in ANSYS for TCC is zero (perfect insulator). In Workbench, the default is automatically calculated as a function of the highest thermal conductivity of the contacting parts and the overall model sizethus essentially modeling perfect thermal contact. I
39
ANSYS Solutions
Summer 2004
Guest Commentary
Putting Quality
40
Assurance
Part 2 of 2:
The NAFEMS document Management of Finite Element Analysis Guidelines to Best Practice states that a quality assurance program should be developed to serve an organization, not vice-versa. To address this concern and the barriers described in Part 1 of this series, IMPACT Engineering Solutions has developed a suite of QA tools that can be customized and scaled to meet the needs of a wide-range of product development teams and industries. This suite of tools for QA includes: process audits, management education, user skill-level assessment, user education/ continuous improvement, pre- and post-analysis checklists, project documentation, data management, and analysis correlation guidelines.
Process Audit
The first step in establishing a QA program should be to document existing processes and company goals, including technical, organizational and competitive goals. Developing an understanding of how products are developed, what the historical issues and challenges have been, what interactions exist, and how simulation technologies can best impact a companys bottom line should precede any recommendations. A process audit should evaluate not only the tools used by an engineering department but also identify additional state-of-the-art tools that can impact the design process or allow simulation activities to grow beyond current limitations. A process audit should help ensure that all groups involved in the design process are on the same page. Finally, the process audit should put some monetary values to typical tasks so that potential savings and opportunities for gains can be more readily identified. The report generated from the process audit should be a living document that allows periodic review of critical components and observations.
Management Education
A recent survey indicated that management, for various reasons, was the greatest barrier to success of FEA in product design by the users who responded. Helping
www.ansys.com
ANSYS Solutions
Summer 2004
keep skills of users sharp. A company can t be confident that users are state-of-the-art in their techniques and tools unless they are exposed to people and techniques outside of their familiar surroundings. The process audit conducted at the beginning of the program should identify critical skills and techniques that are needed to maximize the benefits of simulation, while the skills assessment should identify which users need work in those techniques. Employee growth should be planned, not expected to happen haphazardly. Knowledge and documentation of the next plateau for each user or group of users, with clear milestones, will help ensure that quality is maintained. It is also preferable to insist that all users at an organization go through a standard set of courses so that all are using the same language and have been exposed to the same data.
manage revisions and bill hierarchies, not the simplified geometries, results formats, and validation databases required for an analysis program. While every company must develop its own PLM and data management system that best fits within their organization, a QA program for analysis must tap into that system, formalize it if need be and provide means for policing the archiving of analysis data so that a companys intellectual property and investment in simulation is secure.
Project Documentation
Too few companies have standard report formats for analysis while many companies dont mandate reports at all. Despite the obvious loss of intellectual capital a company will experience when an analyst leaves the organization without documenting their work, a company loses one of the most important quality control tools in the analysis process when reports arent completed. A QA program for analysis must include a report format that transcends groups, specializations, or departments. Analysis data on seemingly unrelated components could still provide insight and prevent repetition of work. In addition to providing details of the recent work, a project report should include references to similar historical projects, test data and correlation criteria. A report should indicate the source of inputs and assumptions as well as comment on the validity of these assumptions. Additionally, a company would benefit from linking test and analysis reports, even to the point of using similar formats for the two related tasks.
Data Management
As companies begin to evaluate their PLM (product lifecycle management) structures, the organization of analysis or other product performance data must be included in the initial planning. D.H. Brown and Associates have investigated the needs of CAE data management and have found that structured PDM (product data management) systems may not be up to the task. PDM systems were typically developed to
www.ansys.com
Summer 2004