Meta-modeling for Manufacturing Processes
2011, Intelligent Robotics and Applications
Sign up for access to the world's latest research
Abstract
Meta-modeling for manufacturing processes describes a procedure to create reduced numeric surrogates that describe cause-effect relationships between setting parameters as input and product quality variables as output for manufacturing processes. Within this method, expert knowledge, empiric data and physical process models are transformed such that machine readable, reduced models describe the behavior of the process with sufficient precision. Three phases comprising definition, generation of data and creation of the model are suggested and used iteratively to improve the model until a required model quality is reached. In manufacturing systems, such models allow the generation of starting values for setting parameters based on the manufacturing task and the requested product quality. In-process, such reduced models can be used to determine the operating point and to search for alternative setting parameters in order to optimize the objectives of the manufacturing process, the product quality. This opens up the path to self-optimization of manufacturing processes. The method is explained exemplarily at the gas metal arc welding process.
Related papers
2019 24th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), 2019
Short product lifecycles and a high variety of products force industrial manufacturing processes to change frequently. Due to the manual approach of many quality analysis techniques, they can significantly slow down adaption processes of production systems or make production unprofitable. Therefore, automating them can be a key technology for keeping pace with market demand of the future. The methodology presented here aims at a meta-model supporting automation for PFMEA. The method differentiates product requirements, production steps and quality measures in such a way, that complex quality requirements can be addressed in any instance of a factory using a common meta-modeling language. Index Terms-production planning, process control, quality management, design for quality
Achieving predictable, reliable, and cost-effective operations in wire and arc additive manufacturing is a key concern during production of complex-shaped functional metallic components for demanding applications, such as those found in aerospace and automotive industries. A metamodel combining localized submodels of the different physical phenomena during welding can ensure stable material deposition. Such a metamodel would necessarily combine submodels from multiple domains, such as materials science, thermomechanical engineering, and process planning, and it would provide a holistic systems perspective of the modeled process. An approach using causal graph-based modeling and Bayesian networks is proposed to develop a metamodel for a test case using wire and arc additive manufacturing with cold metal transfer. The developed modeling approach is used to characterize the effect of manufacturing variables on product dimensional quality in the form of a causal graph. A quantitative simulation using Bayesian networks is applied to the causal graph to enable process parameter tuning. The Bayesian network inference mechanism predicts the effects of the parameters on results, whereas, conversely, with known targets, it can predict the required parameter values. Validation of the developed Bayesian network model is performed using experimental tests.
Process optimization and process adjustment methods are discussed and combining an EWMA chart with Shewhart chart is traditionally recommended as a mean of providing good protection against both small and large shift in the process mean. Using an EWMA together with Shewhart chart, but we find no performance improvement. In conjunction with some commonly used control chart, these adjustment techniques are then applied on a manufacturing process and Clustering the process adjustment with SPC
Quality has become one of the most important customer decision factors in the selection among the competing product and services. Therefore, understanding the meaning of quality and efforts taken towards its improvement plays a pivotal role in growth of a company's business. According to TQM, the most effective way towards the improvement of quality of the product in hand is by improving the process used to manufacture the product. Improvement of a process is a long term task which includes problem definition, source of a problem, root causes which include both primary and secondary and the steps taken towards the optimization. The identification of the problem is possible either by visual inspection or through statistical process control. Objective of the statistical assessment process is to determine whether all major manageable causes of instability of the process have been removed. Process capability analysis is one of the major tools used to determine the process performance as capable or incapable within a specified tolerance. This article presents the overview of use ofprocess capability indices along related to improving the quality of certain processes with a case study of an automobile industry.A variety of quality tools including flowcharts, cause and effects diagrams, control charts, process capability indices and experimental design are illustrated throughout the manuscript. complete underbody of the car which includes various sub-assemblies like the rear wheel house assembly, front long members, dash and cowl, front apron and others are assembled together by spot welding or MIG/MAG welding.
Computers & Industrial Engineering, 1990
AktrKt-The application of regression metamodeis to szmulation outputs ts illustrated m this paper. With increasing interests in applying slmulatton to complex manufacturing problems, regression metamodels can greatly reduce the cost, ttme, and the amount of effort spent m conducting simulatton These models can also be generalized within the bounds defined for the problem's parameters. The use of a regression metamodel to conduct sensitivity analym, application in "opttmlzing" manufactunng systems, and the validity of the model are illustrated An example is given with a maintenance float problem
Lecture Notes in Production Engineering, 2014
Within the Cluster of Excellence "Integrative Production Technology for High-Wage Countries" one major focus is the research and development of selfoptimising systems for manufacturing processes. Self-optimising systems with their ability to analyse data, to model processes and to take decisions offer an approach to master processes without explicit control functions. After a brief introduction, two approaches of self-optimising strategies are presented. The first example demonstrates the autonomous generation of technology models for a milling operation. Process knowledge is a key factor in manufacturing and is also an integral part of the self-optimisation approach. In this context, process knowledge in a machine readable format is required in order to provide the self-optimising manufacturing systems a basis for decision making and optimisation strategies. The second example shows a model based self-optimised injection moulding manufacturing system. To compensate process fluctuations and guarantee a constant part quality the manufactured products, the self-optimising approach uses a model, which describes the pvT-behaviour and controls the injection process by a determination of the process optimised trajectory of temperature and pressure in the mould.
2012
Response Surface Methodology (RSM) introduced in the paper (Box & Wilson, 1951) explores the relationships between explanatory and response variables in complex settings and provides a framework to identify correct settings for the explanatory variables to yield the desired response. RSM involves setting up sequential experimental designs followed by application of elementary optimization methods to identify direction of improvement in response. In this paper, an application of RSM using a two-factor two-level Central Composite Design (CCD) is explained for a diesel engine nozzle manufacturing sub-process. The analysis shows that one of the factors has a significant influence in improving desired values of the response. The implementation of RSM is done using the DoE plug-in available in R software.
Lecture Notes in Production Engineering, 2014
Decision making for competitive production in high-wage countries is a daily challenge where rational and irrational methods are used. The design of decision making processes is an intriguing, discipline spanning science. However, there are gaps in understanding the impact of the known mathematical and procedural methods on the usage of rational choice theory. Following Benjamin Franklin's rule for decision making formulated in London 1772, he called "Prudential Algebra" with the meaning of prudential reasons, one of the major ingredients of Meta-Modelling can be identified finally leading to one algebraic value labelling the results (criteria settings) of alternative decisions (parameter settings). This work describes the advances in Meta-Modelling techniques applied to multidimensional and multi-criterial optimization in laser processing, e.g. sheet metal cutting, including the generation of fast and frugal Meta-Models with controlled error based on model reduction in mathematical physical or numerical model reduction. Reduced Models are derived to avoid any unnecessary complexity. The advances of the Meta-Modelling technique are based on three main concepts: (i) classification methods that decomposes the space of process parameters into feasible and non-feasible regions facilitating optimization, or monotone regions (ii) smart sampling methods for faster generation of a Meta-Model, and (iii) a method for multi-dimensional interpolation using a radial basis function network continuously mapping the discrete, multi-dimensional sampling set that contains the process parameters as well as the quality criteria. Both, model reduction and optimization on a multi-dimensional parameter space are improved by exploring the data mapping within an advancing "Cockpit" for Virtual Production Intelligence.
2007
ABSTRACT Process modelling can play an important part in understanding and improving engineering design processes. Meta-data, ie numerical information describing the state of a modelled process, can be used to enhance modelling fidelity and to measure predicted behaviour. We explore the application of meta-data using a “proxy” process represented using an Applied Signposting Model. The proxy has been derived from a larger model of the turbine cooling system design process.
A framework for optimization of process parameters in material processing and production is described. The framework is designed for effective set up and solution of optimization problems as part of process design, as well as to support development of numerical models by inverse identification of model parameters. The general framework is outlined, which has been supplemented by a neural networks module in order to enable real time decision support. Simulator based on meshless method with radial basis functions (RBF) has been utilized. Optimization Framework The optimization part of the framework is designed as a stand-alone optimization system. Its development was centered around a library of optimization techniques for industrial problems where optimization is carried out on basis of computationally expensive numerical simulations whose results contain substantial level of numerical noise [1,2]. This has been predominantly treated by algorithms based on adaptive approximation of the response functions. Successive approximations of sampled response over suitably sized domains enable exploitation of higher order function information. Restricted step approach is used to ensure global convergence, and adaptive sampling strategies play significant role in reducing the necessary number of evaluations of the response functions. Work was initiated as an attempt to re-implement the C library IOptLib [1-3] in a rigorous object oriented manner in order to more easily master complexity of the developed algorithms and to speed up the development process. The framework is being extended in order to enable straight forward inclusion and seamless use of third party optimizers. This requires careful design of abstraction levels and standardization of input/output and calling conventions, which is achieved by suitable wrappers when third party software is incorporated. Further steps will be made towards more unified treatment of different kinds of problems such as constrained/unconstrained or single objective/multiobjective optimization. Multidisciplinary approach is also considered in a way that different simulators may be used for different problem fields involved in definition of an optimization problem. Neural Networks Approximation Module. In several practical cases the process design parameters must be adapted quickly in order to produce results that comply with customer requests. With classical approach to optimization of process parameters, long computational times needed for each run of the process simulation at trial design parameters can therefore limit applicability of optimization in industrial environment. Solution has been conceived in the form of approximation of system response, which is calculated on basis of sampled response prepared in advance either by runs of numerical model or by measurements previously performed on the process of interest with varying process parameters. The optimization procedure that produces process design parameters consistent with the current requirements is then performed on the surrogate model based on the approximated response.

Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.