Tolerancing began with the notion of limits imposed on the dimensions of realized parts both to maintain functional geometric dimensionality and to enable cost-effective part fabrication and inspection. Increasingly however, component fabrication depends on more than part geometry as many parts are fabricated as a result of a "recipe" rather than dimensional instructions for material addition or removal. Referred to as process tolerancing, this is the case, for example, with IC chips. In the case of tolerance optimization, a typical objective is cost minimization while achieving required functionality or "quality." This paper takes a different look at tolerances, suggesting that rather than ensuring merely that parts achieve a desired functionality at minimum cost, the underlying goal of product design is to make money, more is better and tolerances comprise additional design variables amenable to optimization in a decision theoretic framework. We further recognize that tolerances introduce additional product attributes that relate to product characteristics such as consistency, quality, reliability and durability. These important attributes complicate the computation of the expected utility of candidate designs, requiring additional computational steps for their determination. The resulting theory of tolerancing illuminates the assumptions and limitations inherent to Taguchi's loss function. We illustrate the theory using the example of tolerancing for an apple pie, which conveniently demands consideration of tolerances on both quantities and processes, and the interaction among these tolerances.
more »
« less
Tolerancing for an Apple Pie: A Fundamental Theory of Tolerances
Abstract Tolerancing began with the notion of limits imposed on the dimensions of realized parts both to maintain functional geometric dimensionality and to enable cost-effective part fabrication and inspection. Increasingly, however, component fabrication depends on more than part geometry as many parts are fabricated as a result of a “recipe” rather than dimensional instructions for material addition or removal. Referred to as process tolerancing, this is the case, for example, with IC chips. In the case of tolerance optimization, a typical objective is cost minimization while achieving required functionality or “quality.” This article takes a different look at tolerances, suggesting that rather than ensuring merely that parts achieve a desired functionality at minimum cost, a typical underlying goal of the product design is to make money, more is better, and tolerances comprise additional design variables amenable to optimization in a decision theoretic framework. We further recognize that tolerances introduce additional product attributes that relate to product characteristics such as consistency, quality, reliability, and durability. These important attributes complicate the computation of the expected utility of candidate designs, requiring additional computational steps for their determination. The resulting theory of tolerancing illuminates the assumptions and limitations inherent to Taguchi’s loss function. We illustrate the theory using the example of tolerancing for an apple pie, which conveniently demands consideration of tolerances on both quantities and processes, and the interaction among these tolerances.
more »
« less
- Award ID(s):
- 1923164
- PAR ID:
- 10416842
- Date Published:
- Journal Name:
- Journal of Mechanical Design
- Volume:
- 145
- Issue:
- 6
- ISSN:
- 1050-0472
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Sheet Metal (SM) fabrication is perhaps one of the most common metalworking technique. Despite its prevalence, SM design is manual and costly, with rigorous practices that restrict the search space, yielding suboptimal results. In contrast, we present a framework for the first automatic design of SM parts. Focusing on load bearing applications, our novel system generates a high-performing manufacturable SM that adheres to the numerous constraints that SM design entails: The resulting part minimizes manufacturing costs while adhering to structural, spatial, and manufacturing constraints. In other words, the part should be strong enough, not disturb the environment, and adhere to the manufacturing process. These desiderata sum up to an elaborate, sparse, and expensive search space. Our generative approach is a carefully designed exploration process, comprising two steps. In Segment Discovery connections from the input load to attachable regions are accumulated, and during Segment Composition the most performing valid combination is searched for. For Discovery, we define a slim grammar, and sample it for parts using a Markov-Chain Monte Carlo (MCMC) approach, ran in intercommunicating instances (i.e, chains) for diversity. This, followed by a short continuous optimization, enables building a diverse and high-quality library of substructures. During Composition, a valid and minimal cost combination of the curated substructures is selected. To improve compliance significantly without additional manufacturing costs, we reinforce candidate parts onto themselves --- a unique SM capability called self-riveting. we provide our code and data in https://github.com/amir90/AutoSheetMetal. We show our generative approach produces viable parts for numerous scenarios. We compare our system against a human expert and observe improvements in both part quality and design time. We further analyze our pipeline's steps with respect to resulting quality, and have fabricated some results for validation. We hope our system will stretch the field of SM design, replacing costly expert hours with minutes of standard CPU, making this cheap and reliable manufacturing method accessible to anyone.more » « less
-
Abstract Compared to conventional fabrication, additive manufacturing enables production of far more complex geometries with less tooling and increased automation. However, despite the common perception of AM’s “free” geometric complexity, this freedom comes with a literal cost: more complex geometries may be challenging to design, potentially manifesting as increased engineering labor cost. Being able to accurately predict design cost is essential to reliably forecasting large-scale design for additive manufacturing projects, especially for those using expensive processes like laser powder bed fusion of metals. However, no studies have quantitatively explored designers’ ability to complete this forecasting. In this study, we address this gap by analyzing the uncertainty of expert design cost estimation. First, we establish a methodology to translate computer-aided design data into descriptive vectors capturing design for additive manufacturing activity parameters. We then present a series of case study designs, with varied functionality and geometric complexity, to experts and measure their estimations of design labor for each case. Summary statistics of the cost estimates and a linear mixed effects model predicting labor responses from participant and design attributes was used to estimate the significance of factors on the responses. A task-based, CAD model complexity calculation is then used to infer an estimate of the magnitude and variability of normalized labor cost to understand more generalizable attributes of the observed labor estimates. These two analyses are discussed in the context of advantages and disadvantages of relying on human cost estimation for additive manufacturing forecasts as well as future work that can prioritize and mitigate such challenges.more » « less
-
With rapid innovation in the electronics industry, product obsolescence forecasting has become increasingly important. More accurate obsolescence forecasting would have cost reduction effects in product design and part procurement over a product’s lifetime. Currently many obsolescence forecasting methods require manual input or perform market analysis on a part by part basis; practices that are not feasible for large bill of materials. In response, this paper introduces an obsolescence forecasting framework that is capable of being scaled to meet industry needs while remaining highly accurate. The framework utilizes machine learning to classify parts as active, in production, or obsolete and discontinued. This classification and labeling of parts can be useful in the design stage in part selection and during inventory management with evaluating the chance that suppliers might stop production. A case study utilizing the proposed framework is presented to demonstrate and validate the improved accuracy of obsolescence risk forecasting. As shown, the framework correctly identified active and obsolete products with an accuracy as high as 98.3%.more » « less
-
Disassembly is an integral part of maintenance, upgrade, and remanufacturing operations to recover end-of-use products. Optimization of disassembly sequences and the capability of robotic technology are crucial for managing the resource-intensive nature of dismantling operations. This study proposes an optimization framework for disassembly sequence planning under uncertainty considering human-robot collaboration. The proposed model combines three attributes: disassembly cost, disassembleability, and safety, to find the optimal path for dismantling a product and assigning each disassembly operation among humans and robots. The multi-attribute utility function has been employed to address uncertainty and make a tradeoff among multiple attributes. The disassembly time reflects the cost of disassembly and is assumed to be an uncertain parameter with a Beta probability density function; the disassembleability evaluates the feasibility of conducting operations by robot; finally, the safety index ensures the safety of human workers in the work environment. The optimization model identifies the best disassembly sequence and makes tradeoffs among multi-attributes. An example of a computer desktop illustrates how the proposed model works. The model identifies the optimal disassembly sequence with less disassembly cost, high disassembleability, and increased safety index while allocating disassembly operations between human and robot. A sensitivity analysis is conducted to show the model's performance when changing the disassembly cost for the robot.more » « less
An official website of the United States government

