Browsing by Author "Camelio, Jaime A."
Now showing 1 - 20 of 28
Results Per Page
Sort Options
- Advancing Manufacturing Quality Control Capabilities Through The Use Of In-Line High-Density Dimensional DataWells, Lee Jay (Virginia Tech, 2014-01-15)Through recent advancements in high-density dimensional (HDD) measurement technologies, such as 3D laser scanners, data-sets consisting of an almost complete representation of a manufactured part's geometry can now be obtained. While HDD data measurement devices have traditionally been used in reverse engineering application, they are beginning to be applied as in-line measurement devices. Unfortunately, appropriate quality control (QC) techniques have yet to be developed to take full advantage of this new data-rich environment and for the most part rely on extracting discrete key product characteristics (KPCs) for analysis. In order to maximize the potential of HDD measurement technologies requires a new quality paradigm. Specifically, when presented with HDD data, quality should not only be assessed by discrete KPCs but should consider the entire part being produced, anything less results in valuable data being wasted. This dissertation addresses the need for adapting current techniques and developing new approaches for the use of HDD data in manufacturing systems to increase overall quality control (QC) capabilities. Specifically, this research effort focuses on the use of HDD data for 1) Developing a framework for self-correcting compliant assembly systems, 2) Using statistical process control to detect process shifts through part surfaces, and 3) Performing automated part inspection for non-feature based faults. The overarching goal of this research is to identify how HDD data can be used within these three research focus areas to increase QC capabilities while following the principles of the aforementioned new quality paradigm.
- Advancing the Utility of Manufacturing Data for Modeling, Monitoring, and Securing Machining ProcessesShafae, Mohammed Saeed Abuelmakarm (Virginia Tech, 2018-08-23)The growing adoption of smart manufacturing systems and its related technologies (e.g., embedded sensing, internet-of-things, cyber-physical systems, big data analytics, and cloud computing) is promising a paradigm shift in the manufacturing industry. Such systems enable extracting and exchanging actionable knowledge across the different entities of the manufacturing cyber-physical system and beyond. From a quality control perspective, this allows for more opportunities to realize proactive product design; real-time process monitoring, diagnosis, prognosis, and control; and better product quality characterization. However, a multitude of challenges are arising, with the growing adoption of smart manufacturing, including industrial data characterized by increasing volume, velocity, variety, and veracity, as well as the security of the manufacturing system in the presence of growing connectivity. Taking advantage of these emerging opportunities and tackling the upcoming challenges require creating novel quality control and data analytics methods, which not only push the boundaries of the current state-of-the-art research, but discover new ways to analyze the data and utilize it. One of the key pillars of smart manufacturing systems is real-time automated process monitoring, diagnosis, and control methods for process/product anomalies. For machining applications, traditionally, deterioration in quality measures may occur due to a variety of assignable causes of variation such as poor cutting tool replacement decisions and inappropriate choice cutting parameters. Additionally, due to increased connectivity in modern manufacturing systems, process/product anomalies intentionally induced through malicious cyber-attacks -- aiming at degrading the process performance and/or the part quality -- is becoming a growing concern in the manufacturing industry. Current methods for detecting and diagnosing traditional causes of anomalies are primarily lab-based and require experts to perform initial set-ups and continual fine-tuning, reducing the applicability in industrial shop-floor applications. As for efforts accounting for process/product anomalies due cyber-attacks, these efforts are in early stages. Therefore, more foundational research is needed to develop a clear understanding of this new type of cyber-attacks and their effects on machining processes, to ensure smart manufacturing security both on the cyber and the physical levels. With primary focus on machining processes, the overarching goal of this dissertation work is to explore new ways to expand the use and value of manufacturing data-driven methods for better applicability in industrial shop-floors and increased security of smart manufacturing systems. As a first step toward achieving this goal, the work in this dissertation focuses on adopting this goal in three distinct areas of interest: (1) Statistical Process Monitoring of Time-Between-Events Data (e.g., failure-time data); (2) Defending against Product-Oriented Cyber-Physical Attacks on Intelligent Machining Systems; and (3) Modeling Machining Process Data: Time Series vs. Spatial Point Cloud Data Structures.
- Assessing sustainability of the continuous improvement process through the identification of enabling and inhibiting factorsMadrigal, Johanna (Virginia Tech, 2012-08-09)This research presents results of innovation management practices and sustainability of continuous improvement. Innovation is recognized as a growth tool for economies in general however not all economy sectors have innovation as a strategy. This research served as a case study to analyze how innovation is managed within innovative firms to help less innovative sectors, such as the wood products industry, to become profitable. Among the observed innovation management practices, this study was able to identify the use of continuous improvement to support incremental innovation. Although, continuous improvement is well known and accepted, there are still challenges to reach a sustainable state of continuous improvement. This research also addresses the difficulty in sustaining continuous improvement through a longitudinal case study. A literature review was conducted to identify factors influencing the sustainability of the continuous improvement. These factors were gathered within a research framework which functioned as the main source to establish the questionnaire used as the research tool. Utilizing this tool, the study evaluated the hypotheses relating to the effects of time, location and company type on the behavior of the enabling and inhibiting factors, and the relationships among them. Results demonstrated that time has no effect on factors affecting the sustainability of the continuous improvement, although changes affect how the factors are perceived as success factors in sustaining continuous improvement. The study also concluded that type of company and location impact how the inhibiting and enabling factors are perceived as supporters of the sustainability of the continuous improvement. Finally, the study revealed that these factors are correlated among them, thus sustainability is the result of a dynamic multifactor process rather that an unique factor. In addition to this new framework, the study also developed a self-assessment tool to be used for continuous improvement practitioners. With this tool, the new developed framework can be continuously monitored and proper and informed action can be taken by managers to address any observed gap in sustaining continuous improvement. Finally, the study also brings an example of interdisciplinary research which gathers quantitative methods from the statistics field, and qualitative methods from the business and social science fields.
- Can a Patient's In-Hospital Length of Stay and Mortality Be Explained by Early-Risk Assessments?Azadeh-Fard, Nasibeh; Ghaffarzadegan, Navid; Camelio, Jaime A. (PLOS, 2016-09-15)Objective To assess whether a patient’s in-hospital length of stay (LOS) and mortality can be explained by early objective and/or physicians’ subjective-risk assessments. Data Sources/Study Setting Analysis of a detailed dataset of 1,021 patients admitted to a large U.S. hospital between January and September 2014. Study Design We empirically test the explanatory power of objective and subjective early-risk assessments using various linear and logistic regression models. Principal Findings The objective measures of early warning can only weakly explain LOS and mortality. When controlled for various vital signs and demographics, objective signs lose their explanatory power. LOS and death are more associated with physicians’ early subjective risk assessments than the objective measures. Conclusions Explaining LOS and mortality require variables beyond patients’ initial medical risk measures. LOS and in-hospital mortality are more associated with the way in which the human element of healthcare service (e.g., physicians) perceives and reacts to the risks.
- Capturing Key Knowledge Exchanges within the Design Process of Transformable Shading SystemsKalantar Mehrjardi, Negar (Virginia Tech, 2016-07-01)In the field of sustainable architecture, transformability is an important way of actively responding to ambient conditions while also meeting the needs of occupants and addressing issues of building performance. This research contributes knowledge for architects about the potential of kinetics for the shading system to respond effectively to changes in its environment. Within contemporary architecture, there is a growing interest in motion; buildings and their parts are gradually shifting from static to dynamic. However, contemporary activities in architecture are evidence of a lack of a holistic approach to the design of motion in architecture and the design of motion as an alternative mode of design thinking is still in its infancy. Consequently, the existing tradition of static forms being the sole forms taught in architectural studies should be reevaluated as a design strategy. This research is a step in the direction of better understanding the key knowledge exchanges within the design process of transformable shading systems. It will seek to investigate, explore, and propose how the concept of transformability in designing shading systems can be suggested, depicted, or physically incorporated in building envelopes. In order to get the full potential of the design process of transformable shading systems, this study presents a design workflow of a specific case, called AURA, that helps to create openings for establishing a proper design methodology of transformable shading systems. While the workflow will be concerned with identifying the key decision nodes, it is anticipated that in-depth development will determine critical parameters addressing transformation itself as a design parameter of transformable shading systems. Two studio-based courses offered at Virginia Tech and Texas AandM by the author will become a testing ground for evaluating the key decision nodes found in the design process of AURA within the context of architectural programs, bringing forth the opportunity to expand the current domain of transformable shading systems to a broader perspective of architecture pedagogy. In this case, this research is a step towards adding values directly into the content of the curricula, and thus into the field of design education as a whole.'
- Characterization of a Sea-State Simulator for Ergonomic StudiesBateman, David Brenton (Virginia Tech, 2011-03-30)With the use of tow-tank experiments, data may be generated for ships of various classes using comprehensive instrumentation. This data gives the ability to determine the response of ships to various sea-state conditions far in advanced of their construction and launch. However, this data does not indicate the effects of those sea-states to the individuals aboard that ship. In order to define these effects a sea-state simulator must be designed and built. Once construction is completed a series of test must be conducted to determine the response of the simulator. This response allows the comparison to actual tow-tank data to determine if the simulator is capable of performing the desired research.
- Compressive Sensing Approaches for Sensor based Predictive Analytics in Manufacturing and Service SystemsBastani, Kaveh (Virginia Tech, 2016-03-14)Recent advancements in sensing technologies offer new opportunities for quality improvement and assurance in manufacturing and service systems. The sensor advances provide a vast amount of data, accommodating quality improvement decisions such as fault diagnosis (root cause analysis), and real-time process monitoring. These quality improvement decisions are typically made based on the predictive analysis of the sensor data, so called sensor-based predictive analytics. Sensor-based predictive analytics encompasses a variety of statistical, machine learning, and data mining techniques to identify patterns between the sensor data and historical facts. Given these patterns, predictions are made about the quality state of the process, and corrective actions are taken accordingly. Although the recent advances in sensing technologies have facilitated the quality improvement decisions, they typically result in high dimensional sensor data, making the use of sensor-based predictive analytics challenging due to their inherently intensive computation. This research begins in Chapter 1 by raising an interesting question, whether all these sensor data are required for making effective quality improvement decisions, and if not, is there any way to systematically reduce the number of sensors without affecting the performance of the predictive analytics? Chapter 2 attempts to address this question by reviewing the related research in the area of signal processing, namely, compressive sensing (CS), which is a novel sampling paradigm as opposed to the traditional sampling strategy following the Shannon Nyquist rate. By CS theory, a signal can be reconstructed from a reduced number of samples, hence, this motivates developing CS based approaches to facilitate predictive analytics using a reduced number of sensors. The proposed research methodology in this dissertation encompasses CS approaches developed to deliver the following two major contributions, (1) CS sensing to reduce the number of sensors while capturing the most relevant information, and (2) CS predictive analytics to conduct predictive analysis on the reduced number of sensor data. The proposed methodology has a generic framework which can be utilized for numerous real-world applications. However, for the sake of brevity, the validity of the proposed methodology has been verified with real sensor data associated with multi-station assembly processes (Chapters 3 and 4), additive manufacturing (Chapter 5), and wearable sensing systems (Chapter 6). Chapter 7 summarizes the contribution of the research and expresses the potential future research directions with applications to big data analytics.
- Computational Simulation and Machine Learning for Quality Improvement in Composites AssemblyLutz, Oliver Tim (Virginia Tech, 2023-08-22)In applications spanning across aerospace, marine, automotive, energy, and space travel domains, composite materials have become ubiquitous because of their superior stiffness-to-weight ratios as well as corrosion and fatigue resistance. However, from a manufacturing perspective, these advanced materials have introduced new challenges that demand the development of new tools. Due to the complex anisotropic and nonlinear material properties, composite materials are more difficult to model than conventional materials such as metals and plastics. Furthermore, there exist ultra-high precision requirements in safety critical applications that are yet to be reliably met in production. Towards developing new tools addressing these challenges, this dissertation aims to (i) build high-fidelity numerical simulations of composite assembly processes, (ii) bridge these simulations to machine learning tools, and (iii) apply data-driven solutions to process control problems while identifying and overcoming their shortcomings. This is accomplished in case studies that model the fixturing, shape control, and fastening of composite fuselage components. Therein, simulation environments are created that interact with novel implementations of modified proximal policy optimization, based on a newly developed reinforcement learning algorithm. The resulting reinforcement learning agents are able to successfully address the underlying optimization problems that underpin the process and quality requirements.
- Cyber-Physical Security for Additive Manufacturing SystemsSturm, Logan Daniel (Virginia Tech, 2020-12-16)Additive manufacturing (AM) is a growing section of the advanced manufacturing field and is being used to fabricate an increasing number of critical components, from aerospace components to medical implants. At the same time, cyber-physical attacks targeting manufacturing systems have continued to rise. For this reason, there is a need to research new techniques and methods to ensure the integrity of parts fabricated on AM systems. This work seeks to address this need by first performing a detailed analysis of vulnerabilities in the AM process chain and how these attack vectors could be used to execute malicious part sabotage attacks. This work demonstrated the ability of an internal void attack on the .STL file to reduce the yield load of a tensile specimen by 14% while escaping detection by operators. To mitigate these vulnerabilities, a new impedance-based approach for in situ monitoring of AM systems was created. Two techniques for implementing this approach were investigated, direct embedding of sensors in AM parts, and the use of an instrumented fixture as a build plate. The ability to detect changes in material as small as 1.38% of the printed volume (53.8 mm3) on a material jetting system was demonstrated. For metal laser powder bed fusion systems, a new method was created for representing side-channel meltpool emissions. This method reduces the quantity of data while remaining sensitive enough to detect changes to the toolpath and process parameters caused by malicious attacks. To enable the SCMS to validate part quality during fabrication required a way to receive baseline part quality information across an air-gap. To accomplish this a new process noise tolerant method of cyber-physical hashing for continuous data sets was presented. This method was coupled with new techniques for the storage, transmission, and reconstructing of the baseline quality data was implemented using stacks of "ghost" QR codes stored in the toolpath to transmit information through the laser position. A technique for storing and transmitting quality information in the toolpath files of parts using acoustic emissions was investigated. The ATTACH (additive toolpath transmission of acoustic cyber-physical hash) method used speed modulation of infill roads in a material extrusion system to generate acoustic tones containing quality information about the part. These modulations were able to be inserted without affecting the build time or requiring additional material and did not affect the quality of the part that contained them. Finally, a framework for the design and implementation of a SCMS for protecting AM systems against malicious cyber-physical part sabotage attacks was created. The IDEAS (Identify, Define, Establish, Aggregate, Secure) framework provides a detailed reference for engineers to use to secure AM systems by leveraging the previous work in vulnerability assessment, creation of new side-channel monitoring techniques, concisely representing quality data, and securely transmitting information to air-gapped systems through physical emissions.
- Cyber-Physical Security for Advanced ManufacturingDesmit, Zachary James (Virginia Tech, 2018-01-16)The increased growth of cyber-physical systems, controlling multiple production processes within the manufacturing industry, has led to an industry susceptible to cyber-physical attacks. Differing from traditional cyber-attacks in their ability to alter the physical world, cyber-physical attacks have been increasing in number since the early 2000's. To combat and ultimately prevent the malicious intent of such attacks, the field of cyber-physical security was launched. Cyber-physical security efforts can be seen across many industries that employ cyber-physical systems but little work has been done to secure manufacturing systems. Through the completion of four research objectives, this work provides the foundation necessary to begin securing manufacturing systems from cyber-physical attacks. First, this work is motivated through the systematic review of literature surrounding the topic. This objective not only identifies and highlights the need for research efforts within the manufacturing industry, but also defines the research field. Second, a framework is developed to identify cyber-physical vulnerabilities within manufacturing systems. The framework is further developed into a tool allowing manufacturers to more easily identify the vulnerabilities that exist within their manufacturing systems. This tool will allow a manufacturer to utilize the developed framework and begin the steps necessary to secure the manufacturing industry. Finally, game theoretic models is applied to cyber-physical security in manufacturing to model the interactions between adversaries and defenders. The results of this work provide the manufacturing industry with the tools and motivation necessary to begin securing manufacturing facilities from malicious cyber-physical attacks and create a more resilient industry.
- Cybersecurity for the Internet of Things: A Micro Moving Target IPv6 DefenseZeitz, Kimberly Ann (Virginia Tech, 2019-09-04)As the use of low-power and low-resource embedded devices continues to increase dramatically with the introduction of new Internet of Things (IoT) devices, security techniques are necessary which are compatible with these devices. This research advances the knowledge in the area of cybersecurity for the IoT through the exploration of a moving target defense to apply for limiting the time attackers may conduct reconnaissance on embedded systems while considering the challenges presented from IoT devices such as resource and performance constraints. We introduce the design and optimizations for µMT6D, a Micro-Moving Target IPv6 Defense, including a description of the modes of operation and use of lightweight hash algorithms. Through simulations and experiments µMT6D is shown to be viable for use on low power and low resource embedded devices in terms of footprint, power consumption, and energy consumption increases in comparison to the given security benefits. Finally, this provides information on other future considerations and possible avenues of further experimentation and research.
- Data Analytics for Statistical LearningKomolafe, Tomilayo A. (Virginia Tech, 2019-02-05)The prevalence of big data has rapidly changed the usage and mechanisms of data analytics within organizations. Big data is a widely-used term without a clear definition. The difference between big data and traditional data can be characterized by four Vs: velocity (speed at which data is generated), volume (amount of data generated), variety (the data can take on different forms), and veracity (the data may be of poor/unknown quality). As many industries begin to recognize the value of big data, organizations try to capture it through means such as: side-channel data in a manufacturing operation, unstructured text-data reported by healthcare personnel, various demographic information of households from census surveys, and the range of communication data that define communities and social networks. Big data analytics generally follows this framework: first, a digitized process generates a stream of data, this raw data stream is pre-processed to convert the data into a usable format, the pre-processed data is analyzed using statistical tools. In this stage, called statistical learning of the data, analysts have two main objectives (1) develop a statistical model that captures the behavior of the process from a sample of the data (2) identify anomalies in the process. However, several open challenges still exist in this framework for big data analytics. Recently, data types such as free-text data are also being captured. Although many established processing techniques exist for other data types, free-text data comes from a wide range of individuals and is subject to syntax, grammar, language, and colloquialisms that require substantially different processing approaches. Once the data is processed, open challenges still exist in the statistical learning step of understanding the data. Statistical learning aims to satisfy two objectives, (1) develop a model that highlights general patterns in the data (2) create a signaling mechanism to identify if outliers are present in the data. Statistical modeling is widely utilized as researchers have created a variety of statistical models to explain everyday phenomena such as predicting energy usage behavior, traffic patterns, and stock market behaviors, among others. However, new applications of big data with increasingly varied designs present interesting challenges. Consider the example of free-text analysis posed above. There's a renewed interest in modeling free-text narratives from sources such as online reviews, customer complaints, or patient safety event reports, into intuitive themes or topics. As previously mentioned, documents describing the same phenomena can vary widely in their word usage and structure. Another recent interest area of statistical learning is using the environmental conditions that people live, work, and grow in, to infer their quality of life. It is well established that social factors play a role in overall health outcomes, however, clinical applications of these social determinants of health is a recent and an open problem. These examples are just a few of many examples wherein new applications of big data pose complex challenges requiring thoughtful and inventive approaches to processing, analyzing, and modeling data. Although a large body of research exists in the area of anomaly detection increasingly complicated data sources (such as side-channel related data or network-based data) present equally convoluted challenges. For effective anomaly-detection, analysts define parameters and rules, so that when large collections of raw data are aggregated, pieces of data that do not conform are easily noticed and flagged. In this work, I investigate the different steps of the data analytics framework and propose improvements for each step, paired with practical applications, to demonstrate the efficacy of my methods. This paper focuses on the healthcare, manufacturing and social-networking industries, but the materials are broad enough to have wide applications across data analytics generally. My main contributions can be summarized as follows: • In the big data analytics framework, raw data initially goes through a pre-processing step. Although many pre-processing techniques exist, there are several challenges in pre-processing text data and I develop a pre-processing tool for text data. • In the next step of the data analytics framework, there are challenges in both statistical modeling and anomaly detection o I address the research area of statistical modeling in two ways: - There are open challenges in defining models to characterize text data. I introduce a community extraction model that autonomously aggregates text documents into intuitive communities/groups - In health care, it is well established that social factors play a role in overall health outcomes however developing a statistical model that characterizes these relationships is an open research area. I developed statistical models for generalizing relationships between social determinants of health of a cohort and general medical risk factors o I address the research area of anomaly detection in two ways: - A variety of anomaly detection techniques exist already, however, some of these methods lack a rigorous statistical investigation thereby making them ineffective to a practitioner. I identify critical shortcomings to a proposed network based anomaly detection technique and introduce methodological improvements - Manufacturing enterprises which are now more connected than ever are vulnerably to anomalies in the form of cyber-physical attacks. I developed a sensor-based side-channel technique for anomaly detection in a manufacturing process
- Design, Implementation and Use of In-Process Sensor Data for Monitoring Broaching and Turning Processes: A Multi - Sensor ApproachRathinam, Arvinth Chandar (Virginia Tech, 2013-06-02)Real-time quality monitoring continues to gain interest within the manufacturing domain as new and faster sensors are being developed. Unfortunately, most quality monitoring solutions are still based on collecting data from the end product. From a process improvement point of view, it is definitely more advantageous to proactively monitor quality directly in the process instead of the product, so that the consequences of a defective part can be minimized or even eliminated. In this dissertation, new methods for in-line process monitoring are explored using multiple sensors. In the first case, a new cutting force-based monitoring methodology was developed to detect out of control conditions in a broaching operation. The second part of this thesis focusses on the development of a test bed for monitoring the tool condition in a turning operation. The constructed test bed includes the combination of multiple sensors signals including, temperature, vibrations, and energy measurements. Here, the proposed SPC strategy integrates sensor data with engineering knowledge to produce quick, reliable results using proven profile monitoring techniques. While, the already existing methods are based on raw process data which requires more features to monitor without any loss of information. This technique is straight forward and able to monitor the process comprehensively with less number of features. Consequently, this also adds to the group of tools that are available for the practitioner.
- The Effect of Uncertain End-of-Life Product Quality and Consumer Incentives on Partial Disassembly Sequencing in Value Recovery OperationsRickli, Jeremy Lewis (Virginia Tech, 2013-08-19)This dissertation addresses gaps in the interaction between End-of-Life (EoL) product acquisition systems and disassembly sequencing. The research focuses on two remanufacturing research problems; 1) modeling uncertain EoL product quality, quantity, and timing in regards to EoL product acquisition and disassembly sequencing and 2) designing EoL product acquisition schemes considering EoL product uncertainty. The main research objectives within these areas are; analyzing, predicting, and controlling EoL product uncertainty, and incorporating EoL product uncertainty into operational and strategic level decisions. This research addresses these objectives by researching a methodology to determine optimal or near-optimal partial disassembly sequences using infeasible sequences while considering EoL product quality uncertainty. Consumer incentives are integrated into the methodology to study the effect of EoL product take-back incentives, but it also allows for the study of EoL product quantity uncertainty. EoL product age distributions are key to integrating the disassembly sequence method with EoL product acquisition management, acting both as an indicator of quality and as a basis for determining return quantity when considering incentives. At a broader level, this research makes it possible to study the impact of EoL product quality, and to an extent quantity, uncertainty resulting from strategic level (acquisition scheme) decisions, on operational (disassembly sequencing) decisions. This research is motivated by the rising importance of value recovery and sustainability to manufacturers. Extended Producer Responsibility (EPR) and Product Stewardship (PS) policies are, globally, changing the way products are treated during their use-life and EoL. Each new policy places a greater responsibility on consumers and manufacturers to address the EoL of a product. Manufacturers, in particular, may have to fulfill these obligations by such means as contracting 3rd parties for EoL recovery or performing recovery in-house. The significance of this research is linked to the growing presence of remanufacturing and recovery in the US and global economy, either via profitable ventures or environmental regulations. Remanufacturing, in particular, was surveyed by the US International Trade Commission in 2011-2012, where it was determined that remanufacturing grew by 15% to $43 billion, supported 180,000 full-time jobs from 2009-2011, and is continuing to grow. A partial disassembly sequence, multi-objective genetic algorithm (GA) is used a solution procedure to address the problem of determining the optimal or near-optimal partial disassembly sequence considering a continuous age distribution of EoL or available consumer products, with and without a consumer take-back incentive. The multi-objective GA, novel to the presented approach, relies on infeasible sequences to converge to optimal or near-optimal disassembly sequences. It is verified with a discrete economic and environmental impact case prior to incorporating EoL product age distributions. Considering the age distribution of acquired EoL products allows for decisions to be made based not only on expected profit, but also on profit variance and profit probability per EoL product, which was not observed in previous literature. As such, the research presented here within provides three contributions to disassembly and EoL product acquisition research: 1) integrating EoL product age distributions into partial disassembly sequencing objective functions, 2) accounting for partial disassembly sequence expected profit, profit variation, and profit probability as compared to disassembly sequencing methods that have, historically, only considered expected profit, and 3) studying the impact of EoL product age distributions and consumer take-back incentives on optimal or near-optimal partial disassembly sequences. Overall, this doctoral research contributes to the body of knowledge in value recovery, reverse logistics, and disassembly research fields, and is intended to be used, in the future, to develop and design efficient EoL product acquisition systems and disassembly operations.
- Enabling Connections in the Product Lifecycle using the Digital ThreadHedberg, Thomas Daniel Jr. (Virginia Tech, 2018-11-01)Product lifecycles are complex heterogeneous systems. Applying control methods to lifecycles requires significant human capital. Additionally, measuring lifecycles relies primarily on domain expertise and estimates. Presented in this dissertation is a way to semantically represent a product lifecycle as a cyber-physical system for enabling the application of control methods to the lifecycle. Control requires a model and no models exist currently that integrate each phase of lifecycles. The contribution is an integration framework that brings all phases and systems of a lifecycle together. First presented is a conceptual framework and technology innovation. Next, linking product lifecycle data dynamical is described and then how that linked data could be certified and traced for trustworthiness. After that, discussion is focused how the trusted linked data could be combined with machine learning to drive applications throughout the product lifecycle. Last, a case study is provided that integrates the framework and technology. Integrating all of this would enable efficient and effective measurements of the lifecycle to support prognostic and diagnostic control of that lifecycle and related decisions.
- Essays on Risk Indicators and Assessment: Theoretical, Empirical, and Engineering ApproachesAzadeh Fard, Nasibeh (Virginia Tech, 2016-01-15)Risk indicators are metrics that are widely used in risk management to indicate how risky an activity is. Among different types of risk indicators, early warning systems are designed to help decision makers predict and be prepared for catastrophic events. Especially, in complex systems where outcomes are often difficult to predict, early warnings can help decision makers manage possible risks and take a proactive approach. Early prediction of catastrophic events and outcomes are at the heart of risk management, and help decision makers take appropriate actions in order to mitigate possible effects of such events. For example, physicians would like to prevent any adverse events for their patients and like to use all pieces of information that help accurate early diagnosis and interventions. In this research, first we study risk assessment for occupational injuries using accident severity grade as an early warning indicator. We develop a new severity scoring system which considers multiple injury severity factors, and can be used as a part of a novel three-dimensional risk assessment matrix which includes an incident's severity, frequency, and preventability. Then we study the predictability of health outcome based on early risk indicators. A systems model of patient health outcomes and hospital length of stay is presented based on initial health risk and physician assessment of risk. The model elaborates on the interdependent effects of hospital service and a physician's subjective risk assessment on length of stay and mortality. Finally, we extend our research to study the predictive power of early warning systems and prognostic risk indicators in predicting different outcomes in health such as mortality, disease diagnosis, adverse outcomes, care intensity, and survival. This study provides a theoretical framework on why risk indicators can or cannot predict healthcare outcomes, and how better predictors can be designed. Overall, these three essays shed light on complexities of risk assessments, especially in health domain, and in the contexts where individuals continuously observe and react to the risk indicators. Furthermore, our multi-method research approach provides new insights into improving the design and use of the risk measures.
- Heterogeneous Sensor Data based Online Quality Assurance for Advanced Manufacturing using Spatiotemporal ModelingLiu, Jia (Virginia Tech, 2017-08-21)Online quality assurance is crucial for elevating product quality and boosting process productivity in advanced manufacturing. However, the inherent complexity of advanced manufacturing, including nonlinear process dynamics, multiple process attributes, and low signal/noise ratio, poses severe challenges for both maintaining stable process operations and establishing efficacious online quality assurance schemes. To address these challenges, four different advanced manufacturing processes, namely, fused filament fabrication (FFF), binder jetting, chemical mechanical planarization (CMP), and the slicing process in wafer production, are investigated in this dissertation for applications of online quality assurance, with utilization of various sensors, such as thermocouples, infrared temperature sensors, accelerometers, etc. The overarching goal of this dissertation is to develop innovative integrated methodologies tailored for these individual manufacturing processes but addressing their common challenges to achieve satisfying performance in online quality assurance based on heterogeneous sensor data. Specifically, three new methodologies are created and validated using actual sensor data, namely, (1) Real-time process monitoring methods using Dirichlet process (DP) mixture model for timely detection of process changes and identification of different process states for FFF and CMP. The proposed methodology is capable of tackling non-Gaussian data from heterogeneous sensors in these advanced manufacturing processes for successful online quality assurance. (2) Spatial Dirichlet process (SDP) for modeling complex multimodal wafer thickness profiles and exploring their clustering effects. The SDP-based statistical control scheme can effectively detect out-of-control wafers and achieve wafer thickness quality assurance for the slicing process with high accuracy. (3) Augmented spatiotemporal log Gaussian Cox process (AST-LGCP) quantifying the spatiotemporal evolution of porosity in binder jetting parts, capable of predicting high-risk areas on consecutive layers. This work fills the long-standing research gap of lacking rigorous layer-wise porosity quantification for parts made by additive manufacturing (AM), and provides the basis for facilitating corrective actions for product quality improvements in a prognostic way. These developed methodologies surmount some common challenges of advanced manufacturing which paralyze traditional methods in online quality assurance, and embody key components for implementing effective online quality assurance with various sensor data. There is a promising potential to extend them to other manufacturing processes in the future.
- Impedance-based Nondestructive Evaluation for Additive ManufacturingTenney, Charles M. (Virginia Tech, 2020-09-15)Impedance-based Non-Destructive Evaluation for Additive Manufacturing (INDEAM) is rooted in the field of Structural Health Monitoring (SHM). INDEAM generalizes the structure-to-itself comparisons characteristic of the SHM process through introduction of inter-part comparisons: instead of comparing a structure to itself over time, potentially-damaged structures are compared to known-healthy reference structures. The purpose of INDEAM is to provide an alternative to conventional nondestructive evaluation (NDE) techniques for additively manufactured (AM) parts. In essence, the geometrical complexity characteristic of AM processes combined with a phase-change of the feedstock during fabrication complicate the application of conventional NDE techniques by limiting direct access for measurement probes to surfaces and permitting the introduction of internal defects that are not present in the feedstock, respectively. NDE approaches that are capable of surmounting these challenges are typically highly expensive. In the first portion of this work, the procedure for impedance-based NDE is examined in the context of INDEAM. In consideration of the additional variability inherent in inter-part comparisons - as opposed to part-to-itself comparisons - the metrics used to quantify damage or change to a structure are evaluated. Novel methods of assessing damage through impedance-based evaluation are proposed and compared to existing techniques. In the second portion of this work, the INDEAM process is applied to a wide variety of test objects. This portion considers how the sensitivity of the INDEAM process is affected by defect type, defect size, defect location, part material, and excitation frequency. Additionally, a procedure for studying the variance introduced during the process of instrumenting a structure is presented and demonstrated.
- Integrated Design and Manufacturing [IDM] Framework for the Modular Construction IndustryAlkahlan, Bandar Suliman (Virginia Tech, 2016-07-01)If we look at the construction industry, particularly the modular single-family construction industry, we often see that the design stage is distinctly separate from the construction and fabrication stages. This separation has been occurring for some time now, however, there is often a noticeable lack of understanding of the constraints in linking architectural design to modular construction for single-family housing. In addition, no framework exists which seeks to support overcoming these constraints for the architectural design process while simultaneously bringing knowledge of fabrication, materials selection, and modular construction to the early stage of design. Also, there is a lack of knowledge of fabrication and modular construction constraints by many architects. This research intended to focus upon mapping the design and manufacturing processes for a specific scale of projects: residential single-family units. The research also aimed to understand the relationships among design, the role of emerging technologies, and manufacturing within the modular home construction industry in order to develop a design process that is based upon mass customization, rather than mass production. Thus, qualitative research methods based upon a grounded theory approach were used for evaluating, capturing, and structuring knowledge. To achieve the greatest possible amount of useful information, case studies of on-site visits to manufactured housing production facilities and structured, in-depth, open-ended interviews of architects, engineers, production managers, business managers, and other knowledge-holders within the manufactured modular housing industry were performed. The aim of this research was to map the design and modular homes manufacturing processes in an effort to better understand the relationships between these two domains. The Integration Definition (IDEF0) for Function Modeling was used as a graphical presentation technique. The goal of using such a graphical technique was, first, to understand and analyze the functions of the existing "As-is" design-manufacture communication process; and second, to enhance and improve the communication and productivity performances among people working in the design, manufacturing, and production sectors. Using this graphical modeling method assisted with mapping the design and modular manufacturing processes, including organizations, teams, decisions, actions, and activities. Through this mapping process, strategies to improve the emergent relationships were proposed as a new "To-be" design and manufacturing framework for modular single-family housing projects.
- Monitoring and Prognostics for Broaching Processes by Integrating Process KnowledgeTian, Wenmeng (Virginia Tech, 2017-08-07)With the advancement of sensor technology and data processing capacities, various types of high volume data are available for process monitoring and prognostics in manufacturing systems. In a broaching process, a multi-toothed broaching tool removes material from the workpiece by sequential engagement and disengagement of multiple cutting edges. The quality of the final part, including the geometric integrity and surface finish, is highly dependent upon the broaching tool condition. Though there has been a considerable amount of research on tool condition monitoring and prognostics for various machining processes, the broaching process is unique in the following aspects: 1) a broaching process involves multiple cutting edges, which jointly contribute to the final part quality; 2) the resharpening and any other process adjustments to the tool can only be performed with the whole broaching tool or at least a whole segment of the tool replaced. The overarching goal of this research is to explore how engineering knowledge can be used to improve process monitoring and prognostics for a complex manufacturing process like broaching. This dissertation addresses the needs for developing new monitoring and prognostics approaches based on various types of data. Specifically, the research effort focuses on 1) the use of in-situ force profile data for real-time process monitoring and fault diagnosis, 2) degradation characterization for broaching processes on an individual component level based on image processing; and 3) system-level degradation modeling and remaining useful life prediction for broaching processes based on multiple images.