Browsing by Author "Van Aken, Eileen M."
Now showing 1 - 20 of 51
Results Per Page
Sort Options
- An Activity- Based Costing and Theory of Constraints Model for Product- Mix DecisionsGurses, Ayse Pinar (Virginia Tech, 1999-06-29)The objective of this thesis is to demonstrate the use of the Activity-Based Costing (ABC) approach together with the Theory of Constraints (TOC) philosophy in determining the optimal product-mix and restrictive bottlenecks of a company. The contribution of this thesis is a new product-mix decision model that uses activity-based cost information. This new model is proposed to be used with the TOC philosophy in order to improve the financial performance of a company. Four case studies, all of which are based on hypothetical data, are prepared in this research to show the applicability of the proposed model in different manufacturing environments. Specifically, the first case study shows that the conventional product-mix decision model and the model developed in this thesis can give significantly different results regarding the best product-mix and associated bottlenecks of a company. The second case study demonstrates the use of the proposed product-mix decision model in a complex manufacturing environment. Specifically, this case study shows how companies should consider alternatives such as activity flexibility and outsourcing to improve their profitability figures. The third case study is an extension of the second case study, and it is prepared to illustrate that the proposed model can be extended to include more than one time period. The final case study demonstrates the applicability of the proposed model in a lean manufacturing environment. Using the proposed model developed in this research will give managers more accurate information regarding the optimum product-mix and critical bottlenecks of their companies. By applying the TOC philosophy based on this information, managers will be able to take the right actions that will improve the profitability of their companies. Specifically, they will be able to observe the effects of several alternatives, such as activity flexibility and outsourcing, on the throughput of the whole system. In addition, the proposed model should help managers to prevent making decisions that sub-optimize the system. This may occur, for example, when using only the most efficient methods to produce each product even though the capacities of these methods are limited and some other less efficient methods are currently available in the company. By extending the model to include more than one time period, managers will be able to estimate the potential bottlenecks and the amount of idle capacities of each non-bottleneck activity performed in the company ahead of time. This information is powerful and can give companies a substantial advantage over their competitors because the users of the new model will have enough time to improve the performance of their potential bottlenecks and to search for more profitable usage of excess capacities before the actual production takes place.
- Analysis of Communication Patterns During Construction Production PlanningGhosh, Somik (Virginia Tech, 2012-02-24)The construction industry ranks high in the number of occupational incidents due to the complex and interdependent nature of the tasks. However, construction firms using lean construction have reported better safety performance than the rest. The situation reflects the limitation of traditional planning methods used in construction firms focusing on project level planning, at the expense of production level planning. Lean construction involves participants in the formal production planning process to minimize variability in workflow thus reducing probability of incidents. Considering the involvement of various participants in the production planning process, this research study hypothesized that communication levels afforded by participants during formal production planning have a positive impact on safety performance. The goal of this research study was to understand the role of communication in the formal production planning process and its impact on safety performance. A case study approach was adopted for analyzing two projects, one following formal production planning and another following traditional project planning. Weekly subcontractor coordination meeting was selected as the unit of analysis. Data has been collected using direct observations, open-ended interviews, and examination of archival documents. For this study, the independent variables were categories of communication and dependent variable was recordable incidence rate (safety performance). Communication data was analyzed using Robert Bales' Interaction Process Analysis. Based on the analyses, the participants involved in formal production planning demonstrated: more sensitivity and higher degree of control by frequently providing suggestions/opinions, more enthusiasm in exchange of commitments, sincerity by declining inquiry for commitments in case of conflict of interest, and greater involvement by engaging in frequent dialogues with others. In addition, participants involved in production planning adopted a proactive approach toward safety performance by ensuring that safety was considered while preparing production plans, thus helping improve awareness. The findings indicated a better safety record by the project following formal production planning in comparison to the other project. The research study provides a "meso" level understanding of the role of communication among project participants during formal production planning, and indicates that production planning might have a beneficial impact on safety performance.
- Analysis of the Effect of Ordering Policies for a Manufacturing Cell Transitioning to Lean ProductionHafner, Alan D. (Virginia Tech, 2003-06-11)Over the past two decades, Lean Production has begun to replace traditional manufacturing techniques around the world, mainly due to the success of the Toyota Motor Company. One key to Toyota's success that many American companies have not been able to emulate is the transformation of their suppliers to the lean philosophy. This lack of supplier transformation in America is due to a variety of reasons including differences in supplier proximity, supplier relationships, supplier performance levels, and the ordering policies used for supplied parts. The focus of this research is analyzing the impact of ordering policies for supplied parts of a manufacturing cell utilizing Lean Production techniques. This thesis presents a simulation analysis of a multi-stage, lean manufacturing cell that produces a family of products. The analysis investigates how the ordering policy for supplied parts affects the performance of the cell under conditions of demand variability and imperfect supplier performance. The ordering policies evaluated are a periodic-review inventory control policy (s, S) and two kanban policies. The performance of the cell is measured by the flowtime of the product through the cell, the on-time-delivery to their customer, the number of products shipped each week, the amount of work-in-process inventory in the cell, the approximate percentage of time the cell was stocked out, and the average supplied part inventory levels for the cell. Using this simulation model, an experimental analysis is conducted using an augmented central composite design. Then, a multivariate analysis is performed on the results of the experiments. The results obtained from this study suggest that the preferred ordering policy for supplied parts is the (s, S) inventory policy for most levels of the other three factors and most of the performance measures. This policy, however, results in increased levels of supplied part inventory, which is the primary reason for the high performance for most response variables. This increased inventory is in direct conflict with the emphasis on inventory and waste reduction, one of the key principles of Lean Production. Furthermore, the inflated kanban policy tends to perform well at high levels of supplier on-time delivery and low levels of customer demand variability. These results are consistent with the proper conditions under which to implement Lean Production: good supplier performance and level customer demand. Thus, while the (s, S) inventory policy may be advantageous as a company begins transitioning to Lean Production, the inflated kanban policy may be preferable once the company has established good supplier performance and level customer demand.
- Analysis of Worker Assignment Policies on Production Line Performance Utilizing a Multi-skilled WorkforceMcDonald, Thomas N. (Virginia Tech, 2004-02-27)Lean production prescribes training workers on all tasks within the cell to adapt to changes in customer demand. Multi-skilling of workers can be achieved by cross-training. Cross-training can be improved and reinforced by implementing job rotation. Lean production also prescribes using job rotation to improve worker flexibility, worker satisfaction, and to increase worker knowledge in how their work affects the rest of the cell. Currently, there is minimal research on how to assign multi-skilled workers to tasks within a lean production cell while considering multi-skilling and job rotation. In this research, a new mathematical model was developed that assigns workers to tasks, while ensuring job rotation, and determines the levels of skill, and thus training, necessary to meet customer demand, quality requirements, and training objectives. The model is solved using sequential goal programming to incorporate three objectives: overproduction, cost of poor quality, and cost of training. The results of the model include an assignment of workers to tasks, a determination of the training necessary for the workers, and a job rotation schedule. To evaluate the results on a cost basis, the costs associated with overproduction, defects, and training were used to calculate the net present cost for one year. The solutions from the model were further analyzed using a simulation model of the cell to determine the impact of job rotation and multi-skilling levels on production line performance. The measures of performance include average flowtime, work-in-process (WIP) level, and monthly shipments (number produced). Using the model, the impact of alternative levels of multi-skilling and job rotation on the performance of cellular manufacturing systems is investigated. Understanding the effect of multi-skilling and job rotation can aid both production managers and human resources managers in determining which workers need training and how often workers should be rotated to improve the performance of the cell. The lean production literature prescribes training workers on all tasks within a cell and developing a rotation schedule to reinforce the cross-training. Four levels of multi-skilling and three levels of job rotation frequency are evaluated for both a hypothetical cell and a case application in a relatively mature actual production cell. The results of this investigation provide insight on how multi-skilling and job rotation frequency influence production line performance and provide guidance on training policies. The results show that there is an interaction effect between multi-skilling and job rotation for flowtime, work-in-process, in both the hypothetical cell and the case application and monthly shipments in the case application. Therefore, the effect of job rotation on performance measures is not the same at all levels of multi-skilling thus indicating that inferences about the effect of changing multi-skilling, for example, should not be made without considering the job rotation level. The results also indicate that the net present cost is heavily influenced by the cost of poor quality. The results for the case application indicated that the maturity level of the cell influences the benefits derived from increased multi-skilling and affects several key characteristics of the cell. As a cell becomes more mature, it is expected that the quality levels increase and that the skill levels on tasks normally performed increase. Because workers in the case application already have a high skill level on some tasks, the return on training is not as significant. Additionally, the mature cell has relatively high quality levels from the beginning and any improvements in quality would be in small increments rather than in large breakthroughs. The primary contribution of this research is the development of a sequential goal programming worker assignment model that addresses overproduction, poor quality, cross-training, and job rotation in order to meet the prescription in the lean production literature of only producing to customer demand while utilizing multi-skilled workers. Further contributions are analysis of how multi-skilling level and job rotation frequency impact the performance of the cell. Lastly, a contribution is the application of optimization and simulation methods for comprehensively analyzing the impact of worker assignment on performance measures.
- Application of Systems Engineering Analysis Methods to Examine Engineering Transfer Student PersistenceSmith, Natasha Leigh (Virginia Tech, 2020-01-20)The demand for engineering graduates in the United States continues to grow, yet the number of students entering post-secondary education is declining, and graduation rates have seen little to no change over the last several decades. Engineering transfer students are a growing population and can help meet the nation's needs, however, there is little research on the persistence of this population after they transfer to the receiving institution. Student persistence is dependent on a complex set of interactions over time. Management systems engineering provides a framework for working with complex systems through system analysis and design, with a focus on the interactions of the system components. This research includes multiple management systems engineering analysis methods used to define and develop a systems view of engineering transfer student persistence. This work includes a comprehensive literature review to identify factors affecting engineering transfer student persistence, an empirical analysis of an institutional dataset, and development of a simulation model to demonstrate the throughput of engineering transfer student. Findings include 34 factors identified in the literature as affecting engineering student persistence. A review of the literature also highlighted two important gaps in the literature, including a focus on post-transfer success almost exclusively in the first post-transfer year and a significant interest in vertical transfer students, with little consideration given to lateral transfer students. The empirical analysis addressed the gaps found in the literature. Vertical and lateral engineering transfer students were found to experience different levels of transfer shock which also impacts their 4-year graduation rates. The analysis also found transfer shock was not unique to the first post-transfer term, it was also present in the second and third post-transfer terms, and reframed as transfer adjustment. The simulation model uncovers leaving patterns of engineering transfer students which include the students leaving engineering in the second year, as well as those graduating with an engineering degree in the third year. Overall this research identifies explicit factors that affect engineering transfer student persistence and suggests a new systems engineering approach for understanding student persistence and how institutions can affect change.
- Benchmarking Performance Measurement and the Implementation of Lean Manufacturing in the Secondary Wood Processing Rough MillCumbo, Dan; Kline, D. Earl; Van Aken, Eileen M.; Smith, Robert L. (Virginia Tech, 2004-09)It is hypothesized that, while other components of the secondary wood products value stream; e.g., moulding, turning, sanding, etc, are being integrated and “leaned up” so to speak, the rough mill represents a real or perceived barrier to full implementation of lean manufacturing tools, techniques and concepts. This study investigated the implementation of lean manufacturing in the rough mill as well as performance measurement and metrics at both the rough mill and overall business level. Data were collected from a nationwide survey of secondary wood processing facilities.
- Comparison of transfer shock and graduation rates across engineering transfer student populationsSmith, Natasha L.; Grohs, Jacob R.; Van Aken, Eileen M. (2021-10-20)Background Increasing the persistence of engineering transfer students can help meet the US national priority of increasing the number of engineering graduates. Many transfer students experience a decrease in their grade point average (GPA) at their receiving institution, known as transfer shock, which can lead to them leaving the institution. This GPA decrease is found to be more prevalent in engineering transfer students. Purpose/Hypothesis The purpose of this study is to analyze a single institutional dataset to determine when transfer shock occurs, how it differs among engineering transfer student subgroups, and if transfer shock is a predictor of graduation within 4 years in engineering. Design/Method A 10-year dataset with 789 engineering transfer students was used in this study, and the engineering transfer students were split into four subgroups. Multiple statistical analyses were conducted, including Welch's F-test, chi-square, and logistic regression, to understand differences in transfer shock during the first three terms of enrollment as well as 4-year graduation rates among each subgroup. Results Transfer shock extends through the first three post-transfer terms, resulting in transfer norming. The engineering transfer student subgroups experience different levels of transfer norming; however, the subgroups were not predictors of graduation. The predictors were the transfer GPA and the transfer norming in the first three post-transfer terms of enrollment. Conclusions Engineering transfer students are not a homogeneous population; there are key differences between lateral and vertical transfer students. More strategic, longitudinal programming and decision-making should be considered by institutions.
- Creating a Positive Departmental Climate at Virginia Tech: A Compendium of Successful StrategiesFinney, Jack W.; Finkielstein, Carla V.; Merola, Joseph S.; Puri, Ishwar; Taylor, G. Don; Van Aken, Eileen M.; Hyer, Patricia B.; Savelyeva, Tamara (Virginia Tech, 2008-05-05)“Creating a Positive Departmental Climate at Virginia Tech: A Compendium of Successful Strategies” was created as part of the AdvanceVT Departmental Climate Initiative (DCI). The Department Climate Committee collected policies and practices from a variety of sources to provide department chairs and heads with opportunities to learn about departmental issues at Virginia Tech, to understand more fully the ways in which these issues manifest themselves within departments, and to share both successful and unsuccessful strategies illustrative of the different approaches departments have taken towards promoting effective, efficient, and pleasant work environments.
- Critical Success Factors for Sustaining Kaizen Event OutcomesGlover, Wiljeana Jackson (Virginia Tech, 2010-04-05)A Kaizen event is a focused and structured improvement project, using a dedicated cross-functional team to improve a targeted work area, with specific goals, in an accelerated timeframe. Kaizen events have been widely reported to produce positive change in business results and human resource outcomes. However, it can be difficult for many organizations to sustain or improve upon the results of a Kaizen event after it concludes. Furthermore, the sustainability of Kaizen event outcomes has received limited research attention to date. This research is based on a field study of 65 events across eight manufacturing organizations that used survey data collected at the time of the event and approximately nine to eighteen months after the event. The research model was developed from Kaizen event practitioner resources, Kaizen event literature, and related process improvement sustainability and organizational change literature. The model hypothesized that Kaizen Event Characteristics, Work Area Characteristics, and Post-Event Characteristics were related to Kaizen event Sustainability Outcomes. Furthermore, the model hypothesized that Post-Event Characteristics would mediate the relationship between Kaizen Event and Work Area Characteristics and the Sustainability Outcomes. The study hypotheses were analyzed through multiple regression models and generalized estimating equations were used to account for potential nesting effects (events within organizations). The factors that were most strongly related to each Sustainability Outcome were identified. Work Area Characteristics learning and stewardship and experimentation and continuous improvement and Post-Event Characteristics performance review and accepting changes were significant direct or indirect predictors of multiple Sustainability Outcomes and these findings were generally supported by the literature. There were also some unanticipated findings, particularly regarding the modeling of Sustainability Outcomes result sustainability and goal sustainability, which appear to illustrate potential issues regarding how organizations define and track the performance of Kaizen events over time and present areas for future research. Overall, this study advances academic knowledge regarding Kaizen event outcome sustainability. The findings also present guidelines so that practitioners may better influence the longer-term impact of Kaizen events on their organizations. The research findings may also extend to other improvement activities, thus presenting additional areas for future work.
- A Cross-Cultural Examination: Effects of Reward Systems and Cultures on Low Severity Risk-Taking Behavior in ConstructionThongsamak, Sasima (Virginia Tech, 2007-09-04)The overall research objective was to identify the effects of reward systems (rewards and a penalty) on risk-taking behavior and performance (quality and time) of construction workers from different cultures (American, Asian, and Latin American cultures). This research used the sociotechnical system as the underlying, guiding scientific framework. The research found that Americans and Latin Americans had higher risk-taking behavior than Asians (p<0.01). No difference in risk-taking behavior was found between Americans and Latin Americans (p<0.05). Although culture may influence individuals' risk-taking behavior, the results from this study showed that risk-taking behavior could be altered and suppressed by providing individuals with the proper safety training, education, and safety equipment. Customized safety training for people from different cultures would be useful because the culture elements that contribute to high risk-taking behavior could be addressed. The results also showed that the effects of reward systems on risk-taking behavior were not statistically significant (p>0.1). One possibility that no difference was found may be because the tasks used in this study did not contain enough possibility for participants to take more risk. The effects of reward systems on risk-taking behavior may have been reduced by the low possibility of risky behavior. It is suspected that if the tasks contained more opportunities for participants to take risk, differences in risk-taking behavior would have been significant. The researcher concluded that risk perception is situation-specific and has an influence on the individual's risk-taking behavior on that particular situation but cannot be used to predict risk-taking behavior. Also, general locus of control and general self-efficacy cannot be used to predict risk-taking behaviors. These findings are consistent with many studies that explore locus of control (Iversen & Rundmo, 2002; Rolison & Scherman 2002; Crisp & Barber, 1995), and many researchers that suggested self-efficacy is situation specific (Murdock et al., 2005; Martin et al., 1995; Perraud, 2000; Slanger & Rudestam, 1997). This study also found no relationship between risk-taking behavior and productivity, for both time and quality.
- Cyber-Physical Security for Advanced ManufacturingDesmit, Zachary James (Virginia Tech, 2018-01-16)The increased growth of cyber-physical systems, controlling multiple production processes within the manufacturing industry, has led to an industry susceptible to cyber-physical attacks. Differing from traditional cyber-attacks in their ability to alter the physical world, cyber-physical attacks have been increasing in number since the early 2000's. To combat and ultimately prevent the malicious intent of such attacks, the field of cyber-physical security was launched. Cyber-physical security efforts can be seen across many industries that employ cyber-physical systems but little work has been done to secure manufacturing systems. Through the completion of four research objectives, this work provides the foundation necessary to begin securing manufacturing systems from cyber-physical attacks. First, this work is motivated through the systematic review of literature surrounding the topic. This objective not only identifies and highlights the need for research efforts within the manufacturing industry, but also defines the research field. Second, a framework is developed to identify cyber-physical vulnerabilities within manufacturing systems. The framework is further developed into a tool allowing manufacturers to more easily identify the vulnerabilities that exist within their manufacturing systems. This tool will allow a manufacturer to utilize the developed framework and begin the steps necessary to secure the manufacturing industry. Finally, game theoretic models is applied to cyber-physical security in manufacturing to model the interactions between adversaries and defenders. The results of this work provide the manufacturing industry with the tools and motivation necessary to begin securing manufacturing facilities from malicious cyber-physical attacks and create a more resilient industry.
- A Data Clustering Approach to Support Modular Product Family DesignSahin, Asli (Virginia Tech, 2007-09-21)Product Platform Planning is an emerging philosophy that calls for the planned development of families of related products. It is markedly different from the traditional product development process and relatively new in engineering design. Product families and platforms can offer a multitude of benefits when applied successfully such as economies of scale from producing larger volumes of the same modules, lower design costs from not having to redesign similar subsystems, and many other advantages arising from the sharing of modules. While advances in this are promising, there still remain significant challenges in designing product families and platforms. This is particularly true for defining the platform components, platform architecture, and significantly different platform and product variants in a systematic manner. Lack of precise definition for platform design assets in terms of relevant customer requirements, distinct differentiations, engineering functions, components, component interfaces, and relations among all, causes a major obstacle for companies to take full advantage of the potential benefits of product platform strategy. The main purpose of this research is to address the above mentioned challenges during the design and development of modular platform-based product families. It focuses on providing answers to a fundamental question, namely, how can a decision support approach from product module definition to the determination of platform alternatives and product variants be integrated into product family design? The method presented in this work emphasizes the incorporation of critical design requirements and specifications for the design of distinctive product modules to create platform concepts and product variants using a data clustering approach. A case application developed in collaboration with a tire manufacturer is used to verify that this research approach is suitable for reducing the complexity of design results by determining design commonalities across multiple design characteristics. The method was found helpful for determining and integrating critical design information (i.e., component dimensions, material properties, modularization driving factors, and functional relations) systematically into the design of product families and platforms. It supported decision-makers in defining distinctive product modules within the families and in determining multiple platform concepts and derivative product variants.
- A Decision Support System for Advanced Composites Manufacturing Cost EstimationEaglesham, Mark Alan (Virginia Tech, 1998-04-10)The increased use of advanced composites in aerospace manufacturing has led to the development of new production processes and technology. The implementation of advanced composites manufacturing technology is poorly served by traditional cost accounting methods, which distort costs by using inappropriate volume-based allocations of overhead. Activity-based costing has emerged as a methodology which provides more accurate allocation of costs to products or activities by their usage of company resources. Better designs may also be produced if designers could evaluate the cost implications of their choices early in the design process. This research describes a methodology whereby companies can improve product cost estimation at the conceptual design phase, using intelligent searching and arrangement of existing accounting data to enable designers to access the activity cost information more readily. The concept has considerable scope for application in industry because it will allow companies to make better use of information that is already being recorded in their information systems, by providing it in a form which will enable designers to make better informed decisions during the design process. The design decision support framework is illustrated by applying it to a typical problem in aerospace composites manufacturing. Feasibility of the approach is demonstrated using a prototype software model of the Design Decision Support System, implemented using commercially available software.
- The defence performance measurement framework: Measuring the performance of defence organisations at the strategic levelSoares, Joaquim; Letens, Geert; Vallet, Nathalie; Van Bockhaven, Wouter; Keathley-Herring, Heather; Van Aken, Eileen M. (2022)As the gap between strategic commitments and budgetary constraints continues to grow, defence organisations have introduced performance management initiatives to support decision-making and to improve governance. However, introducing managerial practices in public organisations, including defence, proves to be challenging. As performance management initiatives within defence suffer from an implementation gap, strategic benefits are not being harnessed. In our study, we first exploit the results of a Systematic Literature Review to better anchor the encountered challenges within literature. We then apply thematic analysis to a unique dataset from twelve NATO countries to propose a new defence-specific performance management framework for the strategic level. As the new framework preserves the benefits of existing initiatives while mitigating most recorded challenges, it is proposed as a new guide for designing and assessing defence performance management efforts. Thereby, professionals and scholars are provided with a powerful instrument to address the implementation gap. Moreover, the theoretical and empirical lens adopted ensures alignment between performance management initiatives, defence policy, defence strategy, and strategic objectives. Notably, policy goals and strategic “ends” are clearly connected to critical processes and resources. Thereby, the new framework better supports discussions with key defence stakeholders pertaining to the gap between commitments and constraints.
- Design and Implementation Factors for Performance Measurement in Non-profit Organizations: A Literature ReviewTreinta, Fernanda T.; Moura, Louisi Francis; Almeida Prado Cestari, Jose M.; de Lima, Edson Pinheiro; Deschamps, Fernando; Gouvea da Costa, Sergio Eduardo; Van Aken, Eileen M.; Munik, Juliano; Leite, Luciana R. (2020-08-07)Purpose:Performance measurement systems (PMS) in Non-profit Organizations (NPOs) are more complex than in for-profit organizations. NPOs have an orientation toward social mission and values, and they consider not only organizational efficiency and viability, but also the social impact of the organization. This research provides a comprehensive synthesis of PMSs in NPOs. Design/Methodology/Approach:Using a literature review, supported by bibliometric and network analyses. A paper set of 240 articles related to this research field is examined. Topics that are the most prevalent in this research area and their interrelationships are identified, presenting an outline of current efforts. Findings:Despite the descriptive analyses for the paper set, a framework is proposed for organizing the design-implementation factors of PMSs in non-profit organizations, identifying the main requirements for their successful development. Originality/Value:Investigation on performance measurement in non-profit organizations is still in its early stages of development with many opportunities to further develop the field. Conceptual frameworks and models, as well as specific theories, are being generated for this field of research, and the process of adapting models from the general field of performance measurement is taking place. The meta-framework that organizes the main research topics of PMS in non-profit organizations and the framework that consolidates factors that influence the design-implementation of PMSs in non-profit organizations developed represents this paper contribution.
- Determinants of team effectiveness for cross-functional organizational design teamsVan Aken, Eileen M. (Virginia Tech, 1995-12-01)Recent research indicates that teams are an essential element of most leading organizations (Mohrrnan, Cohen, & Mohrrnan, 1995). With the proliferation of team use comes the need for research to better define the design and management requirements unique to specific types of teams. This research focused on cross-functional design teams tasked with the organizational redesign of sociotechnical work systems. A design team is a cross-functional multi-lex el team with the responsibility to create and often implement a plan for work system redesign. The research objective was to develop a deeper understanding of the team characteristics (called design features) that were most related to team effectiveness. Team effectiveness was defined to include both team performance and team member satisfaction. Cross-functional design teams were studied across two large organizations and key learnings were identified from a third large organization with substantial experience in team-based work redesign. Quantitative and qualitative data were collected from team members using survey questionnaires and interviews. The data analysis strategy included Within and Between Analysis (which uses analysis of variance, correlations, and analysis of covariance) and multiple regression techniques to identify design features most related to team effectiveness at the team level. Results indicated that team skills and clarity in team sponsor expectations were significantly related to team performance at the team level (r= 0.83, p< 0.005, and r=0.89, p< 0.005, respectively).
- Development of a Comprehensive Framework for the Efficiency Measurement of Road Maintenance Strategies using Data Envelopment AnalysisOzbek, Mehmet Egemen (Virginia Tech, 2007-09-19)For the last two decades, the road maintenance concept has been gaining tremendous attention. This has brought about new institutional changes, predominant of which is the challenge for maintenance managers to achieve maximum performance from the existing road system. Such challenge makes it imperative to implement comprehensive systems that measure road maintenance performance. However, the road maintenance performance measurement systems developed and implemented by researchers and state departments of transportation (DOTs) mainly focus on the effectiveness measures, e.g., the level-of-service. Such measurement systems do not sufficiently elaborate on the efficiency concept, e.g., the amount of resources utilized to achieve such level-of-service. Not knowing how "efficient" state DOTs are in being "effective" can lead to excessive and unrealistic maintenance budget expectations. This issue indicates the need for a performance measurement approach that can take the efficiency concept into account. Another important concept that is not investigated in the current road maintenance performance measurement systems is the effect of the environmental factors (e.g., climate, location, and etc.) and operational factors (e.g., traffic, load, design-construction adequacy, and etc.) on the performance of the road maintenance process. This issue, again, indicates the need for a performance measurement approach that can take such external and uncontrollable factors into account. The purpose of this research is to develop and implement a comprehensive framework that can measure the relative efficiency of different road maintenance strategies given the (i) multiple inputs and outputs that characterize the road maintenance process and (ii) uncontrollable factors (e.g., climate, traffic, etc.) that affect the performance of such process. It is challenging to measure the overall efficiency of a process when such process is a multiple input-multiple output process and when such process is affected by multiple factors. To address this challenge, an innovative approach to efficiency measurement, Data Envelopment Analysis, is used in this research. It is believed that this research, by taking the efficiency concept into account, will significantly improve the ways that are currently used to model and measure the performance of road maintenance. The findings of this research will contribute new knowledge to the asset management field in the road maintenance domain by providing a framework that is able to differentiate effective and efficient maintenance strategies from effective and inefficient ones.
- Display Technology and Ambient Illumination Influences on Visual Fatigue at VDT WorkstationsBangor, Aaron W. (Virginia Tech, 2000-12-19)The concept of "visual fatigue" has been studied for 70 years or more. In that time, no single metric of measuring visual fatigue nor one agreed-upon set of tasks to induce visual fatigue has been settled upon. Not even a robust definition of visual fatigue has been established. This research worked to solve some of those problems. This research first set out to develop an index of visual fatigue that could be used effectively in quantifying the subjective experience of visual fatigue. Then it sought to create a set of measurable tasks, representative of office work, that would induce visual fatigue. Taking these two developments, an experiment using human participants was conducted to validate these developments and work toward solving two issues in the visual fatigue field: how visual display technology and ambient illumination affect the onset of visual fatigue. A 4x4 within-subjects design was developed and executed to study how these two independent variables affected ratings of visual fatigue, performance on the task battery, subjective image quality judgments, and contrast sensitivity shifts. Two cathode ray tube (CRT) and two active-matrix LCD (AMLCD) monitors were used in this study. While many instances of the monitors as a whole caused significant differences in reports of visual fatigue, performance, subjective image quality, and contrast sensitivity loss, only a slight effect of display technology was found. Four of eleven visual fatigue and two of eight subjective image quality dimensions showed that the LCD monitors induced more visual fatigue and were rated poorer than the CRT monitors. Ambient illumination levels of 0, 300, 600, and 1200 lux affected all four groups of dependent variables. On the whole, lighting caused visual fatigue, with "watery eyes" and "glare from lights" being adversely affected by brighter lighting. The 0 and 1200 lux were associated with the worst performance, while 300 lux was associated with the best performance. Subjective image quality was affected by lighting, with increasing lighting causing bothersome screen reflections and more temporal (e.g., flicker and jitter) distortions; 600 lux induced more reports of image sizing anomalies. Finally, it caused significantly worse shifts at the 6.0 c/deg spatial frequency on the contrast sensitivity test. The data show that lighting of 300 lux is the best of these four illumination levels. The results of this study not only contribute to the body of research in the areas of display technology and ambient illumination, but several developments of this research are offered to the research community: a complete survey metric of visual fatigue, a standardized battery of tasks for studying visual fatigue and image quality, and a comprehensive subjective image quality survey.
- The Effects of Business Process Management Cognitive Resources and User Cognitive Differences on Outcomes of User ComprehensionSwan, Bret R. (Virginia Tech, 2007-03-26)There is a growing need to study factors that affect user comprehension of Business Process Management (BPM) information portrayed by graphical process models (GPMs). For example, deployment of BPM Systems, unique types of enterprise-level information systems, has dramatically increased in recent years. This increase is primarily because BPM Systems give a variety of managers across an enterprise the ability to directly design, configure, enact, monitor, diagnose, and control business processes that other types of enterprise systems do not. This is possible because BPM Systems uniquely rely on GPMs derived from formal graph theory. Besides controlling the business processes, these GPMs, such as metagraphs and Unified Modeling Language (UML) diagrams, portray business process information (BPI) and prompt BPM managers to apply their training and expertise to deal with BPM situations. As a result, GPMs are the primary information artifacts for decision-making and communication among different, often geographically dispersed stakeholders. Therefore, user comprehension of these unique GPMs is critical to the efficient and effective development, deployment, and utilization of BPM Systems. User comprehension outcomes are jointly affected by the (1) BPM cognitive resources available to each manager (including the type of GPM, BPI, and user educational training and experience), and (2) cognitive differences between individual BPM managers (such as their mental workload, cognitive styles and cognitive abilities). Although research has studied GPMs in various contexts, there is apparently no empirical research investigating GPM user comprehension in the context of BPM Systems. This research makes an important contribution by addressing this gap in the literature. Statement of the Objective: The purpose of this research is to empirically study how BPM cognitive resources and cognitive differences between individuals affect outcomes of GPM user comprehension. This research centered on the following objectives: A. Investigate whether more positive user comprehension outcomes are produced by novice users if a single GPM technique is used to portray different types of BPI (e.g., as with metagraphs) or if different GPM techniques are used to portray different types of BPI (e.g., as with UML diagrams). B. Investigate whether one type of BPI is more easily comprehended and interpreted by novice users irrespective of the type of GPM or the type of educational training of the user. C. Investigate whether users with a specific type of user educational training can more easily comprehend and interpret BPM information irrespective of the type of GPM or the type of BPI. D. Evaluate influences of individual cognitive differences (i.e., mental workload, cognitive styles, and cognitive abilities) on outcomes of user comprehension. In order to accomplish these objectives, this study: (a) defined a theoretical framework conceptualizing user comprehension outcomes in terms of the interaction between cognitive resources external to the user and individual differences affecting how users cognitively process BPI, (b) empirically tested an operational research model of GPM user comprehension that is based on the theoretical framework, and (c) interpreted the experimental results in the context of related literatures. Description of Research Methods: This study empirically tested relationships between several variables representing BPM cognitive resources and individual cognitive differences hypothesized as influencing the outcomes of user comprehension. A laboratory experiment, involving 87 upper-level undergraduate students from two universities, analyzed relationships between participant comprehension of two types of GPMs (i.e., metagraphs and UML diagrams) used to portray three types of BPI (i.e., task-centric, resource-centric, and information-centric BPI) by novice GPM users possessing different educational training (i.e., industrial engineering, business management, and computer science training). Dependent variables included assessments of task accuracy, task timeliness, subjective mental workload, and self-efficacy. Covariate effects were also analyzed for two types of participant cognitive abilities (i.e., general cognitive ability (GCA) and attentional abilities) and two types of participant cognitive styles (extroversion-introversion and sensing-intuitive). Multivariate analysis techniques were used to analyze and interpret the data. Discussion of Results: The type of GPM and participants' GCA produced significant effects on the dependent variables in this study. For example, metagraph users produced significantly more desirable results than UML users across all dependent variables, contrary to what was hypothesized. However, if only the BPM cognitive resources (i.e., GPM Type, BPM Type, and the Type of Participant Education) were studied in relation to user comprehension outcomes, spurious conclusions would have been reached. When individual cognitive differences were included in the research model and analyses, results showed participants with higher GCA produced significantly more positive user comprehension outcomes compared to participants with lower GCAs. Also, many of the impacts of differences in the types of BPI and the types of UET were moderated by the differences in participants' GCA and attentional abilities. In addition, the relationship between subjective mental workload and task performance (i.e., accuracy and timeliness) suggest a possible GPM cognitive "profile" for user comprehension tasks in a BPM Systems context. These results have important implications for future research and practice in several bodies of knowledge, including GPM user comprehension in management systems engineering, BPM modeling, BPM Systems, HCI, and cognitive ergonomics literature.
- An Empirical Analysis of Rating Effectiveness for a State Quality AwardSienknecht, Ronald Theodore Jr. (Virginia Tech, 1999-06-28)This research clarified existing inconsistencies in self-assessment literature, and added to the body of knowledge for rating effectiveness of organizational assessments by defining relationships among rating effectiveness criteria (ratee, rater, rating scale, rating process) and measures (interrater reliability, halo error, leniency and severity, range restriction) based on extensive literature review. A research framework was developed from this review, and was employed in computing rating effectiveness measures at the individual (i.e., examiner or eight rating scale dimensions) and sector (e.g., Private Manufacturing Sector, Private Service Sector, Public Local Sector, Public State & Federal Sector) levels for a State Quality Award (SQA) using data from the 1998 applications. Interrater reliability (measured by intraclass correlations for each rating scale dimension) was low to moderate, and differed by dimension. Halo error (measured by the determinant of the dimension intercorrelation matrices for each examiner) was present for all examiners. Leniency and severity (measured by presence of statistically significant Rater main effect for each dimension) was present for 11 of 32 cases, and differed by dimension. Range restriction (measured by variance analysis for each dimension) was present for 22 of 32 cases, and differed by dimension. A post-hoc principle component analysis indicated poor internal reliability for the rating scale. To improve, the SQA should replace the existing rating scale and provide in-depth training on all elements of the rating process. The importance of the SQA using boxplots, histograms, and rating effectiveness measures to make fully informed decisions was discussed.
- «
- 1 (current)
- 2
- 3
- »