Browsing by Author "Cheng, Hao"
Now showing 1 - 5 of 5
Results Per Page
Sort Options
- ASAS-NANP SYMPOSIUM: prospects for interactive and dynamic graphics in the era of data-rich animal scienceMorota, Gota; Cheng, Hao; Cook, Dianne; Tanaka, Emi (2021-02)Statistical graphics, and data visualization, play an essential but under-utilized, role for data analysis in animal science, and also to visually illustrate the concepts, ideas, or outputs of research and in curricula. The recent rise in web technologies and ubiquitous availability of web browsers enables easier sharing of interactive and dynamic graphics. Interactivity and dynamic feedback enhance humancomputer interaction and data exploration. Web applications such as decision support systems coupled with multimedia tools synergize with interactive and dynamic graphics. However, the importance of graphics for effectively communicating data, understanding data uncertainty, and the state of the field of interactive and dynamic graphics is underappreciated in animal science. To address this gap, we describe the current state of graphical methodology and technology that might be more broadly adopted. This includes an explanation of a conceptual framework for effective graphics construction. The ideas and technology are illustrated using publicly available animal datasets. We foresee that many new types of big and complex data being generated in precision livestock farming create exciting opportunities for applying interactive and dynamic graphics to improve data analysis and make data-supported decisions.
- Bond-Slip Monitoring of Concrete Structures Using Smart Sensors—A ReviewHuo, Linsheng; Cheng, Hao; Kong, Qingzhao; Chen, Xuemin (MDPI, 2019-03-11)Concrete structures with various reinforcements, such as steel bars, composite material tendons, and recently steel plates, are commonly used in civil infrastructures. When an external force overcomes the strength of the bond between the reinforcement and the concrete, bond-slip will occur, resulting in a relative displacement between the reinforcing materials and the concrete. Monitoring bond health plays an important role in guaranteeing structural safety. Recently, researchers have recognized the importance of bond-slip monitoring and performed many related investigations. In this paper, a state-of-the-art review on various smart sensors based on piezoelectric effect and fiber optic technology, as well as corresponding techniques for bond-slip monitoring is presented. Since piezoelectric sensors and fiber-optic sensors are widely used in bond-slip monitoring, their principles and relevant monitoring methods are also introduced in this paper. Particularly, the piezoelectric-based bond-slip monitoring methods including the active sensing method, the electro-mechanical impedance (EMI) method and the passive sensing using acoustic emission (AE) method, and the fiber-optic-based bond-slip detecting approaches including the fiber Bragg grating (FBG) and the distributed fiber optic sensing are highlighted. This paper provides guidance for practical applications and future development of bond-slip monitoring.
- A Multiple-Trait Bayesian Variable Selection Regression Method for Integrating Phenotypic Causal Networks in Genome-Wide Association StudiesWang, Zigui; Chapman, Deborah; Morota, Gota; Cheng, Hao (Genetics Society of America, 2020-12-01)Bayesian regression methods that incorporate different mixture priors for marker effects are used in multi-trait genomic prediction. These methods can also be extended to genome-wide association studies (GWAS). In multiple-trait GWAS, incorporating the underlying causal structures among traits is essential for comprehensively understanding the relationship between genotypes and traits of interest. Therefore, we develop a GWAS methodology, SEM-Bayesian alphabet, which, by applying the structural equation model (SEM), can be used to incorporate causal structures into multi-trait Bayesian regression methods. SEM-Bayesian alphabet provides a more comprehensive understanding of the genotype-phenotype mapping than multi-trait GWAS by performing GWAS based on indirect, direct and overall marker effects. The superior performance of SEM-Bayesian alphabet was demonstrated by comparing its GWAS results with other similar multi-trait GWAS methods on real and simulated data. The software tool JWAS offers open-source routines to perform these analyses.
- Resolving atomic-scale phase transformation and oxygen loss mechanism in ultrahigh-nickel layered cathodes for cobalt-free lithium-ion batteriesWang, Chunyang; Han, Lili; Zhang, Rui; Cheng, Hao; Mu, Linqin; Kisslinger, Kim; Zou, Peichao; Ren, Yang; Cao, Penghui; Lin, Feng; Xin, Huolin L. (2021-06-02)Doped LiNiO2 has recently become one of the most promising cathode materials for its high specific energy, long cycle life, and reduced cobalt content. Despite this, the degradation mechanism of LiNiO2 and its derivatives still remains elusive. Here, by combining in situ electron microscopy and first-principles calculations, we elucidate the atomic-level chemomechanical degradation pathway of LiNiO2-derived cathodes. We uncover that the O1 phase formed at high voltages acts as a preferential site for rock-salt transformation via a two-step pathway involving cation mixing and shear along (003) planes. Moreover, electron tomography reveals that planar cracks nucleated simultaneously from particle interior and surface propagate along the [100] direction on (003) planes, accompanied by concurrent structural degradation in a discrete manner. Our results provide an in-depth understanding of the degradation mechanism of LiNiO2-derived cathodes, pointing out the concept that suppressing the O1 phase and oxygen loss is key to stabilizing LiNiO2 for developing next-generation high-energy cathode materials.
- VTag: a semi-supervised pipeline for tracking pig activity with a single top-view cameraChen, Chun-Peng J.; Morota, Gota; Lee, Kiho; Zhang, Zhiwu; Cheng, Hao (Oxford University Press, 2022-06)Precision livestock farming has become an important research focus with the rising demand of meat production in the swine industry. Currently, the farming practice is widely conducted by the technology of computer vision (CV), which automates monitoring pig activity solely based on video recordings. Automation is fulfilled by deriving imagery features that can guide CV systems to recognize animals' body contours, positions, and behavioral categories. Nevertheless, the performance of the CV systems is sensitive to the quality of imagery features. When the CV system is deployed in a variable environment, its performance may decrease as the features are not generalized enough under different illumination conditions. Moreover, most CV systems are established by supervised learning, in which intensive effort in labeling ground truths for the training process is required. Hence, a semi-supervised pipeline, VTag, is developed in this study. The pipeline focuses on long-term tracking of pig activity without requesting any pre-labeled video but a few human supervisions to build a CV system. The pipeline can be rapidly deployed as only one top-view RGB camera is needed for the tracking task. Additionally, the pipeline was released as a software tool with a friendly graphical interface available to general users. Among the presented datasets, the average tracking error was 17.99 cm. Besides, with the prediction results, the pig moving distance per unit time can be estimated for activity studies. Finally, as the motion is monitored, a heat map showing spatial hot spots visited by the pigs can be useful guidance for farming management. The presented pipeline saves massive laborious work in preparing training dataset. The rapid deployment of the tracking system paves the way for pig behavior monitoring. Lay Summary Collecting detailed measurements of animals through cameras has become an important focus with the rising demand for meat production in the swine industry. Currently, researchers use computational approaches to train models to recognize pig morphological features and monitor pig behaviors automatically. Though little human effort is needed after model training, current solutions require a large amount of pre-selected images for the training process, and the expensive preparation work is difficult for many farms to implement such practice. Hence, a pipeline, VTag, is presented to address these challenges in our study. With few supervisions, VTag can automatically track positions of multiple pigs from one single top-view RGB camera. No pre-labeled images are required to establish a robust pig tracking system. Additionally, the pipeline was released as a software tool with a friendly graphical user interface, that is easy to learn for general users. Among the presented datasets, the average tracking error is 17.99 cm, which is shorter than one-third of the pig body length in the study. The estimated pig activity from VTag can serve as useful farming guidance. The presented strategy saves massive laborious work in preparing training datasets and setting up monitoring environments. The rapid deployment of the tracking system paves the way for pig behavior monitoring. The presented pipeline, VTag, saves massive laborious work in preparing labeled training datasets and setting up environment for pig tracking tasks. VTag can be deployed rapidly and paves the way for pig behavior monitoring.