Center for Advanced Innovation in Agriculture
Permanent URI for this community
Browse
Browsing Center for Advanced Innovation in Agriculture by Author "Cheng, Hao"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
- ASAS-NANP SYMPOSIUM: prospects for interactive and dynamic graphics in the era of data-rich animal scienceMorota, Gota; Cheng, Hao; Cook, Dianne; Tanaka, Emi (2021-02)Statistical graphics, and data visualization, play an essential but under-utilized, role for data analysis in animal science, and also to visually illustrate the concepts, ideas, or outputs of research and in curricula. The recent rise in web technologies and ubiquitous availability of web browsers enables easier sharing of interactive and dynamic graphics. Interactivity and dynamic feedback enhance humancomputer interaction and data exploration. Web applications such as decision support systems coupled with multimedia tools synergize with interactive and dynamic graphics. However, the importance of graphics for effectively communicating data, understanding data uncertainty, and the state of the field of interactive and dynamic graphics is underappreciated in animal science. To address this gap, we describe the current state of graphical methodology and technology that might be more broadly adopted. This includes an explanation of a conceptual framework for effective graphics construction. The ideas and technology are illustrated using publicly available animal datasets. We foresee that many new types of big and complex data being generated in precision livestock farming create exciting opportunities for applying interactive and dynamic graphics to improve data analysis and make data-supported decisions.
- VTag: a semi-supervised pipeline for tracking pig activity with a single top-view cameraChen, Chun-Peng J.; Morota, Gota; Lee, Kiho; Zhang, Zhiwu; Cheng, Hao (Oxford University Press, 2022-06)Precision livestock farming has become an important research focus with the rising demand of meat production in the swine industry. Currently, the farming practice is widely conducted by the technology of computer vision (CV), which automates monitoring pig activity solely based on video recordings. Automation is fulfilled by deriving imagery features that can guide CV systems to recognize animals' body contours, positions, and behavioral categories. Nevertheless, the performance of the CV systems is sensitive to the quality of imagery features. When the CV system is deployed in a variable environment, its performance may decrease as the features are not generalized enough under different illumination conditions. Moreover, most CV systems are established by supervised learning, in which intensive effort in labeling ground truths for the training process is required. Hence, a semi-supervised pipeline, VTag, is developed in this study. The pipeline focuses on long-term tracking of pig activity without requesting any pre-labeled video but a few human supervisions to build a CV system. The pipeline can be rapidly deployed as only one top-view RGB camera is needed for the tracking task. Additionally, the pipeline was released as a software tool with a friendly graphical interface available to general users. Among the presented datasets, the average tracking error was 17.99 cm. Besides, with the prediction results, the pig moving distance per unit time can be estimated for activity studies. Finally, as the motion is monitored, a heat map showing spatial hot spots visited by the pigs can be useful guidance for farming management. The presented pipeline saves massive laborious work in preparing training dataset. The rapid deployment of the tracking system paves the way for pig behavior monitoring. Lay Summary Collecting detailed measurements of animals through cameras has become an important focus with the rising demand for meat production in the swine industry. Currently, researchers use computational approaches to train models to recognize pig morphological features and monitor pig behaviors automatically. Though little human effort is needed after model training, current solutions require a large amount of pre-selected images for the training process, and the expensive preparation work is difficult for many farms to implement such practice. Hence, a pipeline, VTag, is presented to address these challenges in our study. With few supervisions, VTag can automatically track positions of multiple pigs from one single top-view RGB camera. No pre-labeled images are required to establish a robust pig tracking system. Additionally, the pipeline was released as a software tool with a friendly graphical user interface, that is easy to learn for general users. Among the presented datasets, the average tracking error is 17.99 cm, which is shorter than one-third of the pig body length in the study. The estimated pig activity from VTag can serve as useful farming guidance. The presented strategy saves massive laborious work in preparing training datasets and setting up monitoring environments. The rapid deployment of the tracking system paves the way for pig behavior monitoring. The presented pipeline, VTag, saves massive laborious work in preparing labeled training datasets and setting up environment for pig tracking tasks. VTag can be deployed rapidly and paves the way for pig behavior monitoring.