Browsing by Author "Yao, Yaxing"
Now showing 1 - 13 of 13
Results Per Page
Sort Options
- Augmented Reality’s Potential for Identifying and Mitigating Home Privacy LeaksCruz, Stefany; Danek, Logan; Liu, Shinan; Kraemer, Christopher; Wang, Zixin; Feamster, Nick; Huang, Danny Yuxing; Yao, Yaxing; Hester, Josiah (Internet Society, 2023)Users face various privacy risks in smart homes, yet there are limited ways for them to learn about the details of such risks, such as the data practices of smart home devices and their data flow. In this paper, we present Privacy Plumber, a system that enables a user to inspect and explore the privacy “leaks” in their home using an augmented reality tool. Privacy Plumber allows the user to learn and understand the volume of data leaving the home and how that data may affect a user’s privacy— in the same physical context as the devices in question, because we visualize the privacy leaks with augmented reality. Privacy Plumber uses ARP spoofing to gather aggregate network traffic information and presents it through an overlay on top of the device in an smartphone app. The increased transparency aims to help the user make privacy decisions and mend potential privacy leaks, such as instruct Privacy Plumber on what devices to block, on what schedule (i.e., turn off Alexa when sleeping), etc. Our initial user study with six participants demonstrates participants’ increased awareness of privacy leaks in smart devices, which further contributes to their privacy decisions (e.g., which devices to block).
- DeFi Auditing: Mechanisms, Effectiveness, and User PerceptionsDing, Feng; Rupert, Hitsch; Qin, Kaihua; Gervais, Arthur; Wattenhofer, Roger; Yao, Yaxing; Wang, Ye (Springer Nature, 2023-03-01)
- A Diary Study in Social Virtual Reality: Impact of Avatars with Disability Signifiers on the Social Experiences of People with DisabilitiesZhang, Kexin; Deldari, Elmira; Yao, Yaxing; Zhao, Yuhang (ACM, 2023-10-22)People with disabilities (PWD) have shown a growing presence in the emerging social virtual reality (VR). To support disability representation, some social VR platforms start to involve disability features in avatar design. However, it is unclear how disability disclosure via avatars (and the way to present it)would afect PWD’s social experiences and interaction dynamics with others. To fll this gap, we conducted a diary study with 10 PWD who freely explored VRChat—a popular commercial social VR platform—for two weeks, comparing their experiences between using regular avatars and avatars with disability signifers (i.e., avatar features that indicate the user’s disability in real life). We found that PWD preferred using avatars with disability signifers and wanted to further enhance their aesthetics and interactivity. However, such avatars also caused embodied, explicit harassment targeting PWD. We revealed the unique factors that led to such harassment and derived design implications and protection mechanisms to inspire more safe and inclusive social VR.
- An Empathy-Based Sandbox Approach to Bridge the Privacy Gap among Attitudes, Goals, Knowledge, and BehaviorsChen, Chaoran; Li, Weijun; Song, Wenxin; Ye, Yanfang; Yao, Yaxing; Li, Toby (ACM, 2024-05-11)Managing privacy to reach privacy goals is challenging, as evidenced by the privacy attitude-behavior gap. Mitigating this discrepancy requires solutions that account for both system opaqueness and users’ hesitations in testing diferent privacy settings due to fears of unintended data exposure.We introduce an empathy-based approach that allows users to experience how privacy attributes may alter system outcomes in a risk-free sandbox environment from the perspective of artifcially generated personas. To generate realistic personas, we introduce a novel pipeline that augments the outputs of large language models (e.g., GPT-4) using few-shot learning, contextualization, and chain of thoughts. Our empirical studies demonstrated the adequate quality of generated personas and highlighted the changes in privacy-related applications (e.g., online advertising) caused by diferent personas. Furthermore, users demonstrated cognitive and emotional empathy towards the personas when interacting with our sandbox. We ofered design implications for downstream applications in improving user privacy literacy.
- Exploring Tenants’ Preferences of Privacy Negotiation in AirbnbYao, Yaxing; Wang, Zixin; Huang, Danny (2023-08-11)
- From Awareness to Action: Exploring End-User Empowerment Interventions for Dark Patterns in UXLu, Yuwen; Zhang, Chao; Yang, Yuewen; Yao, Yaxing; Li, Toby (ACM, 2024-04-23)The study of UX dark patterns, i.e., UI designs that seek to manipulate user behaviors, often for the benefit of online services, has drawn significant attention in the CHI and CSCW communities in recent years. To complement previous studies in addressing dark patterns from (1) the designer’s perspective on education and advocacy for ethical designs; and (2) the policymaker’s perspective on new regulations, we propose an end-user-empowerment intervention approach that helps users (1) raise the awareness of dark patterns and understand their underlying design intents; (2) take actions to counter the effects of dark patterns using a web augmentation approach. Through a two-phase co-design study, including 5 co-design workshops (N=12) and a 2-week technology probe study (N=15), we reported findings on the understanding of users' needs, preferences, and challenges in handling dark patterns and investigated the feedback and reactions to users' awareness of and action on dark patterns being empowered in a realistic in-situ setting.
- “If sighted people know, I should be able to know:” Privacy Perceptions of Bystanders with Visual Impairments around Camera-based TechnologyZhao, Yuhang; Yao, Yaxing; Fu, Jiaru; Zhou, Nihan (USENIX Security, 2023-08-11)Camera-based technology can be privacy-invasive, especially for bystanders who can be captured by the cameras but do not have direct control or access to the devices. The privacy threats become even more significant to bystanders with visual impairments (BVI) since they cannot visually discover the use of cameras nearby and effectively avoid being captured. While some prior research has studied visually impaired people’s privacy concerns as direct users of camerabased assistive technologies, no research has explored their unique privacy perceptions and needs as bystanders. We conducted an in-depth interview study with 16 visually impaired participants to understand BVI’s privacy concerns, expectations, and needs in different camera usage scenarios. A preliminary survey with 90 visually impaired respondents and 96 sighted controls was conducted to compare BVI and sighted bystanders’ general attitudes towards cameras and elicit camera usage scenarios for the interview study. Our research revealed BVI’s unique privacy challenges and perceptions around cameras, highlighting their needs for privacy awareness and protection. We summarized design considerations for future privacy-enhancing technologies to fulfill BVI’s privacy needs.
- The influence of explanation designs on user understanding differential privacy and making data-sharing decisionWen, Zikai Alex; Jia, Jingyu; Yan, Hongyang; Yao, Yaxing; Liu, Zheli; Dong, Changyu (Elsevier, 2023-09)Differential privacy (DP) technologies are being promoted by organizations to encourage data sharing, but without a proper understanding of how these technologies work, individuals may make incorrect data-sharing decisions. A design gap exists in effectively communicating the workings of DP technologies, such as Local DP, to users. Our research aimed to fill this gap through the use of an explanatory illustration. We conducted an online survey with 228 participants to assess the impact of different explanation designs on understanding DP and data-sharing decisions. Our study found that the visual explanatory illustration was more effective in assisting individuals to comprehend Local DP’s privacy protection against large organizations as compared to the textual description, with the illustration group exhibiting an increase of 51.4% in their comprehension. The study also found that improved knowledge of privacy-enhancing technologies does not guarantee willingness to share protected data. To prevent misinformed decisions, future research could focus on designing a more effective way of communicating the privacy protections of these technologies to users, building on the insights gained from our study.
- An investigation of teenager experiences in social virtual reality from teenagers', parents', and bystanders' perspectivesDeldari, Elmira; Poveda, Julio; Freed, Diana; Yao, Yaxing (2023)The recent rise of social virtual reality (VR) platforms has introduced new technology characteristics and user experiences, which may lead to new forms of online harassment, particularly among teenagers (aged 13-17). In this paper, we took a multi-stakeholder approach and investigate teenagers’ experiences and safety threats in social VR from three perspectives (teenagers, parents, and bystanders) to cover complementary perspectives. Through an interview study with 24 participants (8 teenagers, 7 parents, and 9 bystanders), we found several safety threats that teenagers may face, such as virtual grooming, ability-based discrimination, unforeseeable threats in privacy rooms, etc. We highlight new forms of harassment in the social VR context, such as erotic role-play and abuse through phantom sense, as well as the discrepancies among teenagers, parents, and bystanders regarding their perceptions of such threats. We draw design implications to better support safer social VR environments for teenagers.
- RedCapes: the Design and Evaluation of a Game Towards Improving Autistic Children's Privacy AwarenessYuan, Xiaowen; Ye, Hongni; Tang, Ziheng; Zhu, Xiangrong; Yao, Yaxing; Tong, Xin (ACM, 2023-11-13)Autistic children have differences in social communication, making them vulnerable to privacy risks in social contexts. Research on typical development (TD) children’s privacy learning often neglects autistic children’s unique needs. Therefore, our study aims to understand their challenges in learning privacy and design an effective privacy education game for them. We designed a serious game, RedCapes, and recruited 9 autistic children and 6 TD children to evaluate the game. Our findings suggested that RedCapes improved autistic children’s privacy awareness. Compared to TD children, autistic children have more difficulty identifying relevant privacy risk factors and understanding the full consequences of privacy violations. We propose three design implications for future privacy education games for autistic children. Our work contributes: insights into autistic children’s challenges in learning privacy, a serious game prototype for privacy education, and design recommendations for future privacy education games focused on autistic children.
- Towards Understanding Family Privacy and Security Literacy Conversations at Home: Design Implications for Privacy Literacy InterfacesAlghythee, Kenan; Hrncic, Adel; Singh, Karthik; Kunisetty, Sumanth; Yao, Yaxing; Soni, Nikita (ACM, 2024-05-11)Policymakers and researchers have emphasized the crucial role of parent-child conversations in shaping children’s digital privacy and security literacy. Despite this emphasis, little is known about the current nature of these parent-child conversations, including their content, structure, and children’s engagement during these conversations. This paper presents the findings of an interview study involving 13 parents of children ages under 13 reflecting on their privacy literacy practices at home. Through qualitative thematic analysis, we identify five categories of parent-child privacy and security conversations and examine parents’ perceptions of their children’s engagement during these discussions. Our findings show that although parents used different conversation approaches, rule-based conversations were one of the most common approaches taken by our participants, with example-based conversations perceived to be effective by parents. We propose important design implications for developing effective privacy educational technologies for families to support parent-child conversations.
- Understanding Chinese Internet Users' Perceptions of, and Online Platforms' Compliance with, the Personal Information Protection Law (PIPL)Zhou, Morgana Mo; Qu, Zhiyan; Wan, Jinhan; Wen, Bo; Yao, Yaxing; Lu, Zhicong (ACM, 2024-04-23)The Personal Information Protection Law (PIPL) was implemented in November 2021 to safeguard the personal information rights and interests of Internet users in China. However, the impact and existing shortcomings of the PIPL remain unclear, carrying significant implications for policymakers. This study examined privacy policies on 13 online platforms before and after the PIPL. Concurrently, it conducted semi-structured interviews with 30 Chinese Internet users to assess their perceptions of the PIPL. Users were also given tasks to identify non-compliance within the platforms, assessing their ability to address related privacy concerns effectively. The research revealed various instances of non-compliance in post-PIPL privacy policies, especially concerning inadequate risk assessments for sensitive data. Although users identified some non-compliant activities like app eavesdropping, issues related to individual consent proved challenging. Surprisingly, over half of the interviewees believed that the government could access their personal data without explicit consent. Our findings and implications can be valuable for lawmakers, online platforms, users, and future researchers seeking to enhance personal privacy practices both in China and globally.
- Users' Perceptions of Online Child Abuse Detection MechanismsDeldari, Elmira; Thakkar, Parth; Yao, Yaxing (ACM, 2024-04-23)Child sexual exploitation and abuse (CSEA) online has become a major safety issue for children to access the Internet. To combat CSEA, electronics services providers (ESP) have implemented various mechanisms to detect child sexual abuse materials (CSAM). However, these mechanisms, despite their capability to prevent the mass distribution of CSAM online, may raise significant privacy concerns among general users. In this paper, we conducted a semi-structured interview study with 23 participants to understand their privacy perceptions of two types of online CSAM detection mechanisms. Our results suggested that users were concerned about the transparency of the detection process, inappropriate access to users' data, and unclear boundaries of such mechanisms. Our results also highlight that, even though the majority of participants choose to sacrifice their privacy for societal benefits, they still have privacy concerns that need to be addressed. We discuss the design and policy implications for ESP to improve users' awareness of the data practices of these mechanisms, alleviate users' privacy concerns, and increase societal benefits.