Scholarly Works, School of Communication
Permanent URI for this collection
Browse
Browsing Scholarly Works, School of Communication by Author "Communication"
Now showing 1 - 7 of 7
Results Per Page
Sort Options
- An Agenda for Open Science in CommunicationDienlin, Tobias; Johannes, Niklas; Bowman, Nicholas David; Masur, Philipp K.; Engesser, Sven; Kümpel, Anna Sophie; Lukito, Josephine; Bier, Lindsey M.; Zhang, Renwen; Johnson, Benjamin K.; Huskey, Richard; Schneider, Frank M.; Breuer, Johannes; Parry, Douglas A.; Vermeulen, Ivar; Fisher, Jacob T.; Banks, Jaime; Weber, René; Ellis, David A.; Smits, Tim; Ivory, James Dee; Trepte, Sabine; McEwan, Bree; Rinke, Eike Mark; Neubaum, German; Winter, Stephan; Carpenter, Christopher J.; Krämer, Nicole; Utz, Sonja; Unkel, Julian; Wang, Xiaohui; Davidson, Brittany I.; Kim, Nuri; Won, Andrea Stevenson; Domahidi, Emese; Lewis, Neil A.; de Vreese, Claes (2021-02)In the last 10 years, many canonical findings in the social sciences appear unreliable. This so-called "replication crisis" has spurred calls for open science practices, which aim to increase the reproducibility, replicability, and generalizability of findings. Communication research is subject to many of the same challenges that have caused low replicability in other fields. As a result, we propose an agenda for adopting open science practices in Communication, which includes the following seven suggestions: (1) publish materials, data, and code; (2) preregister studies and submit registered reports; (3) conduct replications; (4) collaborate; (5) foster open science skills; (6) implement Transparency and Openness Promotion Guidelines; and (7) incentivize open science practices. Although in our agenda we focus mostly on quantitative research, we also reflect on open science practices relevant to qualitative research. We conclude by discussing potential objections and concerns associated with open science practices.
- The effectiveness of credibility indicator interventions in a partisan contextDuncan, Megan A. (SAGE, 2019-12-01)Audiences, who cannot investigate the credibility of most news stories for themselves, rely on noncontent heuristic cues to form credibility judgments. For most media, these heuristics were stable over time. Emerging formats of journalism, however, require audiences to learn to interpret what new heuristics credibility cues mean about the story’s credibility. In an experiment, participants (N = 254) were given instructions about how to interpret the credibility cues in three formats as they read a politicized news story, which were compared with a control condition with no instructions. Results show the timing and source increase the effectiveness of the instructions.
- Faculty Perceptions of Research Assessment at Virginia TechMiles, Rachel A.; Pannabecker, Virginia; Kuypers, Jim A. (Levy Library Press, 2020-07-07)In the spring of 2019, survey research was conducted at Virginia Polytechnic Institute and State University (Virginia Tech), a large, public, Carnegie-classified R1 institution in southwest Virginia, to determine faculty perceptions of research assessment as well as how and why they use researcher profiles and research impact indicators. The Faculty Senate Research Assessment Committee (FSRAC) reported the quantitative and qualitative results to the Virginia Tech Board of Visitors to demonstrate the need for systemic, political, and cultural change regarding how faculty are evaluated and rewarded at the university for their research and creative projects. The survey research and subsequent report started a gradual process to move the university to a more responsible, holistic, and inclusive research assessment environment. Key results from the survey, completed by close to 500 faculty from across the university, include: a.) the most frequently used researcher profile systems and the primary ways they are used (e.g., profiles are used most frequently for showcasing work, with results indicating that faculty prefer to use a combination of systems for this purpose); b.) the primary reasons faculty use certain research impact indicators (e.g., number of publications is frequently used but much more likely to be used for institutional reasons than personal or professional reasons); c.) faculty feel that research assessment is most fair at the department level and least fair at the university level; and d.) faculty do not feel positively towards their research being assessed for the allocation of university funding.
- Reluctant to Share: How Third Person Perceptions of Fake News Discourage News Readers From Sharing “Real News” on Social MediaYang, Fan; Horning, Michael A. (Sage, 2020)Rampant fake news on social media has drawn significant attention. Yet, much remains unknown as to how such imbalanced evaluations of self versus others could shape social media users’ perceptions and their subsequent attitudes and behavioral intentions regarding social media news. An online survey (N = 335) was conducted to examine the third person effect (TPE) in fake news on social media and suggested that users perceived a greater influence of fake news on others than on themselves. However, although users evaluated fake news as socially undesirable, they were still unsupportive of government censorship as a remedy. In addition, the perceived prevalence of fake news leads audiences to reported significantly less willingness to share all news on social media either online or offline.
- Taking it from the team: Assessments of bias and credibility in team-operated sports mediaMirer, Michael; Duncan, Megan A.; Wagner, Michael (2018-10-29)Team- and league-operated media play a growing role in the sports media system. Few have looked at how audiences perceive the credibility of in-house content, which regularly mimics traditional sports journalism. An experimental analysis finds that even among fans, independent media content is rated more credible than that produced in-house. Fans view stories accusing their team of wrongdoing as biased even as they find them credible.
- Technologies, Ethics and Journalism’s Relationship with the PublicDuncan, Megan A.; Culver, Kathleen Bartzen (Cogitatio Press, 2020-07-27)Drones can provide a bird’s eye view of breaking news and events that can be streamed live or used in edited news coverage. Past research has focused on the training and ethics of journalists and drone operators. Little attention, however, has been given to audiences and their acceptance and perception of ethics. We suggest that audiences who are open to personal technology use will perceive news media using unmanned aerial vehicles (UAVs) as more ethical in an extension of the Diffusion of Innovation Theory. In a survey (N = 548) of adults living in the United States, we explore the correlates between trust, technology, privacy, and the use of UAVs. Results suggest all three are positively correlated with openness toward drone journalism. We find the audience has preferences for the types of news stories that should be covered using drones. Participants indicated they welcome drone journalism when covering traffic and investigative stories, but not celebrities and politicians. The findings have implications for newsrooms, suggesting transparency and outreach to educate people on the technology could help build trust. Further, the results suggest that Diffusion of Innovation theory can be applied when mediated through news media.
- What's in a Label? Negative Credibility Labels in Partisan NewsDuncan, Megan A. (SAGE, 2020-10-13)Concern about partisan audiences blindly following partisan news brands while simultaneously being unable to distinguish the credible news from hoax news dominates media criticism and theoretical inquiries. Companies and media literacy advocates have suggested credibility labels as a solution. This experiment tests the effectiveness of credibility labels at the intersection of partisan news brands and partisan news stories. Using news credibility theory and Partisan Media Opinion hypothesis, it investigates the effects credibility labels have on partisan audiences, partisan news brands, and partisan news stories. It finds that credibility labels may be an effective news literacy tool, and that credibility is enhanced when the news story’s ideological perspective does not match the ideology of the news brand.