University of Portsmouth’s Dr. Karen Middleton Provides Expert Testimony
Dr. Karen Middleton, a researcher from the University of Portsmouth‘s School of Strategy, Marketing, and Innovation, recently provided expert testimony to the UK Government’s Science, Innovation, and Technology Committee’s inquiry into social media misinformation and harmful algorithms. This inquiry was initiated in response to growing concerns about the rapid advancement of technology, the politicization of social media platforms, and the potential inadequacy of current UK online safety laws.
Addressing the Rise of Misinformation and Harmful Content
The inquiry delves into the consequences of AI-driven algorithms, particularly in light of recent events where misleading images shared on platforms like Facebook and X incited Islamophobic protests. Dr. Middleton’s testimony emphasized the lack of accountability within the current advertising ecosystem, where programmatic advertising models can inadvertently fund the spread of misinformation and hate speech. She highlighted how the complexity and automation of these systems often obscure the funding trails, making it difficult to hold those responsible for harmful content accountable.
The Role of Digital Advertising in Funding Misinformation
With the UK’s programmatic digital advertising spending projected to significantly increase from an estimated £3.7 billion in 2024, the issue of ensuring ethical and responsible advertising practices becomes paramount. Dr. Middleton explained how platforms like Meta, TikTok, and X prioritize engagement-driven content, even if it’s misleading or harmful. This focus on engagement, coupled with the opaque nature of programmatic advertising supply chains, allows harmful content to be monetized without advertisers’ knowledge.
Recommendations for a Safer Online Environment
Dr. Middleton proposed several recommendations to address these challenges, including:
- Embedding safety-by-design principles into digital platforms.
- Increasing transparency in ad placement to ensure advertisers know where their ads are appearing.
- Leveraging AI-driven safety technology to proactively moderate content and identify harmful material.
- Implementing stronger regulations and industry practices to prevent advertising revenue from supporting harmful content.
Collaboration with the Conscious Advertising Network
As a Volunteer Advisor to the Conscious Advertising Network (CAN), Dr. Middleton actively works to promote responsible advertising practices and combat hate speech and harmful content online. CAN expressed its support for Dr. Middleton’s testimony and emphasized the importance of continued discussions in Parliament regarding transparency and accountability in the advertising ecosystem.
The University of Portsmouth’s Commitment to Addressing Online Harms
Dr. Middleton’s involvement in the inquiry reflects the University of Portsmouth’s commitment to addressing critical societal challenges through research and collaboration. Her expertise in marketing practice, teaching, and research, combined with her work with organizations like CAN, positions her as a leading voice in the fight against online misinformation. The university continues to support research and initiatives aimed at making the Internet a safer place for everyone.
Study at the University of Portsmouth
Explore excellent academic and career opportunities at the University of Portsmouth. Let Studygram guide you through the application process.
Fill out the form today to get started!