Article Instance
API Endpoint for journals.
GET /api/articles/59641/?format=api
{ "pk": 59641, "title": "Algorithmic Personalization Features and Democratic Values: What Regulation Initiatives Are Missing", "subtitle": null, "abstract": "<p>2024 was poised to be the largest election year in history, with pivotal elections in Asia, Europe, and the Americas encompassing regional, legislative, and presidential contests, capturing the attention of half the globe. In an era dominated by social media, these elections were influenced by information dissemination through digital platforms.</p>\n<p>Over the last two decades, the landscape of public discourse in matters of civic concern has undergone a transformative shift, moving from traditional media to personalized digital social media outlets. Algorithmic features now selectively match content to users, fostering engagement but also giving rise to issues such as echo chambers, filter bubbles, and sensationalized content. While much attention has been devoted to the challenges posed by personalized discourse, this paper sheds light on a critical aspect that has been overlooked in regulatory paradigms: the erosion of an open, public sphere for discourse due to individualized manipulated content matching.</p>\n<p>In a well-functioning democratic system, an informed citizenry is paramount. However, amplification algorithms and recommendation systems—referred to as “personalization features” here—employ manipulated partial and even contradicting messages designed to targeted audience, while excluding users who may object, correct or protest against them. This paper aims to address the question of whether democratic public discourse can be preserved amidst the individualized flow of manipulated content on social media.</p>\n<p>Existing regulatory approaches—including content regulation, algorithm regulation, and privacy regulation—primarily focus on data profiling, algorithmic content matching, and the nature of discourse within personalized spheres. They prove inadequate in mitigating harm to the public sphere resulting from the selective distribution of individualized manipulated content, hindering open public discussion. Even regulatory initiatives specifically targeting accessible public discourse fall short in addressing both the individualized and manipulated aspects of personalized speech.</p>\n<p>The research findings highlight a pressing concern—the inherent contradiction between the individualized manipulated flow of content and the principles of democratic deliberation. Urgent regulatory frameworks are needed to safeguard a public space for deliberation that is accessible to all, facilitating joint decisions and the construction of pluralistic democratic societies. Potential solutions may involve establishing alternative spaces for public discourse and recognizing distinct considerations for issues in the public sphere. Additionally, a multifaceted strategy beyond online discourse or algorithm regulation is proposed, aiming to foster constructive dialogue, respect, and tolerance in the digital ecosystem, complemented by public education.</p>", "language": null, "license": { "name": "", "short_name": "", "text": null, "url": "" }, "keywords": [], "section": "Article", "is_remote": true, "remote_url": "https://escholarship.org/uc/item/96b9x2c4", "frozenauthors": [ { "first_name": "Sharon", "middle_name": "", "last_name": "Bassan", "name_suffix": "", "institution": "", "department": "" } ], "date_submitted": null, "date_accepted": null, "date_published": "2025-12-03T01:08:00Z", "render_galley": null, "galleys": [ { "label": "PDF", "type": "pdf", "path": "https://journalpub.escholarship.org/ucilr/article/59641/galley/45613/download/" } ] }