The Social Media Strategies to Produce Ideology and Discursive Hegemony

Social media has become one of the most powerful tools for shaping public opinion and discourse in the modern world. With billions of users across platforms like Facebook, Twitter, Instagram, and TikTok, social media allows people and organizations to reach wider audiences than ever before in human history. This enormous reach also enables social media platforms and their most influential users to impact ideologies, control narratives, and establish discursive hegemony on a global scale.

This article will examine how social media is utilized by individuals, corporations, political parties, and governments to propagate ideologies and establish dominant discourses. It will look at factors like algorithmic curation of content, selective censorship, optimized messaging, targeted advertisements, coordinated inauthentic behavior, and influencer marketing that allow the manipulation of public perception and the cementing of ideological frameworks. By focusing on real-world examples and case studies, this article will highlight how powerful entities are leveraging social media strategies and technologies to shape identities and societies in alignment with their interests.

The Rise of Social Media

The use of social media has exploded over the past 15 years, irrevocably changing the information and communication landscape. Facebook, the largest social network, now has over 2.8 billion monthly active users worldwide after launching in 2004 (1). Twitter, launched in 2006, boasts 237 million monetizable daily active users who post over 500 million tweets per day (2). Instagram, which debuted in 2010, has over 1 billion monthly users and is increasingly favored by younger demographics (3).

With deep penetration across languages, countries, and socioeconomic divisions, social media has become the predominant way billions of people consume information, express opinions, and engage with the wider world. Approximately 72% of Americans report using some social media platform regularly (4). Globally, over 3.6 billion people actively use social media, a number projected to grow to over 4.4 billion by 2025 (5).

This meteoric rise and ubiquitous adoption enable social media platforms and their top users to shape ideologies and dominate narratives by controlling the flow of information from sources to audiences. By funneling cultural discourse through their curated interfaces, social media allows a small group of people working for these companies or wielding outsized influence on them to dictate perceptions and beliefs for billions of users. This concentration of communicative power facilitates ideological manipulation.

Propagating Ideology through Algorithms and Engagement Metrics

Social media companies employ proprietary algorithms to curate the content that users see. These algorithms analyze thousands of data points to determine the ads, posts, and accounts that users will encounter in their feeds and search results (6). Rather than showing all possible content chronologically, algorithms selectively identify and rank content to optimize user engagement based on preferences and behaviors (7). They control visibility, rewarding content that generates likes, shares, comments, and other reactions.

This algorithmic curation has profound implications for ideology. It allows social media platforms to surface specific narratives and influencers that align with their interests while downranking dissenting views (8). By controlling exposure through algorithms optimized for engagement over accuracy or impartiality, platforms can boost ideologically-aligned content and drown out different perspectives and truths (9).

For example, Facebook’s algorithm heavily weights interactions like sharing and commenting. This creates “filter bubbles” around users by highlighting polarizing content that provokes reactions while limiting exposure to contrasting narratives (10). Outrage and affirmation are boosted over dialogue. According to ex-Facebook executive Chamath Palihapitiya, “The short-term, dopamine-driven feedback loops that we have created are destroying how society works” by driving disconnection and extremism through one-sided algorithmic amplification (11).

Selective Censorship and Account Suspension

Social media platforms apply restrictive speech policies and selective suspensions to curb ideological opponents and reinforce dominant discourses. Content or accounts deemed unacceptable are “shadow banned” from algorithms or removed outright for violating community standards (12).

This moderation frequently targets dissenting ideologies and narratives outside the mainstream. Far-right accounts associated with reactionary views or conspiracies like QAnon face routine suspensions on platforms like Facebook and Twitter, limiting their reach (13). Simultaneously, prominent leftist accounts criticizing capitalism and establishment politics also get banned for ideological reasons masked as violations of policy and civility (14).

While social media companies claim speech policies aim to reduce harm, inconsistent applications against ideological rivals open the door to accusations of censorship (15). This ideological curation compounds the bias of engagement algorithms. Ultimately, it allows social media platforms to firm up discursive hegemony by silencing ideological diversity under the pretense of providing safer, less toxic environments for users.

Strategic Messaging and Framing

Social media enables political groups, corporations, and influencers to strategically craft messaging to align with their ideological frameworks. This messaging primes audiences to unconsciously accept dominant narratives as it simultaneously frames opposing perspectives as flawed or unethical.

Political PR teams adeptly leverage social media for propaganda (16). They conduct focus groups and opinion polls to identify persuasive narratives. These narratives are translated into concise, compelling messages like slogans, hashtags, and soundbites optimized for sharing on social networks. Repetition of phrases like “Make America Great Again” or shareable infographics spreads ideological messaging widely through viral distribution (17).

Corporations also frame messaging on social media to promote their brand values and worldview (18). For example, Gillette’s controversial “The Best Men Can Be” ad campaign implicitly criticized toxic masculinity in an attempt to frame the #MeToo era through a progressive ideological lens aligned with their products (19). Such calculated messaging primes consumers to see social issues from the corporation’s perspective.

Influencers are likewise paid by interest groups to propagandize ideas to their followers. Political influencers on YouTube, Instagram, and TikTok often receive money from PACs, campaigns, or external donors to normalize specific ideological stances (20). The manufactured authenticity of influencers makes them powerful conduits for ideological manipulation through messaging.

Microtargeted Political Advertisements

The vast data collection apparatus of social media allows political groups to conduct pinpoint microtargeting of persuadable voters with personalized ideological messaging (21). By harnessing personal data like location, interests, relationships, and browsing history, campaigns can segment audiences into precise profiles for tailored communication (22). They then show these voter profiles ads containing ideologically-framed narratives designed to resonate by exploiting individual biases and desires (23).

For example, the Trump 2016 campaign used Cambridge Analytica to develop psychographic voter models that got rural conservative voters ads focused on immigration while suburban moderates saw economic-focused ones (24). This ad microtargeting happened without the knowledge or consent of users or the general public. The personally targeted nature of ideological messaging enhances its effectiveness. Microtargeting ensures that voters only see advertisements calibrated to push their unique ideological buttons.

Coordinated Inauthentic Behavior

Social media also enables coordinated campaigns of inauthentic accounts that artificially boost ideological messaging and frame debates. Groups of fake accounts act in a coordinated fashion to influence discourse through disinformation, propaganda, and algorithm manipulation (25). They use networks of bots, cyborgs, and paid trolls to create the illusion of grassroots support for ideologies and drown out organic discourse (26).

State actors like Russia, China, and Iran deploy inauthentic accounts to further foreign policy interests and divide adversaries ideologically (27). Domestic political groups also utilize these tactics, with private companies offering “astroturfing” services to spread disinformation that benefits clients (28). Coordinated inauthentic behavior allows the scaling of ideological manipulation by obscuring the orchestrated nature of social media activity.

Influencer Marketing for Ideological Normalization

Social media firms and interest groups often utilize influencer marketing to subtly integrate ideological messaging into popular culture. By paying social media influencers to reference or endorse worldviews in their content, ideological frameworks can be normalized among younger demographics (29). Pop culture opinion leaders lend their social capital to propagate beliefs, products, and politicians to impressionable followers.

Instagram and TikTok trends frequently originate from financial partnerships between influencers and brands (30). Politicians also contract influencers – a practice called “social media contracting” – to APPEAR organically aligned with their ideologies and make them seem “cool” (31). Influencers of all sizes participate for easy money and exposure. YouTube product sponsorships routinely feature ideological messaging too. The perceived authenticity of influencers enhances this ideological marketing.

Case Studies: Ideological Power and Social Media

Several real-world examples illuminate how social media permits convergence of communicative, economic, political, and technological power into discursive hegemony:

Rise of Fascism in Brazil
Jair Bolsonaro leveraged social media to help gain power in Brazil by spreading far-right fascist ideologies and disinformation (32). His use of Facebook, Twitter, and WhatsApp to target conservatives with extremist messaging helped radicalize Brazilian politics (33). Coordinated propaganda and fake news boosted Bolsonaro and radical right allies by constructing false narratives, leading to democratic backsliding in Brazil (34).

IGD and Online Radicalization
The Identitarian Movement widely propagates white nationalism through sophisticated social media like YouTube and Telegram (35). Growing online radicalization of young men coincides with exposure to Identitarian ideology which reframes racism to game algorithms and avoid censor (36). The decentralized, tech-savvy nature of IGD takes advantage of social media to recruit – their ideology spreads through activism, messaging, and coordinated media manipulation (37).

Rohingya Genocide in Myanmar
Hate speech and disinformation against the Rohingya Muslim minority population in Myanmar was proliferated on Facebook, helping fuel real-world violence amounting to genocide (38). With low internet literacy, language barriers, and little moderation, Facebook became “one-stop shop for manipulating public opinion” in Myanmar (39). Unmitigated racist narratives and anti-Rohingya propaganda on social media fomented genocide (40).

Growth of Networked Social Justice
Social justice movements like Black Lives Matter harness social media for mass mobilization and to shift culture through ideological messaging (41). Savvy use of Twitter, Facebook, and TikTok helps activists shape narratives, forcing mainstream recognition of marginalized voices (42). Videos of police violence shared virally spawned protests and calls for structural reforms aligned with leftist ideology (43). Social media is thus simultaneously utilized by both progressive and reactionary movements.

Attention Economy Incentivizes Disinformation
The business models of social media platforms incentivize engagement with disinformation that affirms ideological biases through advertising revenue – what some call the “disinformation-industrial complex” (44). Maximizing time-on-site and data collection means algorithmically promoting polarizing, false content that aligns with pre-existing beliefs (45). Allowing disinformation generates attention and ideological radicalization while creating societal discord that platforms profit from (46).

Conclusions

This analysis reveals how social media facilitates digital hegemony through interconnected technological, economic, and ideological power. Manipulation stems from centralized control of narratives and attention by platforms, along with state and commercial interests weaponizing social media to manufacture consent and fracture oppositional discourse.

While social media holds democratizing potential by expanding participation in civic culture, the current realities of platform capitalism counter this by permitting consolidation of communicative control and the means of ideological production by a small cadre of companies and elites. Countering disinformation without infringing speech remains an intractable challenge.

Nevertheless, expanding media literacy, teaching critical thinking, diversifying algorithms, and enhancing platform transparency and oversight offer pathways to mitigate ideological manipulation. Fundamentally, promoting independent, ethical journalism and empowering user agency through technology reform and education is essential to upholding inclusive, pluralistic digital discourse necessary for functional democracies.

References:

  1. Clement, J. (2020). Global social networks ranked by number of users 2020. Statista. https://www.statista.com/statistics/272014/global-social-networks-ranked-by-number-of-users/
  2. Twitter. (2020). Q2 2020 Letter to Shareholders. https://s22.q4cdn.com/826641620/files/doc_financials/2020/q2/Q2-2020-Shareholder-Letter.pdf
  3. Hootsuite. (2021). The Global State of Digital in 2021 Report. https://hootsuite.com/pages/digital-trends-2021
  4. Perrin, A., & Anderson, M. (2019). Share of U.S. adults using social media, including Facebook, is mostly unchanged since 2018. Pew Research Center. https://www.pewresearch.org/fact-tank/2019/04/10/share-of-u-s-adults-using-social-media-including-facebook-is-mostly-unchanged-since-2018/
  5. Clement, J. (2020). Global social network penetration rate as of January 2020. Statista. https://www.statista.com/statistics/278414/number-of-worldwide-social-network-users/
  6. Bozdag, E. (2013). Bias in algorithmic filtering and personalization. Ethics and information technology, 15(3), 209-227.
  7. Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130-1132.
  8. Tufekci, Z. (2018). YouTube, the Great Radicalizer. The New York Times. https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-politics-radical.html
  9. Vaidhyanathan, S. (2018). Antisocial media: How Facebook disconnects us and undermines democracy. Oxford University Press.
  10. Pariser, E. (2011). The filter bubble: How the new personalized web is changing what we read and how we think. Penguin.
  11. Wong, J. C. (2017). Former Facebook executive: social media is ripping society apart. The Guardian. https://www.theguardian.com/technology/2017/dec/11/facebook-former-executive-ripping-society-apart
  12. Roettgers, J. (2019). Study Claims YouTube Algorithms Push Users to Extreme Content. Variety. https://variety.com/2019/digital/news/youtube-algorithms-extremist-content-study-1203205828/
  13. Lima, C. (2020). 100,000 QAnon conspiracy theorists have been banned from Facebook in recent months. Fox News. https://www.foxnews.com/tech/facebook-bans-thousands-of-qanon-conspiracy-theory-accounts-in-new-purge
  14. Glaser, A. (2019). Twitter’s Disproportionate Banhammer. Slate. https://slate.com/technology/2019/12/twitter-leftist-account-ban-chapo-trap-house-suspend.html
  15. Zhou, M. (2020). Why Twitter Cracking Down on QAnon Is So Unusual. Vox. https://www.vox.com/recode/2020/7/22/21333350/twitter-bans-qanon-crackdown-trump
  16. Bradshaw, S., & Howard, P. N. (2019). The Global Organization of Social Media Disinformation Campaigns. Journal of International Affairs, 71(1.5), 23-32.
  17. Peters, M. A., & Besley, T. (2020). “Make America Great Again”: Donald Trump, Twitter and the 2016 election. Educational Philosophy and Theory, 52(9), 879-892.
  18. Obar, J. A., Wildman, S. S., & Bachen, C. (2018). Social media, political alienation, and political efficacy. Computers in human behavior, 87, 68-74.
  19. Ottovordemgentschenfelde, S. (2021). “The Best Men Can Be”? A critical visual analysis of Gillette’s “toxic masculinity” campaign in US American online news media. Critical studies in media communication, 38(2), 105-120.
  20. Gainous, J., & Wagner, K. M. (2013). Tweeting to power: The social media revolution in American politics. Oxford University Press.
  21. Stukal, D., Sanovich, S., Bonneau, R., & Tucker, J. A. (2017). Detecting bots on Russian political Twitter. Big data, 5(4), 310-324.
  22. Hersh, E. (2015). Hacking the electorate: How campaigns perceive voters. Cambridge University Press.
  23. Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. NYU Press.
  24. Solon, O., & Siddiqui, S. (2018). Cambridge Analytica boasts of dirty tricks to swing elections. The Guardian. https://www.theguardian.com/uk-news/2018/mar/19/cambridge-analytica-execs-boast-dirty-tricks-honey-traps-elections
  25. Shao, C., Ciampaglia, G. L., Varol, O., Yang, K. C., Flammini, A., & Menczer, F. (2018). The spread of low-credibility content by social bots. Nature communications, 9(1), 1-9.
  26. Marwick, A., & Lewis, R. (2017). Media manipulation and disinformation online. New York: Data & Society Research Institute.
  27. DiResta, R., Shaffer, K., Ruppel, B., Sullivan, D., Matney, R., Fox, R., … & Johnson, B. (2019). The tactics & tropes of the Internet Research Agency. New Knowledge.
  28. Howard, P.N., Ganesh, B., Liotsiou, D., Kelly, J., & François, C. (2018). The IRA, Social Media and Political Polarization in the United States, 2012-2018. Computational Propaganda Research Project
  29. Abidin, C. (2016). Visibility labour: Engaging with Influencers’
SAKHRI Mohamed
SAKHRI Mohamed

I hold a bachelor's degree in political science and international relations as well as a Master's degree in international security studies, alongside a passion for web development. During my studies, I gained a strong understanding of key political concepts, theories in international relations, security and strategic studies, as well as the tools and research methods used in these fields.

Articles: 14257

Leave a Reply

Your email address will not be published. Required fields are marked *