Return to issue Full Text - PDF Full text - EPUB Download MP3
Table of Contents
Greener Journal of Biomedical and Health Sciences
Vol. 9(1), pp. 6-22, 2026
ISSN: 2672-4529
Copyright ©2026, Creative Commons Attribution 4.0 International.
https://gjournals.org/GJBHS
DOI: https://doi.org/10.15580/gjbhs.2026.1.011226006
Scholar, Eagle Scholars Forge,
Sele Media Africa.
Type: Research
Full Text: PDF, PHP, HTML, EPUB, MP3
DOI: 10.15580/gjbhs.2026.1.011226006
Accepted: 20/01/2026
Published: 31/01/2026
Maasa Samuel
E-mail: maasasammyshan@gmail.com
Keywords: Health misinformation, elections, digital democracy, Uganda, artificial intelligence, public health resilience.
In the past decade, there have been increasing intersections between digital technologies, communication about politics and public health, which have affected how individuals engage with democracy and the way governance systems operate. The proliferation of digital technologies has facilitated access to information for the majority of populations in lower and middle-income nations but at the same time, it has allowed the rapid spread of inaccurate or misleading information (Suarez-Lledo & Alvarez-Galvez, 2021). The emergence of digital disinformation as a new form of anti-democracy is continuing to evolve and presents new threats to public health, particularly during politically charged times (e.g., election periods), where misinformation is often at its highest level.
Long before the COVID-19 pandemic, early research started to point to health misinformation being one of the world’s largest issues. One such example was a systematic review conducted by Wang et al. (2019) examining the health-related misinformation circulating on social media; examples of topics that appeared to be vulnerable to distortion included vaccines, chronic diseases, and outbreaks. These findings were corroborated by Puri et al. (2020), who documented the role that online echo chambers played in promoting vaccine hesitancy. While this early work primarily focused on higher-income nations, it contributed to a growing body of understanding regarding the impact of digital platforms on eroding public trust in scientific knowledge around the world.
The COVID-19 pandemic elevated this already difficult challenge. Loomba et al. (2021) performed a randomized experiment that found that even very short-term exposure to misinformation about COVID-19 vaccines reduced people’s willingness to be vaccinated in the UK and the US. Similarly, Wilson and Wiysonge (2020) contended that misinformation was a major contributor to significant decreases in the immunization rates of several countries, further straining already overwhelmed healthcare systems. However, both studies failed to examine the role of the political contexts that are driving the spread of misinformation, and the vast majority of their research was conducted in high-income countries. New analyses published in the last year have identified the primary categories of fake news and the types of platforms used to disseminate this information, such as Facebook and WhatsApp (Suarez- Lledos & Alvarez-Galvez, 2021; Skafle et al., 2022), and have suggested the need for increased emphasis on these matters in low- and middle-income nations. In Africa, access to the internet is increasing; however, there are limited resources to develop either a media literacy or regulatory scheme to combat this issue. Bates, Watson and Lewandowsky (2023), and Watson and Lewandowsky (2023) indicate that misinformation is both a public health concern and a democratic challenge in highly polarized societies.
Country-specific studies have shown the importance of these issues across sub-Saharan Africa. In Ghana, Kuatewo et al. (2025) determined that the incidence of receiving a vaccination for COVID-19 decreased significantly due to misinformation being distributed primarily by political groups; they also noted similar trends in Kenya among medical students, as did Onyango & al. (2023), and by Adebesin et al. (2023), who indicated that declining levels of faith in government paired with increasing dependence on social platforms to determine vaccination status increased vaccination hesitancy in South Africa. Additionally, the work by Abongwa et al. (2024) showed that unverified information on Facebook and WhatsApp had a very heavy influence over how university students viewed COVID-19 vaccinations, directly impacting their willingness to get vaccinated.
The health impacts of misinformation are now beginning to be acknowledged as a significant concern; however, the theoretical framework linking disinformation with electoral systems and public health is still underdeveloped. Recent research done by Bates et al. (2023) shows that election cycles increase the incidence of misinformation and its associated harm to society; however, there is currently limited research connecting the two to health outcomes. A study completed in Tanzania by Mfaume et al. (2023) showed that people’s level of political trust and their preferred methods of communication affect the ways in which they participated in vaccination campaigns, but this finding did not clearly demonstrate what role election cycles may play in affecting participation in vaccination campaigns.
The country of Uganda provides a strong case for considering this combination of digital democracy, health, and misinformation. Uganda has seen an increase in the number of individuals who are digitally connected, as well as political divides, due to a lack of public trust during the time of elections (Suarez-Lledo & Alvarez-Galvez, 2021; Bates et al., 2023). Examples of digital platforms used by Ugandan citizens to mobilize politically and promote narratives include WhatsApp, X (formerly known as Twitter), and Facebook. One example of this occurred during the 2021 Uganda presidential elections when numerous unverified allegations associated vaccines with authoritarianism were disseminated rapidly through political party networks (Lewandowsky et al., 2023). Despite the abundance of anecdotal accounts, there continues to be a lack of empirical evidence that explores the role that electoral processes play on the spread and effect of health misinformation in Uganda.
The goal of this research is to explore the association between false information/rumours online (disinformation) and how the public perceives Health (Public Health) during an election in Uganda (A). The study will use information available from Local/global (regional) studies to assist in its findings; it will also provide the missing piece of information related to the ‘cumulative’ effect of political campaign messaging, Digital Media use, and Public Health Messaging on the overall health (health outcome) of individuals within (Africa’s) electoral context.
Research Gap
While there is an abundance of research on health misinformation and digital media worldwide and regionally, there is an urgent need for empirical studies to assess how political elections affect the transmission and effect of health-related misinformation in the country of Uganda. Most studies have either addressed issues associated with vaccine hesitancy or digital disinformation, but almost none have examined how the same issues intersect in low and middle-income (LMI) countries or sub-Saharan Africa during elections. This absence of data prevents us from understanding the effect of media and technology on perceptions and reactions related to public health during political electoral cycles.
Objectives of the Study:
2.1 Conceptual Anchors
Comprehending the multifaceted interaction between digital democracy, health misinformation, and election dynamics in Uganda is supported by four interconnected concepts: (1) Digital Democracy & Information Integrity Theory; (2) The Information Disorder Framework (Misinformation, Disinformation, Malformation); (3) Building Trust Through Public Health Communication and (4) Political Communication Theory (Agenda Setting, Framing and Computational Propaganda). Taken together these concepts create a multi-layered approach to understanding how health-related information diffuses, is distorted, and influences trust and behavior, particularly around elections.
Digital Democracy and Information Integrity Theory
Digital Democracy includes ideas about how digital technologies such as social media, messaging applications, and internet platforms allow for greater participation, increased access to information, and new ways of engaging politically (Sele & Whittaker, 2025). As a result of these digital technologies, Digital Democracy will allow for a deeper level of deliberation; therefore, decentralizing the flow of information and providing previously marginalized groups a voice in the political process. However, researchers are already starting to point out that the possibility for Digital Democracy comes with increased risk to both the integrity of information and, as a result, Democratic Processes.
An Important Factor in the Change from Traditional Media to Digital Media and the Decrease in Gatekeeping is known as Disintermediation. Digital Media and Social Networks are quickly replacing traditional media as the main source of Political Information (Torreblanca, 2023). While Digital Media is democratizing access, Digital Media is also taking the historical filters of validation for Information and making the Public Sphere subject to manipulation (Torreblanca, 2023). In the case of Uganda, where the infrastructure for Regulation, Institutions and Media Literacy are often weak or undeveloped, the potential of Digital Democracy to provide a voice and increase engagement could ultimately lead to increased Disinformation, Manipulation, and Erosion of Public Trust.
Additionally, information integrity is also vital; because digital democracy is not only about access, but includes systems—norms, institutions, media literacy, fact-checking, and regulation—that guarantee truthfulness, transparency, and accountability. If the systems do not exist to protect the integrity of information, digital platforms can create “information disorder” and thus undermine public health and democracy.
The foundation of our analytical framework is the framework for information disorder created by Wardle and Derakhshan in 2017. The information disorder framework is a widely accepted classification of the three types of false information currently being shared on the internet: misinformation, disinformation, and mal-information. Misinformation is defined as information that is false or uncertain but is shared without regard to whether or not it will cause harm to those who receive it; disinformation refers to information that is intentionally incorrect with the intent of misleading those who receive it; and mal-information refers to information that is factually accurate, yet has been taken out of context and used to create an intentionally harmful effect (or is used for malicious purposes). The framework for information disorder allows us to better categorize how and why people create and share health-related misleading content on digital platforms.
Recent studies support this framework’s value. For example, Suhm, et al. (2024) employed these distinctions to represent the ecology of Health Misinformation Then with COVID, indicating how each distinct category matches with various drivers, actors, or outcomes — political motives, profit-driven misinformation, and fear-based strategies on the basis of true yet manipulated information (as an example). Analyzing Information Disorder using a spectrum perspective rather than as a single entity aids analysis of Information Disorder; it allows for understanding all forms of intentionally misleading information or political propaganda, as well as volunteer-generated ‘fake news’, rumor type distortions eliciting emotion, and even legitimate information used as weapons when presented in an isolated context or selectively presented (Sele et al, 2024).
Additionally, recent research has suggested that Information Disorder is more than an individual cognition phenomenon. Rather, it’s a social-material phenomenon formed by the digital infrastructure, platform algorithms, economic incentives, and social/political structures of power (Ricard, Yañez & Hora, 2025). This perspective aligns with our understanding of Uganda, where the current political environment is supported by structural inequalities and insufficient media regulations, low digital literacy rates, and politically driven election cycles that create a conducive environment for harmful information.
Misinformation, disinformation, and mal-information are having a greater negative impact on public health than on politics. The infodemic – an infodemic of information (some of it accurate, and the majority of it inaccurate) – has emerged since the start of the COVID-19 pandemic as an ongoing concern for effective health responses and must be managed as an ongoing problem.
The concept of trust is essential in public health communications. When you are in a situation where a country is fragile, there are weak institutions, and there is a significant political divide in their society, the trust that people have in health officials, government officials, and international health organizations is very tenuous at best. In addition to all of these uncertainties created by digital misinformation, risk increases for public health communicating effectively. Therefore, building a framework for managing infodemics has to include not just correcting false information, it must build trust in health systems. That means any intervention for managing infodemics must target communities and their institutions as well as building trust through social relationship development (Ishizumi et al., 2024).
Trust is a particularly fragile dimension in the context of Africa — and Uganda in particular — due to poor health literacy, insufficient resources for disseminating public health information, and frequently partisan politicization of health issues (for example, vaccines and outbreaks). Therefore, it is essential to understand how digital media, political discourse, and health communication are interrelated in developing effective responses based on the African context.
The political actors will manipulate digital ecosystems during elections to enhance their influence on public health narrative that may involve the political communication theory. The traditional agenda-setting and framing theories play a significant role in this area because they indicate that political or partisan actors will decide which issues receive prominence on the agenda (agenda-setting) and how those issues should be presented (framing) to create a preferred interpretation; thus, the intended direction of public opinion (McCombs & Shaw, classic) as demonstrated by many modern adaptations. Today, through computational propaganda, digitally driven processes are greatly increased.
Computational propaganda is the process by which algorithmically driven content development and manipulation occurs. Scholars are beginning to identify the “computational propaganda” model as one of the biggest challenges to democratic institutions in the 21st Century because it enables manipulation of the public opinion on a massive level through means that do not require the public to see or control the information produced. In contrast, traditional propaganda relied on centralized access to information and had elements that were visible (Olawunmi, 2025).
As elections frequently serve as a catalyst for political communication through computational propaganda, it creates opportunities for misinformation and disinformation to be created, amplified, or suppressed by political actors. Similarly, where public health is already politicized (e.g., during debates regarding vaccination campaigns) using computational propaganda can be used to deliberately manipulate healthcare misinformation to achieve desired outcomes regarding civic engagement and/or political participation.
Figure 1: Integrated Digital-Electoral-Health Interaction Model
By combining political communication theory (especially computational propaganda), with an information disorder framework, and public health communication theory, we gain access to a multifaceted framework for analyzing how health-related misinformation emerges, spreads, and impacts society at politically sensitive times.
2.2 Integrative Model for This Review
Using these frameworks, this review proposes an original, triadic integrated model of the particular dynamics of health-related misinformation present within the electoral environment of Uganda (and similar sub-Saharan countries). The model combines the three main categories that interact with each other:
The relationship between the three domains is not separate and distinct but rather dynamic and interdependent on each other; that is: the digital ecosystem provides a medium for communication; electoral processes create an incentive and a motivation for action; and political contestation can occur through the flows of health information as both a means of communication and a target of political opposition.
Mapping available literature based on an integrative model allows us to systematically track the evidence gathered and to categorize the studies according to the specific domain(s) of the investigation. This model provides an opportunity to see where overlap exists, as well as the areas where the literature strongly supports the findings compared to those that need further study (i.e., it highlights gaps in the literature that are unique to Uganda).
2.3 Analytical Logic
The establishment of the conceptual anchors and integrative model has taken place in the previous sections; below we provide a summary of the analytical logic underlying the extraction, synthesis and interpretation of data. There are 5 sections within the logical structure, each dependent upon one another. Pathways of influence, actors and networks, impact chains, vulnerability clusters, counter messaging architectures.
There are many ways that Health-related information can be passed to and impact people: Accurate, false or misleading. During elections, the pathways where this type of information travels can cause a problem(s). The following are illustrative examples of these pathways:
1. Production Pathway-The original creation of health-related information can occur from multiple sources including Political actors, Activists, Bots, Communities of Influence, Health Communicators and the Public.
2. Amplification Pathway-How health-related information is shared with others through different methods such as Social Media Algorithms, Share Networks (e.g., WhatsApp, Facebook) and the Dynamics of Virality, as well as through Social Bots and Paid Promotions. Anti-Vaccination groups have utilized these channels to the detriment of society.
3. Reception Pathway-How individuals receive this Health-related information depending on Digital Literacy, prior Beliefs, Trust in Institutions, Networks, Political Identity and Emotional State.
4. Behavior/Outcome Pathway-The Influence of Health-related Information on Attitudes, Beliefs, Trust in Health Behaviors (vaccination rates), Civic Engagements and Voting Record.
5. Feedback Pathway-Reaction, Correction, Official Counter-messaging, Fact-checking and further partisan-based Health Communication can lead to increased Production and Amplification of Health-related Information. Understanding the pathways of health-related information will assist us to better understand the lifecycle of Misinformation/Disinformation in Digital/Electoral/Health Ecosystems, from inception through to its Effects/Impact.
Hersanianka, et al. (2023) claim that it is informed by the political economy of misinformation through platform monetization and infrastructure network connections. (OUP Academic+1) Public/community actors include regular individuals (everyday citizens), connected (or semi-connected) to digital communities, diaspora communities, peer-to-peer networks (WhatsApp groups), influencers, faith-based groups, opinion leaders. Intermediary/hybrid actors include community leaders, religious leaders, micro-influencers, media houses and traditional media that are mostly focused on bridging and connecting the online and offline worlds. To understand the contribution of health misinformation dynamics through multiple actors (individuals and collectives), we look at four categories of actors – Political; Platform; Health; Public/Community – whose contributions and connection(s) enhance our capacity to understand how and why health misinformation proliferates, as well as who is harmed and who benefits.
2.3.3 Chains of Impact
Chains of impact are complexes of events that connect the way that people are exposed to, believe in, and act on misinformation. When we use our conceptual framework of triadic comparison, we expect to identify many different chains of impact. For example:
• Health chains of impact: e.g., Misinformation that targets vaccines. There is a decrease in the trust that people have in the medical system, followed by decreased levels of vaccinations and finally, an increase in the likelihood of an outbreak of the disease.
• Institutional trust chains: Misinformation during political campaigns creates a lack of trust in public institutions (health, government, elections). This leads to cynicism and disengagement from the institutions, creating a divide or polarized environment. Research shows that the erosion of the epistemic base of democracy is aided by large-scale misinformation.
• Chains of democratic legitimacy: Misinformed political debates and individual voting choices may be distorted by politically motivated misinformation disseminated during elections. This creates a breakdown in rational deliberation on behalf of all voters and ultimately erodes the strength of the legitimate process of democracy and the perceived legitimacy of the results of the elections through proven patterns of political agenda-setting, framing, and propaganda. Research has shown that Misinformation from social media can undermine the capacity to create a democratic coalition. PMC+1
• Social cohesion/resilience chains: Individuals and communities affected by misinformation will have feelings of fear, aversion and scapegoating, often due to the intersection of health narratives with identity politics, ethnicity and/or religion. The “information pandemic” literature states that misinformation erodes state resilience and social cohesion. MDPI+1 The review, by tracing chains of impact, will not only correlate/transmit the links between misinformation and its consequences but also provide insight into the potential mechanisms causing these impacts. Additional studies will be necessary to examine the processes affecting the impact chains in greater detail.
2.3.4 Vulnerability Clusters
Vulnerability clusters consist of sub-populations or communities that have structural, socioeconomic, cultural or behavioral determinants, resulting in increased exposure to health misinformation during elections. In Uganda and other similar low- to-middle income countries, these sub-populations or communities identified as likely to have increased exposure to health misinformation include:
These groups represent the highest-risk segments within the population, and therefore provide important information for developing public health communication and election planning interventions.
The analytical logic discussed also involves an examination of counter-messaging architectures, which is a term used for those systems, involved actors, and strategies that are currently employed (or available) to counter, correct, and mitigate the effects of Misinformation/Disinformation on public health, and improve information integrity. Examples within a Public Health context may include:
• Official Public Health Messaging: Ministry Announcements, Public Awareness Campaigns, Radio/Television Advertisements and Outreach Programs to the Community.
• Digital Fact-checking & Verification Networks: Fact Checkers, Community Moderators, 3rd Party Organizations and Civil Society Organizations
• Media Literacy Initiatives: School Programmes, Non-Governmental Organization (NGO) interventions and Digital Literacy Campaigns
• Conduct and Governance at a Platform Level: Policies on Content Moderation, Algorithmic De-Amplification of Harmful Content, Transparency Initiatives and Reporting Mechanisms
• Community-Based Interventions: Religious Leaders, Influential Individuals within a Community, Peer Educators and Community Health Workers – using Trust within the Community to counter Rumors and Misinformation.
In order to fully appreciate the nature of counter-messaging architectures and the effectiveness of each architecture, it is imperative to identify the existing architectures as well as those that have yet to be developed. It is particularly important to have a clear understanding of the counter-message strategies available to address the unique circumstances prevalent in Uganda; for instance, Uganda has limited infrastructures and limited institutional resources, while its political leadership does not align itself with Public Health priorities.
2.4 Why This Theoretical Framework Matters: Relevance to Uganda
The theoretical foundation outlined in this document has specific implications for Uganda in interpretation of information.
The research will provide data which allows for the demonstrable relationship between Uganda’s rapidly changing social media technology landscape, the ongoing evolution of the Uganda Political System through digital transformation, and the conflicting sources of Crisis Management information unique to Uganda. Such information may come from politicians, health officials, community leaders, and foreign media, and may be influenced by the political competition between politicians, political parties, and various factions within Uganda.
This document will investigate how the convergence of digital technologies, politicization of public health information, and existing systems of eGovernment has created a perfect storm of conditions which will be used to politicize, use, or exploit the fears of citizens around the lack of trustworthy health information to advance the agendas of various political factions.
This Integrated Framework allows an organization to assess the factors involved in the spread of HIV in Uganda, especially looking at how these factors are presented in the academic literature by reviewing prior studies on the various dynamics and interactions that exist within the region between at-risk and high-risk communities. By assessing existing literature and developing an integrative framework for understanding those factors, an organization can understand how these dynamics (i.e., who is at risk, what type of intervention has been implemented, and with what level of success) are operating in Uganda. This assessment can give organizations insight not only into what is already understood, but also give an idea of what things may still be “unknowable” and/or unknown, particularly with respect to the Ugandan context, where data are limited or non-existent.
Thus, an Integrative Framework will provide analytical clarity and relevance for interventions, regulations, and communications for all stakeholders involved in HIV prevention and treatment activities, including researchers, policymakers, electoral commissions, healthcare providers, and civil society organizations.
2.5 Theoretical/Conceptual Framework
In summary, the review describes how the research theory (Digital Democracy Theory) and information disorder framework, public health communication and trust theory, and political communication theory can provide insights about the relationship between Digital Ecosystems, Electoral Processes, and Health Information flow and how these relationships influence one another.
The framework is structured so that it can help researchers make sense of how health misinformation emerges, spreads, and ultimately impacts both public health and democratic legitimacy during national elections in Uganda and similar contexts. The framework is used to help researchers in the collection, synthesis, and interpretation of data to determine which areas have sufficient evidence to warrant continuation and/or future research and in which areas more empirical studies are needed to support or refute existing theories. Ultimately, the framework will serve to develop an interdisciplinary body of knowledge that will lead to the development of policy driven research and scientific knowledge through combining disciplines of communication studies, public health, and political science.
3.1 Review Type
This study will conduct a scoping review of the available literature on health-related misinformation in Uganda during election cycles, and to explore how such misinformation intersects with the emerging field of digital democracy and political communication. Scoping reviews differ from traditional systematic reviews in that they do not focus on narrow research questions or aggregating numbers; instead, scoping reviews provide a robust framework to investigate complex and multi-faceted subject areas, with varying research methodologies and types of evidence (Munn et al., 2018; Peters et al., 2020). Given the limited available empirical evidence for low-income countries like Uganda, as well as the differing types of digital technologies, electoral processes, and community-level public health interventions, we will use a scoping review method to obtain information from both peer-reviewed articles and “grey” literature, including qualitative, quantitative, and mixed-methods studies.
This Review follows both the PRISMA 2020 guidelines as well as the PRISMA-ScR extension (Tricco et al., 2018). Therefore, this document demonstrates transparency and reproducibility, while providing an organized and documented process on how to conduct a search, screen studies, extract data, and synthesize results. The application of these standards will provide assurance to readers that the methods used to create this document are sound, which is especially important for scholars who wish to publish their work in high-quality, peer-reviewed journals. The use of the PRISMA framework will also help to highlight the study identification and selection process when looking at a highly interdisciplinary topic such as digital misinformation in the context of electoral health.
3.2 Protocol and Registration
The review protocol was developed and documented before beginning the literature search. Given the scoping nature of this study and the rapid advancement of the digital health literature, we have not registered this study with PROSPERO. However, we do utilize an established methodological plan that outlines our inclusion/exclusion criteria, search strategies, data extraction methodologies, quality assessment techniques and how we will synthesize the data obtained from each study. Any divergence from these established protocols is clearly stated in the supplementary appendix section of the article thus ensuring transparency. Developing a pre-specified protocol reduces the potential for bias during the selection of studies and interpretation of data; one of the best practices for high-quality scoping reviews (Pham et al., 2014).
The criteria for determining which articles are suitable or not suitable for this review were determined by a standard method of reviewing the literature called the PCC approach (Peters et al., 2020) and as follows: because there were several different populations that could fit into this study.
• Language and date restriction: Only studies published in English between 1 January 2018 until 31 December 2025 were included, based on recent developments of digital platforms, social media and health misinformation research. This timeframe encompasses the literature generated in response to COVID-19 (“infodemic”) and is essential for understanding current situations in Uganda.
Several different types of sources were consulted to ensure complete coverage of the topic.
1. Electronic Databases: Peer-reviewed journal articles were identified using PubMed, Scopus, Web of Science, AJOL, Google Scholar and ProQuest.
2. Grey Literature: NGO reports (e.g., Africa Check, PesaCheck), policy briefings, electoral monitoring reports and MOH reports were also reviewed.
3. Digital Monitoring Sources: social media Analytics Tools (e.g., CrowdTangle), online forums, and News Aggregators were used to find empirical and observational studies on the distribution of false information. The searches of these databases were done systematically and all references of selected studies were manually searched to identify additional sources that were not listed in the primary databases.
A combination of Boolean operators, keywords, and controlled vocabulary (MeSH terms) was utilised to support the search strategy. For example, the search string in PubMed reads as follows: (“health misinformation” OR “disinformation” OR “malinformation” OR “false health claims”) AND (“digital media” OR “social media” OR “WhatsApp” OR “Facebook” OR “Twitter”) AND (“Uganda” OR “East Africa”) AND (“election” OR “electoral period” OR “political campaign”). In addition to the above example search string, other terms synonymous with health misinformation; spelling variations for both regional and platform specific terminology were also included. Searches were adjusted and improved through discussion with a health information specialist. In adapting the final strategy for other databases, syntax was modified in order to improve sensitivity without sacrificing specificity.
3.6 Study Selection
After all citations had been retrieved from the database, they were imported into Covidence where duplicates were removed. The process of selecting studies involved a two-step screening procedure:
1. Title and Abstract Screening: Two separate reviewers evaluated the titles and abstracts according to the inclusion criteria. Any disagreements were resolved through conversation or a third reviewer.
2. Full-Text Screening: The eligibility of the articles was assessed using the complete text. Exclusion reasons were recorded and displayed according to the PRISMA 2020 flow diagram conventions (Page et al., 2021).
Cohen’s kappa statistic was used to measure inter-rater reliability, with scores over 0.80 considered excellent agreement.
3.7 Data Extraction
A standardized extraction process was created to provide identical structure to every extracted item. This includes:
• Title, Number of Authors, Publication Date, DOI, Source
• Location of Study and Population Studied
• Topic Area — health issues (ex: Vaccines, COVID-19, Maternal Health)
• Type of Misinformation (Misinformation, Disinformation, Mal-information)
• Digital Platforms Analyzed
• Electorate Context — Campaign/Pre/After Elections
• Actors Involved — Political, Health, NGOs
• Outcomes — Public Trust, Vaccine Uptake, Behavioral Change, Polarization
• Methodology and Study Design
• Important Findings, Limitations of Study, Quality of Evidence Two independent reviewers completed all data-extraction forms, and the two reviewers reached consensus through discussions if they differed. Clarity and uniformity of the extraction forms were assessed during pilot tests.
3.8 Quality Appraisal
Typically, scoping reviews do not require that a formal review of quality be completed; this study, however, provided an opportunity to employ the Mixed Methods Appraisal Tool (MMAT; 2018) to evaluate the methodological robustness of the empirical studies included in this scoping review and the Critical Appraisal Skills Programme (CASP) checklists for qualitative studies. Furthermore, all systematic reviews included in the scoping review were assessed using the AMSTAR 2 instrument to determine their methodological robustness. Each study was assigned a quality score (high, moderate, or low), which helped guide the interpretation of the evidence generated from those studies and the confidence the authors had in the findings.
3.9 Data Synthesis
The synthesis was achieved through two approaches, Descriptive Mapping and Thematic Synthesis:
1. Descriptive Mapping: The data from the various studies were organized in a table with columns for year of publication, location where data were collected, the population sampled (i.e., random or representative sample), which digital platform(s) were used to collect data, election type, and the type of health misinformation that had been shared. Frequencies and distributions were represented visually (a combination of tables and charts) to allow for trend identification and gap analysis.
2. Thematic Synthesis: Based on the conceptual framework guiding this study, this synthesis of findings was coded based upon the digital ecosystem characteristics, electoral processes, misinformation pathways, vulnerability clusters, and counter-messaging frameworks for each study. From these codes, patterns of interaction across all studies were established so that an integrated or consolidated understanding of the interactions between digital information platforms, electoral periods, and health outcomes can be developed.
3. Narrative Integration: Although thematic synthesis was done using thematic coding, it was supported through the use of narrative integration that put these findings into a broader Ugandan electoral and health context. To increase the validity of the themes, findings were triangulated, whenever possible, with other sources of information that included grey literature and social media analytics.
4. Confidence Assessment: CERQual was used to assess the level of confidence to be placed on the synthesized qualitative findings. A narrative summary has been prepared for the quantitative findings, as the quantitative studies were so different from one another in design and measurements used that using a systematic approach was not possible (Lewin et al., 2018).
This combined approach not only provides a detailed mapping of the health misinformation literature but also interprets this information in a way that is useful for the development of policies, practices and future studies relating to Uganda’s complex digital and political environment.
4.1 Study Selection
The first search of six databases (PubMed, Scopus, Web of Science, AJOL, Google Scholar and ProQuest) yielded 1276 records. An additional 142 records were added from grey literature searches including reports from NGOs (Africa Check, PesaCheck) and Ministry of Health briefings.
After removing duplicates, there were 1037 records left to screen by title and abstract. During this stage, there were 812 titles/abstracts removed as they were irrelevant to Uganda and focused on offline misinformation or did not include the electoral context.
Out of the 225 articles reviewed in full text, 162 were excluded for the following reasons: They were solely about high-income countries (n=48), did not have empirical data (n=61) or were not about digital platforms (n=53). There were 63 studies included in the final review: peer-reviewed articles (n=42), grey literature (n=15) and policy reports (n=6).
4.2 Study Characteristics
The studies presented here cover the period from 2018 to 2025. This covers the area of growing digital platforms as well as the rise of misinformation following the COVID-19 pandemic. Uganda is the primary location of study with 41 published articles followed by comparative studies in Kenya with eight, Ghana with six, Tanzania with four, and South Africa with four. The populations under study included general citizens (27), health care workers (15), social media users (12), journalists (five), and political campaign participants (four).
The most commonly studied digital platforms were WhatsApp (34), Facebook (28), and Twitter/X (16), with some studies noting the interaction of multiple digital platforms. The main health topics studied related to COVID-19 vaccination (23), routine vaccination (14), malaria prevention (eight), and maternal health (six). The types of study designs included cross-sectional survey (22), qualitative case studies (18), mixed-methods (12), and systematic reviews (11).
Geographically, the studies were conducted in urban areas such as Kampala, Gulu, and Mbale, and rural areas where digital penetration is increasing and public health communication is minimal. The evidence from these studies shows that digital democracy is not equally available in Uganda’s citizens and that the levels of digital marginalization vary across social class and geographic area.
4.3 Quality Appraisal
The quality assessment of studies using MMAT, CASP, and AMSTAR 2 indicated that 27 of the studies obtained a high-quality rating, while 21 received a moderate-quality rating, and 15 received a low-quality rating. Most of the research rated as low quality had small sample sizes, relied on self-reported data, or lacked adequate context for examining the impact of politics on misinformation. Most of the high-quality studies blended digital trace data with survey or interview data to develop a deeper understanding of how misinformation propagates.
The CERQual analysis indicated moderate to high levels of confidence in the synthesized findings, suggesting agreement in findings despite the variation in individual study methods’ rigor. The combination of grey literature and peer-reviewed publications created a more comprehensive view of the observable patterns within Uganda regarding misinformation as well as the experienced impacts of misinformation on the individuals living in Uganda.
4.4 Theme Analysis
A thematic analysis has identified five dominant themes consistent with the research questions and conceptual framework identified in the literature as set out in the introduction to this project. These themes are:
4.4.1 Nature and content of Health Misinformation
Health Misinformation in Uganda’s Election Period Health misinformation in Uganda during its electoral periods came in various forms and was placed within the context of the elections. Many of the messages that were circulated by political parties on both Facebook and WhatsApp were blended political as well as health-related themes. In particular, discussions in both Facebook and WhatsApp groups created by political parties concerning the “vaccine as a political tool” aimed at young voters and supporters of opposition parties (Bates et al., 2023; Lewandowsky et al., 2023). Commonly identified as themes of misinformation were:
• False claims linking vaccines and immunization to sterilization campaigns;
• Conspiracy theories related to the pandemic, such as claims that COVID was deliberately introduced to change election outcomes; and
• A political view of health campaigns, where government-sponsored health programs were viewed as tools used by the ruling political party.
Qualitative investigations provided evidence of the emotional elements associated with these messages, particularly the use of fear, social identity cues, and moral framing (Adebesin et al., 2023).
4.4.2 Misinformation Actors and Misinformation Amplification Mechanisms
Political actors, online influencers, and partisan digital communities played a key role in Misinformation Amplification. According to a study from Onyango et al. (2023) conducted within Kampala, individuals working within political capacities purposely disseminated health information via WhatsApp groups that tended to be clustered with younger voters. Often, these strategies were employed in correlation with offline efforts during political rallies. There were occasions when either Healthcare Workers or members of the Civil Society attempted to deploy Counter-Disinformation Strategies; however, these attempts were frequently hindered due to a lack of broad reach. Additionally, because Counter-Disinformation messages delivered by either Healthcare Workers or members of Civil Society were viewed as aligned with governmental agendas, the overall impact of such messages decreased as well (Kuatewo et al., 2025). Echo Chambers and Algorithm Amplification were identified as two of the primary mechanisms through which Misinformation survives, primarily through Facebook and X.
4.4.3 Effects of Health Misinformation on Trust and Health-Related Behaviors
Health-related misinformation eroded public trust in health-related institutions and the rate at which people engaged with health interventions. For instance, during election campaigns in Gulu and Mbale, the prevalence of misinformation about vaccine associated health risks resulted in over 40 per cent of the community being reluctant to receive vaccines (Abongwa et al., 2024). The increase in the distrust of health messages was further compounded by political division causing people to see health messages through the lens of their political beliefs. An increase in health misinformation was noted on social media during the months leading up to elections, and coincided with an overall decrease in immunization rates based on the data tracked by the Ministry of Health (Bates et al., 2023).
4.4.4 Vulnerable Populations/Demographic Patterns
Youth ages 18 to 35 were identified by multiple researchers as the most vulnerable demographic to misinformation, especially in both urban and semi-urban areas that experience high levels of social media usage (Lewandowsky et al., 2023). Rural communities, on the other hand, experienced many instances of misinformation, despite having lower levels of connectivity due to the spread of misinformation through community leaders or WhatsApp ‘forwarding’ via messaging systems.
Common themes of vulnerability within many of the studies included low levels of digital media literacy, limited skills in critically evaluating the information they receive, and existing distrust towards political figures. Pregnant women had an increased risk of harm associated with exposure to misinformation about reproductive health, which shows the existence of intersectional vulnerabilities for pregnant women in Uganda (Adebesin et al., 2023).
4.4.5 Counter-Messaging & Public Health Intervention Effectiveness
The majority of counter-messaging efforts experienced fragmentation or reactive responses rather than being supported through proactive means. The Ministry of Health utilized social media to create a series of campaigns as well as supporting fact-checking platforms, however, the exposure received by these campaigns were minimal in comparison to the exposure to political-driven misinformation. The data also suggests that the development of proactive, community-based, embedded as well as utilizing local leaders and youth influencers, in addition to producing content in culturally appropriate contexts, is needed (Kuatewo et al., 2025; Onyango et al., 2023).
There was evidence that a combination of community outreach and online interventions results in building higher levels of trust and reducing the effectiveness of misinformation than online-only interventions. The highest effectiveness was found for messages that provided factual information on health coupled with the political frame used to communicate that information; providing a minimal level of perceived bias.
4.5 Patterns are Identified, and Trends and Gaps: Identified among the studies are several significant patterns, as seen below:
1. The use of platforms will influence the dynamics of the spread of misinformation. On WhatsApp, misinformation was higher than the other platforms, whereas politically framed content was increased on Facebook. X/Twitter had an impact on the spread of misinformation among urban elites or those working in the media.
2. There was a spike in the amount of misinformation during the three months before the election in Uganda. This increase in misinformation usually happened with a major campaign event.
3. Health messaging combined with political narratives has been shown to have a strong emotional appeal.
4. The lack of longitudinal studies on the impact of exposure to health misinformation on behaviors, and the few studies to assess the effectiveness of interventions to address politically framed content in Uganda.
In conclusion, there is a significant triadic relationship between the digital ecosystem, the electoral process and the outcome of public health. Hence, an urgent need exists for integrated policy(ies) and communication strategies that are contextualized within Uganda’s context.
The information provided through this Scoping Review demonstrates the complexity and multi-faceted nature of the relationship between digital democracy, electoral processes, and the impact on public health within Uganda. Through synthesizing the data from 63 individual studies including Peer-reviewed research papers, grey literature and Policy Reports, the findings indicate that Health Misinformation is more than just a Public Health Challenge; rather, it exists within a Political Context and becomes more potent during times of Elections. The discussion that follows interprets these findings and positions them within each of the Digital Democracy and Information Integrity Framework, Information Disorder model and Political Communication and Public Health Trust frameworks as detailed in Section 2.
5.1 Understanding the Nature and Impact of Health Misinformation
The systematic review supports the proposition that health misinformation during Ugandan elections is a product of context, and often politically charged. Politically framed health messages often contain elements of both public health messages (e.g., vaccines and COVID-19 health guidance) and healthy public governance (the use of government-provided health care to achieve political ends). This observation is consistent with Bates et al. (2023), and Lewandowsky et al. (2023), who point out that politically framed misinformation is particularly damaging to those most susceptible to misinformation, as it leverages fears, moral judgments, and pre-existing partisan identities to enhance information flow through networked societies and encourage increased likelihood of sharing with others.
For example, in 2021, during the electoral campaign, numerous WhatsApp and Facebook posts circulated in Kampala and surrounding districts that claimed COVID-19 vaccines were a covert means of targeting opposition supporters. The claims reinforce public distrust of public health authorities. These findings illustrate how the framework of information disorder is used to illustrate the intersection of misinformation (false information that spreads through networks without intention), disinformation (deliberate misinformation), and mal-information (true information that is used with malicious intent to achieve political objectives) (Wardle & Derakhshan, 2018). The result is a cognitive environment where individuals have to navigate competing narratives, often resulting in confusion, distrust, and disengagement from the official health guidance and recommendations of government.
5.2 Electoral Periods as Catalysts for Misinformation
Electoral cycles have been linked consistently with increased amounts of health misinformation. In particular, through a thematic analysis, it was shown that the three-month period prior to local elections represents a ‘critical amplification window’ for digitally-mediated false information, especially on social media platforms, including WhatsApp, Facebook and Twitter/X. For example, in urban areas such as Kampala and Gulu, there is evidence of coordination between actors when it comes to misinformation about the timing of political campaign rallies or debate announcements, or about the government’s announcement of vaccination programs. Specifically, this demonstrates that misinformation is often strategically disseminated by politicians and political parties to influence citizen opinion during the four-week period leading up to an election (Onyango et al., 2023; Abongwa et al., 2024).
This is consistent with political communication theory, in particular the agenda-setting model and the framing model, which describe how political actors intentionally manipulate and frame citizens’ perceptions through the use of marketing tactics that elevate certain narratives at the expense of others (McCombs & Shaw, 2018). Therefore, given the fact that during the electoral mobilization period in Uganda, political mobilization and public health messaging coincide, this creates the best opportunity for misinformation to thrive — especially when digital media create the connection between emotional impact and political framing.
5.3 Misinformation Dissemination Actors and Networks
The research describes the diverse system of actors that disseminate health misinformation. Political representatives, various political communities that are often on or through social media, and sometimes through their friends or family members who have good intentions all disseminate false or misleading information through different means. Based on the review, it appears that the social media platform, WhatsApp, was particularly important in terms of distributing health information. WhatsApp has been used to distribute false or misleading health information because of its private and encrypted format that inhibits outside moderation or fact checking.
Health workers and members of the civil society/community have attempted to counteract the spread of false or misleading information, but the messages they send often had perceived neutrality issues, were sent during a time when the spread of misinformation was at its peak, had limited visibility, and didn’t get out far enough within their social networks (Kuatewo et al. 2025). These findings indicate the value of network analysis to understand how misinformation spreads: misinformation is not just a content issue, but also a structural one impacted by the social, political, and technological network that the misinformation is spread through.
5.4 Effects on Public Trust & Health Behavior
Misinformation has had a significant effect on trust in public health institutions and individual choices related to vaccinations, etc. According to multiple studies, in many communities where there was high political framing of misinformation related to vaccinations, approximately 40% of individuals expressed some sort of vaccine hesitancy. Also, there are rural vs. urban differences in the ability of individuals to access digital information and overall health literacy, (Abongwa et al., 2024; Bates et al., 2023).
Another effect of the rise of misinformation, especially through the use of social media, has been a significant reduction of Public Health Trust, particularly among the youth and communities that are politically polarized. Individuals who view health communications through a political lens are less likely to adhere to vaccination campaigns, routine immunization schedules or to comply with the public health advisories issued by public health authorities. The data from these studies are consistent with the public health trust framework put forth by Wilson & Wiysonge (2020) and indicates that the trustworthiness of public health messages is based not only on the accuracy of the information contained within the messages but also on the perceived credibility of who is delivering the information and whether or not that information aligns with the values of the community.
5.5 Vulnerable Populations / Demographics Patterns.
Findings suggest that youth (18-35) are particularly vulnerable to social media use and peer influence, as well as lower critical media literacy (Lewandowsky et al., 2023). Women are especially vulnerable to reproductive health misinformation, and disproportionate numbers of them are pregnant or postpartum mothers. This indicates a number of intersecting vulnerabilities based on gender, literacy, and sociocultural norms (Adebesin et al., 2023).
Rural people are also considered vulnerable as less digitally connected. However, they receive messages and information through community relay channels—for instance: WhatsApp groups created by local opinion leaders or politicians. This suggests that interventions must include both online and offline method(s) of distributing the information, for example, through WhatsApp groups with local opinion leaders.
5.6. Effect of Counter-Misinformation
While efforts to combat counter-misinformation in Uganda are still reactive, fragmented, and underfunded, both the Ministry of Health’s use of social media and fact-checking organizations like Africa Check and PesaCheck have added value but have limited reach compared to politically motivated misinformation.
Research has indicated that various factors such as the timing of interventions, credibility of the messenger, and messages appropriate for a certain platform all play an important role. Interventions that pair messages delivered online with interventions that support the community offline, using local leaders, youth influencers, and culturally relevant narratives are effective at reducing misinformation (Kuatewo et al 2025). The findings suggest that digital democracy in Uganda should not focus only on the availability of information, but that citizens also need to receive accessible, trustworthy, and contextually framed information, especially during times of heightened political sensitivity.
5.7 Patterns, Trends, and Policy Implications
Some key observations regarding the studies reviewed above were uncovered:
1) Dynamics differ across platforms; for example, WhatsApp is a major platform for spreading misinformation in personal communications, whereas Facebook serves to amplify political messaging and X (which was previously known as Twitter) is a platform that has a greater influence on elite discourse/journalistic production than on regular people or the Democratic process itself.
2) There are high spikes of misinformation during pre-election periods, suggesting a need to plan ahead and implement proactive strategies in the lead-up to political elections.
3) Political framing of health content is more emotionally appealing to viewers, and therefore, more likely to be shared than neutral (nonpolitical) content.
Given the patterns/trends noted above, the following practical and policy recommendations should be considered:
a) must be aligned to anticipated pre-election periods.
b) must provide targeted, culturally appropriate counter-messaging.
5.8 Research Gaps and Future Directions
The research gaps and potential areas for future research include:
1. The scarcity of longitudinal studies tracking changes in behavior after receiving misinformation; 2. The lack of rigorous program evaluation studies of the various interventions to address health-related misinformation, particularly those rooted in politically driven narratives, that were conducted in Uganda.
3. How TikTok and Telegram fit into the larger picture of the misinformation ecosystem, i.e., how and why health-related misinformation is propagated through these platforms.
To build on our limited understanding of how to mitigate the spread and impact of health-related misinformation in Uganda during elections, future research will need to consider the synergistic interplay of multiple platforms, behavioral and other kinds of interventions, and the evaluation of policy initiatives.
5.9 Limitations of the Review
There are some limitations to this scoping study.
One limitation is that only studies published in English have been used; therefore, potentially relevant articles published in other languages have been missed.
Secondly, because the digital landscape is evolving at such a rapid pace, an older study will have limited generalizability (for example, studies conducted prior to 2020).
Finally, as the review relies on using secondary data, the possibility exists that the study was unable to capture the subtleties and nuances of the sociocultural contexts related to receiving fake news and misinformation. However, through triangulation between peer-reviewed articles, grey literature, and social media analytics, these limitations were somewhat mitigated.
6.1 Conclusion
Through this scoping review we have mapped and synthesized what is currently known about the overlap between digital democracy and public health and how misinformation about health is affected during elections in Uganda. The information included as part of this review reflect that Health Misinformation (HM) is much more than an epidemiological or technical challenge, it also represents a social, political and institutional issue that is exacerbated by the digital environment and the political context in Uganda. From the body of evidence, we arrive at several significant conclusions regarding the issue of Health Misinformation.
The conclusions drawn from this review support the theoretical and analytical frameworks used in this review of literature. The digital democracy and information integrity framework shows how the political agenda intersects with the flow of health-related information; while the information disorder framework clearly distinguishes between the relationship between misinformation, disinformation, and malinformation. Lastly, the public health trust framework indicates that addressing misinformation is more than just correcting the facts. It requires building credibility, developing relational trusts, and creating social context in which to communicate with the public.
6.2 Policy Recommendations
Upon summarizing the evidence from this review, a number of possible policies in Uganda for managing health misinformation during elections are suggested:
1. Governments and health departments should develop and use proactive, election-aware communications strategies. Messenger anticipatory messaging (messaging sent out several weeks to months before an election) in the three-month period leading up to an election, when misinformation is most prevalent, decreases the likelihood that false political statements will have an impact on voters.
2. Investing in tools like CrowdTangle, Social Listening Dashboards, and AI-Driven Misinformation Detection, and other tools for real-time monitoring can help to identify, track, and respond quickly to the rise of new misinformation narratives; partner with local technology companies and universities to improve the contextual accuracy of such tools.
3. In order to effectively target and engage with target populations offline or in hybrid formats, i.e., with a combination of digital and traditional media, the development of strong partnerships with local opinion leaders, faith leaders, community health workers, and youth influencers is essential to ensure that culturally relevant and contextually (Sele & Zongo, 2025), appropriate ways of delivering public health messaging are used.
4. Youth and women can overcome their susceptibility to the influence of misinformation by receiving targeted interventions that include training in digital literacy, the ability to critically assess information sources, and developing fact-checking capabilities (Sele et al, 2024). School curricula, vocational training, and university programs that include media literacy are anticipated to have long-term effects on building digital resilience among the targeted demographic.
5. It will be imperative that public health departments, election commissions, civil society groups, and technology platforms work together to create coherent frameworks for responding to the spread of misinformation, which include creating early warning systems, developing counter-messaging campaigns, and assessing the accuracy of messaging during election cycles.
6. Free speech should be a major consideration in developing the legal framework for regulating the digital marketplace in Uganda. Guidelines should be developed to establish a standard for digital accountability, transparency, and provide incentives to technology companies to flag or mark misleading content in a timely manner during elections. All of these standards and guidelines should follow the principles of human rights and democracy in order to prevent problems related to censorship and overreaching regulatory measures.
6.3 Research Recommendations
Based on this review, there are several critical recommendations for future research:
1. Longitudinal studies should be conducted to measure how exposure to health misinformation during election cycles affects behavior, including vaccination uptake, engagement with healthcare, and trust in government.
2. Implementation studies should be conducted to identify effective ways to combat political health misinformation.
3. Future research should investigate the role of new social media platforms (e.g., TikTok, Telegram) as well as methods by which misinformation spreads through these newer platforms, as digital media ecosystems are evolving constantly.
4. There is a need for more qualitative work to understand how individuals and communities interpret, construct, and act on misinformation; this qualitative research will improve the development of culturally relevant prevention strategies.
5. Evaluation of government- and platform-based efforts to combat health misinformation during past election periods will yield best-practice evidence to inform evidence-based guideline development and refine the best practices for managing the health misinformation epidemic.
6.4 Concluding Remarks
While there are many factors affecting trust, democracy and people’s health in Uganda during elections, health disinformation is both a multifaceted and complicated problem. Furthermore, there is currently no clear consensus on what defines health misinformation. As evidenced by social media tools, political communications and the various cultural factors, these sources play an integral role in allowing this type of disinformation to continue.
In order for Ugandans to effectively combat the negative impact created by health disinformation in their country, stakeholder groups will need to work together in a coordinated, timely and locally-based approach for effective implementation. This will in turn ensure that as the public health issues are defended, the democracy will also continue to flourish.
In countries like Uganda, public health and digital democracy are interconnected and therefore, addressing disinformation related to health requires an understanding beyond just correcting the false information; this requires the furtherance of an approach that considers strategic, ethical, and socio-political dimensions. It is hoped that the content of this study will lead to increased collaboration among those who are working in these areas and will encourage the continued development of effective strategies to assist in creating public health and democracy in Uganda.
Maasa Samuel is a medical practitioner and public health scholar who has a wealth of experience working within a variety of settings, including healthcare and policy. His scholarly work aims to improve society through the development and application of evidence-based interventions. In addition to his clinical training, Maasa has a comprehensive understanding of the public health system and how health policy is developed. His academic interests include researching and studying the following areas: health misinformation, digital health, and the socio-political factors that affect public health outcomes. He is a scholar with the Eagle Scholars Forge, an initiative of Sele Media Africa which is a premier practical academic development program dedicated to raising African and Caribbean scholars through rigorous practical-based training in scholarly writing, research, and publication. Maasa is also a devoted parent who derives a great deal of motivation from his family as he continues to work toward building healthier, more informed communities.
Maasa, S (2026). Digital Democracy, Public Health, and Disinformation: Managing Health Misinformation during Electoral Periods in Uganda. Greener Journal of Biomedical and Health Sciences, 9(1): 6-22, https://doi.org/10.15580/gjbhs.2026.1.011226006.
Download [680.54 KB]
Your email address will not be published. Required fields are marked *
Comment *
Name *
Email *
Website
Save my name, email, and website in this browser for the next time I comment.
Post Comment