Misinformation is a growing concern in many areas of psychology, cognitive sciences, and social approaches. The concept relates to myths associated with various disciplines, such as health and politics. Misinformation continues to persist globally since most rely on it. For example, the deadly rumor that the Zika virus was genetically modified was health-related misinformation that lacked scientific evidence. Misinformation exerts psychological influence on the audience even after it has been refuted because they had earlier believed in misinformation. As such, the attempts of myth-busting may not be achieved since the process of trying to correct misinformation remains quite incompletely understood and complex (Pluviano et al., 2017). After exposure to inaccurate information, people exhibit effects on the content in their problem-solving and decision making. Regardless of attempts to expose one to accurate information, it becomes hard for them to reject its inaccuracy due to confusion, accurate understanding, and subsequent reliance on falsehood. According to Pluviano et al. (2017), misinformation is effectively corrected using debunking, inoculation, and well-designed graphs and pictures techniques of myth-busting. The paper presents how myth-busting helps correct misinformation and discusses how debunking, well-designed graphs, and visual forms and application of neutralizing effect techniques such as inoculation corrects misinformation. In addition, an insight into how theoretical limitations such as attitude bolstering in graphs and pictures, restrictive nature of objectives of inoculation, and poor elaboration in debunking techniques prevent myth-busting success in correcting misinformation.
Myth-busting is a technique used in correcting misinformation by exposing the myths and debunking false information using facts. However, repeating the myths, in most cases, contributes to its acceptance as a result of perceived familiarity (Pluviano et al., 2017). The content-focused model of judgment postulates a strong argument about particular misinformation to decrease its acceptance in the public domain. For instance, in refuting myths related to immunization, healthcare practitioners are expected to utilize effective information that reinforces vaccination accuracy. Despite such strategies, it is evident that most people accept false information to be true. Pluviano et al. (2017) highlight that different facts that justify particular myths should be utilized. Notably, participants should be exposed to more facts, evidence, and myths that counteract the public domain's myths. The framework helps gain more knowledge concerning specific health topics and recall accurate information based on presented factual information. Chan et al. (2017) argue that the audience plays an integral role in reducing credulity. Human memory is a process of reconstruction, and for that matter, Chan et al. (2017) claim that it is vulnerable to external and internal influences. Therefore, for misinformation to be corrected, systematic reasoning related to the message should have the ability to increase its impact on the audience.
Delegate your assignment to our experts and they will do the rest.
Myth-busting applies various patterns and trends in achieving the desired correct information. One of the methods of correcting misinformation is through well-designed graphs and visual forms that can attract the attention of the audiences (Pluviano et al., 2017). The techniques enable people to process information effectively and facilitate the recollection of truth. Graphical and visual presentations are easier methods that individuals can use in the processing of complex verbal information. Ecker et al. (2017) reiterate that a fact box with visualized information that consists of transparent risk communication and encompassing transparent risk communication, treatment, screening, and scientific summary should be used because of its easy understandability. The fact box provides benefits and shortcomings of a particular treatment plan without plain frequencies. The method helps in avoiding misleading statements that may cause misunderstandings by people. Fact box enables people to make informed choices by improving their healthcare-related knowledge of vaccination's benefits and side effects.
The persistence of misinformation can also be reduced through the use of detailed debunking messages. Chan et al. (2017) argue that labeling incorrect information as myth and providing credible information is crucial in curbing existing misinformation. The authors utilized the mental model in highlighting on debunking fake information. In the debunking of misinformation, the model postulates that one should provide sufficiently detailed messages that allow audiences to abandon the previous information and familiarize themselves with the new model. Nonetheless, Berinsky (2017) views that refuting rumors surrounding health care from unlikely sources under particular circumstances is key in increasing people's willingness to reject misinformation. The credibility of sources, according to Berinsky (2017), is an effective tool that can be used in debunking rumors. Misinformation acquires power through familiarity, and thus, quashing of rumors through direct refutation facilitates its diffusion and increases fluency. Therefore, a credible source not directly linked to the misinformation is important in achieving the credibility required. Notably, myth-busting in the health care sector related to misinformation from political class can be debunked by credible sources such as religious institutions.
Misinformation can also be reduced through the application of neutralizing effect methods such as inoculation. Cook et al. (2017) pre-existing attitudes influence the response of people to new information. The authors reiterate that people should be protected against misinformation by exposing them to a refutable version of the existing incorrect message through inoculation theory. For example, in misinformation involving vaccines, inoculation that equips people counterargument messages that conveys resistance to further misinformation should be emphasized. According to Cook et al. (2017), inoculation utilizes an implicit warning associated with an impending threat or refutation of the expected argument that consists of imminent fallacy. Inoculation is an effective way of conveying resistance to inaccurate information. Rapp & Salovich (2018) claim that inoculation message is important in behavior-change interventions. The method helps reduce conspiracy theories' influences by increasing skepticism towards claims made based on conspiracy influence. Inoculation helps neutralize the effect of misinformation by shedding light on how wrong information developed and impinges people to be accurately informed. Such enhances people's potentiality and effectiveness to believe in accurate information that had been affected by misinformation.
Myth-busting techniques are essential in reducing the effects of misinformation; however, they exhibit theoretical limitations. Notably, correcting wrong information through visual and graphical representation might not be effective (Rapp & Salovich, 2018). Regardless of its validity, people tend to reject counterarguments by believing in their worldviews and personal ideologies that override the existing facts. Schwarz et al. (2016) view that; attitude bolstering may result despite measures in place due to the cognitive consistency perspective. People can engage in defense mechanisms such as engaging in biased studies that might likely result in source derogation and, subsequently, reactance.
Debunking messages also exhibit theoretical challenges. One of the limitations of debunking is the elaboration of misinformation, which leads to failure or reduction cases in acceptance of the technique. As demonstrated by debunking theory, elaborations related to a particular myth allow the audience to develop mental models that can result in bias processing of desired information. The debunking method also requires detailed scrutiny for optimal acceptance to be achieved (Pluviano et al., 2017). In case of failure of the group dealing with myth-busting to offer satisfactory information about the information, it leads to skepticism, making it hard to convince masses about the wrong information circulating. Similarly, multiple explanations also result in a continuous influence on misinformation.
Neutralizing effects such as inoculations are not quite ineffective in myth-busting. Inoculation theory restricts the objectives of myth-busting. The inoculation strategy effectively deals with behavioral issues affected by misinformation (Ecker et al., 2017). For instance, the theory was developed to reduce the effects of conspiracy theories associated with factors affecting people facing a threat behavioral issues. Another shortcoming of inoculation method is its effectiveness. The theoretical framework is only applicable to people who possess different pre-existing attitudes, and thus, the method is not relevant in dealing with issues affecting a control group ( Schwarz et al., 2016) . Neutralization of information may also confuse the target's inability to understand which information is correct and false. Such may result in subsequent dependence on false information.
Rationale
Myth busting is key in reducing effects of information. Similarly debunking messages also have theoretical challenges and limitations from misinformation. It is therefore calls for the need to develop mental models that can explain why there is biasness in processing desired information.
Hypothesis
Researchers theorize inoculation as a strategy that effectively deals with behavioral issues affected by misinformation (Ecker et al., 2017). The hypothesis for this study will be The higher the amount of the brain used, the more talented and intelligent a person becomes . The independent variable will be brain capacity as defined by its percentage size. The dependent variable will be Talent and Intelligence.
Methods
Sampling
I will use the Random sampling method because it will help me select any members from my class thereby preventing bias in my results and will recruit my sample from my classmates.
Measures
I will use a debunking myth fact sheet describing the myths and then debunking them in a two-column format. In addition, I will use a one-question test measuring the amount of misinformation retained.
Humans use 10% of their brain.
Absolutely true.
I think it’s true.
I’m not sure.
I think it’s false.
Absolutely false.
MYTH | FACT |
Humans only use 10% of their brain. Humans use only a small portion of their brain. Different people use different areas of the brain, but no one uses all of it. This accounts for the differences in human talent and intelligence: what areas of the brain is used. But, we all have untapped potential. |
The average human uses 100% of the brain on a daily basis, and there is no “silent areas” of a normal healthy human brain. Throughout an average day, evidence suggests humans use most of, if not all of their brains, just as they use most, if not all their muscles. At certain points in the day parts of the brain may be less active, and some tasks may only require some areas of the brain to perform them, but no part of the brain is “unused.” |
Procedures
A pre-test, post-test, control group design will be used for this study. This design method where units of the study are randomly allocated to an experimental group and control (Chan, 2017). I will have 4600 of groups.
I will use a pre-test, post-test, control group design. All participants will be volunteers. The participants will complete a demographic questionnaire to get information on their Age, Gender, Education and Ethnicity.
I will randomly assign participants to a control group and an experimental group. Upon assignment, I will give the participants the myth fact sheet describing the myths as a pretest. I will administer the debunking sheet. The control group will get fact charts about brain capacity against its level of intelligence. The experimental group will get misinformation. Lastly, I will administer the one-question test measuring as a post-test.
Given the nature of the study, other variables are likely to affect the outcome of the course. Some of these variables include anxiety, mood, different background information, and intelligence. The variables are likely the study outcome because it will be difficult to tell whether the student's attitude, intelligence, and anxiety can affect their interest and attention or on their perception on brain (Ramirez et al., 2018). Besides, other factors, such as the subject and the subject's student perception, can equally play a role. The students schedule and course understanding may also have an effect on their participation.
References
Ramirez, G., Shaw, S. T., & Maloney, E. A. (2018). Math anxiety: Past research, promising interventions, and a new interpretation framework. Educational Psychologist , 53 (3), 145-164
Berinsky, A. J. (2017). Rumors and health care reform: Experiments in political misinformation. British Journal of Political Science, 47(2), 241–262.
Chan, M. S., Jones, C. R., Hall Jamieson, K., & Albarracín, D. (2017). Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation. Psychological Science, 28(11), 1531–1546.
Cook, J., Lewandowsky, S., & Ecker, U. K. H. (2017). Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. PLoS One, 12(5), 1–17.
Ecker, U. K. H., Hogan, J. L., & Lewandowsky, S. (2017). Reminders and repetition of misinformation: Helping or hindering its retraction? Journal of Applied Research in Memory and Cognition, 6(2), 185–192.
Pluviano, S., Watt, C., & Sergio, D. S. (2017). Misinformation lingers in memory: Failure of three pro-vaccination strategies. PLoS One, 12(7), 1–12.
Rapp, D. N., & Salovich, N. A. (2018). Can't we just disregard fake news? The consequences of exposure to inaccurate information. Policy Insights from the Behavioral and Brain Sciences, 5(2), 232–239.
Schwarz, N., Newman, E., & Leach, W. (2016). Making the truth stick & the myths fade: Lessons from cognitive psychology. Behavioral Science & Policy , 2 (1), 85-95. https://doi.org/10.1353/bsp.2016.0009