Review
Abstract
Background: The use of health-related online peer support groups to support self-management of health issues has become increasingly popular. The quality of information and advice may have important implications for public health and for the utility of such groups. There is some evidence of variable quality of web-based health information, but the extent to which misinformation is a problem in online peer support groups is unclear.
Objective: We aimed to gain insight into the quality of information and advice about health conditions in online peer support groups and to review the tools available for assessing the quality of such information.
Methods: A scoping review was undertaken following the Joanna Briggs Institute scoping review methodology. We searched electronic databases (MEDLINE [Ovid], CINAHL, Web of Science, ASSIA, ProQuest Dissertation and Theses, and Google Scholar) for literature published before November 2023, as well as citations of included articles. Primary research studies, reviews, and gray literature that explored the quality of information and advice in online peer support groups were included. Title and abstracts were independently screened by 2 reviewers. Data were extracted and tabulated, and key findings were summarized narratively.
Results: A total of 14 (0.45%) relevant articles, from 3136 articles identified, were included. Of these, 10 (71%) were primary research articles comprising diverse quality appraisal methodologies, and 4 (29%) were review articles. All articles had been published between 2014 and 2023. Across the literature, there was more evidence of poor quality information and misinformation than of good quality information and advice, particularly around long-term and life-threatening conditions. There were varying degrees of misinformation about non–life-threatening conditions and about mental health conditions. Misinformation about noncommunicable diseases was reported as particularly prevalent on Facebook. Fellow online peer support group users often played an active role in correcting misinformation by replying to false claims or providing correct information in subsequent posts. Quality appraisal tools were reported as being used by researchers and health care professionals in appraising the quality of information and advice, including established tools for the appraisal of health-related information (eg, DISCERN, HONcode criteria, and Journal of the American Medical Association benchmark criteria). No tools reported were specifically designed to appraise online peer support group content.
Conclusions: While there is good quality information and advice exchanged between users in online peer support groups, our findings show that misinformation is a problem, which is a matter of public health concern. Confidence in the quality of information shared may determine the utility of online peer support groups for patients and health care professionals. Our review suggests that clinical and academic experts in health conditions could play a valuable role in ensuring the quality of content. Several quality appraisal tools are available to support such an initiative.
doi:10.2196/71140
Keywords
Introduction
Background
In recent years, there has been a rapid increase in people’s use of web-based resources aimed at meeting their health information needs [
, ]. Seeking health information and advice from the internet has been described as the interaction of an individual (eg, patient, caregiver, or professional) with or through an electronic device or communication technology to access or to transmit health information [ ]. A large amount of web-based health information is generated by health care providers; pharmaceutical companies; and other public, private, and charitable health organizations by virtue of websites for the public. At the same time, an increasing volume of web-based health content is being generated by individuals living with or caring for others with a health condition, via online peer support groups [ - ].Online peer support groups, otherwise named online support communities, are virtual spaces where individuals exchange health-related information (ie, factual knowledge that is not based on a person’s opinion); advice (ie, a recommendation for action based on someone’s experience); and peer support (ie, sharing personal experiences and emotional, social, or practical support with one another) often through asynchronous, text-based computer-mediated communication platforms (eg, discussion forums and social media sites) [
, , ]. Exchanging health-related information and advice in online peer support groups has been found to empower people living with or caring for others with health conditions in learning new methods for managing their conditions, in feeling that their experiences are validated, in reducing a sense of loneliness, and in improving their navigation of health care services [ - , - ]. Online peer support groups have been described as a health care resource that is complementary to formal health care, with patients feeling as equal contributors in the patient–health care provider relationship [ ]. Interactions through online peer support groups about health can also lead to the identification of health-related problems that health care professionals have not yet considered, and sharing health-related experiences through such groups has led to campaigns to improve health care [ ].While online peer support groups have become increasingly popular, there is growing concern from health care professionals and researchers regarding the accuracy and safety of some health information on the internet [
- ]. The concept of information quality has multiple dimensions, centered on trustworthiness and scientific correctness [ ]. Increasing attention has been given to exploring the quality of health-related information and advice exchanged in online peer support groups, particularly around the medical accuracy of content or, conversely, the spread of misinformation [ - ]. Health-related misinformation has been defined as a health-related claim of a fact that is false due to a lack of scientific evidence [ ] and is usually attributed to misconceptions. Exposure to misinformation in online peer support groups and following false information and advice can detrimentally impact health outcomes [ - ]. Several incidents of public health concerns arose out of misinformation spreading across social web-based platforms during the COVID-19 pandemic, including hospitalization and casualties [ ]. Poor quality information often reproduces within discussion threads in online peer support groups, where like-minded people come together and similar topics circulate (the “echo chamber effect”) [ ]. People with lower levels of health literacy may be particularly vulnerable to the detrimental impacts of misinformation [ , ].Rationale for the Study
With the rapid spread of misinformation in online peer support groups posing significant threats to public health, a comprehensive scoping review of the quality of information and advice and the methods used in quality appraisals is needed. Knowledge about the quality of information shared in online peer support groups is directly relevant for patients, carers, and health care professionals in supporting the management of health conditions. In addition, an understanding of the quality appraisal methods is important, as they serve as potential moderators of the quality of information and advice provided.
Review Objective
The objective of this scoping review was to map and summarize evidence about the quality of information and advice in online peer support groups across a range of health conditions and to review the available tools used to assess the quality of such information and advice.
Review Questions
The following were our research questions (RQs):
- RQ 1: What is known about the quality of information and advice exchanged within online peer support groups about health conditions?
- RQ 1.1: To what extent is misinformation about health conditions a problem within online peer support groups?
- RQ 1.2: How have researchers assessed the quality of information and advice exchanged in online peer support groups about health conditions?
Methods
Overview
Scoping reviews aim to identify and map the key concepts within a research area regarding its nature, features, and volume. Such reviews often address a broad RQ and include a variety of study designs [
, ]. This scoping review was conducted in accordance with the Joanna Briggs Institute methodology for scoping reviews [ - ] and followed the PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews) checklist for reporting [ ]. The review protocol was registered in January 2024 on the Open Science Framework Registries [ ]. The PCC (Population, Concept, and Context) framework [ ] was followed as a guide to construct the review objectives, search strategy, and eligibility criteria. Covidence systematic review software was used for all review stages [ ].Search Strategy and Procedure
The principal reviewer (BMT) and second reviewer (EP) developed the search strategy, with assistance from a university research librarian. The search strategy aimed to identify published and unpublished literature and focused on three central concepts: (1) online support groups, (2) quality assessment, and (3) health-related information and advice. An initial limited search was conducted in the Web of Science and MEDLINE (Ovid) databases to identify the literature on this topic and keywords referring to the 3 central concepts of the search strategy (
). Records identified through this initial search were reviewed to identify relevant literature, and text words contained in the titles, abstracts, and keywords of relevant literature were used to identify additional keywords to develop the final search strategy ( ). An adapted search strategy was developed for each database.Online support group search terms
- online support group
- online support groups
- online community
- online communities
- discussion forum
- discussion forums
- social media
- social networking sites
Quality assessment search terms
- quality assessment
- quality
- assessment medical accuracy
- accuracy
- appraisal
Health-related information search terms
- information
- advice
- health
We searched electronic databases (MEDLINE [Ovid], CINAHL, Web of Science, and Applied Social Sciences Index and Abstracts) and sources of unpublished and gray literature (Google Scholar and ProQuest Dissertations and Theses) from inception to November 2023. Google Scholar collates results from across the internet and does not have internal functions for an advanced search as do the other included databases. Previous research indicated that most gray literature begins to appear after approximately 20 to 30 pages in Google Scholar results; therefore, the first 100 results from page 20 onward were screened [
]. The reference lists of included review articles and citations of all included articles were hand-searched for additional relevant literature. Where full texts were not available, authors were contacted to request a copy.Eligibility Criteria
Inclusion and exclusion criteria reflected the focus of this scoping review on the quality of information and advice about health conditions in online peer support groups (
).Inclusion criteria
- Population
- Synchronous or asynchronous text-based online peer support groups, where individuals can exchange information, advice, and support about health conditions within an online community or group or through a discussion thread about a given topic (eg, Facebook support groups, Reddit, Twitter [rebranded as X], Instagram, WhatsApp, Mumsnet, Health Unlocked, Patient, and charity-owned online peer support groups)
- Concept
- The quality of information and advice about health conditions in online peer support groups (eg, medical accuracy, comparison of information with evidence-based guidelines and relevant literature, readability and usability of content, and degree of misinformation)
- Quality assessment tools and questionnaires (eg, DISCERN tool, Health on the Net Code of Conduct, and the Journal of the American Medical Association benchmark) and their application to information and advice about health conditions in online peer support groups
- Context
- Primary peer-reviewed studies and reviews using quantitative, qualitative, or mixed methods design; dissertations and theses; and gray literature that explored the quality of information and advice about health conditions in online peer support groups
- References within included review articles and citations of included articles
- Studies from any geographic location, focusing on any type of health condition, and published at any time and in any language
Exclusion criteria
- Population
- Video-based social media sites, which do not involve joining an online peer community or group or the use of discussion threads (eg, YouTube and TikTok)
- Concept
- The quality of information and advice on websites that only provide static information that cannot be contributed to or discussed in an online peer support group format (eg, the National Health Service website, WebMD, National Institute of Health, Mayo Clinic, and health-related charity websites).
- The quality of information and advice that is from an online peer support group that is not focused on a particular health condition (eg, health promotion, dieting, and pregnancy).
- The quality of any health-related support services organized by health care professionals (ie, online support interventions conducted as part of a research trial).
- The quality of information and advice in non–online-based support groups (eg, in-person support groups).
- Context
- Conference abstracts and proceedings, commentaries, books, and editorials.
Selection of Included Articles
All identified citations were collated and uploaded into the review software, and duplicates were removed. Two independent reviewers (BMT and EP) screened titles and abstracts against the inclusion criteria. In cases where the abstract signaled potential eligibility, the full article was retrieved and imported into Covidence, read, and assessed for inclusion. Reasons for exclusion of articles were recorded as full text. Any disagreements at each stage of the selection process were resolved through discussion.
Data Extraction
Data were extracted by BMT using a data extraction tool developed by the wider review team (BMT, NSC, JLC, JL, and EP). The data extraction tool included predetermined study characteristics (ie, authors, year, country, research design methodology, and literature aims), conditions in focus of an online peer support group, key findings, details of quality assessment tools, and study limitations. After extracting data from each included article, BMT sent completed data extraction to a second reviewer (EP) for consensus. The extracted study characteristics were discussed before agreeing on the final extracted content to be analyzed and synthesized.
Data Analysis
Data extracted from included literature were tabulated and summarized narratively. Given the diversity and heterogeneity of the literature concerning methodologies and findings, a narrative summary of findings was chosen to provide a comprehensive understanding of the literature. BMT uploaded the full texts of included articles into NVivo (Lumivero) qualitative data analysis software [
] and coded their content according to the 3 RQs of this scoping review. Codes were analyzed descriptively and discussed with additional reviewers (EP and NSC). Final summaries of each RQ were developed with the wider review team (BMT, NSC, JLC, JL, and EP).Results
Included Articles
The search strategy yielded 3136 results, of which 238 (7.59%) were duplicates. From the 2898 titles and abstracts reviewed, 2848 (98.27%) did not meet the inclusion criteria. The full texts of the remaining 50 (n=2898, 1.72%) articles were assessed, and a further 36 (72%) were excluded as they did not meet the inclusion criteria (n=35, 70%) or were not attainable after contacting the author (n=1, 2%). Overall, there were 14 (28%) articles included in our review.
presents a PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) [ ] flow diagram detailing the inclusion process.
Characteristics of Included Studies
Of the 14 included articles, 11 (79%) had been published since 2020. In total, 10 (71%) were primary research articles that reported a quality appraisal of information and advice about health conditions in online peer support groups and 4 (29%) were review articles. A total of 6 (43%) articles were published in the United States; 4 (29%) articles were published in the United Kingdom; and the remaining 4 (29%) articles were published in Ghana, Norway, Spain, and Korea. The included articles focused on a variety of health conditions, which sometimes fell into more than one category, including long-term conditions (n=14, 100%), acute conditions (n=7, 50%), physical health conditions (n=13, 93%), mental health conditions (n=3, 21%), and viral infections (n=6, 43%). Several web-based platforms were explored, including Facebook (n=8, 57%), X (formerly known as Twitter; n=7, 50%), and Reddit (n=6, 43%).
provides further details about the included articles.Study and year | Country | Aims | Research design methodology | Web-based platforms explored | Conditions in focus |
Slick et al [ | ], 2023United States |
| Extraction of discussion threads from Facebook groups and quality assessment | Sickle cell disease | |
Afful-Dadzie et al [ | ], 2023Ghana |
| Literature review and content analysis | Twitter (rebranded as X), Facebook, Instagram, and WeChat | Cancer, dental or oral care, diabetes, heart condition or disease, sexual health, anorexia, alcohol, spinal disorder, eye, and inflammatory bowel disease |
Ulep et al [ | ], 2022United States |
| Systematic review and descriptive analysis | Twitter, Facebook, Instagram, Reddit, and others unnamed | Hearing loss, tinnitus, and vestibular disorders |
Skafle et al [ | ], 2022Norway |
| Rapid review and thematic analysis with narrative qualitative synthesis | Several social media platforms: WhatsApp, Facebook, Instagram, Twitter, and others | COVID-19 |
Pollack et al [ | ], 2022United States |
| Extraction of discussion threads from Reddit and using artificial intelligence (machine learning) to evaluate misinformation | Obesity | |
Farnood et al [ | ], 2022United Kingdom |
| Extraction of discussion threads from online health forums and quality assessment | Public domain internet discussion forums (names not reported) | Heart failure |
Towne et al [ | ], 2021United States |
| Extraction of discussion threads from Facebook groups and content analysis | Endometriosis | |
Suarez-Lledo and Alvarez-Galvez [ | ], 2021Spain |
| Systematic review (analysis method not reported) | Several social media platforms: Twitter, Facebook, MySpace, WhatsApp, and VK (formerly VKontakte) | Noncommunicable diseases, pandemics, and eating disorders |
ElSherief et al [ | ], 2021United States |
|
| Twitter, Reddit, and Drugs-Forum | Opioid use disorder |
Du et al [ | ], 2021United States |
| Extraction of discussion threads from Reddit and using artificial intelligence (machine learning) to evaluate misinformation | Human papillomavirus infection | |
Jo et al [ | ], 2020Korea |
| Extraction of questions and answers from an online discussion forum and using artificial intelligence (structural topic model) to evaluate content | The Naver questions and answers forum | COVID-19 |
Sepehripour et al [ | ], 2017United Kingdom |
| Scoring of social media sites regarding quality | Facebook and Twitter | Microtia |
Cole et al [ | ], 2016United Kingdom |
| Extraction of discussion threads from online discussion forum websites and quality assessment | Reddit, Mumsnet, and Patient | HIV infection, diabetes, and chicken pox |
McGregor et al [ | ], 2014United Kingdom |
| Extraction of discussion threads from social media platforms and analysis using a combination of content analysis and simple thematic analysis | The international Glaucoma Association forum, Facebook, Twitter, Patient, and Care Opinion | Glaucoma |
Narrative Summary of Findings
RQ 1: What Is Known About the Quality of Information and Advice Exchanged Within Online Peer Support Groups About Health Conditions?
Overview
The included articles explored a variety of long-term, acute, physical, and mental health conditions (
[ - ]). The quality of information and advice in online peer support groups varied according to the condition in focus and across web-based platforms. There were more reports of misinformation or mixed quality information about health conditions in online peer support groups than of good quality information.Good Quality Information and Misinformation About Long-Term Conditions
In total, 10 (71%) included articles [
- , - , - ] explored the quality of information and advice about long-term physical health conditions in online peer support groups. Studies reported that the content about long-term life-threatening conditions, such as cancer and diabetes, raised the most substantial concerns regarding the quality of information, highlighting fake, misleading, or inaccurate information [ , ]. With regard to cancer, information about treatments and surgical procedures was particularly described as poor quality [ ]. Conversely, 1 (7%) article [ ] found that information about diabetes in online peer support groups was of reasonable quality and rarely contained inaccurate information. For heart failure, most of the information exchanged in online peer support groups has been described as of low quality and not aligned with evidence-based guidelines, with some described as potentially harmful [ ]. Examples of harmful information and advice included instances where users confirmed diagnoses to other users that were incorrect and were not supported by existing evidence or guidelines [ ]. Similarly, for sickle cell disease, most of the information and advice was rated as medically inaccurate, such as in relation to hydroxyurea medication, managing pain, and the use of natural supplements [ ]. Non–life-threatening conditions, such as endometriosis [ ], microtia [ ] and glaucoma [ ], were found to include little misinformation in comparison to the life-threatening conditions. Some cases of misinformation were reported in glaucoma-focused online peer support groups, which ranged from discussions about non–evidence-based treatments, such as carnosine eye drops and complementary therapies, to more dangerous advice around the use of medical marijuana [ ]. For ear-related disorders, a greater amount of misinformation about tinnitus was reported than about hyperacusis, although the specific types of information exchanged were not reported [ ].Good Quality Information and Misinformation About Viruses
There were mixed findings about virus infections reported in 6 (43%) articles [
, , , , , ]. For COVID-19, information and advice about physical symptoms of COVID-19 were assessed as appropriate [ ], whereas information about the COVID-19 vaccine was reported as misinformed [ , ], with misinformation about side effects such as infertility, chronic illness, changes in DNA, physical deformities, mental illness, and conspiracy theories [ ]. Information and advice about human papillomavirus virus infection has been reported as being of mixed quality [ , ]. Examples of misinformation included the safety of the human papillomavirus virus vaccine, vaccine adverse effects, conspiracy theories about the government and pharmaceutical companies, and the prevalence of human papillomavirus virus infection [ ]. With regard to chicken pox, some non–evidence-based discussions around vaccination and herbal remedies have been reported, but there was little evidence of poor quality or inaccurate information and advice [ ]. Information and advice about HIV infection has been similarly reported to be of reasonably high quality with only a very small proportion considered to be factually incorrect or harmful [ ].Varying Degrees of Misinformation About Mental Health Conditions
In total, 3 articles (21%) [
, , ] reported on the quality of information and advice in online peer support groups relating to mental health conditions. One study on opioid use [ ] reported little misinformation despite anticipating the reproduction of a commonly held misunderstanding that medication-assisted treatment is simply replacing one drug with another. Another study [ ] found a moderate amount of misinformation related to eating disorders, such as around normalizing eating disorders, changing eating habits, and using beauty as the final goal. Content related to eating disorders is frequently hidden or not evident to the public, as pro–eating disorder communities tend to use their own codes to reach specific audiences; thus, the prevalence of misinformation related to eating disorders in online peer support groups may be even greater than what is clearly visible [ ].The Quality of Information and Advice According to Web-Based Platforms
Across the included articles, 18 different web-based platforms were represented (
[ - ]). The most commonly explored web-based platform was Facebook. One review reported that more studies until 2023 have explored Facebook than other web-based platforms, while other web-based platforms, such as WhatsApp, Instagram, WeChat, and country-specific web-based platforms are becoming more popular for seeking information and advice about health conditions [ ]. Misinformation about noncommunicable diseases and treatments has been reported as particularly prevalent on Facebook [ ]. Misinformation about vaccines and drugs has been reported as most prevalent on X [ , ]. On Reddit, misinformation about the human papillomavirus virus vaccine has been reported to cluster in “subreddits” [ ], which are smaller communities that are themed around a given topic [ ], although the degree of misinformation on Reddit has been reported to have decreased over time, possibly due to public health campaigns and an increase in internet verification skills among users [ ]. In comparison to information and advice about health conditions in online peer support groups, the quality of the web-based platform itself was scarcely explored across the included articles.RQ 1.1: To What Extent Is Misinformation About Health Conditions a Problem Within Online Peer Support Groups?
Overview
Health-related misinformation was a central focus across many of the included articles. Some included articles solely focused on misinformation and negated exploring good quality information. Attempts at addressing the spreading of misinformation in online peer support groups were also reported.
The Focus on Health-Related Misinformation in the Literature
Misinformation was reported to varying degrees in all 14 (N=14, 100%) articles, although misinformation received more attention than the general quality of information. In total, 5 (N=14, 36%) articles [
, , - ] specifically focused on variants of misinformation circulating in online peer support groups about various health conditions and did not consider the quality of information more broadly. As detailed earlier, misinformation in online peer support groups has been identified regarding long-term life-threatening conditions [ , , , ], viral infections [ , , , ], and mental health conditions [ , ]. Researchers have emphasized that in the event of a novel infectious disease outbreak where there is great uncertainty among the public, obtaining accurate information can be extremely difficult [ ].The Activities of Users in Minimizing Health-Related Misinformation
The degree of the problem of misinformation in online peer support groups can be reduced when fellow users address or correct false claims. Users have been observed to have corrected misinformation provided by other users through replying to false claims or through providing correct information in subsequent posts [
, , ]. In addition, some users have also been observed to signpost fellow users to additional resources that are reliable and helpful, in response to misinformation [ ]. Some online peer support groups such as Reddit operate a voting system on each discussion thread, whereby posts that contain good quality information and advice can be upvoted to the top of the discussion thread and lower quality posts can be demoted [ ]. The need for health care professionals to take an active role in correcting misinformation in online peer support groups has been widely emphasized, a role that would involve dispelling false medical information that has the potential for spreading further on the web [ , , , ]. Organized web-based public health campaigns, composed of factual, evidence-based messages, have also been promoted to combat misinformation [ - ].RQ 1.2: How Have Researchers Assessed the Quality of Information and Advice Exchanged in Online Peer Support Groups About Health Conditions?
Overview
There were various processes for appraising the medical accuracy of content regarding the types of professionals conducting the quality appraisal, the use of a quality appraisal tool to record the appraisal, and the consideration of quality indicators that might be used when appraising content.
Involvement of Health Care Professionals in Quality Appraising Information and Advice
In total, 9 (N=14, 64%) articles [
, - , - ] reported involving academic or health care professionals in quality appraisal. In some cases, discussions in online peer support groups were appraised by researchers or clinical academics researching the conditions in focus [ - , ], by secondary care health care professionals who were specialists or consultants in the conditions [ , - ], by general practitioners [ , ], and by patients and carers with lived experiences of managing the conditions in focus [ ].Comparison of Information and Advice With Medical Guidelines
Evidence-based guidelines relevant to the condition were often used in determining the quality of information and advice exchanged [
, - ]. Guidelines mentioned included the Scottish Intercollegiate Guidelines Network, National Institute for Health and Care Excellence, American Board of Obesity, and National Institute of Health–National Heart, Lung, and Blood Institute. Current evidence in peer-reviewed scientific journals [ ] and information provided on trusted charity websites dedicated to the condition in focus [ ] were also used as a guide for quality appraising content.Use of Quality Appraisal Tools for Recording the Assessment
In total, 5 (36%) articles reported using specific quality appraisal tools [
, - ]. The DISCERN instrument [ ], the Health on the Net Code of Conduct (HONcode) [ , ], and the Journal of the American Medical Association (JAMA) benchmark [ ] ( ) have been used in information quality evaluation across multiple social media and online peer support group platforms [ ]. None of these tools were specifically designed to appraise online peer support group content, although they were still reported as being used for this purpose by health care professionals and patient users investigating the quality of such content. The DISCERN instrument, which contains 16 items, focuses on rating the quality of health information written by consumers about treatment choices [ ], not necessarily web-based information or specifically for the assessment of information and advice in online peer support groups. The HONcode criteria contain 8 principles that are used as a guide in assessing the reliability and usefulness of medical information on health-related websites [ , ]. In addition, the HONcode criteria have also been used to appraise content in online peer support groups [ , ], despite being devised for traditional health information websites. Similarly, the JAMA benchmark [ ] contains 4 core standards that have been used [ ] as a guide in assessing health information on websites. One article reported developing their own tool specifically for the purpose of their research study instead of using an official quality appraisal tool [ ]. Some quality appraisal studies reported combining 2 or more of the aforementioned quality appraisal tools to assess information and advice in online peer support groups [ ]; however, the rationale behind this was not explained.Quality appraisal tool | Aim of tool | Criteria or items |
DISCERN [ | ]To help users of consumer health information judge the quality of written information about treatment choices |
|
HONcode [ | , ]To assess reliability and usefulness of medical information on the internet |
|
JAMA benchmark [ | ]To evaluate the reliability and accuracy of medical websites |
|
Quality Mark Indicators for Appraising Information and Advice in Online Peer Support Groups
There are specific quality mark indicators that are important and exclusive to the assessment of information and advice about health conditions in online peer support groups [
, , , , , - ]. The source of the information and advice [ , ], currency [ , ], relevance [ , , ], accuracy [ , , , ], sensibility and coherence [ , ], and the usefulness of content [ , ] were reported as key quality indicators in appraising information and advice. Many of these quality indicators mirror criteria comprising the quality appraisal tools outlined in the aforementioned section. In addition, categories such as high-quality evidence down to low-quality information [ ] and “fact vs misinformation” [ ] have also been used to appraise content. Moreover, web-based platform–specific metrics such as “number of posts and reposts,” “number of followers,” “likes,” and “author profile” have also been used in assessing the quality of content [ ].Use of Artificial Intelligence
In total, 4 (29%) studies [
, - ] used artificial intelligence to identify and extract relevant information when assessing the quality of health information and advice. Forms of machine learning including natural language processing [ ] and topic modeling [ ], which is a field of study in artificial intelligence that attempts to extract general concepts from large datasets in the form of an algorithm [ ], have been used in identifying specific concepts of misinformation around certain health conditions in online peer support groups [ , - ].Discussion
Summary of Evidence in Context of Wider Literature
The objective of this scoping review was to gain insight into the quality of information and advice in online peer support groups about a range of health conditions, and to review the available tools used to quality-assess such information and advice. This scoping review has identified that the quality of information and advice in online peer support groups varies according to the condition and to the degree of health care expert moderation present. While there was some evidence of good quality information and advice being exchanged [
, - ], there was more evidence of misinformation and of low-quality information and advice, particularly around long-term and life-threatening conditions [ , , , ], which is also consistent with the wider literature [ , ]. There were varying degrees of misinformation about non–life-threatening conditions [ , , , ], viral infections [ , , , , ], and mental health conditions [ , ]. In the event of a public health concern where there is uncertainty about the disease, obtaining accurate information can be extremely difficult [ ].Among platforms hosting health-related online peer support groups, Facebook presents the most misinformation about noncommunicable diseases and treatments, which may be due to there being more space in posts to describe experiences with disease symptoms and treatments [
]. X, on the other hand, appears to exhibit more misinformation about vaccines and drugs [ , ], which may be because of the natural dynamism of the platform that facilitates an ideal medium for discussions around different political or ideological perspectives [ ]. In addition, the use of algorithms by web-based platforms such as Facebook and X that place content on users’ feeds that match their web-based search–related behaviors and engagement with other posts may also indicate that those whose beliefs align with misinformation are more likely to retransmit such content [ ]. Effortless architecture constructed by web-based platforms, such as the one-touch forward option on WhatsApp that enables users to forward messages simultaneously to multiple users, may also explain the prevalence of misinformation on the web [ ].Platforms such as Facebook and X do not typically monitor misinformation, which may be detrimental for the continued spread of misinformation about various health conditions on such platforms. However, in the event of a public health concern such as during the COVID-19 pandemic, platforms including Facebook, X, Reddit, Instagram, and WhatsApp introduced measures to mitigate misinformation, such as placing warning labels on content about the pandemic, removing content, and banning accounts [
]. This suggests that such interventions are possible, although they are not rolled out routinely and thus may not be prioritized by platforms. The extent of the problem of misinformation in online peer support groups can alternatively be reduced by attempts from fellow online peer support group users in addressing and correcting misinformation [ , , ]. Current moderation solutions for misinformation rely heavily on reporting mechanisms and human-led moderation, though post-by-post moderation based on individual users has been described as unsustainable [ ]. Organized public health campaigns composed of factual, evidence-based messages distributed in the online world could also tackle the problem of misinformation [ - ]. Digital literacy–focused interventions that aim to teach people strategies to recognize misinformation have demonstrated effective outcomes, as well as interventions where platforms make content classed as misinformation less visible and deincentivize the creation and sharing of such content [ ].Studies have typically involved researchers, clinical academics, and specialist clinicians who hold expertise in specific conditions to conduct quality appraisals of information and advice in online peer support groups [
- , - ]. This has been undertaken through comparing information and advice with evidence-based guidelines [ , - ], scientific journal articles [ ], and information provided on trusted third sector websites [ ]. Quality appraisal instruments including the DISCERN instrument [ ], the HONcode criteria [ , ] and the JAMA benchmark [ ] have been used as a recording tool for appraising the quality of information and advice exchanged around a variety of conditions and disciplines [ ], despite not being designed specifically for the online social world. While criteria for forming such tools were not instinctively developed for online peer support group content, the DISCERN instrument, in particular, has been described as being suitable to evaluate any text-based health-related information (pertaining to treatment) [ ]. At present, the DISCERN instrument remains in use with more than 150 published studies having reported use of the tool in evaluating health information [ ], and it is recognized as a reliable tool used by both health care professionals and patient users [ ]. In this review, specific quality indicators for appraising information and advice have been outlined, of which many mirror criteria forming the quality appraisal instruments discussed in this review, such as relevance [ , , ], sources of evidence [ , ], currency [ , ], accuracy [ , , , ], coherence [ , ], and the usefulness of content [ , ]. Research undertaken since conducting this review also highlights accuracy, credibility, and reliability as key quality indicators for appraising information and advice in online peer support groups [ ]. This suggests that the DISCERN instrument [ ], the HONcode criteria [ , ], and the JAMA benchmark [ ] could be appropriate quality appraisal tools for academics and health care professionals to use as a guide for appraising the quality of information and advice in online peer support groups. On occasions, these tools have been combined to assess information and advice in online peer support groups; thus, there may also be a need for novel instruments that focus solely on online support group content that would consider the nature and architecture of such content [ ]. Others academics have argued that because these tools were created to be applied across different health-related domains and are therefore called “general instruments,” they may be more appropriate to evaluate factors associated with the formality of content or the design of websites rather than to evaluate the substance of content (eg, its accuracy and completeness), which may be better evaluated by domain-specific instruments that are created by researchers for individual research and are based on medical guidelines, literature, and their expertise [ ].Strengths and Limitations of the Review
This is the first comprehensive scoping review considering the findings of a diverse and heterogeneous pool of literature and exploring the quality of information and advice in online peer support groups about various health conditions. This scoping review was conducted in accordance with the Joanna Briggs Institute methodology [
- ]. In doing so, this scoping review has successfully identified and mapped current key concepts around the quality of information and advice about various health conditions in online peer support groups, taking a broad, inclusive approach to evidence and accommodating the literature comprising a range of diverse questions and study designs.There are also some limitations that should be considered. First, although 6 databases were searched, it is possible that some relevant studies were not identified. Second, although the search strategy was developed with the support of a specialist librarian, there are additional search terms that could have been used, such as “chat rooms” and “computer-mediated support” [
], which were excluded from the search strategy as they yielded a considerable amount of irrelevant literature. Not including these terms may have meant that some relevant literature may have been missed. However, to alleviate these potential limitations, Google Scholar was also searched, as well as the reference lists of all the included studies. This yielded additional studies that were included in our review. Third, articles focusing only on information and advice about health conditions on video-based social media sites were excluded from this review, as this paper focused on rich written information and advice generated in online communities dedicated to peer-to-peer support, rather than including comments on videos. Therefore, it is possible that some additional data were missed from comments sections on videos. Future research could compare the quality of health-related information and advice provided on web-based video platforms to traditional online peer support groups. Finally, the second part of the objective of this review was to review the available tools used to assess the quality of information and advice in online peer support groups. While this scoping review identified and mapped the available information about such tools, the articles themselves did not provide a substantive amount of information about quality appraisal tools, which limited the extent to which it was possible to review them based the included articles in this review. Wider literature has been drawn on in the discussion of this paper.Recommendations for Clinical Practice and Future Research
Where patients seek information and advice in the web-based world is a matter that requires more attention and improved conversations in consultations between health care professionals and patients [
, , ]. Health care professionals need to meticulously attend to supporting people with lower levels of health literacy [ , ] and older people who may be particularly vulnerable to experiencing confusion and anxiety generated by coming across misinformation [ , ]. To do this, health care professionals need to be better informed about what online peer support groups are, how they operate, and the benefits and harms that they can offer patients. There is a need for health care professionals to be more aware of content discussed in online peer support groups [ , - , ] and, furthermore, to take an active role in correcting misinformation [ , , , , ]. This activity is a current gap in provision and could be recommended for future career opportunities for health care professionals who may be seeking alternative roles to traditional clinical work, in meeting the health information and support needs of the public. In addition, there may be an opportunity for health care professionals in training to complete a quality appraisal task of health information available on the web and possibly to provide advice (within their field of competence) in online peer support groups. While some degree of moderation of information and advice in online peer support groups has been proven to be helpful in correcting misinformation, the extent to which health care professionals could or should be involved in moderating content as part of their duty of care was not explored in the included articles in this review. Future research is needed to determine the acceptability of and potential methods for health care professionals moderating and contributing content in online peer support groups, from the perspectives of health care professionals and patient users of online peer support groups. In addition, the type of health care professional who would be most effective in assessing the quality of content in online peer support groups, which may include general practitioners, nurses, secondary care clinicians, or others, needs to be further explored. Moreover, future research could explore the resources used in time, training, and financing around asking health care professionals to quality appraise information and advice in online peer support groups to better understand how feasible incorporating such a task would be alongside traditional clinical duties. Future research is needed to further assess the acceptability and usability of common quality appraisal tools used in appraising information and advice in online peer support groups, such as DISCERN instrument [ ], the HONcode criteria [ , ], and the JAMA benchmark [ ], for health care professionals to use them as a guide in such practice. There is a need for a novel instrument tailored specifically to appraise health-related information in online peer support groups, incorporating items focusing on reliability, credibility, accuracy, currency, coherence, and usefulness as the basis of the tool.Conclusions
There is increasing concern regarding the quality of information and advice about health conditions exchanged in online peer support groups. While there is good quality information and advice exchanged between users, misinformation is also a problem, with some users being ill-advised about common long-term and life-threatening conditions, viral infections, and mental health conditions. The extent of the problem of misinformation can be reduced with fellow users in online peer support groups addressing and correcting misinformation and through public health campaigns organized in the online world. Health care professionals and patients need to discuss the use of online peer support groups more regularly in consultations. Furthermore, there is a need for health care professionals to play a more active role in quality-assuring health-related information and advice in online peer support groups. This could be facilitated using established quality appraisal tools such as the DISCERN instrument, the HONcode criteria, and the JAMA benchmark as a guide in appraising content. Future research is needed focusing on approaches to educating health care professionals around the benefits and harms that online peer support groups can offer patients, along with investigating the acceptability of and potential methods for health care professionals moderating and contributing content in online peer support groups.
Acknowledgments
This study was funded by the UK National Institute for Health and Care Research (NIHR) School for Primary Care Research. The views expressed are those of the authors and not necessarily those of the NIHR or the Department of Health and Social Care.
The authors thank Amy McEwan, a University of Exeter research librarian, who supported them in developing the search strategy for this scoping review.
Authors' Contributions
BMT and EP developed the search strategy, with assistance from a university research librarian. BMT and EP screened all titles, abstracts, and full texts. BMT also hand-searched the reference lists of included review articles and citations of all included articles for additional relevant literature. BMT, NSC, JLC, JL, and EP developed a data extraction tool for the included literature. BMT extracted data from the included literature and sent it to EP for consensus. BMT coded data from the included literature, which was discussed with EP and NSC. BMT, NSC, JLC, JL, and EP developed final summaries of each research question. BMT wrote up this review paper, which was contributed by NSC, JLC, JL, and EP.
Conflicts of Interest
None declared.
Final search strategy for all included databases and sources to obtain literature exploring the quality of information and advice about health conditions in online peer support groups.
DOCX File , 27 KBHealth conditions explored and reported quality of information and advice across the 14 included articles, exploring the quality of information and advice in online peer support groups.
DOCX File , 35 KBWeb-based platforms explored across the 14 included articles exploring the quality of information and advice about health conditions in online peer support groups.
DOCX File , 33 KBPRISMA-ScR checklist.
PDF File (Adobe PDF File), 101 KBReferences
- Wollmann K, van der Keylen P, Tomandl J, Meerpohl JJ, Sofroniou M, Maun A, et al. The information needs of internet users and their requirements for online health information-a scoping review of qualitative and quantitative studies. Patient Educ Couns. Aug 2021;104(8):1904-1932. [CrossRef] [Medline]
- Luo A, Qin L, Yuan Y, Yang Z, Liu F, Huang P, et al. The effect of online health information seeking on physician-patient relationships: systematic review. J Med Internet Res. Feb 10, 2022;24(2):e23354. [FREE Full text] [CrossRef] [Medline]
- Robinson TN, Patrick K, Eng TR, Gustafson D. An evidence-based approach to interactive health communication: a challenge to medicine in the information age. Science Panel on Interactive Communication and Health. JAMA. Oct 14, 1998;280(14):1264-1269. [CrossRef] [Medline]
- Le LH, Hoang PA, Pham HC. Sharing health information across online platforms: a systematic review. Health Commun. Jul 03, 2023;38(8):1550-1562. [CrossRef] [Medline]
- Tausczik Y, Huang X. Knowledge generation and sharing in online communities: current trends and future directions. Curr Opin Psychol. Dec 2020;36:60-64. [CrossRef] [Medline]
- Berkanish P, Pan S, Viola A, Rademaker Q, Devine KA. Technology-based peer support interventions for adolescents with chronic illness: a systematic review. J Clin Psychol Med Settings. Dec 11, 2022;29(4):911-942. [FREE Full text] [CrossRef] [Medline]
- Frey E, Bonfiglioli C, Brunner M, Frawley J. Parents' use of social media as a health information source for their children: a scoping review. Acad Pediatr. May 2022;22(4):526-539. [FREE Full text] [CrossRef] [Medline]
- Newman K, Wang AH, Wang AZ, Hanna D. The role of internet-based digital tools in reducing social isolation and addressing support needs among informal caregivers: a scoping review. BMC Public Health. Nov 09, 2019;19(1):1495. [FREE Full text] [CrossRef] [Medline]
- Sweet KS, LeBlanc JK, Stough LM, Sweany NW. Community building and knowledge sharing by individuals with disabilities using social media. Comput Assist Learn. Jul 30, 2019;36(1):1-11. [CrossRef]
- Johansson V, Islind AS, Lindroth T, Angenete E, Gellerstedt M. Online communities as a driver for patient empowerment: systematic review. J Med Internet Res. Feb 09, 2021;23(2):e19910. [FREE Full text] [CrossRef] [Medline]
- Hwang EH, Singh PV, Argote L. Knowledge sharing in online communities: learning to cross geographic and hierarchical boundaries. Organ Sci. Dec 2015;26(6):1593-1611. [CrossRef]
- Allen C, Vassilev I, Kennedy A, Rogers A. Long-term condition self-management support in online communities: a meta-synthesis of qualitative papers. J Med Internet Res. Mar 10, 2016;18(3):e61. [FREE Full text] [CrossRef] [Medline]
- Ziebland S, Wyke S. Health and illness in a connected world: how might sharing experiences on the internet affect people's health? Milbank Q. Jun 18, 2012;90(2):219-249. [FREE Full text] [CrossRef] [Medline]
- Treadgold BM, Teasdale E, Muller I, Roberts A, Coulson N, Santer M. Parents and carers' experiences of seeking health information and support online for long-term physical childhood conditions: a systematic review and thematic synthesis of qualitative research. BMJ Open. Dec 28, 2020;10(12):e042139. [FREE Full text] [CrossRef] [Medline]
- Griffiths F, Dobermann T, Cave JA, Thorogood M, Johnson S, Salamatian K, et al. The impact of online social networks on health and health systems: a scoping review and case studies. Policy Internet. Dec 2015;7(4):473-496. [FREE Full text] [CrossRef] [Medline]
- Toolkit for tackling misinformation on noncommunicable disease: forum for tackling misinformation on health and NCDs. World Health Organization. URL: https://www.who.int/europe/publications/i/item/WHO-EURO-2022-6260-46025-66542 [accessed 2024-05-29]
- Keselman A, Arnott Smith C, Murcko AC, Kaufman DR. Evaluating the quality of health information in a changing digital ecosystem. J Med Internet Res. Feb 08, 2019;21(2):e11129. [FREE Full text] [CrossRef] [Medline]
- Daraz L, Morrow AS, Ponce OJ, Beuschel B, Farah MH, Katabi A, et al. Can patients trust online health information? A meta-narrative systematic review addressing the quality of health information on the internet. J Gen Intern Med. Sep 21, 2019;34(9):1884-1891. [FREE Full text] [CrossRef] [Medline]
- Wagner T, Howe CJ, Lewis B, Adame T. Is your WebLitLegit? Finding safe and good health information on the internet. Health Lit Res Pract. Apr 2022;6(2):e151-e158. [FREE Full text] [CrossRef] [Medline]
- Al-Jefri M, Evans R, Uchyigit G, Ghezzi P. What is health information quality? Ethical dimension and perception by users. Front Med (Lausanne). Sep 20, 2018;5:260. [FREE Full text] [CrossRef] [Medline]
- Johansson P, Enoch F, Hale SA, Vidgen B, Bereskin C, Margetts HZ, et al. How can we combat online misinformation? A systematic overview of current interventions and their efficacy. SSRN Journal. Preprint posted online December 7, 2023. [FREE Full text] [CrossRef]
- Borges do Nascimento IJ, Pizarro AB, Almeida JM, Azzopardi-Muscat N, Gonçalves MA, Björklund M, et al. Infodemics and health misinformation: a systematic review of reviews. Bull World Health Organ. Sep 01, 2022;100(9):544-561. [FREE Full text] [CrossRef] [Medline]
- Muhammed TS, Mathew SK. The disaster of misinformation: a review of research in social media. Int J Data Sci Anal. 2022;13(4):271-285. [FREE Full text] [CrossRef] [Medline]
- Chou WY, Oh A, Klein WM. Addressing health-related misinformation on social media. JAMA. Dec 18, 2018;320(23):2417-2418. [CrossRef] [Medline]
- Hajli MN, Sims J, Featherman M, Love PE. Credibility of information in online communities. J Strat Mark. May 22, 2014;23(3):238-253. [CrossRef]
- Zhao Y, Da J, Yan J. Detecting health misinformation in online health communities: incorporating behavioral features into machine learning based approaches. Inf Process Manag. Jan 2021;58(1):102390. [CrossRef]
- Ma TJ, Atkin D. User generated content and credibility evaluation of online health information: a meta analytic study. Telemat Inform. Aug 2017;34(5):472-486. [CrossRef]
- Adams SA. Revisiting the online health information reliability debate in the wake of "web 2.0": an inter-disciplinary literature and website review. Int J Med Inform. Jun 2010;79(6):391-400. [CrossRef] [Medline]
- Petrič G, Cugmas M, Petrič R, Atanasova S. The quality of informational social support in online health communities: a content analysis of cancer-related discussions. Digit Health. Feb 20, 2023;9:20552076231155681. [FREE Full text] [CrossRef] [Medline]
- Akbolat M, Amarat M, Ünal Ö, Şantaş G. A survey of health information seeking by cancer patients indicates some problems over medical explanations and terminology. Health Info Libr J. Mar 23, 2023;40(1):29-41. [CrossRef] [Medline]
- Sudau F, Friede T, Grabowski J, Koschack J, Makedonski P, Himmel W. Sources of information and behavioral patterns in online health forums: observational study. J Med Internet Res. Jan 14, 2014;16(1):e10. [FREE Full text] [CrossRef] [Medline]
- Coleman A. 'Hundreds dead' because of COVID-19 misinformation. BBC News. URL: https://www.bbc.com/news/world-53755067 [accessed 2024-05-29]
- Peterson J, Pearce PF, Ferguson LA, Langford CA. Understanding scoping reviews: definition, purpose, and process. J Am Assoc Nurse Pract. Jan 2017;29(1):12-16. [CrossRef] [Medline]
- Anderson S, Allen P, Peckham S, Goodwin N. Asking the right questions: scoping studies in the commissioning of research on the organisation and delivery of health services. Health Res Policy Syst. Jul 09, 2008;6(1):7. [FREE Full text] [CrossRef] [Medline]
- Tricco AC, Lillie E, Zarin W, O'Brien K, Colquhoun H, Kastner M, et al. A scoping review on the conduct and reporting of scoping reviews. BMC Med Res Methodol. Feb 09, 2016;16(1):15. [FREE Full text] [CrossRef] [Medline]
- Pollock D, Peters MD, Khalil H, McInerney P, Alexander L, Tricco AC, et al. Recommendations for the extraction, analysis, and presentation of results in scoping reviews. JBI Evid Synth. Mar 01, 2023;21(3):520-532. [CrossRef] [Medline]
- Peters MD, Marnie C, Tricco AC, Pollock D, Munn Z, Alexander L, et al. Updated methodological guidance for the conduct of scoping reviews. JBI Evid Synth. Oct 2020;18(10):2119-2126. [CrossRef] [Medline]
- Tricco AC, Lillie E, Zarin W, O'Brien KK, Colquhoun H, Levac D, et al. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. Oct 02, 2018;169(7):467-473. [FREE Full text] [CrossRef] [Medline]
- Treadgold B, Coulson N, Campbell J, Lambert J, Pitchforth E. Quality of health-related information and advice in online support groups: protocol for a scoping review. Open Science Framework. URL: https://osf.io/u9xfy [accessed 2024-05-29]
- The world's no 1 systematic review tool. Covidence Systematic Review Software. URL: http://www.covidence.org/ [accessed 2024-05-29]
- Haddaway NR, Collins AM, Coughlin D, Kirk S. The role of Google Scholar in evidence reviews and its applicability to grey literature searching. PLoS One. Sep 17, 2015;10(9):e0138237. [FREE Full text] [CrossRef] [Medline]
- NVivo qualitative data analysis computer software 14 L. Lumivero. URL: https://lumivero.com/products/nvivo/ [accessed 2024-05-29]
- Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. Mar 29, 2021;372:n71. [FREE Full text] [CrossRef] [Medline]
- Slick N, Bodas P, Badawy SM, Wildman B. Accuracy of online medical information: the case of social media in sickle cell disease. Pediatr Hematol Oncol. Mar 28, 2023;40(2):99-107. [CrossRef] [Medline]
- Afful-Dadzie E, Afful-Dadzie A, Egala SB. Social media in health communication: a literature review of information quality. Health Inf Manag. Jan 04, 2023;52(1):3-17. [CrossRef] [Medline]
- Ulep AJ, Deshpande AK, Beukes EW, Placette A, Manchaiah V. Social media use in hearing loss, tinnitus, and vestibular disorders: a systematic review. Am J Audiol. Sep 21, 2022;31(3S):1019-1042. [CrossRef]
- Skafle I, Nordahl-Hansen A, Quintana DS, Wynn R, Gabarron E. Misinformation about COVID-19 vaccines on social media: rapid review. J Med Internet Res. Aug 04, 2022;24(8):e37367. [FREE Full text] [CrossRef] [Medline]
- Pollack CC, Emond JA, O'Malley AJ, Byrd A, Green P, Miller K, et al. Characterizing the prevalence of obesity misinformation, factual content, stigma, and positivity on the social media platform reddit between 2011 and 2019: infodemiology study. J Med Internet Res. Dec 30, 2022;24(12):e36729. [FREE Full text] [CrossRef] [Medline]
- Farnood A, Johnston B, Mair FS. An analysis of the diagnostic accuracy and peer-to-peer health information provided on online health forums for heart failure. J Adv Nurs. Jan 09, 2022;78(1):187-200. [FREE Full text] [CrossRef] [Medline]
- Towne J, Suliman Y, Russell KA, Stuparich MA, Nahas S, Behbehani S. Health information in the era of social media: an analysis of the nature and accuracy of posts made by public Facebook pages for patients with endometriosis. J Minim Invasive Gynecol. Sep 2021;28(9):1637-1642. [CrossRef] [Medline]
- Suarez-Lledo V, Alvarez-Galvez J. Prevalence of health misinformation on social media: systematic review. J Med Internet Res. Jan 20, 2021;23(1):e17187. [FREE Full text] [CrossRef] [Medline]
- ElSherief M, Sumner SA, Jones CM, Law RK, Kacha-Ochana A, Shieber L, et al. Characterizing and identifying the prevalence of web-based misinformation relating to medication for opioid use disorder: machine learning approach. J Med Internet Res. Dec 22, 2021;23(12):e30753. [FREE Full text] [CrossRef] [Medline]
- Du J, Preston S, Sun H, Shegog R, Cunningham R, Boom J, et al. Using machine learning-based approaches for the detection and classification of human papillomavirus vaccine misinformation: infodemiology study of reddit discussions. J Med Internet Res. Aug 05, 2021;23(8):e26478. [FREE Full text] [CrossRef] [Medline]
- Jo W, Lee J, Park J, Kim Y. Online information exchange and anxiety spread in the early stage of the novel coronavirus (COVID-19) outbreak in South Korea: structural topic model and network analysis. J Med Internet Res. Jun 02, 2020;22(6):e19455. [FREE Full text] [CrossRef] [Medline]
- Sepehripour S, McDermott AL, Lloyd MS. Microtia and social media: patient versus physician perspective of quality of information. J Craniofac Surg. May 2017;28(3):643-645. [CrossRef] [Medline]
- Cole J, Watkins C, Kleine D. Health advice from internet discussion forums: how bad is dangerous? J Med Internet Res. Jan 06, 2016;18(1):e4. [FREE Full text] [CrossRef] [Medline]
- McGregor F, Somner JE, Bourne RR, Munn-Giddings C, Shah P, Cross V. Social media use by patients with glaucoma: what can we learn? Ophthalmic Physiol Opt. Jan 31, 2014;34(1):46-52. [CrossRef] [Medline]
- Charnock D, Shepperd S, Needham G, Gann R. DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health. Feb 01, 1999;53(2):105-111. [FREE Full text] [CrossRef] [Medline]
- Boyer C, Frossard C, Gaudinat A, Hanbury A, Falquetd G. How to sort trustworthy health online information? Improvements of the automated detection of HONcode criteria. Procedia Comput Sci. 2017;121:940-949. [CrossRef]
- Boyer C, Selby M, Scherrer JR, Appel R. The health on the net code of conduct for medical and health websites. Comput Biol Med. Sep 1998;28(5):603-610. [CrossRef] [Medline]
- Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the internet. JAMA. Apr 16, 1997;277(15):1244. [CrossRef]
- Doupe P, Faghmous J, Basu S. Machine learning for health services researchers. Value Health. Jul 2019;22(7):808-815. [FREE Full text] [CrossRef] [Medline]
- Alipour S, Nikooei S, Hosseinpour R, Yavari Barhaghtalab MJ. Evaluation of the quality and accuracy of breast cancer knowledge among persian language websites. BMC Health Serv Res. Dec 20, 2022;22(1):1560. [FREE Full text] [CrossRef] [Medline]
- Reidy C, Klonoff DC, Barnard-Kelly KD. Supporting good intentions with good evidence: how to increase the benefits of diabetes social media. J Diabetes Sci Technol. Sep 16, 2019;13(5):974-978. [FREE Full text] [CrossRef] [Medline]
- Bapaye JA, Bapaye HA. Demographic factors influencing the impact of coronavirus-related misinformation on WhatsApp: cross-sectional questionnaire study. JMIR Public Health Surveill. Jan 30, 2021;7(1):e19858. [FREE Full text] [CrossRef] [Medline]
- Krishnan N, Gu J, Tromble R, Abroms LC. Research note: examining how various social media platforms have responded to COVID-19 misinformation. Harvard Kennedy School. URL: https://doi.org/10.37016/mr-2020-85 [accessed 2024-05-29]
- Ghalavand H, Nabiolahi A. Exploring online health information quality criteria on social media: a mixed method approach. BMC Health Serv Res. Oct 30, 2024;24(1):1311. [FREE Full text] [CrossRef] [Medline]
- Zhang Y, Sun Y, Xie B. Quality of health information for consumers on the web: a systematic review of indicators, criteria, tools, and evaluation results. J Assoc Inf Sci Technol. Apr 29, 2015;66(10):2071-2084. [CrossRef]
- Coulson NS, Buchanan H. The role of online support groups in helping individuals affected by HIV and AIDS: scoping review of the literature. J Med Internet Res. Jul 26, 2022;24(7):e27648. [FREE Full text] [CrossRef] [Medline]
- Vivion M, Reid V, Dubé E, Coutant A, Benoit A, Tourigny A. How older adults manage misinformation and information overload - a qualitative study. BMC Public Health. Mar 21, 2024;24(1):871. [FREE Full text] [CrossRef] [Medline]
- Scherer LD, Pennycook G. Who is susceptible to online health misinformation? Am J Public Health. Oct 2020;110(S3):S276-S277. [FREE Full text] [CrossRef] [Medline]
Abbreviations
HONcode: Health on the Net Code of Conduct |
JAMA: Journal of the American Medical Association |
PCC: Population, Concept, and Context |
PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses |
PRISMA-ScR: Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews |
RQ: research question |
Edited by A Schwartz; submitted 13.01.25; peer-reviewed by O Afolayan, E Galvin; comments to author 19.02.25; revised version received 11.03.25; accepted 18.04.25; published 16.05.25.
Copyright©Bethan M Treadgold, Neil S Coulson, John L Campbell, Jeffrey Lambert, Emma Pitchforth. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 16.05.2025.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.