WikiJournal of Medicine/Does the packaging of health information affect the assessment of its reliability? A randomized controlled trial protocol

WikiJournal of Medicine
Open access • Publication charge free • Public peer review • Wikipedia-integrated

WikiJournal of Medicine is an open-access, free-to-publish, Wikipedia-integrated academic journal for Medical and Biomedical topics. <seo title=" WJM, WikiJMed, Wiki.J.Med., WikiJMed, Wikiversity Journal of Medicine, WikiJournal Medicine, Wikipedia Medicine, Wikipedia medical journal, WikiMed, Wikimedicine, Wikimedical, Medicine, Biomedicine, Free to publish, Open access, Open-access, Non-profit, online journal, Public peer review "/>

<meta name='citation_doi' value='10.15347/WJM/2021.001'>

Article information

Abstract

Background: Wikipedia is frequently used as a source of health information. However, the quality of its content varies widely across articles. The DISCERN tool is a brief questionnaire developed in 1996 by the Division of Public Health and Primary Health Care of the Institute of Health Sciences of the University of Oxford. They claim it provides users with a valid and reliable way of assessing the quality of written information. However, the DISCERN instrument’s reliability in measuring the quality of online health information, particularly whether or not its scores are affected by reader biases about specific publication sources, has not yet been explored.

Methods: This study is a double-blind randomized assessment of a Wikipedia article versus a BMJ literature review using a modified version of the DISCERN tool. Participants will include physicians and medical residents from four university campuses in Ontario and British Columbia and will be randomized into one of four study arms. Inferential statistics tests (paired t-test, multi-level ordinal regression, and one-way ANOVA) will be conducted with the data collected from the study.

Outcomes: The primary outcome of this study will be to determine whether a statistically significant difference in DISCERN scores exists, which could suggest whether or not how health information is packaged influences how it is assessed for quality.



Non-technical summary
The internet, and in particular Wikipedia, is an important way for professionals, students and the public to obtain health information. For this reason, the DISCERN tool was developed in 1996 to help users assess the quality of the health information they find.

The ability of DISCERN to measure the quality of online health information has been supported with research, but the role of bias has not necessarily been accounted for. Does how the information is packaged influence how the information itself is evaluated? This study will compare the scores assigned to articles in their original format to the same articles in a modified format in order to determine whether the DISCERN tool is able to overcome bias.

A significant difference in ratings between original and inverted articles will suggest that the DISCERN tool lacks the ability to overcome bias related to how health information is packaged.

Background

Introduction

The internet is a crucial source of health information for health practitioners, students and the public. In the online information landscape, Wikipedia, a widely accessible and free encyclopedia, stands out as one of the most frequently consulted sources of online health information. Despite its high frequency of usage, the health content in Wikipedia varies widely in quality. Thus, it is important for all consumers of Wikipedia's health content to consider the quality of a specific article prior to applying its content.

This is specifically the aim of the DISCERN instrument, which was first developed in 1996 by the Division of Public Health and Primary Health Care of the Institute of Health Sciences of the University of Oxford. DISCERN is a brief questionnaire that provides users with a valid and reliable way of assessing the quality of written information.[1] The original tool is comprised of 15 questions that address targeted aspects of the article. The 16th question asks for the respondent's overall impression of the publication's quality. The DISCERN instrument has experienced success as a tool to measure the quality of online health information, as demonstrated by its application in 244 published studies including quality assessments of Wikipedia articles.

Nevertheless, the accuracy of DISCERN when it is used in a blinded study has not yet been evaluated. Therefore, it remains unknown whether the use of DISCERN to evaluate health information is affected by reader bias about specific publication sources. The authors hypothesize that an individual's responses to the questions in the DISCERN instrument might be influenced by their perception of the document's publisher. Specifically, that the same information will be rated differently based on where the reader perceives it to have come from. This study aims to answer the question: Does the packaging of health information affect the assessment of its reliability using the DISCERN instrument? In light of this, the authors will conduct a double-blind randomized assessment of the same information packaged as a Wikipedia article versus a BMJ literature review. The DISCERN instrument will be used by participants to conduct an assessment of the information's reliability; however, the instrument has been modified (Appendix A) to provide clear language for participants and to remove questions that relate specifically to treatments or interventions.

Participants, including physicians and medical residents, will be asked to evaluate and compare either the quality of both articles in their original formats, or both articles in inverted formats. Differences between article scoring between both groups will allow the authors to determine DISCERN's ability to overcome bias related to the article's publication source.

Literature review

The quality of Wikipedia's health content has received the vast majority of the academic attention paid to Wikipedia in the context of its use as a health information resource. The reports of Wikipedia's quality in the academic literature generally focus on Wikipedia's suitability for: patients or the general health consumer; students in health sciences; or professionals in the field of health and wellness.

To date, topics  included in the assessment of Wikipedia's content for patient education or consumer health include: gastroenterology;[2] nephrology;[3] cancer;[4][5][6][7] autoimmune disorders;[8] and medicinal drugs[9][10][11][12][13] or herbal supplements,[14] pathology informatics,[15] surgery,[16][17] toxicology,[18][19] nutrition,[20][21][22] complementary and alternative medicine,[23] hearing loss,[24] and mental health or the brain.[25][26] These assessments assess readability, reliability, and accuracy or completeness and specifically discuss their findings in relation to the public consumer or patient. Of those that include results – some conference proceedings do not – there is some agreement that Wikipedia is suitable for patients and a 2010 study found that, while Wikipedia is not necessarily the superior resource, it is the preferred resource.[5]

There is strong evidence in the literature that students enrolled in health and medicine programs are highly likely to use or have used Wikipedia to supplement their education. Herbert, et al (2015) present evidence that suggests most medical students use Wikipedia at a moderate or high rate (67%), but this investigation reports a response rate of 21% so the findings cannot be generalized.[27] Judd and Kennedy (2011) found that medical students used Google in 69% of biomedical sessions in a computer laboratory and Wikipedia in 51% of those same sessions.[28] While the study notes an interesting trajectory whereby students' reliance on Wikipedia decreases each year from first year to third year, actual Wikipedia use remains prominent throughout students' progression through the curriculum. At Queen's University[29] and USCF,[30] Wikipedia is used in formal education as a learning tool for evidence based medicine. Overall, however, there is a lack of consensus in the literature about Wikipedia's suitability for health education. Some studies conclude Wikipedia is suitable for students[31][32] while many conclude it is not.[33][34][35][36][37]

A minority of evaluations of Wikipedia's health content consider its suitability for health care workers and the outcomes of these studies is also inconsistent. Park, Masupe, Joseph, et al (2016) report that Botswanan health care workers' perceptions of Wikipedia's quality is divisive at best. Further, participants in the Botswana study indicated Wikipedia's medical content as valuable simply because it is freely available and, through a now defunct relationship with telecommunications companies, remained accessible when internet access was lost.[38] However, the ability to access Wikipedia offline has been unavailable since 2018.[39] As a surgical reference, Wikipedia is found to be accurate, albeit incomplete, and an appropriate resource.[40] Conversely, the drug information on Wikipedia is deemed variable in comparison with Micromedex and, therefore, considered inappropriate drug reference for professionals.[14]

Methods and design

Design

This is a factorial double-blind randomized controlled trial to determine if how an article is packaged affects the score it receives when the DISCERN tool is used to evaluate its reliability and quality. The study will involve four intervention arms:

  • Arm 1: will use DISCERN to evaluate an original BMJ article first and an original Wikipedia article second (control group A)
  • Arm 2: will use DISCERN to evaluate an original Wikipedia article first and an original BMJ article second (control group B)
  • Arm 3: will use DISCERN to evaluate a BMJ article formatted as a Wikipedia article first and a Wikipedia article formatted as a BMJ article second (experiment group A)
  • Arm 4: will use DISCERN to evaluate a Wikipedia article formatted as a BMJ article first and a BMJ article formatted as a Wikipedia article second (experiment group B)

Controlling the order in which the articles are read as prescribed in Arms 1 and 2 and again in Arms 3 and 4, will allow the researchers to determine whether a sequence effect may have influenced the scoring of the article. The study involves four Canadian medical schools including three in Ontario and one in British Columbia allowing for recruitment of medical faculty and students possessing the relevant backgrounds of knowledge and experience to complete the study intervention. Consenting participants will be asked to attend one session, organized in their home institution, supervised by one of the co-investigators who will ensure that participants do not have access to any outside materials while completing the study intervention.

Settings

This study will be conducted on four university campuses in Ontario and British Columbia that include a medical school and that are also within reasonable proximity of the researchers' home campuses to facilitate in-person administration of participants' packets. Such institutions include:

  1. Michael G. DeGroote School of Medicine: McMaster University (Hamilton, ON)
  2. Faculty of Medicine, University of Toronto (Toronto, ON)
  3. Schulich School of Medicine & Dentistry. Western University (London, ON)
  4. Faculty of Medicine, University of British Columbia (Vancouver, BC)

Participants and recruitment

Participants will include faculty from the four medical institutions listed above. Participant recruitment will be done through a combination of a purposive approach, directly contacting individuals responding to inclusion criteria through e-mail or by telephone, and through study advertisement, using paper and electronic posters.

Individuals who wish to take part in the study will be required to read, complete, and sign a consent form prior to attending the supervised session. Consent forms will be stored by the co-investigators. Participants will be able to withdraw their consent at any time prior to the commencement of data analysis.

Sample size

The four medical schools included in this study report an approximate cumulative 4,770 full-and part-time faculty members (Table 1).

Number of faculty members reported by medical schools included in study
Michael G. DeGroote School of Medicine: McMaster University (Hamilton, ON) >700[41]
Department of Medicine: University of Toronto (Toronto, ON) 800[42]
Schulich School of Medicine & Dentistry: Western University (London, ON) 2,681[43]
Faculty of Medicine: University of British Columbia (Vancouver, BC) 589[44]
Total 4,770

To achieve the desired confidence level of 90% and a margin of error of 5%, the authors will randomly select 336 participants from the pool of faculty members recruited for the study. In the event that more than 336 participants were not recruited, the authors will use a convenience sampling method until at least 336 participants have been recruited.

The estimated sample size to produce statistically significant results was calculated using the following formula:

Sample Size = (Distribution of 50%)/((Margin of Error 5%/Confidence Level Score 90%)2)

        = (0.5 x (1-0.5))/((0.05/1.9)2)

        = 0.25/0.0006925

        = 361.01

True Sample = (Sample Size x Population)/(Sample Size + Population – 1)

       = (361.01... x 4,770)/(361.01... + 4,770 - 1)

       = 1722021.6606/5130.0108

       = 335.6760

Randomization

Each recruited participant's name and contact information and their corresponding participant ID will be kept in a separate, encrypted and password protected MS Excel spreadsheet. Once the recruitment phase is complete, the authors will use the RANDBETWEEN function in MSExcel to randomly select 336 participants. If more than 336 participants are not recruited, the authors will employ a convenience sampling method until 336 participants have been recruited.

A total of 84 participant packets will be created for each arm of the study and will be labeled with a unique number ranging from 001 until 336. An independent volunteer who is not participating in administering the study will enter numbers 001 to 336 in MSExcel. Using the RANDBETWEEN function in MSExcel, 84 numbers will be randomly selected a total of four times. Each group of 84 numbers will be assigned Arm 1, Arm 2, Arm 3 or Arm 4, respectively. The volunteer will then pack each numbered packet with the relevant documents for the arm to which they have been assigned.

Using the same encrypted and password protected MSExcel spreadsheet as above, the researchers will track the envelope numbers that are distributed during administration of the study and to whom each envelope number is assigned. This record will be used exclusively for the purpose of removing a participant's data from the study in the event they decide to withdraw their consent. Neither participants nor the researchers will have knowledge of which arm to which participants have been assigned.

Interventions

Eligible and consenting participants will be randomized into one of four arms as outlined in the study design.

Participants will be required to attend a 30 to 60 minute session supervised by study investigators (JH, DS or LR) during which they will receive their participant package. Each package will include the pre-participation survey (Appendix B), the DISCERN instrument, two articles placed in the order they should be read according to the arm to whch the envelope number has been assigned, and the post-participation questionnaire. All materials including articles and questionnaires will be collected by investigators at the end of the time allocated to completion. Following collection of the article and DISCERN questionnaire, participants will be asked to complete a short additional questionnaire inquiring about their prior knowledge of the article.

Proposed outcome measures

Primary

The modified DISCERN instrument is composed of 10 questions covering the depth of content, scientific accuracy, completeness, justification or evidence given, and readability grade. Users respond to each question of the tool using a scale from 1 to 5 where 1 represents  serious or extensive shortcomings while a score of 3 signifies potentially important but not serious shortcomings and a score of 5 constitutes minimal shortcomings. Final grade of the article is determined through a composite score of the 10 questions. The primary outcome of our study is the difference in scores between the original articles when compared with each other, with the difference in scores of the modified articles when compared with each other. If the difference in scores between both original articles is not significantly different from the difference in score between both modified articles, it may be concluded that DISCERN is not effective in overcoming article sourcing bias.

Secondary

The academic backgrounds and expertise of our participants may result in previous knowledge or familiarity with articles used in this study. Chances of this occurrence can not be eliminated and must be considered in the data analysis. Therefore, our secondary outcome will address the potential un-blinding of participants by using a short questionnaire (Appendix C) to determine whether subjects recognized one or both of the articles from previous readings. We will also determine whether the order of reading of the modified articles had an impact on grading by participants.

Data collection, storage, and analysis

Primary assessment

A modified version of the DISCERN instrument will be used to collect data from participants (Appendix A). All responses to each DISCERN questionnaire will be entered into SPSS and separated into four groups: BMJ as BMJ, BMJ as WP, WP as WP, WP as BMJ.

The following inferential statistical tests may be conducted with the collected data:

  1. Paired t-test to determine whether the mean difference in individual DISCERN scores between results in Arms 1 and 2 are statistically different from the results in Arms 3 and 4. This test will not consider the effect that sequence may have on the DISCERN scores for each article
  2. Multi-level ordinal regression using all four arms to determine whether the order in which the two articles are read by participants potentially influenced their assessment of each article.
  3. One-Way ANOVA to determine whether the difference between DISCERN scores within each Arms 1 and 2 is statistically significant from Arms 3 and 4.

The following hypotheses will be tested:

Null hypothesis A: There is no difference in DISCERN scores between the BMJ article as a BMJ article and the Wikipedia article as a Wikipedia article

  • Data types: independent variable (nominal data): document name; dependent variable (ratio data): DISCERN score
  • Null hypothesis B: There is no difference in DISCERN scores between the WP → BMJ article and the BMJ → to WP article
    • Data types: independent variable (nominal data): document name; dependent variable (ratio data): DISCERN score
  • Null hypothesis C: There is no difference in DISCERN scores between the WP → BMJ article and the BMJ as BMJ article
    • Data types: independent variable (nominal data): document name; dependent variable (ratio data): DISCERN score
  • Null hypothesis D: There is no difference in DISCERN scores between the BMJ → WP article and the WP as WP article
    • Data types: independent variable (nominal data): document name; dependent variable (ratio data): DISCERN score

Secondary assessment

  • Study subject questionnaire (see: Appendix B and Appendix C): Descriptive statistics using SPSS
  • Does the order in which the articles are assessed affect the outcome?

Additional information

Competing interests

DS and JH are Wikipedians. They research Wikipedia and contribute to its content. In an effort to minimize the risk of conflict of interest, the Wikipedia article evaluated by participants in this study was selected by the researchers because its content had no contributions from DS and minimal contributions from JH.

Author contributions

  • Design of study (JH, LR, DS)
  • Preparation and revision of manuscript (LR, DS)
  • Approval of submitted manuscript (JH, LR, DS)

Ethics statement

This study has been approved by the Hamilton Integrated Research Ethics Board (HIREB) under project ID 8228.

Funding

The authors have not received funding for this study.

References

  1. "Welcome to DISCERN". DISCERN Online. DISCERN Project. Retrieved 2020-09-09.
  2. Czarnecka-Kujawa, Kasia; Abdalian, Rupert; Grover, Samir C. (2008-04). "M1042 The Quality of Open Access and Open Source Internet Material in Gastroenterology: Is Wikipedia Appropriate for Knowledge Transfer to Patients?". Gastroenterology 134 (4): A-325–A-326. doi:10.1016/S0016-5085(08)61518-8. https://linkinghub.elsevier.com/retrieve/pii/S0016508508615188. 
  3. Thomas, Garry R.; Eng, Lawson; de Wolff, Jacob F.; Grover, Samir C. (2013-03). "An Evaluation of Wikipedia as a Resource for Patient Education in Nephrology". Seminars in Dialysis 26 (2): 159–163. doi:10.1111/sdi.12059. http://doi.wiley.com/10.1111/sdi.12059. 
  4. Qureishi, A.; Sharma, A. (2012). "Can Wikipedia replace traditional patient information leaflets? Comparing the internet to official patient information resources in thyroid cancer". European Surgery - Acta Chirurgica Austriaca 44 (Suppl. 247): 35. doi:10.1007/s10353-012-0163-y. 
  5. 5.0 5.1 Leithner, A.; Maurer-Ertl, W.; Glehr, M.; Friesenbichler, J.; Leithner, K.; Windhager, R. (2010). "Wikipedia and osteosarcoma: a trustworthy patients' information?". Journal of the American Medical Informatics Association 17 (4): 373–374. doi:10.1136/jamia.2010.004507. PMID 20595302. https://academic.oup.com/jamia/article-lookup/doi/10.1136/jamia.2010.004507. 
  6. Rajagopalan, Malolan S.; Khanna, Vineet K.; Leiter, Yaacov; Stott, Meghan; Showalter, Timothy N.; Dicker, Adam P.; Lawrence, Yaacov R. (2011). "Patient-Oriented Cancer Information on the Internet: A Comparison of Wikipedia and a Professionally Maintained Database". Journal of Oncology Practice 7 (5): 319–323. doi:10.1200/JOP.2010.000209. PMID 22211130. http://ascopubs.org/doi/10.1200/JOP.2010.000209. 
  7. Biggs, T. C.; Jayakody, N.; Best, K.; King, E. V. (2018-06). "Quality of online otolaryngology health information". The Journal of Laryngology & Otology 132 (6): 560–563. doi:10.1017/S0022215118000774. ISSN 0022-2151. https://www.cambridge.org/core/product/identifier/S0022215118000774/type/journal_article. 
  8. Watad, Abdulla; Bragazzi, Nicola Luigi; Brigo, Francesco; Sharif, Kassem; Amital, Howard; McGonagle, Dennis; Shoenfeld, Yehuda; Adawi, Mohammad (2017-07-18). "Readability of Wikipedia Pages on Autoimmune Disorders: Systematic Quantitative Assessment". Journal of Medical Internet Research 19 (7): e260. doi:10.2196/jmir.8225. ISSN 1438-8871. PMID 28720555. PMC PMC5539385. http://www.jmir.org/2017/7/e260/. 
  9. Clauson, Kevin A; Polen, Hyla H; Boulos, Maged N Kamel; Dzenowagis, Joan H (2008-12). "Scope, Completeness, and Accuracy of Drug Information in Wikipedia". Annals of Pharmacotherapy 42 (12): 1814–1821. doi:10.1345/aph.1L474. ISSN 1060-0280. http://journals.sagepub.com/doi/10.1345/aph.1L474. 
  10. Hunter, Julia Alexandra; Lee, Taehoon; Persaud, Navindra (2018-07-02). "A comparison of the content and primary literature support for online medication information provided by Lexicomp and Wikipedia". Journal of the Medical Library Association 106 (3). doi:10.5195/JMLA.2018.256. ISSN 1558-9439. PMID 29962913. PMC PMC6013145. http://jmla.pitt.edu/ojs/jmla/article/view/256. 
  11. Koppen, Laura; Phillips, Jennifer; Papageorgiou, Renee (2015-07). "Analysis of reference sources used in drug-related Wikipedia articles". Journal of the Medical Library Association 103 (3): 140–144. doi:10.3163/1536-5050.103.3.007. ISSN 1536-5050. PMID 26213506. PMC PMC4511054. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4511054/. 
  12. Lavsa, Stacey M.; Corman, Shelby L.; Culley, Colleen M.; Pummer, Tara L. (2011-04). "Reliability of Wikipedia as a medication information source for pharmacy students". Currents in Pharmacy Teaching and Learning 3 (2): 154–158. doi:10.1016/j.cptl.2011.01.007. https://linkinghub.elsevier.com/retrieve/pii/S1877129711000086. 
  13. Reilly, Timothy; Jackson, William; Berger, Victoria; Candelario, Danielle (2017-03). "Accuracy and completeness of drug information in Wikipedia medication monographs". Journal of the American Pharmacists Association 57 (2): 193–196.e1. doi:10.1016/j.japh.2016.10.007. https://linkinghub.elsevier.com/retrieve/pii/S1544319116308652. 
  14. 14.0 14.1 Phillips, Jennifer; Lam, Connie; Palmisano, Lisa (2014-07). "Analysis of the accuracy and readability of herbal supplement information on Wikipedia". Journal of the American Pharmacists Association 54 (4): 406–414. doi:10.1331/JAPhA.2014.13181. https://linkinghub.elsevier.com/retrieve/pii/S1544319115302156. 
  15. Kim, Ji Yeon; Gudewicz, Thomas M.; Dighe, Anand S.; Gilbertson, John R. (2010). "The pathology informatics curriculum wiki: Harnessing the power of user-generated content". Journal of Pathology Informatics 1 (1): 10. doi:10.4103/2153-3539.65428. ISSN 2153-3539. PMID 20805963. PMC PMC2929539. http://www.jpathinformatics.org/text.asp?2010/1/1/10/65428. 
  16. Devgan, Lara; Powe, Neil; Blakey, Brittony; Makary, Martin (2007-09). "Wiki-Surgery? Internal validity of Wikipedia as a medical and surgical reference". Journal of the American College of Surgeons 205 (3): S76–S77. doi:10.1016/j.jamcollsurg.2007.06.190. https://linkinghub.elsevier.com/retrieve/pii/S1072751507009520. 
  17. Modiri, Omeed; Guha, Daipayan; Alotaibi, Naif M.; Ibrahim, George M.; Lipsman, Nir; Fallah, Aria (2018-03). "Readability and quality of wikipedia pages on neurosurgical topics". Clinical Neurology and Neurosurgery 166: 66–70. doi:10.1016/j.clineuro.2018.01.021. https://linkinghub.elsevier.com/retrieve/pii/S0303846718300271. 
  18. Ayes, K. B.; Bardsley, C. H. (2010). "Wikipedia Information for Toxicologic Emergencies: How Reliable Is It?". Clinical Toxicology 48 (6): 635. doi:10.3109/15563650.2010.493290. ISSN 1556-3650. http://www.tandfonline.com/doi/full/10.3109/15563650.2010.493290. 
  19. Ayes, K. B.; Bardsley, C. H.; Jantz, J. M.; Frederick, W. A. (2011). "Wikipedia Information for Toxicologic Emergencies Involving Household Products, Plants and Envenomations: How reliable is it?". Clinical Toxicology 49 (6): 609. doi:10.3109/15563650.2011.598695. http://www.tandfonline.com/doi/full/10.3109/15563650.2011.598695. 
  20. Messner, Marcus; DiStaso, Marcia W.; Jin, Yan; Meganck, Shana; Sherman, Scott; Norton, Sally (2014). "Influencing public opinion from corn syrup to obesity: A longitudinal analysis of the references for nutritional entries on Wikipedia". First Monday 19 (11). doi:10.5210/fm.v19i11.4823. https://journals.uic.edu/ojs/index.php/fm/article/view/4823. 
  21. Sanz-Valero, J.; Cabrera-Hernández, L.; Wanden-Berghe, C.; Culebras-Fernandez, J.M. (2013-09). "PP188-MON The popularization of food and nutritional sciences: Wikipedia versus a general encyclopedia". Clinical Nutrition 32 (Suppl. 1): S192. doi:10.1016/S0261-5614(13)60499-9. https://linkinghub.elsevier.com/retrieve/pii/S0261561413604999. 
  22. Sanz-Valero, J.; Wanden-Berghe, C.; Guardiola-Wanden-Berghe, R. (2012-09). "PP154-MON Nutrition and metabolism in Wikipedia: Presence and adequacy of English and Spanish terminology". Clinical Nutrition Supplements 7 (1): 198–199. doi:10.1016/S1744-1161(12)70493-3. https://linkinghub.elsevier.com/retrieve/pii/S1744116112704933. 
  23. Koo, Malcolm (2014). "Complementary and Alternative Medicine on Wikipedia: Opportunities for Improvement". Evidence-Based Complementary and Alternative Medicine 2014: 1–4. doi:10.1155/2014/105186. ISSN 1741-427X. PMID 24864148. PMC PMC4016830. http://www.hindawi.com/journals/ecam/2014/105186/. 
  24. Simpson, Andrea; Le, Michelle; Malicka, Alicja N. (2018-10-02). "The Accuracy and Readability of Wikipedia Articles on Hearing Loss". Journal of Consumer Health on the Internet 22 (4): 323–336. doi:10.1080/15398285.2018.1542251. ISSN 1539-8285. https://www.tandfonline.com/doi/full/10.1080/15398285.2018.1542251. 
  25. Reavley, N. J.; Mackinnon, A. J.; Morgan, A. J.; Alvarez-Jimenez, M.; Hetrick, S. E.; Killackey, E.; Nelson, B.; Purcell, R. et al. (2012-08). "Quality of information sources about mental disorders: a comparison of Wikipedia with centrally controlled web and printed sources". Psychological Medicine 42 (8): 1753–1762. doi:10.1017/S003329171100287X. ISSN 0033-2917. https://www.cambridge.org/core/product/identifier/S003329171100287X/type/journal_article. 
  26. Stankus, Tony; Spiegel, Sarah E. (2010-08-31). "Wikipedia, Scholarpedia, and References to Journals in the Brain and Behavioral Sciences: A Comparison of Cited Sources and Recommended Readings in Matching Free Online Encyclopedia Entries". Science & Technology Libraries 29 (3): 258–265. doi:10.1080/0194262X.2010.497711. ISSN 0194-262X. https://doi.org/10.1080/0194262X.2010.497711. 
  27. Herbert, Verena G; Frings, Andreas; Rehatschek, Herwig; Richard, Gisbert; Leithner, Andreas (2015-12). "Wikipedia – challenges and new horizons in enhancing medical education". BMC Medical Education 15 (1): 32. doi:10.1186/s12909-015-0309-2. ISSN 1472-6920. PMID 25879421. PMC PMC4384304. https://bmcmededuc.biomedcentral.com/articles/10.1186/s12909-015-0309-2. 
  28. Judd, Terry; Kennedy, Gregor (2011-03). "Expediency-based practice? Medical students' reliance on Google and Wikipedia for biomedical inquiries: Students' reliance on Google and Wikipedia". British Journal of Educational Technology 42 (2): 351–360. doi:10.1111/j.1467-8535.2009.01019.x. http://doi.wiley.com/10.1111/j.1467-8535.2009.01019.x. 
  29. Murray, Heather; Walker, Melanie; Maggio, Lauren; Dawson, Jennifer (2018-06). "24 Wikipedia medical page editing as a platform to teach evidence-based medicine". Oral Sessions 23: A12–A13. doi:10.1136/bmjebm-2018-111024.24. http://ebm.bmj.com/lookup/doi/10.1136/bmjebm-2018-111024.24. 
  30. Azzam, Amin; Bresler, David; Leon, Armando; Maggio, Lauren; Whitaker, Evans; Heilman, James; Orlowitz, Jake; Swisher, Valerie et al. (2017-02). "Why Medical Schools Should Embrace Wikipedia: Final-Year Medical Student Contributions to Wikipedia Articles for Academic Credit at One School". Academic Medicine 92 (2): 194–200. doi:10.1097/ACM.0000000000001381. ISSN 1040-2446. PMID 27627633. PMC PMC5265689. http://journals.lww.com/00001888-201702000-00022. 
  31. Schweitzer, N. J. (2008-04-15). "Wikipedia and Psychology: Coverage of Concepts and Its Use by Undergraduate Students". Teaching of Psychology 35 (2): 81–85. doi:10.1080/00986280802004594. ISSN 0098-6283. http://www.tandfonline.com/doi/abs/10.1080/00986280802004594. 
  32. Haigh, Carol A. (2011-02). "Wikipedia as an evidence source for nursing and healthcare students". Nurse Education Today 31 (2): 135–139. doi:10.1016/j.nedt.2010.05.004. https://linkinghub.elsevier.com/retrieve/pii/S0260691710000924. 
  33. Azer, Samy A. (2014-02). "Evaluation of gastroenterology and hepatology articles on Wikipedia: Are they suitable as learning resources for medical students?". European Journal of Gastroenterology & Hepatology 26 (2): 155–163. doi:10.1097/MEG.0000000000000003. ISSN 0954-691X. http://journals.lww.com/00042737-201402000-00004. 
  34. Azer, Samy A.; AlSwaidan, Nourah M.; Alshwairikh, Lama A.; AlShammari, Jumana M. (2015-10). "Accuracy and readability of cardiovascular entries on Wikipedia: are they reliable learning resources for medical students?". BMJ Open 5 (10): e008187. doi:10.1136/bmjopen-2015-008187. ISSN 2044-6055. PMID 26443650. PMC PMC4606442. http://bmjopen.bmj.com/lookup/doi/10.1136/bmjopen-2015-008187. 
  35. Antivalle, M.; Battellino, M.; Ditto, M.C.; Varisco, V.; Chevallard, M.; Rigamonti, F.; Batticciotto, A.; Atzeni, F. et al. (2014-06). "SAT0585 Evaluation of Wikipedia Rheumatology Articles as A Learning Resource for Medical Students". Annals of the Rheumatic Diseases 73 (Suppl. 2): 801.3–802. doi:10.1136/annrheumdis-2014-eular.5610. ISSN 0003-4967. http://ard.bmj.com/lookup/doi/10.1136/annrheumdis-2014-eular.5610. 
  36. Jetty, Prasad; Yacob, Michael M.; Lotfi, Shamim (2014-11). "The Wikipedia Medical Student: Comparing the Quality of Vascular Surgery Topics Across Two Commonly Used Educational Resources". Journal of Vascular Surgery 60 (5): 1404. doi:10.1016/j.jvs.2014.08.030. https://linkinghub.elsevier.com/retrieve/pii/S0741521414015559. 
  37. Azer, Samy A. (2015-03). "Is Wikipedia a reliable learning resource for medical students? Evaluating respiratory topics". Advances in Physiology Education 39 (1): 5–14. doi:10.1152/advan.00110.2014. ISSN 1043-4046. https://www.physiology.org/doi/10.1152/advan.00110.2014. 
  38. Park, Elizabeth; Masupe, Tiny; Joseph, Joseph; Ho-Foster, Ari; Chavez, Afton; Jammalamadugu, Swetha; Marek, Andrew; Arumala, Ruth et al. (2016-11). "Information needs of Botswana health care workers and perceptions of wikipedia". International Journal of Medical Informatics 95: 8–16. doi:10.1016/j.ijmedinf.2016.07.013. https://linkinghub.elsevier.com/retrieve/pii/S1386505616301745. 
  39. "Wikipedia Zero". Wikimedia Foundation Governance Wiki. Wikimedia Foundation. 2018-10-05. Retrieved 2020-09-23.
  40. Devgan, Lara; Powe, Neil; Blakey, Brittony; Makary, Martin (2007-09). "Wiki-Surgery? Internal validity of Wikipedia as a medical and surgical reference". Journal of the American College of Surgeons 205 (3): S76–S77. doi:10.1016/j.jamcollsurg.2007.06.190. ISSN 1072-7515. https://doi.org/10.1016/j.jamcollsurg.2007.06.190. 
  41. "Welcome from the Chair". Department of Medicine. McMaster University. Archived from the original on 2020-06-29. Retrieved 2020-09-09.
  42. "Welcome to the Department of Medicine, University of Toronto". Department of Medicine. University of Toronto. 2016-03-04. Retrieved 2020-09-09.
  43. "Facts and Figures". Schulich School of Medicine & Dentistry. Western University. Retrieved 2020-09-09.
  44. "Faculty of Medicine". UBC Research + Innovation. University of British Columbia. Archived from the original on 2020-04-14. Retrieved 2019-12-13.

Appendices

Appendix A. Modified DISCERN Instrument for article evaluation

Are the aims and objectives of the topic clearly stated at the beginning of the article?

No Partially Yes
1 2 3 4 5

Does the article cover the needed subtitles and key concepts related to the topic?

No Partially Yes
1 2 3 4 5

Is the information provided throughout the article scientifically correct and in agreement with current valid resources and textbooks?

No Partially Yes
1 2 3 4 5

Is the article neutral and not based on personal views?

No Partially Yes
1 2 3 4 5

Is the article balanced and unbiased?

No Partially Yes
1 2 3 4 5

Is it clear what sources of information were used to compile the publication (references, links to professional web sites?)

No Partially Yes
1 2 3 4 5

Has the article been regularly updated and amended?

No Partially Yes
1 2 3 4 5

Are there key areas in the article that are completed and do not need further addition?

No Partially Yes
1 2 3 4 5

Do the images, figures, and tables provided in the article support the information given and enhance the understanding of points raised?

No Partially Yes
1 2 3 4 5

What is your overall rating of the whole article as a source of information to medical students?

Serious or extensive shortcomings Potentially important but not serious shortcomings Minimal shortcomings
1 2 3 4 5

Appendix B. Pre-participation survey

(1) What is your current level of practice?

(a) Undergraduate medical student (select this option if you are a current student in an MD program, or an MD graduate who has not yet passed your license exam)

(b) Resident

(c) MD, post-residency

(d) MD + additional education (e.g. PhD, Masters)

(e) Other (please specify): _____________

(2) If applicable, what is your specialty (in practice or research or both): _________________________

Appendix C. Post-participation questionnaire

To be completed once you have completed the DISCERN instrument:

(1) Were either of the two articles you read familiar to you in any way? (Yes/No)

If you answered “NO” to question one (1) above, you may return the questionnaire to the investigator now.

If you answered “Yes” to the above question, please answer the two questions below.

(2) If you answered yes to question one (1) above, which article(s) were familiar to you with respect to their content:

  • Both articles
  • The Wikipedia article only
  • The BMJ article only

(3) Which article did you read first (e.g. Wikipedia or BMJ)?