Research article / Open Access
DOI: 10.31488/bjcr.1000105
Readability of online keratinocyte carcinoma patienteducation materials
Nishita Maganty1, Muneeb Ilyas1, Nan Zhang2, and Amit Sharma*1
1. Department of Dermatology, Mayo Clinic Arizona,13400 E. Shea Blvd. Scottsdale, AZ 85259, USA
2. Department of Biostatistics, Mayo Clinic Arizona, USA
*Corresponding author: NishitaMaganty, Department of Dermatology, Mayo Clinic Arizona, 13400 E. Shea Blvd. Scottsdale, AZ 85259, USA, Tel: (480)-292-0662;
Abstract
Objective: Keratinocyte carcinoma (KC) is the most common cancer in the United States. A substantial portion of the US population relies on internet-based contentfor health information. KC online education material should be produced at areading level that iscomprehensible by laypersons. The aim of this study is to evaluate the readability and accessibility of currently available online information on Keratinocyte Carcinoma. Design/Setting/Methods: Online searches for “basal cell carcinoma” and “squamous cell carcinoma of skin” were performed using Google. The top ten relevant websites for each topic were evaluated. Readability of KC educational material was assessed using eight well-established tests. Additionally, for each website, the number of images, number of advertisements, mobile-friendliness, and translatability were analyzed. Results: A total of 77 and 66 articles were identified for basal cell carcinoma and squamous cell carcinoma, respectively. The average grade-reading level for websites was 12.01 for basal cell carcinoma and 12.38 for squamous cell carcinoma.Conclusion: Online patient education material for KC exceeds therecommended sixth-grade reading level proposed by the American Medical Association and the National Institutes of Health and may be too difficult for most patients to comprehend. KC online education material should be modified to be easily comprehensible by the general population.
Introduction
Keratinocyte carcinoma (KC) is the most common malignancy in the United States and is associated with significant morbidity and health care expenditures [1-3]. Despite increased awareness of the harmful effects of ultraviolet light exposure, the incidence of KCs has increased by 3-8% annually[3,4].In order to reduce disease associated morbidity and mortality, it is imperative that patients are appropriately educated on skin cancer’s early detection methods, modifiable risk factors, and treatment options.
It is estimated that 50-80% of Americans use the internet to seek out health care information, and many report that they do so frequently [5,6].Online health information is inexpensive and enables patients to make healthcare decisions [7,8]. Given that the majority of Americans use the internet to obtain health care information[5,6], it is critical that online information is accurate and understandable to laypersons. Yet, patient education materials are often written in a manner too complex for the average patient to understand[9, 10].
The average American adult reads between a seventh and eighth grade reading level and above this level is considered difficult to read [11,12]. One study [11] looked at patient educational material for five health-related causes of death and found that 75% of the online information was above a ninth-grade reading level. This suggests that the patient education material online may be too difficult for the average American to read and understand. The American Medical Association (AMA) and National Institutes of Health (NIH) recommend that patient education reading material be at a sixth-grade reading level or below [13].Grade reading level can be assessed using readability tests. Eight commonly utilized tests and the different variables they assess are outlined in Table 1 [14-21].
Given the increasing incidence of KC, it is essential that the readability of online patient education material is appropriate. Furthermore, factors such as mobile-friendliness, translatability, and commercial advertising have not been analyzed when evaluating online skin cancer-related patient education material. The primary aim of this study is to assess the readability of the most commonly used online patient education resources for KC. The secondary aim is to evaluate the number of images, mobile friendliness, translatability, and advertising found on online KC patient education resources.
Table 1. Readability Analysis Tests.
Name | Description | Variables Assessed |
---|---|---|
Bormuth Cloze Mean | Removes selected words from sample and asks readers to fill in missing words. Score of 60% or better indicates text is at independent reading level. | Sentence complexity |
Coleman-Liau | Used to gauge readability of text. Designed to be easily calculated mechanically from samples of hard-copy text and outs US grade reading level. | Word length, Sentence length |
Flesch-Kincaid | Predicts a grade level necessary to understand text using word length and sentence length. Commonly used in the field of education | Word length, Sentence length |
Flesch Reading Ease | Calculates ease of reading text with scores from 1 to 100. The higher the score, the easier the text is to read. Score of 60 to 70 is considered plain English. | Syllables per word, Sentence length |
Gunning Fog | Employs sentence length and the proportion of polysyllabic words to determine US grade level. | Sentence length, Proportion of polysyllabic words |
New Dale-Chall | Calculates US grade level based on sentence length and “hard” words, which are words that do not appear on a specially designed list of common words familiar to most 4th-grade students. | Sentence length, “Hard” words |
New Fog Count | Measures reading ease and calculates US grade level based on easy words of one or two syllables once and hard words of three or more syllables three times | Sentence length, Words with greater than 3 syllables |
Raygor Estimate | Calculates the U.S. grade level by counting the average number of sentences and letters per 100 words and plotting them on a graph. The intersection of the two variables determines the reading level of the text. | Number of sentences, Letters per 100 words |
SMOG | Predicts US grade level based on the number of polysyllabic words and the number of sentences | Number of sentences, Number of polysyllabic words |
Methods
The study was approved by the Institutional Review Board of Mayo Clinic. The online patient education materials for KC were collected by performing an internet search on Google (www.google.com). Location services were disabled and user data was erased prior to beginning the search. A search of the terms ‘basal cell carcinoma (BCC)’ and ‘squamous cell carcinoma (SCC) of the skin’ were conducted on March 19, 2017. The Google search compiled the top ten websites for each of the above searches. Sponsored sites were excluded to eliminate bias.
Within each website, a variety of topics including detection, treatment and management were found. An article was defined as relevant material that was accessible within one click from the homepage. The number of images on each webpage for each website was recorded. Advertisements on each webpage were counted and categorized as pharmaceutical advertisements, cosmetic advertisements and “other”. Mobile friendliness was assessed by through the Nibbler tool (http://nibbler.silktide.com). If a Nibbler score is above 7 out of 10, the website can be accessed and viewed on a mobile device and is considered mobile-friendly. Translatability was determined by whether the website had an option to translate pages using Google translate or other translation services.
Readability analysis was performed using Readability Studio Professional Edition v2012.0(Oleander Software Ltd, Vandalia, OH). Of the 15 default readability tests proposed by the software, we excluded redundant tests and tests that assessed similar variables. Eight well-established tests fit our inclusion criteria and were used to evaluate the readability of the online articles: Bormuth Cloze Mean, Coleman-Liau Index, Flesch-Kincaid Grade Level, Gunning Fog Index, New Dale-Chall, New Fog Count, Raygor Estimate, and SMOG. Each of these eight tests assesses different qualities of readability, providing a comprehensive analysis.
Statistical analysis
Each website with more than one webpage was analyzed using a one-sided one sample t-test. A p- value < 0.05 was deemed statistically significant. The one-sided one sample t-test was used to determine if the Bormuth Cloze mean score is significantly lower than 60%. The one-sided one sample t-testfor the remaining readability tests were used to determine if the scores were significantly greater than 6, the recommended reading grade level for health information. The Kruskal-Wallis test was used to determine if the readability test scores for a given testdiffer between the different websites.
Results
Basal cell carcinoma
The ten most popular websites for BCC included a total of 77 articles. The mean grade reading level for all 77 articles was 12.01, which is above the reading level for the average US adult. The eight readability test results are summarized in Table 2. None of the websites achieved a Bormuth Cloze score at or above 60%, indicating the text on all the websites was too difficult to read. The seven readability tests that assess grade-reading level had mean scores that ranged from 9.4 to 13.3. Additionally, every website had a SMOG score greater than 9 and Coleman-Liau and Gunning Fog score greater than 8.
Five of the ten websites had more than one webpage, allowing for statistical analysis. Overall, all five websites were unable to meet our readability criteria across all eight tests, with the exception of the American Academy of Dermatology website using the New Fog Count test.
As shown in Table 3, the total number of images per webpage ranged from 0to 19. Seventy percent of websites had fewer than two images per webpage. The total number of advertisements per webpage ranged from 0 to 5, with an average of 1.7 advertisements per page. Fifty percent of websites had 2 or more advertisements per webpage. Of the ten websites, only skincancer.org was not mobile-friendly. Three of the ten websites did not offer an option to translate text into another language.
Table 2.Summary of Readability Scores with Statistical Analysis for BCC.
Information | AAD | ACS | Cancer Center | Mayo | MedNet | Medline | Medscape | SCF | WebMD | Wikipedia | Pvalue3 |
---|---|---|---|---|---|---|---|---|---|---|---|
Total pages | 5 | 13 | 1 | 4 | 1 | 1 | 29 | 5 | 1 | 1 | |
Bormuth Cloze Mean; Mean(SD), Pvalue1 | 39.80 (1.92), <.0001 | 36.62 (3.84), <.0001 | 32 | 32.25 (2.87), 0.0002 | 27 | 39 | 23.86 (2.42), <.0001 | 30.80 (3.49), <.0001 | 40 | 24 | <.0001 |
Coleman Liau; Mean(SD), Pvalue2 | 8.88 (1.36), 0.0046 | 9.32 (1.45), <.0001 | 9.5 | 11.08 (1.18), 0.0016 | 14.3 | 8.8 | 14.88 (1.33), <.0001 | 11.60 (1.13), | 8.1 | 14.1 | <.0001 |
0.0002 | |||||||||||
Flesch Kincaid; Mean(SD), Pvalue2 | 7.32 (0.34), 0.0005 | 8.65 (1.56), <.0001 | 12.1 | 10.13 (0.89), 0.0013 | 11.5 | 7.6 | 15.01 (1.15), <.0001 | 11.52 (2.33), 0.0031 | 7.1 | 15.5 | <.0001 |
Gunning Fog; Mean(SD), Pvalue2 | 8.36 (0.84), 0.0016 | 9.75 (1.72), <.0001 | 9.8 | 12.60 (0.97), 0.0004 | 11.7 | 8.3 | 17.22 (1.64), <.0001 | 12.88 (1.66), 0.0004 | 8.5 | 17 | <.0001 |
New Dale Chall; Mean(SD), Pvalue2 | 7.50 (1.41), 0.0383 | 9.04 (1.66), <.0001 | 11.5 | 11.63 (1.84), 0.0044 | 14 | 7.5 | 15.66 (0.77), <.0001 | 11.80 (3.09), 0.0069 | 7.5 | 16 | <.0001 |
New Fog Count; Mean(SD), Pvalue2 | 5.08 (0.75), 0.9745 | 8.23 (1.90), 0.0006 | 8.1 | 9.00 (2.05), 0.0306 | 5.2 | 5.7 | 12.48 (1.90), <.0001 | 10.12 (2.09), 0.0058 | 6.3 | 13.7 | <.0001 |
Raygor Estimate; Mean(SD), Pvalue2 | 6.75 (0.50), 0.0288 | 8.46 (1.81), 0.0002 | 11 | 10.33 (0.58), 0.0029 | NE | 8 | 16.03 (1.74), <.0001 | 13.00 (2.35), 0.0013 | 7 | 17 | <.0001 |
SMOG; Mean(SD), Pvalue2 | 9.94 (0.34), <.0001 | 10.28(1.33), <.0001 | 11.2 | 12.18 (0.34), <.0001 | 12.6 | 9.8 | 16.33 (1.01), <.0001 | 13.12 (1.52), 0.0002 | 9.5 | 16.4 | <.0001 |
Pvalue1 is from one sided one sample t-test to test if the Bormuth Cloze Mean score is significantly different than 60 (cut-off score for easy read level; alternative hypothesis is that the score is lower than 60) for each web source. Pvalue2 is from one sided one sample t-test to test if the scores is significantly different than 6 (cut-off grade for easy read level; alternative hypothesis is that the score is higher than 6) for each web source. Kruskal-Wallis test was used to test if the scores significantly differ among the different web sources and the p-values for each score is summarized in Pvalue3. When there is only one page describing SCC in the web source, only the scores were displayed in Table 1. Standard deviation and one sample t-test cannot be computed in these cases.
Table 3. Images, Advertisements, Mobile-Friendliness, and Translatability for each BCC website
Parent Website | Organization | Articles | Total Images | Images per Page | Total Advertisements | Advertisements per Page | MobileFriendly | Translatable |
---|---|---|---|---|---|---|---|---|
http://www.skincancer.org/ | Skin Cancer Foundation | 6 | 14 | 2.33 | 13 | 2.17 | N | Y |
http://www.webmd.com/ | WebMD | 4 | 0 | 0 | 20 | 5 | Y | N |
http://www.mayoclinic.org/ | The Mayo Clinic | 5 | 3 | 0.6 | 10 | 2 | Y | Y |
http://www.medicinenet.com/ | MedicineNet | 5 | 9 | 1.8 | 15 | 3 | Y | N |
https://en.wikipedia.org/ | Wikipedia | 1 | 11 | 11 | 0 | 0 | Y | Y |
http://emedicine.medscape.com/ | Medscape | 36 | 29 | 0.81 | 0 | 0 | Y | Y |
https://patient.info/ | Patient Platform Limited | 1 | 1 | 1 | 5 | 5 | Y | Y |
https://www.cancer.org/ | American Cancer Society | 13 | 1 | 0.08 | 0 | 0 | Y | Y |
https://www.aad.org/ | American Academy of Dermatology | 5 | 8 | 1.6 | 0 | 0 | Y | N |
http://www.dermnetnz.org/ | DermNet NZ | 1 | 19 | 19 | 0 | 0 | Y | Y |
Squamous cell carcinoma
The ten most popular websites for SCC included a total of 66 articles. The mean reading-grade level for all 66 articles was 12.38, which is well above the reading level for the average US adult. The eight readability test results are summarized in Table 4. All of the websites had a Bormuth Cloze mean at or under 40, indicating the text is difficult to read. The readability tests that assess grade-reading level had a range from 10.1 to 13.6. All the websites had Coleman-Liauand Gunning Fog reading levels greater than 8 and SMOG scores greater than 9. Six of the 10 websites had more than one webpage, allowing for statistical analysis. Four of six websites did not meet our readability criteria across all eight tests; though, the American Academy of Dermatology and WebMD websites scored within our readability criteria for a limited number of tests.
Table 4.Summary of Readability Scores with Statistical Analysis for SCC.
Information | AAD | ACS | DermNet_NZ | Mayo | MedNet | Medscape | Patient_info | SCF | WebMD | Wikipedia | Pvalue3 |
---|---|---|---|---|---|---|---|---|---|---|---|
Total pages | 5 | 13 | 1 | 5 | 1 | 36 | 1 | 6 | 4 | 1 | |
Bormuth Cloze Mean; Mean(SD), Pvalue1 | 40.20 (2.39), <.0001 | 36.62 (3.84), <.0001 | 30 | 36.00(4.18), 0.0001 | 33 | 24.00(4.44), <.0001 | 31 | 33.67(4.55), <.0001 | 42.50 (5.69), 0.0043 | 24 | <.0001 |
Coleman Liau; Mean(SD), Pvalue2 | 8.64 (1.15), 0.0034 | 9.32 (1.45), <.0001 | 13.2 | 9.94 (1.59), 0.0026 | 11.5 | 14.99 (2.34), <.0001 | 12.6 | 10.43 (1.56), 0.0005 | 8.08 (2.87), 0.1223 | 13.5 | <.0001 |
Flesch Kincaid; Mean(SD), Pvalue2 | 7.18 (0.87), 0.0193 | 8.65 (1.56), <.0001 | 11.1 | 8.86 (1.38), 0.0049 | 9.7 | 15.27 (1.97), <.0001 | 11.5 | 10.10(2.05), 0.0022 | 7.15 (3.81), 0.2942 | 16.5 | <.0001 |
Gunning Fog; Mean(SD), Pvalue2 | 8.60 (0.80), 0.0010 | 9.75 (1.72), <.0001 | 13.1 | 10.46 (1.21), 0.0006 | 11.3 | 16.22 (1.99), <.0001 | 13.4 | 11.37(2.84), 0.0029 | 8.70 (3.32), 0.1011 | 16.2 | <.0001 |
New Dale Chall; Mean(SD), Pvalue2 | 7.30 (1.48), 0.0608 | 9.05 (1.65), <.0001 | 14 | 9.90 (2.19), 0.0082 | 14 | 15.06 (1.32), <.0001 | 11.5 | 10.58(2.25), 0.0021 | 6.00 (1.00), 0.5000 | 16 | <.0001 |
New Fog Count; Mean(SD), Pvalue2 | 5.42 (1.07), 0.8537 | 8.23 (1.90), 0.0006 | 6.4 | 7.18 (0.65), 0.0078 | 5.6 | 11.48 (2.85), <.0001 | 7.7 | 8.62 (2.43), 0.0231 | 7.03 (3.57), 0.3031 | 12.9 | <.0001 |
Raygor_Estimate; Mean(SD), Pvalue2 | 6.80 (0.84), 0.0497 | 8.46 (1.81), 0.0002 | NE | 9.00 (2.00), 0.0142 | NE | 15.45 (2.25), <.0001 | NE | 10.20 (2.17), 0.0062 | 6.75 (3.50), 0.3486 | 17 | <.0001 |
SMOG; Mean(SD), Pvalue2 | 9.92 (0.58), <.0001 | 10.28 (1.33), <.0001 | 12.7 | 11.20 (1.02), 0.0002 | 11.4 | 16.09 (1.43), <.0001 | 13.2 | 12.13 (1.76), 0.0002 | 9.48 (2.77), 0.0436 | 17.3 | <.0001 |
Pvalue1 is from one sided one sample t-test to test if the Bormuth Cloze Mean score is significantly different than 60 (cut-off score for easy read level; alternative hypothesis is that the score is lower than 60) for each web source. Pvalue2 is from one sided one sample t-test to test if the scores are significantly different than 6 (cut-off grade for easy read level; alternative hypothesis is that the score is higher than 6) for each web source. Kruskal-Wallis test was used to test if the scores significantly differ among the different web sources and the p-values for each score is summarized in Pvalue3. When there is only one page describing BCC in the web source, only the scores were displayed in Table 2. Standard deviation and one sample t-test cannot be computed in these cases.
Table 5. Images, Advertisements, Mobile-Friendliness, and Translatability for each SCC website.
Website | Organization | Articles | Total Images | Images per Page | Total Advertisements | Advertisements per | Moile-Friendly | Translatable |
---|---|---|---|---|---|---|---|---|
http://www.skincancer.org/ | Skin Cancer Foundation | 5 | 7 | 1.4 | 0 | 0 | N | Y |
http://www.mayoclinic.org/ | The Mayo Clinic | 4 | 5 | 1.25 | 8 | 2 | Y | Y |
https://www.cancer.org | American Cancer Society | 13 | 1 | 0.07692308 | 0 | 0 | Y | Y |
http://www.webmd.com/ | WebMD | 2 | 0 | 0 | 8 | 4 | Y | N |
http://www.medicinenet.com | MedicineNet | 4 | 4 | 1 | 12 | 3 | Y | N |
https://en.wikipedia.org/ | Wikipedia | 1 | 11 | 11 | 0 | 0 | Y | Y |
http://www.cancercenter.com/ | Cancer Treatment Centers of America | 1 | 0 | 0 | 0 | 0 | Y | N |
https://www.aad.org | American Academy of Dermatology | 5 | 9 | 1.8 | 0 | 0 | Y | N |
http://emedicine.medscape.com/ | Medscape | 30 | 15 | 0.5 | 9 | 0.3 | Y | Y |
https://medlineplus.gov/ | MedlinePlus | 1 | 8 | 8 | 0 | 0 | Y | Y- Spanish only |
As shown in Table 5, the total number of images per webpage ranged from 0 to 11 with an average of 2.5 images per webpage. Only 20% of websites had two or more images per webpage. The total number of advertisements per webpage ranged from 0 to 4, with webmd.com having the highest number. Three websites had two or more advertisements per page. Of the ten websites, only skincancer.org was not mobile-friendly. Four of the ten websites did not offer an option to translate text into another language.
Discussion and Conclusion
In this study, the average reading level for the ten most popular BCC and SCC websites was above the twelfth grade. This is significantly higher than the sixth-grade reading level recommended by the NIH and AMA. Additionally,nearly half of the websites did not provide translation services for another language. As one in five US residents speak a language other than English at home [22],it is important to include translation options so these individuals have equal access to health information. This discrepancy could result in a lower level of medical comprehension in non-English speaking patients, and thus, an imbalanced access to health care related information.
For BCC, we found that 70% of websites had fewer than two images per webpage. For SCC, 80% of websites had fewer than two images per webpage. Images that are linked closely with text increase attention and recall of health information when compared to text alone [23].Additionally, patients with low literacy skills are more likely to comprehend health care information if provided with both text and images [23]. Using images in patient education material can improve health outcomes for patients with limited health literacy.
Finally, advertisements on health education webpages are often distracting and can detract from patient education. During the BCC website search, 50% of websites had two or more advertisements per webpage. For SCC, 30% of sites had two or more advertisements per webpage. The presence of advertisements online can result in an increase in cognitive workload, which leads users to experience the ads as intrusive and distracting [24]. Advertisements located close to the main body of text lead to increased disruption of reading and attention [25]. Furthermore, patients may mistrust websites with advertisements [26].These findings suggest that advertisements on a health education webpage should be sparingly used and placed away from the main body of text.
This study demonstrates that KC online patient education material is too difficult and above the AMA and NIH recommended sixth-grade reading level. Additionally, features such as images, translatability, and mobile-friendliness can be improved upon to increase reading comprehension among laypersons.
To provide guidance for future patient education website design, we provide the 7 best practices for designers and educators as summarized in Table 6.
Table 6. Best practices for producing online patient education resources.
Best Practices for Making a Patient Education Website |
---|
1. Text should be written at sixth grade reading level and evaluated using the Bormuth Cloze and SMOG readability tests |
2. Readability level should be provided on the website |
3. Translation services should be available |
4. Multiple images per webpage is beneficial |
5. Reduce number of advertisements |
6. Ensure mobile device accessibility |
7. Text should include definitions or glossaries when appropriate |
The best practices for making a patient education website on KC include writing the text at the AMA and NIHrecommended sixth grade level. In addition, the readability of the text should be analyzed using readability software to confirm that the text is at the target level. We feel that the Bormuth Cloze Mean and SMOG tests are best suited as tools to analyze readability of online health care information. This is evidenced in our study, as both tests were consistent in their scoringof different websites. Additionally, the Bormuth Cloze Mean test focuses on sentence complexity while SMOG has an emphasis on sentence number and structure. When developing a patient education website, these two testsshould be utilized as they provide a strong global assessment of readability. This helps ensure that laypersons can read and comprehend the text, leading to improved healthliteracy. To increase readability by non-English speakers, a translation service with a diverse set of languages should be linked to the website. Increasing the number of relevant images and reducing the number of advertisements can improve reading comprehension. As nearly two-thirds of Americans own a smartphone, and 19% of Americans rely on a smartphone for accessing online services, patient education websites should be easily viewable on a mobile device. Finally, using definitions or glossaries can increase reading comprehension [27]. Following these recommendations, we are confident that patients will have improved accessibility and understanding of KC education material.
Conflict of Interest Disclosure
None
Funding Source
Haub Family Career Development Award
Acknowledgements
Haub Family Career Development Award.
Reference
- Barton V, Armeson K, Hampras S, et al. Nonmelanoma skin cancer and risk of all-cause and cancer-related mortality: a systematic review. Arch Dermatol Res. 2017; 309: 243-251.
- Karimkhani C, Boyers LN, Dellavalle RP, et al. It’s time for “keratinocyte carcinoma” to replace the term “nonmelanoma skin cancer”. J Am AcadDermatol. 2015; 72: 186-187.
- Madan V, Lear JT,Szeimies RM. Non-melanoma skin cancer. Lancet. 2010; 375: 673-685.
- Rogers HW, Weinstock MA, Feldman SR, et al. Incidence Estimate of Nonmelanoma Skin Cancer (Keratinocyte Carcinomas) in the U.S. Population, 2012. JAMA Dermatol. 2015; 151: 1081-1086.
- Fox S, Rainie L, Horrigan J. The Online Health Care Revolution: How the Web Helps Americans Take Better Care of Themselves. 2000.
- Baker L, Wagner TH, Singer S, et al. Use of the Internet and e-mail for health care information: results from a national survey. JAMA. 2003; 289: 2400-2406.
- MathurS, Shanti N, Brkaric M, et al. Surfing for scoliosis: the quality of information available on the Internet. Spine (Phila Pa 1976). 2005; 30: 2695-2700.
- Storino A, Castillo-Angeles M, Watkins AA, et al. Assessing the Accuracy and Readability of Online Health Information for Patients With Pancreatic Cancer. JAMA Surg. 2016; 151: 831-837.
- Wallace LS, Lennon ES. American Academy of Family Physicians patient education materials: can patients read them? Fam Med. 2004; 36: 571-574.
- Schoof ML, Wallace LS. Readability of American Academy of Family Physicians patient education materials. Fam Med. 2014; 46: 291-293.
- Walsh TM and Volsko TA. Readability assessment of internet-based consumer health information. Respir Care. 2008; 53: 1310-1315.
- AHRQ. Tip 6. Be Cautious About Using Readability Formulas. 2015.
- Eltorai AE, Han A, Truntzer J, et al. Readability of patient education materials on the American Orthopaedic Society for Sports Medicine website. PhysSportsmed. 2014; 42: 125-130.
- Flesch R. A new readability yardstick. J Appl Psychol. 1948; 32: 221-233.
- Coleman M and Liau TL. A computer readability formula designed for machine scoring. J ApplPsychol. 1975; 60: 283.
- Svider PF, Agarwal N, Choudhry OJ, et al. Readability assessment of online patient education materials from academic otolaryngology-head and neck surgery departments. Am J Otolaryngol. 2013; 34: 31-35.
- Chall JS, Dale E. Readability revisited: The new Dale–Chall readability formula. Brookline Books. 1995.
- Kincaid JP, FishburneJr RP, Rogers RL, et al. Derivation of new readability formulas (automated readability index, fog count and flesch reading ease formula) for navy enlisted personnel. Naval Technical Training Command Millington TN Research Branch.1975.
- Baldwin RS, Kaufman RK. A concurrent validity study of the Raygor readability estimate. J Reading. 1979; 23: 148-153.
- Zheng J, Yu H. Readability Formulas and User Perceptions of Electronic Health Records Difficulty: A Corpus Study. J Med Internet Res. 2017; 19: e59.
- McKenna MC, Stahl KAD. Assessment for reading instruction: Guilford Publications. 2015.
- CIS. One in Five U.S. Residents Speaks Foreign Language at Home, Record 61.8 million. 2014.
- Houts PS, Doak CC, Doak LG, et al. The role of pictures in improving health communication: a review of research on attention, comprehension, recall, and adherence. Patient EducCouns. 2006; 61: 173-190.
- Burke M, Hornof A, Nilsen E, et al. High-cost banner blindness: Ads increase perceived workload, hinder visual search, and are forgotten. Transactions on Computer-Human Interaction. 2005; 12: 423–445.
- Simola J, Kuisma J, Oorni A, et al. The impact of salient advertisements on reading and attention on web pages. J ExpPsychol Appl. 2011; 17: 174-190.
- Sillence E, Briggs P, Harris PR, et al. How do patients evaluate and make use of online health information? SocSci Med. 2007; 64: 1853-1862.
- Cole R. The understanding of medical terminology used in printed health education materials. Health Educ J. 1979; 38: 111-121.
Received: June 10, 2018;
Accepted: June 25, 2018;
Published: June 29, 2018.
To cite this article: Maganty N, Ilyas M, Zhang N.Readability of online keratinocyte carcinoma patient education materials.British Journal of Cancer Research. 2018: 1:2.
© Maganty N, et al. 2018.