Addressing and evaluating health literacy in mHealth: a scoping review
Introduction
The United States (U.S.) Department of Education’s most recent survey revealed 36% of adults have basic or below basic health literacy and only 12% have proficient health literacy (1). The annual U.S. associated costs for low health literacy are estimated to be up to 238 billion dollars. Deficiencies in health literacy are linked to medical errors, illness and disability as well as compromised public health (2,3). A more recent report on health literacy conducted in eight European Union member states found 47% of participants had problematic or inadequate health literacy (4). Among low to middle income countries, the proportion of individuals with inadequate levels of health literacy ranges from 32% in Ghana (5) to 52% for both Afghanistan (6) and Philippines (7). Health literacy as a concept emerged in the 1980s after major public health campaigns failed to affect positive health changes in populations with limited education or economic instability (8). However, as demonstrated by the dearth of current national data, substantial efforts to improve health literacy are still needed (9). As such, Healthy People 2030 identified health literacy as a priority area of focus for improvement of public health (10).
Early definitions of heath literacy focused on reading ability, but a much broader understanding of health literacy has evolved over the years (11). For example, Healthy People 2030 defines health literacy as complex, extending beyond the individual to include families, corporations, systems, and communities (10). Healthy People 2030 also posits that health literacy is comprised of abilities which go beyond reading and understanding material but encompasses one’s ability to analyze information, accurately interpret symbols, charts, and diagrams, factor in risks and benefits and then assimilate this information to make informed decisions (10). This expansive health literacy definition captures the multi-faceted nature of this concept, reflecting the importance of the context in which health information is being accessed.
It is of no surprise that with the advent of the Internet in the 1990s, health-related websites in the 2000s, and mobile health apps in the 2010s, innovative modes of accessing information have created new challenges in addressing health literacy. The delivery of health information no longer occurs in a siloed clinic environment. Information is accessible in a variety of formats. For instance, information delivered over the Internet and via related technologies is commonly referred to as eHealth (12) whereas mHealth has been defined as mobile devices (such as mobile phones, tablets, monitoring devices and other wireless technology) being used to support both individual and public health (13). Regardless of specific delivery modality, the amount of health information available to the public and the means with which one can access health information has increased substantially. However, this increase has made addressing health literacy a more difficult pursuit.
In an effort to synthesize the evidence for this relatively new area of research of mHealth related to health literacy, a scoping review was planned. At the time of this scoping review, many review studies related to health literacy for mHealth or eHealth had been published with significant variation in topics such as the examination of health literacy levels of specific apps (14,15); the effects of mHealth-based interventions on health literacy (16,17); and discussions on health literacy of mobile apps for cancer (18,19), diabetes (20), heart disease (21,22), chronic pain management (23), and mental health (24,25). While previous reviews have explored health literacy in the context of mHealth or mHealth interventions, to our knowledge, none of the published reviews used a theoretical framework specific to health literacy to systematically examine comprehensive aspects of health literacy within mHealth.
Our original intention with this scoping review was to identify a health literacy evaluation tool that could be used to evaluate mobile app content. However, we were unable to identify such a tool. It became clear that it was necessary to first understand how health literacy needs have been addressed (e.g., design or accommodations) in mHealth thus far to inform the development of such a tool. Therefore, the purpose of this scoping review was to answer the following questions: (I) How is health literacy addressed in mHealth app development? and (II) How is evaluation of health literacy addressed in mHealth apps? The first question pertains to mHealth development or design aspects that may be noted in the literature to account for literacy variability in the mHealth app whereas the second question pertains to how the mHealth was evaluated to speak to health literacy in relation to mHealth. Based on our previous expertise in this area there are a variety of ways in which literacy variability can be discussed when it pertains to mHealth. mHealth apps may consider the use of techniques or design strategies to accommodate for literacy needs. However, others may have factored in literacy suitability by evaluating health literacy directly and deriving a relationship about the literacy as it pertains to mHealth. The research team felt that two questions were necessary to be answered in this scoping review to extrapolate these various means that have been noted to understand the relationship of health literacy in mHealth.
Objectives of the study
A scoping review of the literature was conducted to identify the existing tools and criteria for evaluating mobile health apps from a health literacy perspective. Given the broad nature of the subject matter, conducting a scoping review would allow researchers to examine how health literacy is addressed in mHealth research, particularly as it pertains to mobile app development and design.
Methods
Guidelines
We followed the Joanna Briggs Institute (JBI) Scoping Review guidelines to conduct this review study (26). The JBI guidelines are a comprehensive set of guidelines which are updated periodically.
Protocol and registration
A search of the literature was performed following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Scoping Review guidelines (27) (available at https://mhealth.amegroups.com/article/view/10.21037/mhealth-22-11/rc). The reporting required for the PRISMA-ScR is consistent with other guidance for scoping reviews provided by the JBI. Please see the following link for complete search details: https://digitalcommons.unmc.edu/search/11/.
Eligibility criteria
Articles were eligible for inclusion if they were English language articles and addressed general health literacy or mHealth/digital/eHealth literacy. Articles also needed to meet one of the following inclusion criteria: (I) collected literacy information in order to incorporate literacy into the design and/or modification of an app or (II) collected literacy information to describe the population being studied. Because we were interested in the articles which dealt with the design and evaluation of mHealth tools, we excluded grey literature, book chapters, and commentaries with an expectation that articles of interest are most likely to be found in peer-reviewed journals. Conference abstracts were excluded due to an anticipated difficulty obtaining the complete information about the study. Articles on literacy regarding specific health conditions (e.g., mental health literacy) were excluded because our primary interest was on health literacy not specific to given health conditions.
Information sources and search strategy
Embase, Scopus, PubMed, CINAHL, and PsycINFO databases were searched on March 17, 2021. The following words and their permutations were used for the search: health literacy, mobile health application, and tool or model. Searches were performed for articles published for all years. Two reviewers independently reviewed the reference lists to identify potentially eligible articles. The team then made a decision whether to include the article. All articles pulled from the search strategies were compiled and deduplicated using RefWorks and Zotero.
Selection of sources of evidence
Figure 1 presents the PRISMA-ScR scoping review process. Initially, there were 1,010 abstracts identified and 294 of them were removed due to duplication, resulting in 716 unique abstracts. All abstracts were independently reviewed by two individuals to determine the article’s eligibility. If the article was found to be ineligible, the reviewer noted the reason(s) using the eligibility criteria. In cases of disagreements on the eligibility, the pair met to discuss and resolve conflicts. In any instance, the pair was unable to resolve conflicts, the larger group evaluated and came to a consensus. A master sheet was created and data on all full-text articles were documented (e.g., author, year, article type, study objectives, sample description, study design, if literacy information was collected, type of literacy collected/addressed, literacy tool name, results related to literacy, and if literacy was incorporated into app design/modification). During the full-article review, 100 additional articles were identified from the reference lists of articles reviewed. In order to identify eligible articles, we used the same process of reviewing the abstracts and the full articles by two reviewers. A total of 57 articles were identified for inclusion in the scoping review.
After examining the 57 identified articles, a qualitative review of articles identified two distinct categories of articles. The first category of articles focused on creating and/or modifying an app with accommodations for health literacy concerns. The second category of articles focused on estimating health literacy levels or examining the association between literacy and outcomes such as intention to use a health app. Because it was thought that the first category of articles would be most important in understanding how apps have been designed or evaluated for those with limited health literacy, the decision was made to focus on this category of articles (n=32) in this scoping review. To address the second category of articles (n=25) identified in this scoping review a second manuscript using the same process will be written.
Data charting and synthesis of results
Data were summarized and analyzed using a data extraction table. The data extraction table included the following categories: (I) authors; (II) year of study; (III) study objective; (IV) sample description; (V) methods; (VI) results related to health literacy level; and (VII) health literacy integration recommendations. Sample descriptions included location of the study, sample size, age, sex, race/ethnicity, and education levels. Methods included the type of study (app development, app testing, review of apps, review of app studies), data collection methods, the type of literacy (e.g., general, technology), and the literacy tool(s) used. Pairs of reviewers reviewed the full-text articles independently and met to resolve conflicting results. Further differences identified in the initial charting were resolved among all the reviewers. To summarize the data, we first extracted information that addressed each data element (e.g., sample description). We then standardized the description of the data—for example, for the location information, we reported the state information for the U.S. studies and for the sex information, we reported the percentage of females.
Health literacy framework
The framework for this scoping review needed to capture two critical concepts: health literacy, and app/mobile platform design from a low literacy user perspective. Previous studies have concluded that there is a lack of a common definition and common methods of operationalizing health literacy as applied to digital platforms (28).
In trying to address this gap, Monkman & Kushniruk (in 2013), developed a set of heuristics for evaluating mobile health applications (29). These heuristics were based on the U.S. Department of Health & Human Services, Office of Disease Prevention and Health Promotion publication entitled “Health Literacy Online” (HLO) (in 2010) (30). The original (in 2010) and updated (in 2015) HLO documents serve as our framework for examining how health literacy is addressed in mHealth app development (30). The major advantage of the HLO documents is that they provide clear, actionable content which is usable by those designing apps for health consumers (31).
Six broad categories of strategies are identified in the HLO document: (I) what we know about users with limited health literacy skills; (II) write actionable content; (III) display content clearly on the page; (IV) organize content and simplify navigation; (V) engage users; and (VI) test your site with users with limited literacy skills. The first strategy is the only one which displays significant changes from its original 2010 version to the current 2015 version. The strategies identified in the articles reviewed in this paper were mapped to the strategies identified in the HLO to highlight those which appear to be most salient. For details about each of the HLO strategies, please visit https://health.gov/healthliteracyonline/.
Results
Selection of sources of evidence
There were 1,010 records identified through database searching and an additional 100 identified from using references lists. After the abstract and full-text review, 32 articles were included in our final list for analysis (Figure 1).
Characteristics of evidence sources
Table 1 summarizes the sample characteristics, methods and results, and Table 2 summarizes the HLO recommendations. Of the 32 articles reviewed, two were reviews of mobile app studies (32,33), two were reviews of publicly available mobile apps (34,35), nine were articles which described development and evaluation of an app (36-44), five were articles which described development of an app (45-49), one article described the heuristics evaluation of a mobile health app (29) and the remaining 13 were evaluations of an app. The majority of mobile app development/evaluation studies were conducted in the U.S. The study populations included in the app development/evaluation studies ranged from the general population to primary care clinic patients and providers to patients with specific health conditions (e.g., dual antiplatelet therapy, chronic kidney disease).
Table 1
Primary author [year] | Objective(s) | Sample description | Methods | Results re. health literacy level | Health literacy integration into design | HLO 2016 strategies (2010 strategy*) |
---|---|---|---|---|---|---|
Abujarad [2018] | To describe how an mHealth tool was designed, developed, and evaluated for advancing the informed consent process | Connecticut, USA N=14 Patients and researchers from an asthma clinic and a university institutional review board member Age: 21–74 years 55.7% female 66.7% white 77.8% Bachelor+ |
Development and evaluation of app User-centered design methodology Focus groups, pilot of app General literacy Literacy tool: not used |
Not reported | Desired reading level was at 8th grade level | 2.6: Write in plain language |
Text-to speech translation is a key feature of VIC and is achieved by online and automated text-to-speech translation | 3.11: Make your site accessible to people with disabilities | |||||
Text-to-speech interfaces addresses literacy issues and makes the IC process an option for inexperienced computer users | 5.1: Share information through multimedia | |||||
Bahadori [2020] | To assess the readability of the information provided within total hip replacement and total knee replacement apps to understand more about the impact this could have on patients | Location: UK N=15 apps |
Systematic review of apps General literacy GFI, FKGL, FRES |
Only one app reached “easy to read” criteria across all three indices | Consider specific needs of target population | 1.2*: Understand their motivations |
Target a GFI and FKGL of 6 and FRES of 70 | 2.6: Write in plain language | |||||
Decrease number/length of sentences | 3.1: Limit paragraph size. Use bullets & short lists | |||||
Involve patients in app development and user acceptance testing | 6.1: Recruit users w. limited literacy/health literacy skills | |||||
Monitor patient experience to see if readability needs to be improved | 6.4: Test whether your content is understandable and actionable | |||||
Ben-Zeev [2013] | To describe the development of a smartphone illness self-management system for people with schizophrenia. | Illinois, USA N=8 Practitioners from a psychiatric rehabilitation agency Age: not reported Sex: not reported Race/ethnicity: not reported Education: not reported |
Development of app Survey, focus group General literacy Literacy tool: not used |
Not reported | Apps and technological systems must be usable by people with low literacy levels and cognitive impairments | 1.1: Reading & cognitive processing challenges |
Deploying existing mHealth resources intended for the general population may prove problematic | 1.4: Mobile considerations 6.1: Recruit users w. limited literacy/health literacy skills 6.4: Test whether your content is understandable & actionable |
|||||
Bender [2016] | To describe promoters’ and health care providers’ current practices and experiences disseminating health education and perceptions regarding visuals promoting physical activity and limiting sedentary behavior for a visually enhanced low-text mHealth app | Location: California, USA N=21 Eligibility: bilingual healthcare providers in low-income Latino communities Age: mean 41.2 years 86% female Race/ethnicity: not reported Education: not reported |
Evaluation of app Focus groups, qualitative interviews Health literacy Literacy tool: not used |
Not formally assessed in clinics, but difficulty frequently observed in ad-hoc assessments Limited health educational materials with visual aids exist |
Use visuals with simple text and culturally tailored themes and imaging | 3.8: Use images that help people learn |
Make sure the meaning of your image s clear to all users | ||||||
Formalize health literacy measurements | NA | |||||
Boyd [2015] | To describe the design, methodology, limitations, and results of the MyIDEA tablet app | Illinois, USA N=5 Advisers of drug-eluting stent patients Age: not reported Sex: not reported Race/ethnicity: not reported Education: not reported |
Development of app App development Health literacy Literacy tool: not used |
Identified by 5 patient advisors as a key attribute of the patient population | Write text at a sixth-grade reading level and provide narration as an additional method for individuals with low literacy to understand text-based information | 2.6: Write in plain language |
Include audio and images as supplemental information for people below a sixth-grade reading level | 3.8: Use images that help people learn | |||||
Casey [2014] | To explore patients’ views and experiences of using smartphones to promote physical activity in primary care | UK N=12 Smartphone owners Age: 17–62 (mean 42) years 75% female Race/ethnicity: not reported Education: not reported |
Evaluation of app Interviews Smartphone literacy Literacy tool: not used |
83% had emailed on phone, 33% had downloaded apps previously | Reduction, or simplifying a task to influence behavior, was evident by the reports that the app was easy to use, required basic numerical literacy, and was highly visible on the home screen | 4.1: Create a simple & engaging homepage 4.2: Label & organize content with your users in mind 4.3: Create linear information paths 4.7: Provide easy access to home & menu pages |
Ceasar [2019] | To utilize focus groups for gathering qualitative data to inform the development of an app that promotes physical activity among African American women in Washington, DC | Washington DC and Maryland, USA N=16 African American women in low-income areas of Washington DC Age: 51–74 (mean 62.1) years Bachelor+: 63% |
Development of app Focus group Technology literacy Literacy tool: not used |
Technology literacy identified as a perceived barrier towards using apps to promote physical activity | Use focus groups as a collaborative tool to inform app development | 2.1: Identify user motivations and goals. 6.1: Recruit users with limited literacy skills 6.4: Test whether your content is understandable and actionable |
Increase relatability with local information | 5.3: Provide tailored information | |||||
Check-ins or IT support to address technical difficulties | 6.4: Test whether your content is understandable and actionable 6.8: Test on mobile |
|||||
Chaudry [2013] | To (1) determine whether the interface design can help a low literacy population accurately estimate portion sizes of various liquids; and (2) to confirm the successful results of our previous study when the interface is high fidelity |
Indiana, USA African American patients with chronic kidney disease Study 1 (n=10): mean age 58 years, 40% female Study 2 (n=18): mean age 53 years, 72% female Education: not reported |
Development and evaluation of app Health literacy REALM |
Literacy only assessed in study 1 because participants were embarrassed to complete the test in front of peers 6/10 patients read below a 9th grade level People with varying literacy skills were able to comprehend and navigate the design of the interface to search and select specific portion sizes |
Recommend using literacy tests other than REALM to reduce discomfort when speaking aloud | 1.1: Reading and cognitive processing challenges 6.2: Identify and eliminate logistical barriers to participation |
Connelly [2016] | To provide a case study of design of an ecological momentary assessment mobile app for a low-literacy population. | New York, USA N=41 Farming population of Mexican American women, primarily Spanish speaker, none completed college Ages: 18–45 years for Phases 1–3, mean age 28.8 years for Phase 3 |
Development and evaluation of app Focus groups Health literacy SAHL-S&E, NVS |
Phase 1: Mean SAHL 16.1, 3/8 women scored as low literacy, all women struggled to complete tasks Phase 2: Literacy not assessed Phase 3: Mean SAHL 14.2, 4/11 women scored as low literacy. Mean NVS score of 1 Phase 4: 5 of 7 participants had low literacy skills |
Differences in health literacy better identified with NVS than SAHL with usability best tested in situ | 1.1: Reading and cognitive processing challenges 6.3: Create plain language testing materials |
Interface has larger pictures with short labels and could be read aloud | 3.8: Use images that help people learn | |||||
Mobile app focus groups to explore app design | 6.1: Recruit users with limited literacy skills | |||||
Provide a case study of design of an ecological momentary assessment mobile app for a low-literacy population | 6.4: Test whether your content is understandable and actionable | |||||
Iterative, user-centered design process with focus groups was essential for designing the app rather than merely replacing words with icons and/or audio | 6.4: Test whether your content is understandable and actionable | |||||
Coughlin [2017] | To develop an app to provide women with information about how they can reduce their risk of breast cancer through healthy behaviors. | Washington and Georgia, USA N=5 Women interested in breast cancer risk reduction Age: not reported Race/ethnicity: not reported Education: not reported |
Development of app eHealth literacy Literacy tool: not used |
Not reported | Varying levels of eHealth literacy will be addressed by using simple navigation features and providing straightforward instructions about how to use the app and connect it to commercially available products | 1.2: Understanding navigation 2.5: Provide specific action steps |
It will be possible for women to use the app without interfacing with commercial Internet sites | NA | |||||
Dev [2019] | To present feedback on a family planning app. | Kenya N=42 Postpartum women (n=25) and family planning providers (n=17) from maternal and child health clinic Age: 14–21 years for patients (n=15) and 18–58 years for providers Education: not reported |
Development and evaluation of app Interviews General literacy and technology literacy Literacy tool: not used |
General and technological literacy were seen as potential barriers | Increasing graphics, audio, and video were recommended to overcome literacy barriers | 5.1: Share information through multimedia. 5.2: Design intuitive interactive graphics and tools |
Dunn-Lopez [2020] | To determine: readability, types of functions, and linkage to authoritative sources of evidence for self-care focused mHealth apps targeting heart failure available in the Apple and Google Play Stores | N=10 apps Inclusion: mHealth apps targeting patients with heart failure in the Apple and Google Play app stores |
Review of apps General literacy FKGL |
Average reading grade level 9.35 Only 1 app had a reading grade level of <6th grade |
Essential elements in providing health literate content at a 6th grade reading level include plain language, short sentences, brief paragraphs, bulleted or numbered lists, and actionable content | 2.5: Provide specific action steps 2.6: Write in plain language 3.1: Limit paragraph size. Use bullets and short lists |
Fontil [2016] | To (I) adapt the literacy level and cultural relevance of online program content for low-income, underserved populations; and (II) test the feasibility and acceptability of the modified program | California, USA Sample size not reported Low-income prediabetes patients at a large safety net clinic Age: not reported Sex: not reported Race/ethnicity: not reported Education: not reported |
Development and evaluation of app Focus groups Technology literacy Literacy tool: not used |
Not reported | In addition to simplifying overall language, we simplified explanations of scientific concepts, preserving core concepts while improving understandability | 2.3: Describe the health behavior – just the basics |
To address concerns about the complexity of the curriculum, we adapted the readability level of each lesson (originally 9th grade or higher) to mostly a 5th-grade level or below | 2.6: Write in plain language | |||||
Creating technical assistance tools for various stages of the program to address lower technology literacy | 6.5: Use moderators who have experience with users with limited literacy skills | |||||
Gibbons [2014] | To explain health information technology (HIT) universal design principles derived from the human factors engineering literature that can help to overcome potential usability and/or patient safety issues that are associated with unrecognized, embedded assumptions about cultural groups when designing HIT systems | Review study | Reviews of app studies Health literacy, IT literacy Literacy tool: not used |
Not reported | Use disparities-oriented use cases when designing an app | NA |
Use symbols that have been found to be common across culture | 3.8: Use images that help people learn | |||||
Include a target population with low health literacy during usability evaluation | 6.1: Recruit users with limited literacy skills—and limited health literacy skills | |||||
Giunti [2018] | To examine how mHealth can facilitate physical activity among those with multiple sclerosis (MS) and understand the motivational aspects behind adoption of mHealth solutions for MS. | Switzerland Patients with MS and healthcare professionals N=12 patients, 12 professionals Age: 35–62 years for patients, 26–64 years for professionals 50% female Race/ethnicity not reported Bachelors+: 33% |
Evaluation of app Mixed methods: focus group, interview, survey eHealth literacy eHEALS |
Patient eHealth literacy median score 17.75 (IQR 11–28.5) | Personas were created to represent persons with MS at different eHealth and health literacy levels | 3.8: Use images that help people learn |
Huang [2015] | To enhance foreign visitors’ capabilities in communication during exchange information with local foreign doctors by developing an effective patient-physician communication mobile system | Austria and Taiwan Sample size not reported Foreign students seeking medical care and physicians Age: not reported Sex: not reported Race/ethnicity: not reported Education: not reported |
Evaluation of app Case studies and interviews Health literacy Literacy tool: used but the name of the tool not reported |
Foreign patients scored significantly higher after exposure to eHealth system compared to before the exposure | The voice-to-text bilingual function will be used to assist the patients with low health literacy | 5.1: Share information through multimedia |
Lord [2016] | To explore provider and staff perceptions of implementation of the A-CHESS mobile recovery support app with clients in 4 addiction service settings | United States N=12 Clinicians and administrators from 4 agencies that serve people with substance use disorders Age: 25–53 years 50% female 91% white Education: not reported |
Evaluation of app Qualitative interviews, deductive analysis guided by CFIR model General literacy and technology literacy Literacy tool: not used |
Not reported | Use speech-to-text functionality to help individuals with low literacy | 5.1: Share information through multimedia |
Mackert [2017] | To explore the perceived role of men in prenatal health, use of an e-health application, and participant suggested ways of improving the application | Texas, USA N=23 General population of adult males Age: mean 26.0 years 52% white 100% some postsecondary ed |
Evaluation of app Semi-structured interview Health literacy NVS |
Average NVS score 5.3, suggesting the sample had adequate levels of health literacy Can balance between being broad applicability and individualization, regardless of level of health literacy users have |
Need app design to be engaging and interactive, from adding videos and games inside the application, to personalizing the experience, to changing font size and color | 5.1: Share information through multimedia 5.2: Design intuitive interactive graphics and tools |
Encouraged dynamic personalization allowing users to input personal data | 5.3: Provide tailored information | |||||
Miller [2017] | To determine whether patients from vulnerable populations could successfully navigate and complete an mHealth patient decision aid | North Carolina, USA Patients due for colorectal cancer screening N=450 Age: 50–74 years 53.8% female 37.6% African American Education: not reported |
Evaluation of app Secondary usability analysis Health literacy Literacy tool: validated item, “How confident are you filling out medical forms by yourself?” |
36.9% with limited health literacy | Design apps for those with low health literacy and low computer literacy: use a simple interface displaying only one question per screen with large response buttons, similar to what would be found at an automated teller machine or self-checkout kiosk | 1.4: Mobile considerations 5.4: Create user-friendly forms and quizzes |
Use simple language and include audio narration to assist those with literacy barriers | 2.6: Write in plain language 5.1: Share info through multimedia |
|||||
Monkman [2013] | To (I) adapt a set of existing guidelines for the design of consumer health Web sites into evidence-based heuristics; and (II) apply the heuristics to evaluate a mobile app | NA | Evaluation of app using heuristics Health literacy Literacy tool: not used |
As the heuristic evaluation yielded valuable recommendations for improving the app, this approach (based on modifying evidence-based design guidelines) to developing heuristics for investigating usability and health literacy appeared to be successful | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices | 1.4: Mobile considerations 2.2: Put the most important information first 2.4: Stay positive. Include the benefits of taking action. 2.5: Provide specific action steps 2.6: Write in plain language 2.7: Check content for accuracy Category 3: display: 3.1–3.11 Category 4: organize: 4.1–4.10 5.1: Share information through multimedia 5.3: Provide tailed information 5.5: Consider social media sharing options |
Mueller [2020] | To develop, pilot, and assess a serious game for mobile devices that teaches geohazard, maternal, and neonatal health messages | Nepal N=71 Age: mean 40 years Education: informal to bachelor’s degree |
Development and evaluation of app Observation and focus groups Development and field evaluation of a game designed for individuals with low literacy General literacy Literacy tool: not used |
Not reported | Co-design of images with intended population | 1.1*: ID your users. Who are they? |
Pictogram sets can be switched out for newly designed pictograms that are contextualized and localized to other study areas, countries, or topics | 3.8: Use images that help people learn (choose realistic images) | |||||
Muscat [2020] | To develop an intervention that addressed health literacy for Australian adults with kidney failure requiring dialysis to promote active patient participation in CKD management and decision-making | Location: Australia No participants other than the research team |
Development of app Health literacy Literacy tool: not used |
Not reported | Calculate readability statistics | 5.4: Create user-friendly forms & quizzes 6.4: Test whether your content is understandable & actionable |
Apply the Patient Education Materials Assessment Tool | 6.4: Test whether your content is understandable & actionable | |||||
Supplement written content with audiovisual formats | 5.1: Share information through multimedia | |||||
Incorporate micro-learning and interactive quizzes | 5.4: Create user-friendly forms & quizzes | |||||
Improve literacy skills with question prompt lists, volitional help sheets, and skills training | NA | |||||
Ownby [2012] | To evaluate the extent to which an electronic intervention targeting health literacy and organized by the elements of the Information-Motivation-Behavioral Skills model could improve patients’ health literacy and medication adherence. | Location: Florida, USA N=124 Persons with HIV Age: 20–67 (mean 47.1) years 29% female 63.4% African American, 36.6% white 10.5% college graduates |
Evaluation of app Health literacy TOHFLA |
Mean TOFHLA: numeracy 46.02, reading 42.46, total: 88.48 6 participants with inadequate health literacy, 10 marginal, 108 adequate Intervention led to greater increases in adherence among those with lower numeracy and lower baseline adherence |
Present numeric dosing data in a graphic calendar format | 5.2: Design intuitive interactive graphics and tools |
Poduval [2018] | To determine whether there was evidence of a digital divide when a Web-based self-management program for type II diabetes mellitus was integrated into routine care | London, UK N=330 Adults with type 2 diabetes Age: mean 58.4 years 44.5% female 45.5% white 48.8% bachelors or more |
Evaluation of app Retrospective analysis General and digital health literacy Literacy tool: not used |
Not reported | Consideration of literacy levels and audio/visual media for usability | 2.6: Write in plain language 5.1: Share information through multimedia |
Text written for people with a reading age of 12, all essential information was provided in video as well as text format | 2.6: Write in plain language 5.1: Share information through multimedia |
|||||
Personal stories included | 5.3: Provide tailored information | |||||
Povey [2020] | To use a participatory design research approach to understand adaptations which might improve the engagement, reach and acceptability of this resource from the perspective of Aboriginal and Torres Strait Islander youth | Torres Strait N=120 Co-design group: Islander youth (n=45) aged 10– 18 years, 47% female, 93% in secondary school Survey: Islander people (n=75), 51% under 18, 60% female |
Evaluation of app Mixed methods, participatory design General literacy and mental health literacy Literacy tool: not used |
Not reported | Engagement via humor, music, vibrant colors, relatable images, and stories about positive change | 2.4: Stay positive. Include benefits of taking action 5.1: Share info through multimedia |
Audio and intuitive visuals encouraged for lower literacy | 3.8: Use images that help people learn (choose realistic images – photos of “real” people) 5.1: Share information through multimedia |
|||||
Sustain use with customization, interactive activities, and challenges/records of progress over time and options for sharing | 5.3: Provide tailored information | |||||
Povey [2016] | To explore Aboriginal and Torres Strait Islander community members’ experiences of using two culturally responsive e-mental health apps and identify factors which influence the acceptability of these approaches | Torres Strait N=9 People identifying as Aboriginal/Torres Strait Islanders who spoke English and did not have a severe level of mental illness. Age: 18–60 years 66.7% female Education: not reported |
Development and evaluation of app Qualitative focus group General literacy Literacy tool: not used |
Not reported | Graphics and animation perceived as supporting motivation | 2.4: Stay positive. include benefits of taking action (give users motivation to make a change) |
Culturally relevant graphics, voices, animation, and optional short video clips may assist in engagement with content and overcome literacy issues. | 3.8: Use images that help people learn (choose realistic images) | |||||
Schnall [2015] | To understand the perceived ease of use, usefulness, risk and trust that contribute to behavioral intention to use a mobile app for meeting the healthcare needs of persons living with HIV (PLWH) | Location: New York, USA N=80 PLWH and clinicians/case managers PLWH (n=50): ages 18–59 years, 26% female, 52% Black, 50% Hispanic Clinicians/case managers (n=30): ages 23–62 years, 83.3% female, 56.7% white, 20% Hispanic Education: not reported |
Evaluation of app Qualitative participatory design via focus groups, app evaluation General and technology literacy Literacy tool: not used |
Not measured | App should not rely on internet connectivity | NA |
Present information with simplicity | 2.2: Put the most important information first 2.3: Describe the health behavior – just the basics 2.6: Write in plain language |
|||||
Siedner [2015] | To identify predictors of uptake of an mHealth application for a low-literacy population of people living with HIV (PLWH) in rural Uganda and; evaluate the efficacy of various short message service (SMS) text message formats to optimize the balance between confidentiality and accessibility | Location: SW Uganda N=385 PLWH undergoing CD4 testing and can access a mobile phone Age: median 32 years 65.2% female 10.6% post-secondary |
Evaluation of app Secondary data analysis from randomized clinical trial General literacy Ability to read a complete sentence |
Confirmed literacy at the time of enrollment was a robust predictor of SMS text message receipt, identification, and appropriate response for PLWH in rural Uganda | Coded messages, which obviate the need for literacy, were as effective as direct messages and might augment privacy | 1.1: Reading & cognitive processing challenges |
In person confirmation of mobile phone competency was highly predictive and should be considered for future similar interventions where possible End-user characteristics, particularly literacy and technology experience are important predictors of an mHealth intervention for PLWH in rural Uganda |
1.1*: ID your users. Who are they? | |||||
Thorough assessments of end-user written literacy and technology experience should be made before and during implementation design | 6.1: Recruit users with limited literacy skills – and limited health literacy skills | |||||
Coded messages can have similar efficacy as text messages, while maintaining confidentiality | 6.4: Test whether your content is understandable and actionable | |||||
Sox [2010] | To create an interface for parents of children with ADHD to enter disease-specific information to facilitate data entry with minimal task burden | Massachusetts, USA N=17 English or Spanish speaking parents who are primary caretakers of a school-aged child with ADHD Age: not reported Sex: not reported Race/ethnicity: not reported Education: not reported |
Development and evaluation of app Needs analysis, usability testing, and performance testing Health literacy TOFHLA |
2/10 participants with lower health literacy (<81) | Alternative text explanations and audio files to support lower health literacy | 5.1: Share information through multimedia |
Acknowledge tension between expectations of a highly-educated parent and a parent with limited health knowledge | NA | |||||
Srinivas [2019] | To report a case study involving the design and evaluation of a mobile ecological momentary assessment (EMA) tool that supports context-sensitive EMA-reporting of location and social situations accompanying eating and sedentary behavior | Midwest, USA N=59 Obese women a referred to HealthyMe program Age: 35–64 years 83% Black, 17% White 61% college |
Development and evaluation of app Focus groups, semi-structured interviews, prototype testing, 2 field trials Health literacy NVS |
59.3% low health literacy | Specific to reducing burden while capturing a user response, we suggest designing a system that uses simple-worded, direct questions with fewer words that are easier to read and quicker for the participant to understand and has simple response options that are easier to read, quicker for the participant to understand and select from | 3.1: Limit paragraph size. Use bullets & short lists 5.2: Design intuitive interactive graphics & tools 5.4: Create user-friendly forms & quizzes |
Wildenbos [2018] | To synthesize literature on aging barriers to digital (health) computer use, and explain, map and visualize these barriers in relation to the usability of mHealth by means of a framework | NA | Review of app studies Scoping review Computer literacy Literacy tool: not used |
Cognitive barriers impact satisfaction via diminishing age dependent abilities (numeracy & representational fluency) Motivational barriers impact learnability via diminishing age-dependent computer literacy |
Encourage designers, programmers and developers should be to create mHealth interventions with inclusive design, or flexible enough to be usable by people with no limitations as well as by people with functional limitations related to disabilities or old age | 3.11: Make your site accessible to people with disabilities |
The MOLD-US framework can aid mHealth designers in inclusive design efforts. The visual overview of MOLD-US enables a quick assessment of aging barriers and medical conditions that involve deteriorating capacity | 3.11: Make your site accessible to people with disabilities | |||||
Wildenbos [2019] | To assess usability problems older patients, encounter in two mHealth apps, aiming show the value of MOLD-US, a recent aging barriers framework, as a classification tool to identify the intrinsic cause of these problems. | Netherlands N=23 Age >50 years, can read Dutch Sex: not reported Education: not reported |
Evaluation of app Case-study Computer literacy Think Aloud method |
28 high severe usability issues of the mHealth apps were identified Core natures were related to motivational and cognitive barriers of older adults Participants had difficulties understanding app navigation structure, missing important text, buttons and icon elements |
Cognitive load should be minimized, i.e., by a clear navigational structure and aligning an interface with expectations of older adults mind | 4.2: Label and organize content with your users in mind |
Advise to put more emphasis on addressing motivational barriers of older adults within user interface design and guidelines | 1.2*: Understanding their motivations. Why are they here? 2.1: Identify user motivations & goals. Why are they here? |
|||||
User-interface design elements such as font size and buttons should be adjusted to the older adult user population. | 3.3: Use a readable font that’s at least 16 pixels | |||||
Advise to involve older populations as co-creators in the requirements analysis and design phases when developing mHealth | 6.1: Recruit users with limited literacy skills – and limited health literacy skills | |||||
Usability evaluation approaches may need adjustments to prevent reporter bias and become better suited for testing mHealth services with the older adult and chronically ill patient populations | 6.2: ID & eliminate logistical barriers to participation | |||||
Colored information visuals explaining navigation and consequences of decision could be used as a decision aid tool since these types of visuals have a positive effect on the accuracy of the decisions made by older adults in eHealth tools | NA | |||||
Using feedback messages in interfaces should not only inform users on (the result of) their actions but should also offer the user options to recover from wrong actions and return to previously retrieved information or actions | 1.2: Understanding Navigation | |||||
A clear (video) instruction on how to use an app should be given when older users register for an app, including an aid to return to this instruction during any point in an app’s usage” | NA |
*, Recommendation from 2010 guidelines. CKD, chronic kidney disease; HIV, human immunodeficiency virus; ADHD, attention-deficit hyperactivity disorder; eHEALS, eHealth Literacy Scale; CFIR, Consolidated Framework for Implementation Research; VIC, Patient Centered Virtual Multimedia Interactive Informed Consent tool; IQR, Interquartile range; GFI, Gunning Fog Index; FKGL, Flesch-Kincaid Grade Level; FRES, Flesch Reading Ease Score; NVS, Newest Vital Sign; REALM, Rapid Estimate of Adult Literacy in Medicine; SAHL-S&E, Short Assessment of Health Literacy-Spanish and English; TOFHLA, Test of Functional Health Literacy in Adults; HLO, Health Literacy Online; NA, not applicable.
Table 2
HLO strategy | Author/year | Recommendation |
---|---|---|
Section 1 [2015]: What we know about users with limited literacy skills | ||
1.1 Reading & cognitive processing challenges | Ben-Zeev [2013] | Apps and technological systems must be usable by people with low literacy levels and cognitive impairments |
Chaudry [2013] | Recommend using literacy tests other than REALM to reduce discomfort when speaking aloud | |
Connelly [2016] | Differences in health literacy better identified with Newest Vital Sign than Short Assessment of Health Literacy with usability best tested in situ | |
Siedner [2015] | Coded messages, which obviate the need for literacy, were as effective as direct messages and might augment privacy | |
1.2 Understanding navigation | Coughlin [2017] | Varying levels of eHealth literacy will be addressed by using simple navigation features and providing straightforward instructions about how to use the app and connect it to commercially available products |
Wildenbos [2019] | Using feedback messages in interfaces should not only inform users on (the result of) their actions but should also offer the user options to recover from wrong actions and return to previously retrieved information or actions | |
1.3 Using search | N/A | NA |
1.4 Mobile considerations | Ben-Zeev [2013] | Deploying existing mHealth resources intended for the general population may prove problematic |
Miller [2017] | Design apps for those with low health literacy and low computer literacy: use a simple interface displaying only one question per screen with large response buttons, similar to what would be found at an automated teller machine or self-checkout kiosk | |
Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices | |
Section 1 [2010]: Learn about your users & their goals | ||
1.1 ID your users. Who are they? | Mueller [2020] | Co-design of images with intended population |
Siedner [2015] | In person confirmation of mobile phone competency was highly predictive and should be considered for future similar interventions | |
End-user characteristics, particularly literacy and technology experience are important predictors of an mHealth intervention for PLWH in rural Uganda. | ||
1.2 Understanding their motivations. Why are they here? | Bahadori [2020] | Consider specific needs of target population |
Wildenbos [2019] | Advise to put more emphasis on addressing motivational barriers of older adults within user interface design and guidelines | |
1.3 Understanding their goals. What are they trying to do? | N/A | NA |
Section 2: Write actionable content | ||
2.1 Identify user motivations & goals. Why are they here? | Ceasar [2019] | Use focus groups as a collaborative tool to inform app development |
Wildenbos [2019] | Advise to put more emphasis on addressing motivational barriers of older adults within user interface design and guidelines | |
2.2 Put the most important information first | Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices |
Schnall [2015] | Present information with simplicity | |
2.3 Describe the health behavior – just the basics | Fontil [2016] | In addition to simplifying overall language, we simplified explanations of scientific concepts, preserving core concepts while improving understandability |
Schnall [2015] | Present information with simplicity | |
2.4 Stay positive. Include the benefits of taking action | Povey [2020] | Engagement via humor, music, vibrant colors, relatable images, and stories about positive change |
Povey [2016] | Graphics and animation perceived as supporting motivation | |
Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices | |
2.5 Provide specific action steps | Dunn-Lopez [2020] | Essential elements in providing health literate content at a 6th grade reading level include plain language, short sentences, brief paragraphs, bulleted or numbered lists, and actionable content |
Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices | |
Coughlin [2017] | Varying levels of eHealth literacy will be addressed by using simple navigation features and providing straightforward instructions about how to use the app and connect it to commercially available products | |
2.6 Write in plain language | Abujarad [2018] | Desired reading level was at 8th grade level |
Bahadori [2020] | Target a Gunning Fog Index and Flesch Kincaid Grade Leve of 6 and Flesch Reading Ease Score of 70 | |
Boyd [2015] | Write text at a sixth-grade reading level and provide narration as an additional method for individuals with low literacy to understand text-based information | |
Dunn-Lopez [2020] | Essential elements in providing health literate content at a 6th grade reading level include plain language, short sentences, brief paragraphs, bulleted or numbered lists, and actionable content | |
Fontil [2016] | To address concerns about the complexity of the curriculum, we adapted the readability level of each lesson (originally 9th grade or higher) to mostly a 5th-grade level or below | |
Miller [2017] | Use simple language and include audio narration to assist those with literacy barriers | |
Poduval [2018] | Consideration of literacy levels and audio/visual media for usability | |
Text written for people with a reading age of 12, all essential information was provided in video as well as text format | ||
Schnall [2015] | Present information with simplicity | |
Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices | |
2.7 Check content for accuracy | Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices |
Section 3: Display content clearly on the page | ||
3.1 Limit paragraph size. Use bullets & short lists | Bahadori [2020] | Decrease number/length of sentences |
Dunn-Lopez [2020] | Essential elements in providing health literate content at a 6th grade reading level include plain language, short sentences, brief paragraphs, bulleted or numbered lists, and actionable content | |
Srinivas [2019] | Specific to reducing burden while capturing a user response, we suggest designing a system that uses simple-worded, direct questions with fewer words that are easier to read and quicker for the participant to understand and has simple response options that are easier to read, quicker for the participant to understand and select from | |
3.2 Use meaningful headings | Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices |
3.3 Use a readable font that’s at least 16 pixels | Wildenbos [2019] | User-interface design elements such as font size and buttons should be adjusted to the older adult user population |
Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices | |
3.4 Use white space & avoid clutter | Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices |
3.5 Keep the most important information above the fold – even on mobile | Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices |
3.6 Use links effectively | Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices |
3.7 Use color or underline to ID links | Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices |
3.8 Use images that help people learn | Bender [2016] | Use visuals with simple text and culturally tailored themes and imaging |
Boyd [2015] | Include audio and images as supplemental information for people below a sixth-grade reading level | |
Connelly [2016] | Interface has larger pictures with short labels and could be read aloud | |
Gibbons [2014] | Use symbols that have been found to be common across culture | |
Giunti [2018] | Personas were created to represent persons with MS at different eHealth and health literacy levels | |
Mueller [2020] | Pictogram sets can be switched out for newly designed pictograms that are contextualized and localized to other study areas, countries or topics | |
Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices | |
Povey [2020] | Audio and intuitive visuals encouraged for lower literacy | |
Povey [2016] | Culturally relevant graphics, voices, animation, and optional short video clips may assist in engagement with content and overcome literacy issues | |
3.9 Use appropriate contrast | Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices |
3.10 Make web content printer friendly | Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices |
3.11 Make your site accessible to people with disabilities | Abujarad [2018] | Text-to speech translation is a key feature of VIC and is achieved by online and automated text-to-speech translation |
Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices | |
Wildenbos [2018] | Encourage designers, programmers and developers should be to create mHealth interventions with inclusive design, or flexible flexible enough to be usable by people with no limitations as well as by people with functional limitations related to disabilities or old age | |
The MOLD-US framework can aid mHealth designers in inclusive design efforts. The visual overview of MOLD-US enables a quick assessment of aging barriers and medical conditions that involve deteriorating capacity | ||
3.12 Make websites responsive | NA | NA |
3.13 Design mobile content to meet mobile user’s needs | NA | NA |
Section 4: Organize content & simplify navigation | ||
4.1 Create simple & engaging homepage | Casey [2014] | Reduction, or simplifying a task to influence behavior, was evident by the reports that the app was easy to use, required basic numerical literacy, and was highly visible on the home screen |
Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices | |
4.2 Label & organize content with your users in mind | Casey [2014] | Reduction, or simplifying a task to influence behavior, was evident by the reports that the app was easy to use, required basic numerical literacy, and was highly visible on the home screen |
Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices | |
Wildenbos [2019] | Cognitive load should be minimized, i.e., by a clear navigational structure and aligning an interface with expectations of older adults | |
4.3 Create linear information paths | Casey [2014] | Reduction, or simplifying a task to influence behavior, was evident by the reports that the app was easy to use, required basic numerical literacy, and was highly visible on the home screen |
Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices | |
4.4 Give buttons meaningful labels | Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices |
4.5 Make clickable elements recognizable | Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices |
4.6 make sure the browser “Back” button works. | Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices |
4.7 Provide easy access to home & menu pages | Casey [2014] | Reduction, or simplifying a task to influence behavior, was evident by the reports that the app was easy to use, required basic numerical literacy, and was highly visible on the home screen |
Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices | |
4.8 Give users options to browse | Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices |
4.9 Include a simple search | Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices |
4.10 Display search results clearly | Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices |
Section 5: Engage Users | ||
5.1 Share information through multimedia | Abujarad [2018] | Text-to-speech interfaces addresses literacy issues and makes the IC process an option for inexperienced computer users |
Dev [2019] | Increasing graphics, audio, and video were recommended to overcome literacy barriers | |
Huang [2015] | The voice-to-text bilingual function will be used to assist the patients with low health literacy | |
Lord [2016] | Use speech-to-text functionality to help individuals with low literacy | |
Mackert [2017] | Need app design to be engaging and interactive, from adding videos and games inside the application, to personalizing the experience to changing font size and color | |
Miller [2017] | Use simple language and include audio narration to assist those with literacy barriers | |
Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices | |
Muscat [2020] | Supplement written content with audiovisual formats | |
Poduval [2018] | Consideration of literacy levels and audio/visual media for usability | |
Text written for people with a reading age of 12, all essential information was provided in video as well as text format | ||
Povey [2020] | Engagement via humor, music, vibrant colors, relatable images, and stories about positive change | |
Audio and intuitive visuals encouraged for lower literacy | ||
Sox [2010] | Alternative text explanations and audio files to support lower health literacy | |
5.2 Design intuitive interactive graphics & tools | Dev [2019] | Increasing graphics, audio, and video were recommended to overcome literacy barriers |
Mackert [2017] | Need app design to be engaging and interactive, from adding videos and games inside the application, to personalizing the experience to changing font size and color | |
Ownby [2012] | Present numeric dosing data in a graphic calendar format | |
Srinivas [2019] | Specific to reducing burden while capturing a user response, we suggest designing a system that uses simple-worded, direct questions with fewer words that are easier to read and quicker for the participant to understand and has simple response options that are easier to read, quicker for the participant to understand and select from | |
5.3 Provide tailored information | Ceasar [2019] | Increase relatability with local information |
Mackert [2017] | Encouraged dynamic personalization allowing users to input personal data | |
Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices | |
Poduval [2018] | Personal stories included | |
Povey [2020] | Sustain use with customization, interactive activities, and challenges/records of progress over time and options for sharing. | |
5.4 Create user-friendly forms & quizzes | Miller [2017] | Design apps for those with low health literacy and low computer literacy: use a simple interface displaying only one question per screen with large response buttons, similar to what would be found at an automated teller machine or self-checkout kiosk |
Muscat [2020] | Calculate readability statistics | |
Incorporate micro-learning and interactive quizzes | ||
Srinivas [2019] | Specific to reducing burden while capturing a user response, we suggest designing a system that uses simple-worded, direct questions with fewer words that are easier to read and quicker for the participant to understand and has simple response options that are easier to read, quicker for the participant to understand and select from | |
5.5 Consider social media sharing options | Monkman [2013] | Although the majority of the recommendations from the HLO guide for Web sites were applicable for assessing mobile usability, the heuristics generated in this study may benefit from being complemented with other evidence-based heuristics specific to mobile devices |
Section 6: Test your site with users with limited literacy skills | ||
6.1 Recruit users with limited literacy skills – and limited health literacy skills | Bahadori [2020] | Involve patients in app development and user acceptance testing |
Ben-Zeev [2013] | Deploying existing mHealth resources intended for the general population may prove problematic | |
Ceasar [2019] | Use focus groups as a collaborative tool to inform app development | |
Connelly [2016] | Mobile app focus groups to explore app design | |
Gibbons [2014] | Include a target population with low health literacy during usability evaluation | |
Siedner [2015] | Thorough assessments of end-user written literacy and technology experience should be made before and during implementation design | |
Wildenbos [2019] | Advise to involve older populations as co-creators in the requirements analysis and design phases when developing mHealth | |
6.2 ID & eliminate logistical barriers to participation | Chaudry [2013] | Recommend using literacy tests other than REALM to reduce discomfort when speaking aloud |
Wildenbos [2019] | Usability evaluation approaches may need adjustments to prevent reporter bias and become better suited for testing mHealth services with the older adult and chronically ill patient populations | |
6.3 Create plain language testing materials | Connelly [2016] | Differences in health literacy better identified with Newest Vital Sign than Short Assessment of Health Literacy with usability best tested in situ |
6.4 Test whether your content is understandable and actionable | Bahadori [2020] | Monitor patient experience to see if readability needs to be improved |
Ben-Zeev [2013] | Deploying existing mHealth resources intended for the general population may prove problematic | |
Ceasar [2019] | Use focus groups as a collaborative tool to inform app development | |
Check-ins or IT support to address technical difficulties | ||
Connelly [2016] | Iterative, user-centered design process with focus groups was essential for designing the app rather than merely replacing words with icons and/or audio | |
Provide a case study of design of an ecological momentary assessment mobile app for a low-literacy population | ||
Muscat [2020] | Calculate readability statistics | |
Apply the Patient Education Materials Assessment Tool | ||
Siedner [2015] | Coded messages can have similar efficacy as text messages, while maintaining confidentiality | |
6.5 Use moderators who have experience with users with limited literacy skills | Fontil [2016] | Creating technical assistance tools for various stages of the program to address lower technology literacy |
6.6 Pretest your moderator’s guide | NA | NA |
6.7 Use multiple strategies to make sure participants understand what you want them to do | NA | NA |
6.8 Test on mobile | Ceasar [2019] | Check-ins or IT support to address technical difficulties |
HLO, Health Literacy Online; REALM, Rapid Estimate of Adult Literacy in Medicine; NA, not applicable; PLWH, persons living with HIV; IC, informed consent; IT, information technology.
Results of individual sources of evidence
To answer the first research question “How is health literacy addressed in mHealth app development?” We described the characteristics of the study sample, methods and results in Table 1. Also, in Table 2 we included recommendations authors made to address health literacy. To answer the second research question “How is evaluation of health literacy addressed in mHealth apps?” We included the health literacy tool used in the study and presented the recommendation authors made to evaluate health literacy in Table 2.
Synthesis of results
Evaluation of health literacy
Neither of the two reviews of app studies, nor the five studies which described the development of an app used a health literacy tool (32,33,45-49); of the two reviews of publicly available apps, one used the Gunning Fog Index (GFI) (34) and the other one used the Flesch-Kincaid Grade Level (FKGL) (35). Of the nine studies which described the development and evaluation of an app, five did not use any health literacy tools (36,39-42) whereas the four other studies used at least one health literacy tool such as the Rapid Estimate of Adult Literacy in Medicine (REALM) (37), Test of Functional Health Literacy in Adults (TOFHLA) (43), Short Assessment of Health Literacy-Spanish and English (SAHL-S&E) (38), or Newest Vital Sign (NVS) (38,44). Of the 13 studies that described the evaluation of an app and the one study that described the heuristics evaluation of an app, seven did not use any health literacy tool (29,50-55) and three used an existing tool including XXXX (eHEALS) (56), NVS (57), or TOFHLA. The remaining evaluation studies had participants read a complete sentence (58), used the “Think Aloud” method (59) implemented a validated question developed in another study (60), or used a tool but did not include the name of the tool in the article (61).
HLO categories and recommendations
At least one of HLO categories (I) what we know; (II) write; (III) display; (IV) organize; (V) engage; and (VI) test, was addressed in all of 32 articles reviewed in this study. Examples of recommendations from each of the categories are given below. Details are presented in Table 2.
Category 1 What we know
This category included thirteen recommendations addressed by ten articles (29,34,37-39,41,45,58-60); 5 HLO strategies were incorporated into the 10 articles with Monkman and Kushniruk (in 2013) speaking to one (29). Ben-Zeev et al. (in 2013), who conducted a survey and a focus group with psychiatric rehabilitation agency patients and practitioners indicated that “deploying existing mHealth resources identified for the general population may prove problematic” (45). They commented that a user-friendly mobile app needs to be developed for people with schizophrenia. This app should avoid distracting and superfluous elements; use minimal steps to access content; utilize simple screen arrangements, sentence composition, and concrete wording; include memory aids (e.g., “continue” button on the bottom of the screen); and use an interface organized using a simple geometry. Bahadori et al. (in 2020), who reviewed 15 apps for patients undergoing hip and knee replacement, recommended identifying and taking into consideration the specific needs of a target population when designing the app (34). They recommended involving patients in app development and acceptability assessments. Further, they suggested conducting ongoing monitoring of patient experience to determine if readability of the content needs to be addressed.
Category 2 Write
This category included 18 recommendations addressed by 14 articles (29,34-36,39,40,42,46,47,53-55,59,60); six HLO strategies were incorporated into the 14 articles with Monkman and Kushniruk (in 2013) speaking to five HLO strategies. Fontil et al. (in 2016) who conducted focus groups with low-income prediabetes patients at a large safety net clinic recommended simplifying overall language as well as simplifying explanations of scientific concepts (40). They also recommended adapting the readability level to mostly a 5th grade level or below (62). Poduval et al. (in 2018) conducted a study among 330 adults with diabetes and recommended using text written for a reading age of 12 (53). Finally, Miller et al. (in 2017) suggested using a simple interface displaying only one question on each screen and using simple language (60).
Category 3 Display
This category included 15 recommendations addressed by 15 articles (29,32-36,38,41,42,44,46,50,54,56,59); four HLO strategies were incorporated into the 15 articles with Monkman and Kushniruk (in 2013) speaking to 11 HLO strategies. In order to improve the display, the following suggestions were made: decrease the number or length of sentences (34); use simple-worded, direct questions with fewer words (44); use an interface with larger pictures and short labels (38); include visuals with simple text with culturally tailored themes and images (50); and incorporate culturally relevant graphics, voices, animation and videos (54).
Category 4 Organize
This category included five recommendations addressed by three articles (29,51,59); 4 HLO strategies were incorporated into the 3 articles and Monkman and Kushniruk (in 2013) speaking to 10 HLO strategies. Wildenbos et al. (in 2019), who conducted a study with adults 50 years and older, recommended adjusting user-interface elements such as buttons for the older adult population and minimizing cognitive overload for this population group (59). The users in their study found navigation hierarchy confusing and did not know how to return to previously shown information in the app. Wildenbos et al. (in 2019), commented that cognitive overload can be addressed by using a clear navigation structure and an interface that aligns with expectation of older adults (59). Casey et al. (in 2014) conducted an interview of smartphone owners. Participants in their study found that the app they examined was easy to use because it reduced and simplified tasks to influence behavior (51). Siedner et al. (in 2015), who conducted a study in rural Uganda, found that “ease of use plays a dominant role in technology uptake ”(58).
Category 5 Engage
This category included 24 recommendations addressed by 13 articles (36,42-44,47-49,52,53,57,60-62); four HLO strategies were incorporated into the four articles and Monkman and Kushniruk (in 2013) speaking to three HLO strategies. Two studies recommended a text-to-speech and voice-to-text bilingual function (36,61). Mackert et al. (in 2017) stated that in order to increase engagement and interaction, the app designer could add videos and games inside the app and allow users to change font size and color (57). In their study with people with multiple sclerosis (MS), Giunti et al. (in 2018) created personas to represent persons with MS for an app to promote physical activity (56). They created four different types of personas taking into consideration age, level of physical activity and motivation level.
Category 6 Test
This category included 21 recommendations addressed by ten articles (32,34,37,38,40,45,47,49,58,59); four HLO strategies were used by the ten articles. Bahadori et al. (in 2020) recommended involving patients in app development and user acceptance testing and monitoring patient experience to assess if readability needs should be improved (34). Two articles recommended using “focus groups as a collaborative tool to inform app development” and “mobile app focus groups to explore app design” (38,47). Also, Fontil et al. (in 2016) suggested to “create technical assistance tools for various stages of the program to address lower technology literacy” (40).
Recommendations not aligned with HLO guidelines
Not all recommendations fell under one of the HLO categories. For example, Coughlin et al. (in 2017) suggested that users should be able to use the app without interfacing with commercial internet (39). Muscat et al. (in 2021) stated that it is possible to improve literacy skills with “question prompt lists, volitional help sheets, and skills training (49). Sox et al. (in 2010) who conducted a study with parents with children with attention deficit hyperactivity disorder found “tension between expectations of a highly-educated parent and a parent with limited health knowledge” (43). Connelly et al. (in 2016) tested a mobile app with Mexican women aged 18–45 (38). They found that differences in health literacy and numeracy were better identified with the NVS compared with the SAHL. SAHL could assess whether the participant can read health vocabulary and recognize word meaning; however, it does not test more complex comprehension skills. Povey et al. (in 2020) who conducted a mixed methods study with Torres Strait islander youth suggested engaging with users through stories about positive health behavior changes (42).
Discussion
Our results undoubtedly suggest that there are a variety of ways in which health literacy is being considered within mHealth. Much of what is being done in this space centers around what we have learned from written material combined with the broader evolving definition of health literacy. Recognizing that when we refer to health literacy within mHealth we are going beyond the written words and expanding to how the content is placed on a small screen, or how easily one can navigate the material without challenges. Our results suggest incorporating common strategies noted among the articles within this scoping review can serve as a foundational starting point for the development of a brief screening tool which addresses the expansive nature of health literacy in mHealth.
How has health literacy been addressed in mHealth app development?
The HLO guidelines serve as a framework to understand different aspects of the health literacy of mHealth tools. By using the HLO guidelines, we were able to systematically review the articles to understand the potential gaps. Our results demonstrate that no single article addressed all of the components of the HLO guidelines except for the article by Monkman and Kushniruk (in 2013); however, the top two categories used addressed Engagement or Testing of the mHealth content.
Within the Engage category, by far the most often recommended action for app developers was to engage users via multi-media: audio, video, and interactive graphics. While text-to-speech is addressed in the HLO as a means of addressing those with disabilities, multi-media recommendations including text-to-speech are intended for individuals with low (health) literacy. The thought behind recommending multi-media is to increase engagement with the end users. For example, studies that examined the role of multi-media in general showed the positive effect of multi-media on patient engagement levels among individuals with diabetes and inflammatory bowel disease (63,64). However, it is not clear how the increased level of end-user engagement relates to health literacy and the exact role of multi-media in achieving intended outcomes. Thus, more research is needed to better understand the relationship between multi-media, engagement and health literacy.
In the Test category a majority of the recommendations were related to ensuring the usability and acceptability of the app by the target population. This occurred through the use of focus groups (38,47), readability assessments (34,49), and usability testing with end users (32). Interestingly, many of the authors engaged individuals from their app’s target population in the development stage rather than the testing stage (just the opposite of what the HLO discusses). Regardless of when the testing took place, ensuring that the material was vetted by intended end users is critical. Co-designing mHealth apps can facilitate the app design alignment of the app to account for end user behaviors and how they use the information. This may, in turn, promote adoption of mHealth and reduce barriers for mHealth use (65). Additionally, recommendations in the Test category also spoke to understanding the end user literacy and technology characteristics (58), as well as incorporating information technology (IT) check-ins or technical assistance (40).
Recommendations pertaining to Writing and Display categories of the HLO were the next most cited by the authors, followed by What We Know and then Organize. Writing comes as no surprise—recommendations about simplifying writing to accommodate those with poor literacy skills have been made for decades (66-68) and a subcategory “write in plain language” is by far the most-often recommended within this category. This recommendation is congruent with guides previously published on creating easy to understand written materials by avoiding technical jargon, lengthy sentences, lengthy paragraphs and/or with words that contain 3 or more syllables (69).
Within the display category, the most-often recommended actions regarded imagery and accessibility. Images are powerful tools and often authors’ recommendations around imagery were to provide visuals that the target audience could relate to: images of individuals who looked like the end users, images of real people, and culturally appropriate images. Culturally appropriate images may be especially important as data demonstrates that those with low literacy are often individuals from areas with the greatest socioeconomic challenges, and these areas are predominantly African American and Latinx communities (70). Recommendations around accessibility largely centered on the inclusion of screen readers/text-to-speech capabilities. While the original intent of this recommendation was no doubt to address those with physical disabilities, those with poor literacy skills would also benefit from being able to listen to rather than having to read the content of an app. Incorporating accessibility features with attention to inclusive design considerations can promote usability of mHealth to for consumers with and without noted disabilities expanding reach to a much larger audience (71).
Despite the Organize and What We Know categories being updated to include newer research regarding cognitive processing and navigation patterns in those with limited literacy skills, these were the least addressed categories among the articles being reviewed. The What We Know category discusses in depth the most recent research on how those with poor literacy skills engage with digital media sources including mobile devices, yet this category was notably absent among recommendations made in reviewed articles. This may be due to some of recommendations in this category being moved to other categories with the HLO updates. Overall, despite some categories appearing to a lesser degree than others, our findings support the use of the HLO as a guide for app development given that every category was discussed by at least one author.
Notably, there was one article, Monkman and Kushniruk (in 2013) that provided recommendations for all the HLO categories except the Test category. This article was uniquely different than the other 31 articles in that it was focused on applying a heuristics evaluation derived from the HLO guide to mobile apps. The most-frequently incorporated recommendations in the heuristics are from the Display and Organize categories, followed by the Write, Engage, and What We Know categories. The heuristics come as close to an evaluation tool as the authors were able to find in this scoping review. Yet the heuristics do not lend themselves to serve as an easily applied evaluation tool for clinicians in real world settings. Further, the heuristics appear to be intended for use in the development stage of an app versus a brief health literacy mHealth app evaluation tool.
How is the evaluation of health literacy addressed in mHealth apps?
Several of the studies addressed health literacy through the use of a formal health literacy assessment tool such as NVS or TOFHLA or through the use of a tool such as Flesch Reading Ease Score (FRES) which provided an indication of reading level. The majority of the studies, however, did not use a formal assessment but rather discussed how health literacy needs could be addressed based on input from their study participants. Importantly our results indicate the evaluation of health literacy in mHealth primarily was end-user focused and did not appear to extensively evaluate the mHealth content itself for literacy fit to a variety of individuals with limited health literacy.
Notably, engaging an end user in the design and testing of mHealth technologies seemed to be quite informative for adapting the mHealth application to meet the intended needs of a given population (65,72). Further, ensuring that the many of the same strategies used to account for variability in literacy levels for written materials are also pertinent in the delivery of content in mobile forms as well.
Developing a tool enabling clinicians to quickly evaluate apps for use within their patient populations should stem from the HLO categories which were commonly employed in the studies reviewed. For instance, ensuring that the tool is quickly and effectively able to determine the reading level of the mHealth app content would be important (73). Additionally, recognizing that literacy aspects in mHealth go beyond strictly written material but also encompasses how the material is displayed and featured to promote user engagement. Less certain is how to incorporate (I) HLO strategies that were used infrequently; (II) recommendations that did not fit into any HLO category; or (III) the use of a formal literacy assessment. Gaps remain in understanding what is the bare minimum criteria needed to screen an app and if there are categories which lend themselves to promote app suitability from a provider perspective versus a patient perspective.
Strengths and limitations
This study systematically reviewed the available evidence using the HLO framework, which made it possible to discuss concrete ideas to address different aspects of health literacy. The findings of this study could provide research and program insights to design and evaluate mHealth among individuals with a wide range of health literacy. This scoping review has some limitations. This review excluded certain types of articles including the grey literature and conference abstracts and was limited to English language articles only, which may have led to some degree of evidence omission and publication bias. As mentioned earlier, Monkman and Kushniruk (in 2013) heuristics begin to bring us closer to having a tool which could be employed to evaluate apps themselves (29). But we remain without an adequate framework for evaluating mHealth apps themselves from a health literacy perspective beyond the broader strokes of the HLO. Although content added to the 2nd edition of the HLO addresses mobile considerations, the material focuses on how the information is being delivered and accessed (e.g., small viewing device) versus providing direction on how to evaluate the content being delivered within the app itself.
Conclusions
As healthcare and public health professionals continue to leverage and increase the use of mobile apps to aid in educating patients and promoting self-care management, we need to ensure communication provided in mobile apps is suitable to the target patient populations (74). The development of a brief tool which can be easily and effectively used to evaluate content being delivered via mHealth which screens for acceptability for low health literacy populations is warranted. Future work should focus on which of the HLO recommendations are most crucial to incorporate into such a tool and which are less valuable in sifting through mHealth content. Additionally, piloting such a tool with providers working in environments where there are higher rates of low literacy populations to determine acceptability and feasibility of incorporating this into clinical practice is needed.
Acknowledgments
We would like to acknowledge Kimberly Harp MLS, for the assistance with conducting the search for this scoping review.
Funding: None.
Footnote
Reporting Checklist: The authors have completed the PRISMA-ScR reporting checklist. Available at https://mhealth.amegroups.com/article/view/10.21037/mhealth-22-11/rc
Conflicts of Interest: All authors have completed the ICMJE uniform disclosure form (available at https://mhealth.amegroups.com/article/view/10.21037/mhealth-22-11/coif). The authors have no conflicts of interest to declare.
Ethical Statement: The authors are accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.
Open Access Statement: This is an Open Access article distributed in accordance with the Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International License (CC BY-NC-ND 4.0), which permits the non-commercial replication and distribution of the article with the strict proviso that no changes or edits are made and the original work is properly cited (including links to both the formal publication through the relevant DOI and the license). See: https://creativecommons.org/licenses/by-nc-nd/4.0/.
References
- Kutner M, Greenburg E, Jin Y, et al. The Health Literacy of America's Adults: Results from the 2003 National Assessment of Adult Literacy. NCES 2006-483. National Center for Education Statistics; 2006.
- Vernon JA, Trujillo A, Rosenbaum S, et al. Low health literacy: Implications for national health policy. Washington, DC: Department of Health Policy, School of Public Health and Health Services, The George Washington University; 2007.
- National Action Plan to Improve Health Literacy. Washington, DC: U.S. Department of Health and Human Services, Office of Disease Prevention and Health Promotion; 2010.
- Sørensen K, Pelikan JM, Röthlin F, et al. Health literacy in Europe: comparative results of the European health literacy survey (HLS-EU). Eur J Public Health 2015;25:1053-8. [Crossref] [PubMed]
- Amoah PA, Phillips DR. Socio-demographic and behavioral correlates of health literacy: a gender perspective in Ghana. Women Health 2020;60:123-39. [Crossref] [PubMed]
- Harsch S, Jawid A, Jawid E, et al. Health Literacy and Health Behavior Among Women in Ghazni, Afghanistan. Front Public Health 2021;9:629334. [Crossref] [PubMed]
- Tolabing MCC, Co KCD, Mendoza OM, et al. Prevalence of Limited Health Literacy in the Philippines: First National Survey. Health Lit Res Pract 2022;6:e104-12. [Crossref] [PubMed]
- Nutbeam D. The evolving concept of health literacy. Soc Sci Med 2008;67:2072-8. [Crossref] [PubMed]
- Gupta R KC, Griessenauer C, Thomas A, et al. Overview and history of health literacy in the United States and the epidemiology of low health literacy in healthcare. The Evolution of Health Literacy: Empowering Patients through Improved Education; 2017:1-16.
- Health Literacy in Healthy People 2030: U.S. Department of Health and Human Services, Office of Disease Prevention and Health Promotion. Available online: https://health.gov/healthypeople/priority-areas/health-literacy-healthy-people-2030
- Nutbeam D. Health literacy as a public health goal: a challenge for contemporary health education and communication strategies into the 21st century. Health Promotion International 2000;15:259-67. [Crossref]
- Eysenbach G. What is e-health? J Med Internet Res 2001;3:E20. [Crossref] [PubMed]
- mHealth: new horizons for health through mobile technologies: second global survey on eHealth. World Health Organization; 2011.
- Su WC, Mehta KY, Gill K, et al. Assessing the Readability of App Descriptions and Investigating its Role in the Choice of mHealth Apps: Retrospective and Prospective Analyses. AMIA Annu Symp Proc 2022;2021:1139-48. [PubMed]
- Kim H, Xie B. Health literacy in the eHealth era: A systematic review of the literature. Patient Educ Couns 2017;100:1073-82. [Crossref] [PubMed]
- El Benny M, Kabakian-Khasholian T, El-Jardali F, et al. Application of the eHealth Literacy Model in Digital Health Interventions: Scoping Review. J Med Internet Res 2021;23:e23473. [Crossref] [PubMed]
- Lin YH, Lou MF. Effects of mHealth-based interventions on health literacy and related factors: A systematic review. J Nurs Manag 2021;29:385-94. [Crossref] [PubMed]
- Verma R, Saldanha C, Ellis U, et al. eHealth literacy among older adults living with cancer and their caregivers: A scoping review. J Geriatr Oncol 2022;13:555-62. [Crossref] [PubMed]
- Coughlin S, Thind H, Liu B, et al. Mobile Phone Apps for Preventing Cancer Through Educational and Behavioral Interventions: State of the Art and Remaining Challenges. JMIR Mhealth Uhealth 2016;4:e69. [Crossref] [PubMed]
- Shan R, Sarkar S, Martin SS. Digital health technology and mobile devices for the management of diabetes mellitus: state of the art. Diabetologia 2019;62:877-87. [Crossref] [PubMed]
- Garner SL, George CE, Young P, et al. Effectiveness of an mHealth application to improve hypertension health literacy in India. Int Nurs Rev 2020;67:476-83. [Crossref] [PubMed]
- Delva S, Waligora Mendez KJ, Cajita M, et al. Efficacy of Mobile Health for Self-management of Cardiometabolic Risk Factors: A Theory-Guided Systematic Review. J Cardiovasc Nurs 2021;36:34-55. [Crossref] [PubMed]
- DeMonte CM, DeMonte WD, Thorn BE. Future implications of eHealth interventions for chronic pain management in underserved populations. Pain Manag 2015;5:207-14. [Crossref] [PubMed]
- Martinengo L, Stona AC, Tudor Car L, et al. Education on Depression in Mental Health Apps: Systematic Assessment of Characteristics and Adherence to Evidence-Based Guidelines. J Med Internet Res 2022;24:e28942. [Crossref] [PubMed]
- Brian RM, Ben-Zeev D. Mobile health (mHealth) for mental health in Asia: objectives, strategies, and limitations. Asian J Psychiatr 2014;10:96-100. [Crossref] [PubMed]
- Peters MDJ, Marnie C, Tricco AC, et al. Updated methodological guidance for the conduct of scoping reviews. JBI Evid Synth 2020;18:2119-26. [Crossref] [PubMed]
- Tricco AC, Lillie E, Zarin W, et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation. Ann Intern Med 2018;169:467-73. [Crossref] [PubMed]
- Huhta AM, Hirvonen N, Huotari ML. Health Literacy in Web-Based Health Information Environments: Systematic Review of Concepts, Definitions, and Operationalization for Measurement. J Med Internet Res 2018;20:e10273. [Crossref] [PubMed]
- Monkman H, Kushniruk A. A health literacy and usability heuristic evaluation of a mobile consumer health application. Stud Health Technol Inform 2013;192:724-8. [PubMed]
- Office of Disease Prevention and Health Promotion (ODPHP). Health Literacy Online: U.S. Department of Health and Human Services; 2016. [Updated June 8, 2016].
- Monkman H, Griffith J, Kushniruk AW. Evidence-based Heuristics for Evaluating Demands on eHealth Literacy and Usability in a Mobile Consumer Health Application. Stud Health Technol Inform 2015;216:358-62. [PubMed]
- Gibbons MC, Lowry SZ, Patterson ES. Applying Human Factors Principles to Mitigate Usability Issues Related to Embedded Assumptions in Health Information Technology Design. JMIR Hum Factors 2014;1:e3. [Crossref] [PubMed]
- Wildenbos GA, Peute L, Jaspers M. Aging barriers influencing mobile health usability for older adults: A literature based framework (MOLD-US). Int J Med Inform 2018;114:66-75. [Crossref] [PubMed]
- Bahadori S, Wainwright TW, Ahmed OH. Readability of Information on Smartphone Apps for Total Hip Replacement and Total Knee Replacement Surgery Patients. J Patient Exp 2020;7:395-8. [Crossref] [PubMed]
- Dunn Lopez K, Chae S, Michele G, et al. Improved readability and functions needed for mHealth apps targeting patients with heart failure: An app store review. Res Nurs Health 2021;44:71-80. [Crossref] [PubMed]
- Abujarad F, Alfano S, Bright TJ, et al. Building an Informed Consent Tool Starting with the Patient: The Patient-Centered Virtual Multimedia Interactive Informed Consent (VIC). AMIA Annu Symp Proc 2018;2017:374-83. [PubMed]
- Chaudry BM, Connelly K, Siek KA, et al. Formative evaluation of a mobile liquid portion size estimation interface for people with varying literacy skills. J Ambient Intell Humaniz Comput 2013;4:779-89. [Crossref] [PubMed]
- Connelly K, Stein KF, Chaudry B, et al. Development of an Ecological Momentary Assessment Mobile App for a Low-Literacy, Mexican American Population to Collect Disordered Eating Behaviors. JMIR Public Health Surveill 2016;2:e31. [Crossref] [PubMed]
- Coughlin SS, Besenyi GM, Bowen D, et al. Development of the Physical activity and Your Nutrition for Cancer (PYNC) smartphone app for preventing breast cancer in women. Mhealth 2017;3:5. [Crossref] [PubMed]
- Fontil V, McDermott K, Tieu L, et al. Adaptation and Feasibility Study of a Digital Health Program to Prevent Diabetes among Low-Income Patients: Results from a Partnership between a Digital Health Company and an Academic Research Team. J Diabetes Res 2016;2016:8472391. [Crossref] [PubMed]
- Mueller S, Soriano D, Boscor A, et al. MANTRA: development and localization of a mobile educational health game targeting low literacy players in low and middle income countries. BMC Public Health 2020;20:1171. [Crossref] [PubMed]
- Povey J, Sweet M, Nagel T, et al. Drafting the Aboriginal and Islander Mental Health Initiative for Youth (AIMhi-Y) App: Results of a formative mixed methods study. Internet Interv 2020;21:100318. [Crossref] [PubMed]
- Sox CM, Gribbons WM, Loring BA, et al. Patient-centered design of an information management module for a personally controlled health record. J Med Internet Res 2010;12:e36. [Crossref] [PubMed]
- Srinivas P, Bodke K, Ofner S, et al. Context-Sensitive Ecological Momentary Assessment: Application of User-Centered Design for Improving User Satisfaction and Engagement During Self-Report. JMIR Mhealth Uhealth 2019;7:e10894. [Crossref] [PubMed]
- Ben-Zeev D, Kaiser SM, Brenner CJ, et al. Development and usability testing of FOCUS: a smartphone system for self-management of schizophrenia. Psychiatr Rehabil J 2013;36:289-96. [Crossref] [PubMed]
- Boyd AD, Moores K, Shah V, et al. My Interventional Drug-Eluting Stent Educational App (MyIDEA): Patient-Centered Design Methodology. JMIR Mhealth Uhealth 2015;3:e74. [Crossref] [PubMed]
- Ceasar JN, Claudel SE, Andrews MR, et al. Community Engagement in the Development of an mHealth-Enabled Physical Activity and Cardiovascular Health Intervention (Step It Up): Pilot Focus Group Study. JMIR Form Res 2019;3:e10944. [Crossref] [PubMed]
- Dev R, Woods NF, Unger JA, et al. Acceptability, feasibility and utility of a Mobile health family planning decision aid for postpartum women in Kenya. Reprod Health 2019;16:97. [Crossref] [PubMed]
- Muscat DM, Lambert K, Shepherd H, et al. Supporting patients to be involved in decisions about their health and care: Development of a best practice health literacy App for Australian adults living with Chronic Kidney Disease. Health Promot J Austr 2021;32:115-27. [Crossref] [PubMed]
- Bender MS, Martinez S, Kennedy C. Designing a Culturally Appropriate Visually Enhanced Low-Text Mobile Health App Promoting Physical Activity for Latinos: A Qualitative Study. J Transcult Nurs 2016;27:420-8. [Crossref] [PubMed]
- Casey M, Hayes PS, Glynn F, et al. Patients' experiences of using a smartphone application to increase physical activity: the SMART MOVE qualitative study in primary care. Br J Gen Pract 2014;64:e500-8. [Crossref] [PubMed]
- Lord S, Moore SK, Ramsey A, et al. Implementation of a Substance Use Recovery Support Mobile Phone App in Community Settings: Qualitative Study of Clinician and Staff Perspectives of Facilitators and Barriers. JMIR Ment Health 2016;3:e24. [Crossref] [PubMed]
- Poduval S, Ahmed S, Marston L, et al. Crossing the Digital Divide in Online Self-Management Support: Analysis of Usage Data From HeLP-Diabetes. JMIR Diabetes 2018;3:e10925. [Crossref] [PubMed]
- Povey J, Mills PP, Dingwall KM, et al. Acceptability of Mental Health Apps for Aboriginal and Torres Strait Islander Australians: A Qualitative Study. J Med Internet Res 2016;18:e65. [Crossref] [PubMed]
- Schnall R, Higgins T, Brown W, et al. Trust, Perceived Risk, Perceived Ease of Use and Perceived Usefulness as Factors Related to mHealth Technology Use. Stud Health Technol Inform 2015;216:467-71. [PubMed]
- Giunti G, Kool J, Rivera Romero O, et al. Exploring the Specific Needs of Persons with Multiple Sclerosis for mHealth Solutions for Physical Activity: Mixed-Methods Study. JMIR Mhealth Uhealth 2018;6:e37. [Crossref] [PubMed]
- Mackert M, Guadagno M, Lazard A, et al. Engaging Men in Prenatal Health Promotion: A Pilot Evaluation of Targeted e-Health Content. Am J Mens Health 2017;11:719-25. [Crossref] [PubMed]
- Siedner MJ, Santorino D, Haberer JE, et al. Know your audience: predictors of success for a patient-centered texting app to augment linkage to HIV care in rural Uganda. J Med Internet Res 2015;17:e78. [Crossref] [PubMed]
- Wildenbos GA, Jaspers MWM, Schijven MP, et al. Mobile health for older adult patients: Using an aging barriers framework to classify usability problems. Int J Med Inform 2019;124:68-77. [Crossref] [PubMed]
- Miller DP Jr, Weaver KE, Case LD, et al. Usability of a Novel Mobile Health iPad App by Vulnerable Populations. JMIR Mhealth Uhealth 2017;5:e43. [Crossref] [PubMed]
- Huang E, Liao SF, Chen SL. e-Health Informed Foreign Patient and Physician Communication: The Perspective of Informed Consent. In: Ortuño F RI, editors., editor. Bioinformatics and Biomedical Engineering. Cham: Springer International Publishing; 2015.
- Ownby RL, Waldrop-Valverde D, Caballero J, et al. Baseline medication adherence and response to an electronically delivered health literacy intervention targeting adherence. Neurobehav HIV Med 2012;4:113-21. [Crossref] [PubMed]
- Elsabrout K. Increasing diabetic patient engagement and self-reported medication adherence using a web-based multimedia program. J Am Assoc Nurse Pract 2018;30:293-8. [Crossref] [PubMed]
- van Deen WK, Khalil C, Dupuy TP, et al. Assessment of inflammatory bowel disease educational videos for increasing patient engagement and family and friends' levels of understanding. Patient Educ Couns 2022;105:660-9. [Crossref] [PubMed]
- Saparamadu AADNS, Fernando P, Zeng P, et al. User-Centered Design Process of an mHealth App for Health Professionals: Case Study. JMIR Mhealth Uhealth 2021;9:e18079. [Crossref] [PubMed]
- Clear Communication: National Institutes of Health; 2021 [updated July 7, 2021]. Available online: https://www.nih.gov/institutes-nih/nih-office-director/office-communications-public-liaison/clear-communication/clear-simple
- Leroy G, Endicott JE, Kauchak D, et al. User evaluation of the effects of a text simplification algorithm using term familiarity on perception, understanding, learning, and information retention. J Med Internet Res 2013;15:e144. [Crossref] [PubMed]
- Teaching Patients with Low Literacy Skills. Second ed. Philadelphia: J. B. Lippincott Company; 1996.
- Simply put: A guide for creating easy-to-understand materials. In: Centers for Disease Control and Prevention (U.S.). Office of the Associate Director for Communication. Strategic and Proactive Communication Branch. Third ed. Atlanta, Georgia: Centers for Disease Control and Prevention; 2009.
- Literacy Gap Map: Barbara Bush Foundation. Available online: https://map.barbarabush.org/
- Radcliffe E, Lippincott B, Anderson R, et al. A Pilot Evaluation of mHealth App Accessibility for Three Top-Rated Weight Management Apps by People with Disabilities. Int J Environ Res Public Health 2021;18:3669. [Crossref] [PubMed]
- Hilliard ME, Hahn A, Ridge AK, et al. User Preferences and Design Recommendations for an mHealth App to Promote Cystic Fibrosis Self-Management. JMIR Mhealth Uhealth 2014;2:e44. [Crossref] [PubMed]
- Dawson RM, Felder TM, Donevant SB, et al. What makes a good health 'app'? Identifying the strengths and limitations of existing mobile application evaluation tools. Nurs Inq 2020;27:e12333. [Crossref] [PubMed]
- Emerson MR, Harsh Caspari J, Notice M, et al. Mental health mobile app use: Considerations for serving underserved patients in integrated primary care settings. Gen Hosp Psychiatry 2021;69:67-75. [Crossref] [PubMed]
Cite this article as: Emerson MR, Buckland S, Lawlor MA, Dinkel D, Johnson DJ, Mickles MS, Fok L, Watanabe-Galloway S. Addressing and evaluating health literacy in mHealth: a scoping review. mHealth 2022;8:33.