Published oninVol 4, No 3(2021):Jul-Sep

Preprints (earlier versions) of this paper are available athttps://preprints.www.mybigtv.com/preprint/24553, first published.
Satisfaction, Usability, and Compliance With the Use of Smartwatches for Ecological Momentary Assessment of Knee Osteoarthritis Symptoms in Older Adults: Usability Study

Satisfaction, Usability, and Compliance With the Use of Smartwatches for Ecological Momentary Assessment of Knee Osteoarthritis Symptoms in Older Adults: Usability Study

Satisfaction, Usability, and Compliance With the Use of Smartwatches for Ecological Momentary Assessment of Knee Osteoarthritis Symptoms in Older Adults: Usability Study

Original Paper

1Department of Pharmacy, University of Toulouse, Toulouse, France

2Department of Aging and Geriatric research, University of Florida, Gainesville, FL, United States

3Department of Epidemiology, University of Florida, Gainesville, FL, United States

4Department of Biomedical Engineering, University of Florida, Gainesville, FL, United States

5Google, Mountain View, California, CA, United States

6Department of Computer and Information Science and Engineering, University of Florida, Gainesville, FL, United States

7Department of Community Dentistry and Behavioral Science, University of Florida, Gainesville, FL, United States

*all authors contributed equally

Corresponding Author:

Charlotte Rouzaud Laborde, PharmD, PhD

Department of Pharmacy

图卢兹大学

Hopital Paule de Viguier

330 Avenue de Grande Bretagne, TSA 70034

Toulouse, 31059

France

Phone: 33 625088692

Fax:33 561771055

Email:charlotte.laborde@yahoo.fr


Background:Smartwatches enable physicians to monitor symptoms in patients with knee osteoarthritis, their behavior, and their environment. Older adults experience fluctuations in their pain and related symptoms (mood, fatigue, and sleep quality) that smartwatches are ideally suited to capture remotely in a convenient manner.

Objective:The aim of this study was to evaluate satisfaction, usability, and compliance using the real-time, online assessment and mobility monitoring (ROAMM) mobile app designed for smartwatches for individuals with knee osteoarthritis.

Methods:Participants (N=28; mean age 73.2, SD 5.5 years; 70% female) with reported knee osteoarthritis were asked to wear a smartwatch with the ROAMM app installed. They were prompted to report their prior night’s sleep quality in the morning, followed by ecological momentary assessments (EMAs) of their pain, fatigue, mood, and activity in the morning, afternoon, and evening. Satisfaction, comfort, and usability were evaluated using a standardized questionnaire. Compliance with regard to answering EMAs was calculated after excluding time when the watch was not being worn for technical reasons (eg, while charging).

Results:大部分人的文本displayed was large enough to read (22/26, 85%), and all participants found it easy to enter ratings using the smartwatch. Approximately half of the participants found the smartwatch to be comfortable (14/26, 54%) and would consider wearing it as their personal watch (11/24, 46%). Most participants were satisfied with its battery charging system (20/26, 77%). A majority of participants (19/26, 73%) expressed their willingness to use the ROAMM app for a 1-year research study. The overall EMA compliance rate was 83% (2505/3036 responses). The compliance rate was lower among those not regularly wearing a wristwatch (10/26, 88% vs 16/26, 71%) and among those who found the text too small to read (4/26, 86% vs 22/26, 60%).

Conclusions:Older adults with knee osteoarthritis positively rated the ROAMM smartwatch app and were generally satisfied with the device. The high compliance rates coupled with the willingness to participate in a long-term study suggest that the ROAMM app is a viable approach to remotely collecting health symptoms and behaviors for both research and clinical endeavors.

JMIR Aging 2021;4(3):e24553

doi:10.2196/24553

Keywords



Mobile devices are becoming commonplace in patient-based research [1]。Their ability to capture sensor data and enable interaction with participants in both observational and interventional studies makes mobile devices a powerful tool to augment traditional data collection approaches [2]。For example, these devices passively record activity with an accelerometer and location via GPS sensors to track physical activity and mobility. This information could be useful in understanding patients’ symptoms in the free-living environment. Such knowledge would be ideal for patients with osteoarthritis who exhibit variable pain experiences that may also interact with their mood and fatigue levels [3,4]。Coupled with sensor-based mobility data, smart devices offer a rich portrait of the interplay between symptoms and mobility levels.

Osteoarthritis is a degenerative and progressive disease affecting approximately 250 million patients worldwide [5]。Pain experiences greatly differ between patients and are often irregular within the same patient [6]。The complexity of symptoms is partly due to the site (knee, hip, or hand), genetic predisposition, initial cause of damage (ie, injury), obesity status, level of inflammation, and environmental factors [5,7]。Traditionally, patients receive treatment after reporting pain complaints and a physical examination along with optional imaging (eg, radiographs) [8,9]。Physical activity patterns, mobility function, and symptoms are used by clinical practitioners to inform treatment decisions [8,10,11]。However, difficulty in retrospective assessment of complex experiences like pain and the recall bias of self-assessing activity patterns present obstacles for care management of patients with osteoarthritis [12]。As a result, there has been considerable interest in using smart mobile devices—phones and wearables—for ascertaining symptoms and objective activity measures for informing practitioners [13]。In 2019, approximately 30 to 40 apps were designed for logging pain symptoms, but only one-fifth of those apps engaged the patients for which they were designed [14]。Moreover, none were solely designed for a smartwatch interface. Mobile devices and smart wearables have the potential to better characterize symptoms in the free-living world, but involvement of end-users (eg, patients) are necessary for appropriate design and long-term adoption.

New tools are needed to collect symptoms, experiences, and patterns of mobility and activity in real time in the free-living environment. Ecological momentary assessment (EMA) is a method based on data collection originally developed by Larson and Csikszentmihalyi in 1983 [15] for the psychological assessment of what activities people engage in, how they feel, and what they are thinking during their daily lives. It was developed because people are poor at reconstructing psychological experiences after they have occurred [16,17]。Rather, EMA considers experiences in the moment in a real-world environment and is potentially more representative of reality [18]。EMAs were first collected using paper diaries, followed by dedicated electronic diaries [19]。Recently, however, smartphone and smartwatch apps are becoming a pervasive means of assessing medical symptoms [20,21]。Work by Murphy and Smith demonstrated that tracking activity patterns with daily EMA fatigue reports yielded insights into the manifestation of activity-induced fatigue in participants with knee or hip osteoarthritis [22]。另一个最近的报告使用专门设计的smartwatch app to prompt older adults with knee osteoarthritis to report their pain 4 to 5 times per day for approximately 3 months. Results demonstrated that older adults wore the watch for 75% of the study duration and answered 50% to 60% of the twice-daily prompts to rate their pain. Despite some drawbacks, including battery drain and technical issues, participants generally thought the watch was convenient and acceptable [23]。Although this previous work is encouraging, additional research is clearly needed to document smartwatch satisfaction, usability, and compliance for knee osteoarthritis symptoms.

The large increase in mobile medical apps has prompted the US Food and Drug Administration (FDA) to release a guidance statement [24]。The FDA is clearly supportive of evaluating patient-reported outcomes [25]; however, the framework for regulating medical mobile apps is still in its infancy [24]。Moreover, FDA guidance documents state that any patient-based software should undergo evaluation for overall design, usability, and acceptability for use in clinical care and research settings [26]。In that regard, the objective of our study was to evaluate satisfaction, usability, and compliance using the real-time and online assessment and mobility monitoring (ROAMM) mobile app designed for smartwatches. This study builds on initial input from interviews about the ROAMM app interface and usability in both patients and practitioners [27,28]。We hypothesized that older adults with knee osteoarthritis would provide positive satisfaction and usability ratings while being compliant with wearing the smartwatch and answering EMA prompts over an approximately 2-week evaluation period.


参与招聘和访问的设计

Community-dwelling older adults aged 65 years and above with symptomatic unilateral or bilateral knee osteoarthritis were enrolled in the study. Recruitment sources included community advertisements and participant-based registries. Exclusion criteria included significant cognitive impairment, neurological conditions that severely inhibited mobility, inability to communicate because of severe hearing loss or speech disorder, terminal illness with life expectancy less than 12 months, severe pulmonary disease, renal failure with hemodialysis, severe psychiatric disorder (eg, bipolar, schizophrenia), excessive alcohol use (>14 drinks per week), drug addiction, or treatment for cancer (radiation or chemotherapy) within the past 1 year. All participants provided written informed consent, and the protocol was approved by the University of Florida Institutional Review Board.

Participants were asked to attend 2 clinic visits: one at baseline and another approximately 2 weeks later. After providing written informed consent, participants were administered the Mini-Mental Status Examination and then instructed on how to use the ROAMM app as previously described [27,29]。Participants were provided a simple user guide on how to use the wireless charging station and USB cable. They were also provided with a demographic questionnaire and an “exit” questionnaire that asked about their satisfaction with watch functionality and usability (seeMultimedia Appendix 1) to be completed at the end of the second week. At the second visit, participants were asked to return the smartwatch and completed questionnaires.

ROAMM App and EMA

The ROAMM app was developed at the University of Florida to enable real-time capture of patient-generated information. The smartwatch app collects wearable sensor (accelerometer and GPS) data simultaneously with symptom EMAs, as described previously [27]。Briefly, the ROAMM app is composed of a server and smartwatch app that are remotely connected through a secure https protocol. This integrated framework is designed and developed to perform several tasks, including remote data collection, storage, retrieval, and analysis. The primary goal of this project was to evaluate usability, satisfaction, and compliance of wearing the smartwatch and responding to EMA prompts in free-living conditions. Participants were asked to wear and charge the smartwatch every day for approximately 2 weeks during waking hours. The ROAMM app was programed to prompt the participant three times a day in a stratified random manner at prespecified windows: 8:00-11:59, 12:00-15:59, and 16:00-19:59.

While wearing the watch, participants were prompted in the morning to report their prior night’s sleep quality. Thereafter, EMA pain, fatigue, mood, and activity were assessed throughout the day. Participants used the rotating bezel on the Samsung Gear S3 to dial in responses and then saved their responses by pressing a button located on top of the bezel. Rating scales were chosen based on the previous literature and the ability to scale down the content for the watch interface [30-34]。In the morning, participants rated their previous night’s sleep quality on a scale of 0 to 10 [35,36], with the following anchors: 0 to 1, “very poor”; 2, “poor”; 3 to 4, “OK”; 5 to 8, “well”; and 9 to 10, “very well”. EMA pain was evaluated using a valid and reliable numerical rating scale—the 11-point Box Scale (BS-11) of pain intensity that ranges from 0 to 10 [37,38]。There is a wide variety of versions of this scale and its inclusion of text anchors [39]。Because of the small watch face, we preferred to include more anchors than the traditional numeric scales. The following text anchors were shown as the participant rotated the dial: 0, “none”; 1 to 3, “mild”; 4 to 5, “moderate”; 6 to 7, “severe”; 8 to 9, “very severe”; and 10, “worst possible.” A depiction of the interface is shown inFigure 1and in our previous publications [27,28]。

Figure 1. Depiction of watch face with visual analog scale used to rate pain intensity.
View this figure

Fatigue severity was also assessed using a scale of 0 to 10, using the abovementioned anchors, according to other similar validated scales previously reported [40-42]。Mood ratings were scaled slightly differently to more closely follow previously validated visual analogue scales [43,44]。By default, the zero value for “neutral” was placed at the bottom of the screen; rotation to the right reported negative mood ratings, with text anchors “negative” for –1 to –3 and “very negative” for –4 to –5. Rotation to the left reported positive mood ratings. Finally, participants rotated the bezel to choose an icon representing one of the following activity categories that they were presently engaged in: lying down, standing, walking, sitting, and other activities (representing other possible activities such as gardening and exercise). Thus, participants were prompted to report pain, fatigue, mood, and activity three times per day. To reduce burden, prompts were delivered in a contiguous manner—one after another. The total time to answer a set of prompts was very short, typically <30 seconds.

ROAMM Exit Questionnaire to Evaluate Satisfaction and Usability

A 13-item exit questionnaire was administered at the end of the second week of the study (see questionnaire inMultimedia Appendix 1). The questions dealt with wearing comfort (eg, size, weight, wristband material), usability of the ROAMM app (eg, responding to prompts, font size, battery life), ease of using the inductive charger, and willingness to participate in future research studies. Participants were also asked to provide feedback to improve the app and its usability. Questions that used a 4-point Likert scale were reduced to two categories for statistical analysis (eg “very satisfied and satisfied” vs “somewhat satisfied and not satisfied”). Some questions asked participants to select as many options as possible that apply. Participants were also asked to provide any additional opinions of the ROAMM app and the smartwatch. Responses to this question were categorized into 4 major areas: technical issue; usability or functionality issue; size, weight, or display issue; and no issue (ie, positive opinion).

ROAMM EMA Compliance

Compliance with each ROAMM app prompt was calculated in two ways. First, a raw compliance rate was calculated as the number of actual responses divided by the total number of possible responses assuming the watch was delivering the EMAs during programmed times:

(Total responses / Total number of possible responses) × 100.

Second, it was important to adjust the compliance rate to notpenalizeparticipants for potential technical issues or for when the watch was not being worn (ie, when charging). For this calculation, time windows with <3 hours of sensor data (ie, the watch was turned off during a time when an EMA could be delivered) or if the watch was charging for >30 minutes were flagged. Flagged time windows were not countedagainstthe participant for nonresponsiveness (ie, they were not included in the denominator of the compliance rate). We considered this form of ”adjusted“ compliance in the stratified analysis described below. Only days where there were >3 hours of data, signifying a sufficient time to judge compliance, were considered in the analysis.

Data Analysis

Comparisons of dichotomous responses on the patient satisfaction surveys were described as proportions and analyzed using Fisher exact test. Questions that contained multiple answers or free text were tallied, but formal statistical comparisons were not performed owing to the low number of responses. Adjusted compliances were compared using the Studentttest between two groups and one-way analysis of variance with posthoc tests for more than two comparisons. Differences and associations were considered statistically significant at an α level <.05.


Characteristics of the Study Population

Table 1provides demographic characteristics of 27 of the 28 participants who completed the demographic questionnaire. Their mean age was 73.2 (SD 5.5) years, with a total of 19 (70%) female participants, 21 (78%) White participants, and 24 (89%) participants with a college-level education. Participants were moderately active, and most were overweight (n=10, 37%) or obese (n=9, 33%).

Table 1. Characteristics of study participants (N=27).
Characteristic Participants
Age (years), mean (SD) 73.2 (5.5)
Sex, female, n (%) 19 (70)
Race, n (%)

White 21 (78)

Other 6 (22)
Education level, n (%)

College education 24 (89)

Other 3 (11)
Living status, n (%)

Lives alone 6 (22)

Other 21 (78)
Housing, n (%)

Single-family home 22 (82)

Other 5 (19)
Morphology

Height (m), mean (SD) 1.7 (0.1)

Weight (kg), mean (SD) 80 (21.4)

BMI (kg/m2), mean (SD) 28.3 (5.5)


Obese (BMI ≥30 kg/m2), n (%) 9 (33)


Overweight (BMI 25-30 kg/m2), n (%) 10 (37)


Normal (BMI 18.5-25 kg/m2), n (%) 8 (30)
Physical activity, n (%)

No regular leisure-time physical activity 4 (15)

Some leisure-time physical activity 13 (48)

Regular leisure-time physical activity 9 (33)
Bill Payment, n (%)

Somewhat difficult or very difficult time paying bills 13 (48)

Not very difficult 14 (52)

ROAMM Exit Questionnaire to Evaluate Satisfaction and Usability

Of the 26 participants, 81% (21) reported that they would be willing to wear the smartwatch while sleeping, and 85% (22) reported the text was large enough to read (Table 2). Moreover, all 26 participants reported it was easy to enter ratings using the smartwatch. About 77% (20/26) of the participants reported that the smartwatch’s battery life ended while they were wearing it. A similar proportion of participants regularly wore a wristwatch (16/26, 62% vs 10/26, 38%;P=.16) and answered that they would wear the smartwatch as their personal watch (11/24, 46% vs 13/24, 54%;P=.77).

Approximately half of the participants (14/26, 54%) reported the smartwatch was “very comfortable” or “comfortable” (Table 3). A follow-up question asking participants how the smartwatch comfort could be improved received the following responses: no changes (n=7), reduce weight of the watch (n=11), improve wristband clasp function (n=7), reduce display size (n=6), change the material of wrist band (n=6), reduce wrist band size (n=5), and other (size, weight, display and motion detection) (n=8). Despite these criticisms, a majority of the participants reported that they were satisfied with the function of the watch (19/26, 73%;P=.002) and charging the battery (20/26, 77;P<.001;Table 3).

Table 2. Real-time, online assessment and mobility monitoring exit questionnaire.
Question Participants,an (%) Pvalueb

Response: yes Response: no
Do you regularly wear a wristwatch? 16 (62) 10 (38) .16
Would you wear the Samsung smartwatch as your personal watch? (n=24) 11 (46) 13 (54) .77
For research purposes, would you occasionally wear the watch while sleeping? 21 (81) 5 (19) <.001
Was the text large enough to read? 22 (85) 4 (16) <.001
Was it easy to enter the ratings using the smartwatch? 26 (100) 0 (0) N/Ac
Did you charge it every night? 26 (100) 0 (0) N/A
Did the watch ever run out of battery (ie, battery died) while you were wearing it? 20 (77) 6 (23) <.001

aTotal number of participants is 26, unless otherwise noted in the row header.

bFisher exact test.

cN/A: not applicable.

Table 3. Real-time, online assessment and mobility monitoring exit questionnaire (continued).
Question Participants (N=26), n (%) Pvaluea
How satisfied were you with the function of the watch (ie, you were able to tell date/time easily)? .002

Very satisfied and satisfied, n (%) 19 (73)

Somewhat satisfied and not satisfied, n (%) 7 (27)
How satisfied were you with the charging of the battery of the Samsung smartwatch? <.001

Very satisfied and satisfied, n (%) 20 (78)

Somewhat satisfied and not satisfied, n (%) 6 (22)
How comfortable was the Samsung smartwatch to wear on a daily basis? .78

Very comfortable and comfortable 14 (54)

Somewhat or not comfortable 12 (46)
How likely are you to participate in a 1-year research study asking you to wear the Samsung smartwatch daily? .002

Very likely, likely or somewhat likely 19 (73)

Not likely 7 (27)

aFisher exact test.

Furthermore, a majority of the participants (19/26, 73%;P=.002) expressed their willingness to use the ROAMM app for a 1-year research study. In a follow-up question that asked the participants the reasons for responding ”not likely“ or ”somewhat likely“ (n=11), participants cited lack of comfort (n=5), (the watch was) not stylish (n=3), gets in the way (n=4), screen was hard to read (n=3), screen was unresponsive (n=4), privacy issue (n=1), technical issue (n=5), and size or weight issues (n=1). However, some of these participants were willing to wear the smartwatch for 1 month (n=5) or 3 months (n=1). Only 3 participants reported not willing to wear the watch at all.

All participants were asked to provide additional comments on the ROAMM app and the smartwatch. Those who opted to respond commented on technical issues (battery charging: n=10; temperature of the watch being too hot: n=2) and usability issues (resetting the watch: n=5; unresponsive screen: n=1; and size, weight, or display issues: n=7). There were positive opinions about the health monitoring aspects (n=4) and the ability to use the device as a phone or for email and calendar use (n=2).

ROAMM EMA Compliance Rates

Twenty-eight participants wore the smartwatch for a mean of 13.9 (SD 0.4) days. When considering only those days with >3 hours of wear-time, participants wore the watch for a mean of 11.3 (SD 0.6) days. The accumulated total was 316 days recorded along with a total of 2505 smartwatch responses. The raw compliance rate was 61% (2505/4108) and the adjusted compliance rate was 83% (2505/3036). Specific to different windows throughout the day, the adjusted compliance rate was 86% (1004/1161) in the morning, 79% (800/1016) in the afternoon, and 77% (701/908) in the evening; details of adjusted compliance rate according to EMA responses in each window are shown inTable 4.

Table 4. Adjusted compliance rates according to ecological momentary assessment responses across the three evaluation windows.
Evaluation window Sleep Pain Mood Fatigue Activity
Morning .93 .875 .873 .87 .82
Afternoon N/Aa .84 .823 .83 .71
Evening N/A .815 .795 .81 .71

aN/A: not applicable.

Average adjusted compliance for EMA prompts were similar for pain, mood, fatigue, activity, and sleep (P=.14), although compliance was consistently lowest for reporting activity, which was the final question of the bundle. Moreover, average adjusted compliance rates were similar across the three time windows (P=.92). We explored potential reasons for compliance differences in a stratified analysis. Adjusted compliance was lower among those who do not regularly wear a wristwatch (88% vs 71%;P=.03) and was better among those who thought the text was large enough to read (86% vs 60%;P=.01) (Figure 2). No differences in adjusted compliance rates were observed for participants who reported higher satisfaction levels, those who were more likely to wear the watch for a 1-year study, those who would wear the smartwatch as a personal watch, and those who reported the smartwatch did run out of battery (Figures 2and3).

Figure 2. Adjusted compliance average according to responses from the real-time online assessment and mobility monitoring app exit questionnaire for Yes and No responses.
View this figure
Figure 3. Adjusted compliance average according to responses from the real-time online assessment and mobility monitoring app exit questionnaire for Likert's responses.
View this figure

Gerontechnology is a relatively new concept that aims to promote health and well-being through technology that considers older adults’ needs and preferences [45]。The ROAMM app was developed based on these guiding principles and was designed to capture information about gerontological symptoms in the free-living environment. To ensure the technology is appropriate for this population, our research team and others have conducted focus groups to gather feedback aboutgero-friendlyvisualization (eg, display size) and functionality [27,46-49]。In the next phase of this study, we evaluated the technology in a small target sample. In this context, the purpose of this study was to evaluate the ROAMM smartwatch app for usability, satisfaction, and compliance in a patient population of older adults with knee osteoarthritis. Subsequent paragraphs interpret the results within the framework of gerontechnology and compare the current results to the existing literature. Based on our exit questionnaire, a majority of participants positively rated the ROAMM app display and functionality (eg, rotating dial). About half of the participants felt the smartwatch was uncomfortable, but almost three-fourths were likely to participate in a long-term study asking them to wear the smartwatch. Additionally, EMA compliance rates reported here were similar to a recent meta-analysis that pooled data from 701 participants across 12 EMA studies [50]。The high EMA compliance rates also indicate that older adults were able to use the app in free-living conditions. Participants also responded that it was easy to enter information using the rotating bezel, the text was sufficiently large, and they were satisfied with charging the smartwatch and effectively charging it every night. These responses culminated in a high likelihood of participating in research asking them to wear the smartwatch in a 1-year study—a goal for research related to health monitoring. However, it should be noted that willingness to participate in a long-duration study might not transfer to long-term compliance. Overall, our results suggest that older adults with knee osteoarthritis were generally satisfied with the ROAMM app and smartwatch, but the next intervention requires improved comfort and wearability for planning long-term studies.

观察duri电池消耗是一致的问题ng the study. The ROAMM app collects sensor data simultaneously with EMA data. We previously reported that the battery is most susceptible to the GPS sensor, with approximately 1% battery drain per collected sample [29]。这下水道成倍增加sensors are collected simultaneously and further affected when the screen is activated during EMA responses. In a similar study, investigators from the KOALAP (Knee Osteoarthritis, Linking Activity and Pain) study also struggled to ensure the smartwatch battery lasted during the day—about 15 hours. They also found that the lack of battery life significantly impacted engagement with the smartwatch [51]。Additional innovation is needed on battery technology, smart sensor triggering (eg, activate accelerometer during movement only, activate GPS outside a geofence), and energy efficiency to ensure that apps like ROAMM are capable of health monitoring for an entire waking day. Advances in sensor technology and EMA tools for health monitoring are only effective if sufficient compliance is demonstrated [52]。The compliance rates reached in this study were consistent with systematic reviews of EMA for assessing chronic pain in adults (eg, 83% [53] and 86% [54]). However, achieving good compliance is a multifactorial challenge, as it involves the type of behavioral coaching, perceived burden, demographics of the population, and the usability of the technology [55]。Regarding the demographics, older adults tend to have higher compliance (88%-90% at 75 years old) than younger adults (72%-74% at 25 years old) even in technology-based evaluations, as reported in a chronic pain study [50]。In fact, an EMA-based study in older African American adults reported over a 90% compliance rate when rating their activity and stress, four times per day, on a smartphone [56]。There was also some evidence that fewer questions yielded higher compliance. We observed that a single sleep quality question in the morning yielded the highest compliance. In prior work, microinteraction EMAs—where people are prompted with fast, glanceable questions that could be answered in a few seconds similar to ROAMM—were developed on smartwatches and compared to less-frequent EMA prompts on smartphones. Researchers found that although prompts on the smartwatch were eight times more frequent than those on the smartphone, participants were 35% more compliant to short microinteraction EMAs on the smartwatch [57]。Participants also responded to EMAs in less time and reported the EMAs to be less distracting on the smartwatch than on the smartphone [58]。Therefore, EMAs on a smartwatch might serve as an excellent approach for longitudinal studies, which was also conveyed by a majority of older adults in our study who were willing to participate in a 1-year research study.

分层分析合规率产生了即时通讯portant information for practice and for planning future research. In general, compliance was similar between participants with different opinions of the comfort and satisfaction with the function of the smartwatch and ROAMM app. Unexpectedly, compliance was similar among participants not likely to wear the smartwatch as their own personal watch and those who would not volunteer for a 1-year research study. Participants regularly wearing a wristwatch had significantly higher compliance than nonwearers. Furthermore, individuals who had difficulty reading the text on the watch had lower compliance than those who did not experience difficulties. In the focus group study, approximately 80% of the respondents reported the display text size was adequate [27]。In the current study the same results were found (24/28, 79%) and participants reported the text was large enough. To be more inclusive and generalize to the population as a whole, future studies will need to consider whether people regularly wear watches and ensure text size or fonts are optimized for compliance.

There are strengths and weaknesses of this study that will aid in conducting future research using smartwatch devices for monitoring health. One of the weaknesses is that this study was performed on a relatively small, homogenous sample of older adults with knee osteoarthritis. In particular, this was a well-educated sample, and the results may not be generalizable to individuals with lower levels of education. Furthermore, we did not employ a commonly used ”usability“ scale for assessing the ROAMM app, which makes comparisons to the literature difficult. At the time of data collection, existing scales were not appropriate for assessing both the software and hardware of wearable devices. Moreover, despite internal pilot testing, rapid battery drainage found during wear in the free-living environment remained to be an issue. These weaknesses are balanced with some strengths such as the thorough investigation of usability and user compliance following an extended use of the ROAMM app in real-world settings.

总之,用膝盖osteoarthrit老年人is positively rated and were generally satisfied with the ROAMM app on the Samsung smart watch. Battery life remains a concern and will need to be carefully considered in future studies. Compliance rates were generally high but were impacted by personal experiences wearing a watch and text readability. After using the ROAMM app for about 2 weeks, a majority of older adults were willing to participate in a 1-year study requiring them to wear the smartwatch. Overall, the results support new opportunities to monitor health symptoms while capturing objective sensor information from a smartwatch in older adults with knee osteoarthritis.

Acknowledgments

This study was funded by the Data Science and Applied Technology (DSAT); Core of the Claude D. Pepper Older Americans Independence Center at the University of Florida (UF) (P30 AG028740). The UF Informatics Institute and the UF Clinical and Translational Science Institute contributed for partial funding (R21 AG059207) supporting staff and faculty during the project.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Exit questionnaire to be filled in by the participants after 2 weeks.

PDF File (Adobe PDF File), 110 KB

References

  1. Sama PR, Eapen ZJ, Weinfurt KP, Shah BR, Schulman KA. An evaluation of mobile health application tools. JMIR Mhealth Uhealth 2014 May 01;2(2):e19 [FREE Full text] [CrossRef] [Medline]
  2. Peart DJ, Balsalobre-Fernández C, Shaw MP. Use of mobile applications to collect data in sport, health, and exercise science: a narrative review. J Strength Cond Res 2019 Apr;33(4):1167-1177. [CrossRef] [Medline]
  3. Allen KD, Coffman CJ, Golightly YM, Stechuchak KM, Keefe FJ. Daily pain variations among patients with hand, hip, and knee osteoarthritis. Osteoarthritis Cartilage 2009 Oct;17(10):1275-1282 [FREE Full text] [CrossRef] [Medline]
  4. Murphy SL, Kratz AL, Williams DA, Geisser ME. The association between symptoms, pain coping strategies, and physical activity among people with symptomatic knee and hip osteoarthritis. Front Psychol 2012;3:326 [FREE Full text] [CrossRef] [Medline]
  5. Lim SS, Vos T, Flaxman AD, Danaei G, Shibuya K, Adair-Rohani H, et al. A comparative risk assessment of burden of disease and injury attributable to 67 risk factors and risk factor clusters in 21 regions, 1990-2010: a systematic analysis for the Global Burden of Disease Study 2010. Lancet 2012 Dec 15;380(9859):2224-2260 [FREE Full text] [CrossRef] [Medline]
  6. Bartley EJ, Palit S, Staud R. Predictors of osteoarthritis pain: the importance of resilience. Curr Rheumatol Rep 2017 Sep;19(9):57 [FREE Full text] [CrossRef] [Medline]
  7. Hunter DJ, Guermazi A, Roemer F, Zhang Y, Neogi T. Structural correlates of pain in joints with osteoarthritis. Osteoarthritis Cartilage 2013 Sep;21(9):1170-1178 [FREE Full text] [CrossRef] [Medline]
  8. Kolasinski SL, Neogi T, Hochberg MC, Oatis C, Guyatt G, Block J, et al. 2019 American College of Rheumatology/Arthritis Foundation Guideline for the Management of Osteoarthritis of the Hand, Hip, and Knee. Arthritis Rheumatol 2020 Feb;72(2):220-233 [FREE Full text] [CrossRef] [Medline]
  9. Hunter DJ, Bierma-Zeinstra S. Osteoarthritis. Lancet 2019 Apr 27;393(10182):1745-1759. [CrossRef] [Medline]
  10. Juhl C, Lund H, Roos EM, Zhang W, Christensen R. A hierarchy of patient-reported outcomes for meta-analysis of knee osteoarthritis trials: empirical evidence from a survey of high impact journals. Arthritis 2012;2012:136245 [FREE Full text] [CrossRef] [Medline]
  11. Dobson F, Hinman RS, Hall M, Marshall CJ, Sayer T, Anderson C, et al. Reliability and measurement error of the Osteoarthritis Research Society International (OARSI) recommended performance-based tests of physical function in people with hip and knee osteoarthritis. Osteoarthritis Cartilage 2017 Nov;25(11):1792-1796 [FREE Full text] [CrossRef] [Medline]
  12. Daoust R, Sirois M, Lee JS, Perry JJ, Griffith LE, Worster A, et al. Painful memories: reliability of pain intensity recall at 3 months in senior patients. Pain Res Manag 2017;2017:5983721 [FREE Full text] [CrossRef] [Medline]
  13. Sim I. Mobile devices and health. N Engl J Med 2019 Sep 05;381(10):956-968. [CrossRef]
  14. Zhao P, Yoo I, Lancey R, Varghese E. Mobile applications for pain management: an app analysis for clinical usage. BMC Med Inform Decis Mak 2019 May 30;19(1):106 [FREE Full text] [CrossRef] [Medline]
  15. Larson R, Csikszentmihalyi M. The Experience Sampling Method. In: Flow and the Foundations of Positive Psychology. Dordrecht: Springer; 2014:21-34.
  16. Bradburn NM, Rips LJ, Shevell SK. Answering autobiographical questions: the impact of memory and inference on surveys. Science 1987 Apr 10;236(4798):157-161. [CrossRef] [Medline]
  17. Schwarz N. Self-reports: How the questions shape the answers. Am Psychol 1999;54(2):93-105. [CrossRef]
  18. Smyth JM, Smyth JM. Ecological momentary assessment research in behavioral medicine. J Happiness Stud 2003;4(1):35-52. [CrossRef]
  19. Shiffman S, Stone AA, Hufford MR. Ecological momentary assessment. Annu Rev Clin Psychol 2008;4:1-32. [CrossRef] [Medline]
  20. Boulos MNK, Brewer AC, Karimkhani C, Buller DB, Dellavalle RP. Mobile medical and health apps: state of the art, concerns, regulatory control and certification. Online J Public Health Inform 2014;5(3):229 [FREE Full text] [CrossRef] [Medline]
  21. McManus DD, Trinquart L, Benjamin EJ, Manders ES, Fusco K, Jung LS, et al. Design and preliminary findings from a new electronic cohort embedded in the Framingham heart study. J Med Internet Res 2019 Mar 01;21(3):e12143 [FREE Full text] [CrossRef] [Medline]
  22. Murphy SL, Smith DM. Ecological measurement of fatigue and fatigability in older adults with osteoarthritis. J Gerontol A Biol Sci Med Sci 2010 Feb;65(2):184-189 [FREE Full text] [CrossRef] [Medline]
  23. Beukenhorst AL, Howells K, Cook L, McBeth J, O'Neill TW, Parkes MJ, et al. Engagement and participant experiences with consumer smartwatches for health research: longitudinal, observational feasibility study. JMIR Mhealth Uhealth 2020 Jan 29;8(1):e14368 [FREE Full text] [CrossRef] [Medline]
  24. Policy for Device Software Functions and Mobile Medical Applications. U.S. Food and Drug Administration (FDA). 2019 Sep 27. URL:https://www.fda.gov/regulatory-information/search-fda-guidance-documents/policy-device-software-functions-and-mobile-medical-applications[accessed 2021-06-25]
  25. Patient-reported outcome measures: use in medical product development to support labeling claims. U.S. Food and Drug Administration (FDA). 2019 Dec. URL:http://www.fda.gov/regulatory-information/search-fda-guidance-documents/patient-reported-outcome-measures-use-medical-product-development-support-labeling-claims[accessed 2020-07-01]
  26. Quality System (QS) Regulation/Medical Device Good Manufacturing Practices Internet. U.S. Food and Drug Administration (FDA). URL:https://www.fda.gov/medical-devices/postmarket-requirements-devices/quality-system-qs-regulationmedical-device-good-manufacturing-practices[accessed 2020-04-28]
  27. Manini TM, Mendoza T, Battula M, Davoudi A, Kheirkhahan M, Young ME, et al. Perception of older adults toward smartwatch technology for assessing pain and related patient-reported outcomes: pilot study. JMIR Mhealth Uhealth 2019 Mar 26;7(3):e10044 [FREE Full text] [CrossRef] [Medline]
  28. Alpert JM, Manini T, Roberts M, Kota NSP, Mendoza TV, Solberg LM, et al. Secondary care provider attitudes towards patient generated health data from smartwatches. NPJ Digit Med 2020;3:27 [FREE Full text] [CrossRef] [Medline]
  29. Kheirkhahan M, Nair S, Davoudi A, Rashidi P, Wanigatunga AA, Corbett DB, et al. A smartwatch-based framework for real-time and online assessment and mobility monitoring. J Biomed Inform 2019 Jan;89:29-40 [FREE Full text] [CrossRef] [Medline]
  30. Gould D, Kelly D, Goldstone L, Gammon J. Examining the validity of pressure ulcer risk assessment scales: developing and using illustrated patient simulations to collect the data. Information point: visual analogue scale. J Clin Nurs 2001 Sep;10(5):697-706. [CrossRef] [Medline]
  31. Paul-Dauphin A, Guillemin F, Virion JM, Briançon S. Bias and precision in visual analogue scales: a randomized controlled trial. Am J Epidemiol 1999 Nov 15;150(10):1117-1127. [CrossRef] [Medline]
  32. McCormack HM, Horne DJ, Sheather S. Clinical applications of visual analogue scales: a critical review. Psychol Med 1988 Nov;18(4):1007-1019. [CrossRef] [Medline]
  33. Huskisson EC. Measurement of pain. Lancet 1974 Nov 09;2(7889):1127-1131. [CrossRef] [Medline]
  34. Downie WW, Leatham PA, Rhind VM, Wright V, Branco JA, Anderson JA. Studies with pain rating scales. Ann Rheum Dis 1978 Aug;37(4):378-381 [FREE Full text] [CrossRef] [Medline]
  35. Monk T, Reynolds C, Kupfer D, Buysse D, Coble P, Hayes A, et al. The Pittsburgh Sleep Diary. J Sleep Res 1994;3(2):120. [CrossRef]
  36. Smith MT, Wegener ST. Measures of sleep: The Insomnia Severity Index, Medical Outcomes Study (MOS) Sleep Scale, Pittsburgh Sleep Diary (PSD), and Pittsburgh Sleep Quality Index (PSQI). Arthritis & Rheumatism 2003 Oct 15;49(S5):S184-S196. [CrossRef]
  37. Jamison RN, Gracely RH, Raymond SA, Levine JG, Marino B, Herrmann TJ, et al. Comparative study of electronic vs. paper VAS ratings: a randomized, crossover trial using healthy volunteers. Pain 2002 Sep;99(1-2):341-347. [CrossRef] [Medline]
  38. Farrar JT, Young JP, LaMoreaux L, Werth JL, Poole RM. Clinical importance of changes in chronic pain intensity measured on an 11-point numerical pain rating scale. Pain 2001 Nov;94(2):149-158. [CrossRef] [Medline]
  39. Hjermstad M, Fayers P, Haugen D, Caraceni A, Hanks G, Loge J, European Palliative Care Research Collaborative (EPCRC). Studies comparing Numerical Rating Scales, Verbal Rating Scales, and Visual Analogue Scales for assessment of pain intensity in adults: a systematic literature review. J Pain Symptom Manage 2011 Jun;41(6):1073-1093 [FREE Full text] [CrossRef] [Medline]
  40. Hacker ED, Ferrans CE. Ecological momentary assessment of fatigue in patients receiving intensive cancer therapy. J Pain Symptom Manage 2007 Mar;33(3):267-275 [FREE Full text] [CrossRef] [Medline]
  41. Piper BF, Borneman T, Sun VC, Koczywas M, Uman G, Ferrell B, et al. Cancer-related fatigue: role of oncology nurses in translating National Comprehensive Cancer Network assessment guidelines into practice. Clin J Oncol Nurs 2008 Oct;12(5 Suppl):37-47 [FREE Full text] [CrossRef] [Medline]
  42. Curran SL, Beacham AO, Andrykowski MA. Ecological momentary assessment of fatigue following breast cancer treatment. J Behav Med 2004 Oct;27(5):425-444. [CrossRef]
  43. Cella DF, Perry SW. Reliability and concurrent validity of three visual-analogue mood scales. Psychol Rep 1986 Oct;59(2 Pt 2):827-833. [CrossRef] [Medline]
  44. Folstein MF, Luria R. Reliability, validity, and clinical application of the Visual Analogue Mood Scale. Psychol Med 1973 Nov;3(4):479-486. [CrossRef] [Medline]
  45. Ozsungur f . Gerontechnological影响因素uccessful aging of elderly. Aging Male 2020 Dec;23(5):520-532. [CrossRef] [Medline]
  46. Mitzner TL, Savla J, Boot WR, Sharit J, Charness N, Czaja SJ, et al. Technology adoption by older adults: findings from the PRISM trial. Gerontologist 2019 Jan 09;59(1):34-44 [FREE Full text] [CrossRef] [Medline]
  47. Bergschöld JM, Neven L, Peine A. DIY gerontechnology: circumventing mismatched technologies and bureaucratic procedure by creating care technologies of one's own. Sociol Health Illn 2020 Feb;42(2):232-246. [CrossRef] [Medline]
  48. Chen K, Chan AHS. Gerontechnology acceptance by elderly Hong Kong Chinese: a senior technology acceptance model (STAM). Ergonomics 2014;57(5):635-652. [CrossRef] [Medline]
  49. Peek STM, Luijkx KG, Vrijhoef HJM, Nieboer ME, Aarts S, van der Voort CS, et al. Understanding changes and stability in the long-term use of technologies by seniors who are aging in place: a dynamical framework. BMC Geriatr 2019 Aug 28;19(1):236 [FREE Full text] [CrossRef] [Medline]
  50. Ono M, Schneider S, Junghaenel DU, Stone AA. What affects the completion of ecological momentary assessments in chronic pain research? an individual patient data meta-analysis. J Med Internet Res 2019 Feb 05;21(2):e11398 [FREE Full text] [CrossRef] [Medline]
  51. Beukenhorst AL, Parkes MJ, Cook L, Barnard R, van der Veer SN, Little MA, et al. Collecting symptoms and sensor data with consumer smartwatches (the Knee OsteoArthritis, Linking Activity and Pain Study): Protocol for a Longitudinal, Observational Feasibility Study. JMIR Res Protoc 2019 Jan 23;8(1):e10238 [FREE Full text] [CrossRef] [Medline]
  52. NA. Ecological momentary assessments and the science of behavior change. Exerc Sport Sci Rev 2017 Jan;45(1):3. [CrossRef] [Medline]
  53. Morren M, van Dulmen S, Ouwerkerk J, Bensing J. Compliance with momentary pain measurement using electronic diaries: a systematic review. Eur J Pain 2009 Apr;13(4):354-365. [CrossRef] [Medline]
  54. May M, Junghaenel DU, Ono M, Stone AA, Schneider S. Ecological momentary assessment methodology in chronic pain research: a systematic review. J Pain 2018 Jul;19(7):699-716 [FREE Full text] [CrossRef] [Medline]
  55. Hekler EB, Klasnja P, Traver V, Hendriks M. Realizing effective behavioral management of health: the metamorphosis of behavioral science methods. IEEE Pulse 2013 Sep;4(5):29-34. [CrossRef] [Medline]
  56. Fritz H, Tarraf W, Saleh DJ, Cutchin MP. Using a smartphone-based ecological momentary assessment protocol with community dwelling older African Americans. J Gerontol B Psychol Sci Soc Sci 2017 Sep 01;72(5):876-887 [FREE Full text] [CrossRef] [Medline]
  57. Intille S, Haynes C, Maniar D, Ponnada A, Manjourides J. μEMA: microinteraction-based ecological momentary assessment (EMA) using a smartwatch. In: Proc ACM Int Conf Ubiquitous Comput. Proc 2016 ACM Int Jt Conf Pervasive Ubiquitous Comput Internet Heidelberg Germany: ACM; 2016 Sep Presented at: proceedings acm international conference ubiquitous computing; 2016 september; Germany Heidelberg p. 1124-1128 URL:http://europepmc.org/abstract/MED/30238088[CrossRef]
  58. Ponnada A, Haynes C, Maniar D, Manjourides J, Intille S. Microinteraction ecological momentary assessment response rates: effect of microinteractions or the smartwatch? Proc ACM Interact Mob Wearable Ubiquitous Technol 2017 Sep;1(3):92 [FREE Full text] [CrossRef] [Medline]


DSAT:Data Science and Applied Technology
EMA:ecological momentary assessment
FDA:Food and Drug Administration
KOALAP:Knee Osteoarthritis, Linking Activity and Pain
ROAMM:real-time, online assessment and mobility monitoring
UF:University of Florida


Edited by J Wang; submitted 24.09.20; peer-reviewed by D Bychkov, F Lamers; comments to author 24.11.20; revised version received 12.01.21; accepted 22.04.21; published 14.07.21

Copyright

©Charlotte Rouzaud Laborde, Erta Cenko, Mamoun T Mardini, Subhash Nerella, Matin Kheirkhahan, Sanjay Ranka, Roger B Fillingim, Duane B Corbett, Eric Weber, Parisa Rashidi, Todd Manini. Originally published in JMIR Aging (https://aging.www.mybigtv.com), 14.07.2021.

这是一个开放的文章在t分布he terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Aging, is properly cited. The complete bibliographic information, a link to the original publication on https://aging.www.mybigtv.com, as well as this copyright and license information must be included.


Baidu
map