| Home | E-Submission | Sitemap | Editorial Office |  
top_img
Research in Vestibular Science > Volume 19(4); 2020 > Article
모바일 기반 안구운동기록 어플리케이션: 가용성과 유용성에 대한 무작위 배정 연구

Abstracts

Objectives

This study was performed to assess the efficacy and feasibility of mobile application-based Frenzel tests in dizziness clinics.

Methods

We performed an investigator-initiated, blinded-outcome assessor, parallel, randomized controlled crossover trial at Chungnam National University Hospital between August 2019 and October 2019. Certified medical staffs were randomly assigned to the intervention group (i.e., a mobile application-based Frenzel glass system, n=15) or the observation group (i.e., a conventional desktop-based Frenzel glass system, n=15); the groups applied the respective systems for the preparation of eye movement recording and switched systems. The primary outcome was the elapsed time in seconds it took the participants to prepare the system for eye recording simulation. The secondary outcomes were perceived stress and satisfaction scores after completion of the operation, as measured by a questionnaire using 10-point Likert scales.

Results

The mean time of machine preparation for eye recording simulation was reduced by 50% in the mobile application group compared to the desktop group in both study periods (38.0±7.1 sec vs. 76.0±8.7 sec). We detected no carryover effect. Participants also reported lower stress while using application than while using the desktop system (2.3±1.3 vs. 4.6±2.4; p<0.001). The application obtained a mean overall satisfaction score of 9.2 out of 10.

Conclusions

The implementation of an eye movement recording application in a dizziness examination was well adopted by users and decreased the time and stress related to machine operation.

INTRODUCTION

Dizziness and vertigo are responsible for 4% of emergency department visits [1]. International Classification of Diseases 10th Revision codes R42 (dizziness) and H81 (vestibular diseases) are in the top 20 frequent illnesses reported in the 2018 hospitalization data from the Korea Health Insurance Review and Assessment Service [2]. In addition, the number of in-patients diagnosed with R42 or H81 and total medical expenses have increased in the last 3 years [2]. This means that frontline doctors are now taking care of many patients with dizziness/ vertigo. At present, examinations to record eye movement without fixation usually involve Frenzel glasses connected to a desktop- based personnel computer system. Compared with classical Frenzel glasses, the eyes can be observed more easily with a monitor; however, the size of the system is large, and it is inconvenient to wait for the machine to start every time a doctor uses it. The main reasons for this slow start are that Windows devices (Microsoft, Redmond, WI, USA) are large and consume a large amount of power. Such a process can be facilitated by the introduction of mobile application-based devices.
Therefore, we conducted a randomized controlled crossover trial to assess the feasibility and efficacy of a mobile application-based Frenzel system compared with a desktop-based system, focusing on the examiner's standpoint.

MATERIALS AND METHODS

1. Trial Design and Participants

This was a randomized, controlled, crossover trial including two parallel groups to assess the clinical feasibility and efficacy of an mobile application-based Frenzel test in reducing the time to operation and eye movement recording and improving the workload of examiners. This study was conducted from August 2019 to October 2019 at Chungnam National University Hospital in South Korea. Although the intervention could not be masked, all investigators remained unaware of the outcomes until all the data were unlocked for analysis at the end of the trial. All the certified medical staff working in the department of neurology or otorhinolaryngology were eligible for inclusion in the study. Requirements for participant inclusion were experience with desktop-based video Frenzel glass examinations, the user who was using smartphone or mobile device and written informed consent. All the participants were assumed to have equivalent experience and competence with the desktop- based Frenzel test because of their similar training backgrounds. This trial is registered with Clinical Research Information Service (cris.nih.go.kr; No. KCT 0004403).
This study was approved by the Institutional Review Board of Chungnam National University Hospital (No. 2019-07-078).

2. Randomization and Masking

Participants were randomized into either the mobile application device group or the desktop device group (Fig. 1). Randomization was performed using a concealed computer- generated list of management assignments with a 1:1 allocation ratio based on the stochastic patient case assignment Python program (ver. 3.7.3; Python Software Foundation) provided by the author (HJC). Written informed consent was obtained from each participant after full information disclosure prior to participation in the study. Blinding to the type of machine was maintained during recruitment to minimize preparation bias. Allocation concealment was ensured with allocation software and was not released until the participants started the experiment. All experiments were video recorded for later analysis. A post-experiment video review was performed without blinding by two reviewers (HJC and SK) who reviewed footage independently and were blinded to each other's reviews. The data analyst (ISK) was not blinded to treatment allocation.
Fig. 1.
Design of the study.
rvs-19-4-120f1.jpg

3. The Eye Movement Recording Application

The application was developed at Chungnam National University Hospital (Daejeon, South Korea) following a user-centered and evidence-based approach by neurotologist (SHJ) and software developers. Based on eye movement observations, the team worked closely to identify the key functionalities and pro-cesses to be implemented. With one touch, an icon can be selected and shown on a binocular screen (Fig. 2, Supplementary Video 1). When using the application, the user can interact with it by starting, pausing, and stopping it at any time. All eye movement records by the user are sequentially saved locally on the device in archived files to preserve information that can be retrieved at any time for debriefing or medicolegal purposes. These files can also be erased or safely exported and saved.
Fig. 2.
(A) A tablet computer showing icon of mobile application (red circle). (B) Screenshot. (C) Bedside examination using tablet computer. (D) Doctors can view screen-shots on their smartphones.
rvs-19-4-120f2.jpg

4. Procedures

On the day of the experiment, each participant received three standardized training sessions on how to use each device. According to the random allocation, the participants were then asked to prepare for eye movement recording on the same upper torso model (Fig. 3). After this task had been completed, the crossover occurred. The time that the power was turned off corresponded to the end of the experiment. Before leaving the room, the participant was asked to complete a questionnaire about the experiment immediately afterwards.
Fig. 3.
Standardized training session for the desktop-based Frenzel test (A) and mobile application-based Frenzel test (B).
rvs-19-4-120f3.jpg

5. Outcomes

The primary outcome was the elapsed time in seconds it took to prepare the system for eye recording simulation by the participants. The secondary outcomes were perceived stress and satisfaction scores after completion of the experiment, as measured by the questionnaire using 10-point Likert scales (Supplementary Fig. 1).

6. Statistical Analysis

Power calculations were based on a 50% reduction in the time to eye movement recording, which we considered to be a sufficient difference to modify the practice. Assuming 1 minute of recording time with the application and 2 minutes with the desktop device, three participants per group had to be recruited to provide 80% power with a two-sided α level of 0.05. To prevent a potential loss of power due to the mis-specification of assumptions and to successfully perform para-metric tests, 30 participants were recruited. The time to eye movement recording was reported for each method in each study period. For the primary analysis, the times to eye movement recording considering both experimental methods were compared by pooling both study periods. Paired data were then analyzed using a paired t-test for dependent groups, with a two-sided α level of 0.05. Potentially, the efficacy of the application could be different depending on the first method used in the crossover design, i.e., the carryover effect. To investigate this effect, we performed sensitivity analyses in which we compared the mean difference in the primary outcomes between the two preparation methods by study period (independent observations) using independent t-tests and by randomized group (paired observation) using paired t-tests. All of the differences were reported with exact 95% confidence intervals (CIs). The carryover effect was tested with a linear, multivariate, generalized estimating equation (GEE) model, with an exchangeable working correlation matrix as follows; the study period and the recording method were introduced as independent variables, and an intercept term was introduced to model a modification of the recording method's effect between the first and second study period. The null hypothesis, which stated that the interaction term was null, was tested to detect a carryover effect. The secondary outcomes (stress and satisfaction scores) were reported for each method and by study period. The same analyses were performed for the secondary outcomes. All statistical tests were two-sided, with a type-one error risk of 0.05. We used IBM SPSS ver. 20.0 (IBM Corp., Armonk, NY, USA) for the descriptive analyses and statistical tests.

RESULTS

From August 2019 to October 2019, we randomly assigned 30 participants to perform eye movement recording using the mobile application-based method first (n=15) or the desktop- based method first (n=15); there were no dropouts or missing data (Fig. 1). The baseline characteristics of the two groups seemed balanced (Table 1). We observed excellent interrater agreement for video reviewing (Cronbach's alpha of 0.995 for the mobile application and 0.985 for the desktop-based system). The mean time of eye movement recording was shorter with the application than with the desktop-based system in both study periods (Table 2). Overall, the time to record eye movement preparation decreased by 50% (38.1 seconds; 95% CI, 34.9 to 41.2; p<0.001) in the mobile application group (Table 3). The shortened eye recording preparation time was similar in both study arms, regardless of whether participants started the experiment with the application or the desktop-based system (Fig. 4, Table 3). The questionnaire was completed by all the participants. After completion of the experiment, the participants reported more stress using the desktop-based system than the mobile application (the overall perceived stress and the overall stress before the experiment were 4.6±2.4 vs. 2.3±1.3; p<0.001) (Tables 2, 3). The application obtained a mean overall satisfaction score of 9.2 out of 10 (Tables 2, 3). Because the data did not suggest a carryover effect in the mean time to eye movement recording, stress and satisfaction during both study periods were pooled (Table 4).
Table 1.
Baseline characteristics
Characteristic App-based method first (n=15) Desktop-based method first (n=15) p-value
Age (yr) 34.6±5.9 35.9±4.3 > 0.05
Sex     > 0.05
 Female 6 (40.0) 8 (53.3)  
 Male 9 (60.0) 7 (46.7)  
Time in the dizziness clinic (yr) 7.0±5.0 7.5±4.6 > 0.05
Occupation     > 0.05
 Neurologist 12 (80.0) 13 (86.7)  
 Otolaryngologist 2 (13.3) 2 (13.3)  
 Technician 1 (6.7) 0 (0)  

Values are presented as mean±standard deviation or number (%). App, application.

Table 2.
Outcome descriptive analysis by method and study period
Period Time to record eye movement (sec) Satisfactiona) Stressa)
Period 1 and 2
 Mobile app 38.0±7.1 9.2±1.0 2.3±1.3
 Desktop 76.0±8.7 6.4±1.9 4.6±2.4
Period 1
 Mobile app 35.8±5.3 9.1±1.0 2.5±1.6
 Desktop 77.9±8.2 5.9±1.8 5.1±2.4
Period 2
 Mobile app 40.0±8.2 9.3±1.0 2.1±1.0
 Desktop 74.0±9.1 6.9±1.9 4.1±2.2

Values are presented as mean±standard deviation.

App, application.

a) Measured by 10-point Likert scale.

Table 3.
Outcome differences
Variable Time to record eye movement
Satisfaction
Stress
Difference (95% CI) p-value Difference (95% CI) p-value Difference (95% CI) p-value
Period
 Period 1 42.1 (36.9–47.3) < 0.001 3.2 (2.1–4.3) < 0.001 –2.6 (–4.2 to –1.0) 0.002
 Period 2 34.1 (27.6–40.6) < 0.001 2.4 (1.2–3.6) < 0.001 –2.1 (–3.4 to –0.8) 0.004
Randomization group
 Mobile application first 38.4 (32.8–44.0) < 0.001 2.2 (1.0–3.4) 0.001 –2.6 (–4.2 to –1.0) 0.002
 Desktop first 37.7 (31.6–43.9) < 0.001 3.4 (2.3–4.5) < 0.001 –3.1 (–4.5 to –1.6) < 0.001
 All groups and periods 38.1 (34.9–41.2) < 0.001 2.8 (2.1–3.5) < 0.001 –2.3 (–3.3 to –1.4) < 0.001

CI, confidence interval.

Table 4.
GEE model to test for carryover effects for paired data
Variable Timea)
Satisfactiona)
Stressa)
Mean difference (95% CI) p-value Mean difference (95% CI) p-value Mean difference (95% CI) p-value
Univariate analysis
 Intervention
 Mobile app 0 (reference)   0 (reference)   0 (reference)  
 Desktop PC 38.1 (34.9 to 41.2) < 0.001 –2.3 (–3.3 to –1.4) < 0.001 2.8 (2.1 to 3.5) < 0.001
Multivariate analysis
 Period with mobile app
 Period 1 0 (reference)   0 (reference)   0 (reference)  
 Period 2 3.7 (–2.3 to 9.7) 0.23 –1.0 (–2.3 to 0.3) 0.13 1.0 (–0.6 to 2.6) 0.23
Intervention in period 1
 Mobile app 0 (reference)   0 (reference)   0 (reference)  
 Desktop PC 42.1 (37.3 to 46.8) < 0.001 –3.2 (–4.2 to –2.2) < 0.001 2.6 (1.2 to 4.0) < 0.001
Intervention in period 2
 Mobile app 0 (reference)   0 (reference)   0 (reference)  
 Desktop PC 34.1 (28.1 to 40.1) < 0.001 –2.4 (–3.5 to –1.4) < 0.001 2.1 (0.9 to 3.3) 0.001

GEE, generalized estimating equation; CI, confidence interval; app, application; PC, personal computer.

a) Linear GEE model with exchangeable working correlation matrix.

The time was longer in both study periods when recording with a desktop PC-based system. The desktop group reported higher stress and lower satisfaction than the mobile application group in both study periods. In period 2, the time was longer, satisfaction was lower, and stress was higher than those in period 1, but the difference was not statistically significant for time (p=0.083), satisfaction (p=0.328), or stress (p=0.611).

Fig. 4.
Boxplots of elapsed time to machine operation for eye movement recording by participants when using the mobile application (app) compared with the desktop-based system. The solid horizontal lines denote medians and interquartile ranges the end-points of the whiskers indicate the range. The difference between groups was significant for time to operation for eye movement recording using the t-test for paired data (p<0.001). PC, personal computer.
rvs-19-4-120f4.jpg

DISCUSSION

This study compared mobile application eye movement recording with desktop-based recording. It took less time and was less stressful to use the application than to use the desktop system, and the overall satisfaction was higher. Although se-veral mobile medical devices are in development, data on their clinical usefulness are lacking [3]. Mobile phones or tablets are widely available devices with the potential to test ocular moti-lity, but their use in neurotology research until now has been rare. To our knowledge, this is the first trial for comparison of eye movement recording between mobile application and desktop-based recording on the user aspect. Recently, miniature video-oculography goggles have been introduced for ictal nystagmus recording; these goggles can also demonstrate the differential points of nystagmus in recurrent vertigo such as vestibular migraine, Menière's disease, or benign paroxysmal positional vertigo [4]. The combination of video-oculography goggles with mobile devices could facilitate telemedicine in the future. The careful examination of eye movements in the management of dizziness patients is both complex and time- consuming for many doctors [5,6]. We have noticed an over-generalized approach to frontline management of patients with “dizziness” or “vertigo”. Recent reports suggest that emergency doctors and other primary care providers may not be entirely comfortable interpreting bedside findings associated with vestibular disorders [7,8]. Therefore, the education of frontline doctors such as residents, primary physicians, and emergency doctors is very important for improving dizziness management. Consultation with neurotology experts regarding documented eye recordings using apps will be helpful for doctors who see dizziness patients; these doctors may be more comfortable with mobile application systems. The development of mobile applications is very basic. An important factor is that hardware such as mobile devices and goggles should be developed together to ensure good resolution, portability, easy accessibility, and wireless operation.
Our study has some limitations. First, this study was single- center study and was conducted on people who had used desktop-based Frenzel test among medical staff who treat dizziness patients. Therefore, there was regional limitation when recruiting participants, but in order to overcome this limitation, all participants practiced using the desktop and mobile application the same number of times when participating in the study, then evaluated the elapsed time and the satisfaction and stress score were evaluated in an independent space. Second, most participants were specialists in neurology and otolaryn-gology (21 of 30, 70.0%). Ultimately, the goal of mobile application-based recording is to make the frontline doctor easily accessible to dizzy patients. However, because the objectives of this study were to compare the satisfaction and stress scores between users of this mobile application for users and those who used a conventional desktop-based recording system, a number of specialists who used the desktop-based recording system were included. Finally, the subject of our study was a human model, not a human. Nonetheless, the mobile application showed almost the same eye movements as the desktop-based system. Because the purpose of our study was not to compare the eye movements recorded by each device but to compare the time required to prepare and complete the test and the satisfaction and stress scores during test, a human model was used instead of an actual patient. We will prepare a follow-up study to determine the accuracy of mobile application-based eye recording.
In conclusion, the adoption of Frenzel glasses with mobile application-based recording decreased the elapsed time by 50% compared with the conventional desktop-based recording system. The medical staff also reported lower stress and higher satisfaction related to examination with mobile application- based devices than with desktop-based systems. As research in this area is scarce, the results generated from this study will be of importance and might promote change and improve the quality of dizziness management.

CONFLICTS OF INTEREST

No potential conflict of interest relevant to this article was reported.

ACKNOWLEDGEMENTS

This work was supported by the Chungnam National University Hospital Research Fund, 2017 to 2018. The funder of the study had no role in the study design, data collection, data analysis, data interpretation, or writing of the report. The corresponding author had full access to all the data in the study and had full responsibility for the decision to submit for publication. The authors sincerely thank the participants in this study.

Supplementary Materials

Supplementary material can be found via (https://doi.org/10.21790/rvs.2020.19.4.120).
rvs-19-4-120-suppl1.pdf
Supplementary material can be found via (https://doi.org/10.21790/rvs.2020.19.4.120).

REFERENCES

1. Newman-Toker DE, Cannon LM, Stofferahn ME, Rothman RE, Hsieh YH, Zee DS. Imprecision in patient reports of dizziness symptom quality: a cross-sectional study conducted in an acute care setting. Mayo Clin Proc 2007;82:1329–40.
crossref pmid
2. Korea Health Insurance Review and Assessment Service. 2018 hospitalization frequent illness [Internet]. Wonju: Korea Health Insurance Review and Assessment Service 2020;[cited 2020 Nov 15]. Available from: http://opendata.hira.or.kr/op/opc/olapHifrqSickInfo.do.

3. Sim I. Mobile devices and health. N Engl J Med 2019;381:956–68.
crossref
4. Young AS, Lechner C, Bradshaw AP, MacDougall HG, Black DA, Halmagyi GM, et al. Capturing acute vertigo: a vestibular event monitor. Neurology 2019;92:e2743–53.
crossref pmid
5. Whitman GT. Examination of the patient with dizziness or imbalance. Med Clin North Am 2019;103:191–201.
crossref pmid
6. Huh YE, Kim JS. Bedside evaluation of dizzy patients. J Clin Neurol 2013;9:203–13.
crossref pmid pmc
7. Stanton VA, Hsieh YH, Camargo CA Jr, Edlow JA, Lovett PB, Goldstein JN, et al. Overreliance on symptom quality in diagnosing dizziness: results of a multicenter survey of emergency physicians. Mayo Clin Proc 2007;82:1319–28.
crossref pmid
8. Newman-Toker DE, Stanton VA, Hsieh YH, Rothman RE. Frontline providers harbor misconceptions about the bedside evaluation of dizzy patients. Acta Otolaryngol 2008;128:601–4.
crossref pmid
Editorial Office
Department of Neurology, Chonnam National University Hospital
42 Jeong-ro, Dong-gu, Gwangju 61469, Korea
Tel: +82-62-220-6274   Fax: +82-62-228-3461   E-mail: nrshlee@chonnam.ac.kr
About |  Browse Articles |  Current Issue |  For Authors and Reviewers
Copyright © The Korean Balance Society. All rights reserved.                 Developed in M2PI