Skip to main content
  • More from ADA
    • Diabetes
    • Clinical Diabetes
    • Diabetes Spectrum
    • ADA Standards of Medical Care
    • ADA Scientific Sessions Abstracts
    • BMJ Open Diabetes Research & Care
  • Subscribe
  • Log in
  • My Cart
  • Follow ada on Twitter
  • RSS
  • Visit ada on Facebook
Diabetes Care

Advanced Search

Main menu

  • Home
  • Current
    • Current Issue
    • Online Ahead of Print
    • Special Article Collections
    • ADA Standards of Medical Care
  • Browse
    • By Topic
    • Issue Archive
    • Saved Searches
    • Special Article Collections
    • ADA Standards of Medical Care
  • Info
    • About the Journal
    • About the Editors
    • ADA Journal Policies
    • Instructions for Authors
    • Guidance for Reviewers
  • Reprints/Reuse
  • Advertising
  • Subscriptions
    • Individual Subscriptions
    • Institutional Subscriptions and Site Licenses
    • Access Institutional Usage Reports
    • Purchase Single Issues
  • Alerts
    • E­mail Alerts
    • RSS Feeds
  • Podcasts
    • Diabetes Core Update
    • Special Podcast Series: Therapeutic Inertia
    • Special Podcast Series: Influenza Podcasts
    • Special Podcast Series: SGLT2 Inhibitors
    • Special Podcast Series: COVID-19
  • Submit
    • Submit a Manuscript
    • Journal Policies
    • Instructions for Authors
    • ADA Peer Review
  • More from ADA
    • Diabetes
    • Clinical Diabetes
    • Diabetes Spectrum
    • ADA Standards of Medical Care
    • ADA Scientific Sessions Abstracts
    • BMJ Open Diabetes Research & Care

User menu

  • Subscribe
  • Log in
  • My Cart

Search

  • Advanced search
Diabetes Care
  • Home
  • Current
    • Current Issue
    • Online Ahead of Print
    • Special Article Collections
    • ADA Standards of Medical Care
  • Browse
    • By Topic
    • Issue Archive
    • Saved Searches
    • Special Article Collections
    • ADA Standards of Medical Care
  • Info
    • About the Journal
    • About the Editors
    • ADA Journal Policies
    • Instructions for Authors
    • Guidance for Reviewers
  • Reprints/Reuse
  • Advertising
  • Subscriptions
    • Individual Subscriptions
    • Institutional Subscriptions and Site Licenses
    • Access Institutional Usage Reports
    • Purchase Single Issues
  • Alerts
    • E­mail Alerts
    • RSS Feeds
  • Podcasts
    • Diabetes Core Update
    • Special Podcast Series: Therapeutic Inertia
    • Special Podcast Series: Influenza Podcasts
    • Special Podcast Series: SGLT2 Inhibitors
    • Special Podcast Series: COVID-19
  • Submit
    • Submit a Manuscript
    • Journal Policies
    • Instructions for Authors
    • ADA Peer Review
Cardiovascular and Metabolic Risk

Diagnostic Accuracy of a Device for the Automated Detection of Diabetic Retinopathy in a Primary Care Setting

  1. Frank D. Verbraak1⇑,
  2. Michael D. Abramoff2,3,4,
  3. Gonny C.F. Bausch5,
  4. Caroline Klaver6,7,8,
  5. Giel Nijpels9,
  6. Reinier O. Schlingemann10 and
  7. Amber A. van der Heijden9
  1. 1Department of Ophthalmology, VU Medical Center, Amsterdam, the Netherlands
  2. 2Department of Ophthalmology and Visual Sciences, University of Iowa Hospital & Clinics, Iowa City, IA
  3. 3VA Medical Center, Iowa City, IA
  4. 4IDx, Iowa City, IA
  5. 5Star-SHL, Rotterdam, the Netherlands
  6. 6Department of Ophthalmology, Erasmus Medical Center, Rotterdam, the Netherlands
  7. 7Department of Epidemiology, Erasmus Medical Center, Rotterdam, the Netherlands
  8. 8Department of Ophthalmology, Radboud University Medical Center, Rotterdam, the Netherlands
  9. 9Department of General Practice and Elderly Care Medicine, Amsterdam Public Health Research Institute, VU University Medical Center, Amsterdam, the Netherlands
  10. 10Department of Ophthalmology, Amsterdam Medical Center, Amsterdam, the Netherlands
  1. Corresponding author: Frank D. Verbraak, f.verbraak{at}vumc.nl
Diabetes Care 2019 Apr; 42(4): 651-656. https://doi.org/10.2337/dc18-0148
PreviousNext
  • Article
  • Figures & Tables
  • Suppl Material
  • Info & Metrics
  • PDF
Loading

Abstract

OBJECTIVE To determine the diagnostic accuracy in a real-world primary care setting of a deep learning–enhanced device for automated detection of diabetic retinopathy (DR).

RESEARCH DESIGN AND METHODS Retinal images of people with type 2 diabetes visiting a primary care screening program were graded by a hybrid deep learning–enhanced device (IDx-DR-EU-2.1; IDx, Amsterdam, the Netherlands), and its classification of retinopathy (vision-threatening [vt]DR, more than mild [mtm]DR, and mild or more [mom]DR) was compared with a reference standard. This reference standard consisted of grading according to the International Clinical Classification of DR by the Rotterdam Study reading center. We determined the diagnostic accuracy of the hybrid deep learning–enhanced device (IDx-DR-EU-2.1) against the reference standard.

RESULTS A total of 1,616 people with type 2 diabetes were imaged. The hybrid deep learning–enhanced device’s sensitivity/specificity against the reference standard was, respectively, for vtDR 100% (95% CI 77.1–100)/97.8% (95% CI 96.8–98.5) and for mtmDR 79.4% (95% CI 66.5–87.9)/93.8% (95% CI 92.1–94.9).

CONCLUSIONS The hybrid deep learning–enhanced device had high diagnostic accuracy for the detection of both vtDR (although the number of vtDR cases was low) and mtmDR in a primary care setting against an independent reading center. This allows its’ safe use in a primary care setting.

Introduction

With the growing prevalence of diabetes, the prevalence of diabetic retinopathy (DR) is rising as well. Screening for DR has proven to be effective in the prevention of visual loss and blindness from DR (1). National health authorities (2) and most professional organizations (3) recommend regular DR screening programs, which are usually integrated within regular diabetes care (4). Automated medical diagnosis has achieved parity with or even superiority to clinical experts’ diagnosis for an increasing number of clinical tasks, including detection of DR (5–7), and can help to improve health care efficiency, affordability, and accessibility of DR screening. Moreover, automated diagnosis reduced the diagnostic variability that was common in expert review of medical images (8).

Multiple diagnostic algorithms for the detection of DR are now commercially available for which the performance has been independently evaluated (9–12). One of these, the IDx-DR-EU-2.1 device, has been enhanced with deep learning. Deep learning, a machine learning technique that uses multilayer neural networks, has allowed substantial improvements in artificial intelligence (AI)-based diagnostic systems (13–17). Because deep learning is used to build its explicit retinopathy lesion (biomarker) detectors, the IDx-DR-EU-2.1 is a lesion-based AI system, mimicking human visual processing (13,18). While most deep learning applications associate images directly with a diagnostic output, lesion-based AI systems detect lesions and other abnormalities and are thought to be more robust to catastrophic failure from small perturbations in images (18). The lesion-based AI system allowed significantly improved diagnostic accuracy on a laboratory data set (13) and is designed to detect multiple levels of DR and diabetic macular edema (DME) according to the International Clinical Diabetic Retinopathy Severity Scale (ICDR) (13,19,20).

The purpose of this study was to determine the diagnostic accuracy of the hybrid deep learning–enhanced device (IDx-DR-EU-2.1) to detect more than mild DR and/or DME (mtmDR) and vision-threatening DR or DME (vtDR), according to the ICDR grading system compared with the reference standard, in people with type 2 diabetes in a primary care setting.

Research Design and Methods

Study Design, Population, and Setting

This retrospective study studied all people with type 2 diabetes that were screened at a diagnostic center in the Netherlands Star-SHL (Star-SHL, Rotterdam, the Netherlands) in the year 2015. Star-SHL is a so-called “primary center diagnostic center,” a facility that provides medical diagnostics to general practitioners in the Southwest region of the Netherlands. Under the guidance of general practitioners Star-SHL counsels patients with chronic diseases including diabetes. Study inclusion criteria were: existing diagnosis of type 2 diabetes, not previously diagnosed with DR and ability to undergo fundus photography. Patients were not otherwise selected, and reflected the mixed multiethnicity of the general population of Rotterdam, with around 15% non-Caucasian inhabitants.

Imaging

Participants underwent fundus imaging according to a strict standardized protocol (two per eye: one macula centered and one disc centered [45° field of view]) using Topcon TRC-NW200 cameras operated by experienced Star-SHL technicians. The images were made in eight different sites, and settings of the cameras were identical. Pharmacological dilation was applied when the technician decided that the images did not meet the requirements for grading. Image sets for each participant were stored in a proprietary Picture Archival System (PACS). Approval was obtained from the Human Subjects Committee of Star-SHL to conduct the study in accordance with the tenets of the Declaration of Helsinki.

Reference Standard Grading

A reading center determined the exam quality, as well as the presence and severity of DR, according to the ICDR grading system for all exams (20,21). The reading center protocol was as follows: two experienced readers from the Rotterdam Study at Erasmus Medical Center (22–24), independently graded each exam per ICDR grading system. Graders were masked to any algorithm outputs. Disagreements between the two readers were adjudicated by an experienced retinal specialist (F.D.V.) for the final grade. For analysis, the final ICDR grades were combined into no or mild DR (and no DME) and moderate DR (mtmDR and not vtDR) or vtDR (see Supplementary Table 1). The presence of exudates, retinal thickening (if visible on nonstereo photographs), within 1 disc diameter of the fovea, was taken as evidence of DME (19).

Automated Detection of DR

All images for which a reference standard according to the reading center was available were graded by a deep learning–enhanced device (IDx-DR-EU-2.1), referred to here as “the device.” The device’s core is a lesion-based algorithm with explicit lesion detectors, enhanced by deep learning, thought to closely resemble human visual processing (13,25). The underlying algorithms have been described extensively (13,26). Briefly, the lesion-based algorithm consists of multiple mutually dependent detectors, many of them implemented as convolutional neural networks of DR characteristic lesions. The outputs are integrated into an index, a numerical output varying between 0 and 1, indicating the likelihood of the exam having DR. Both images (fovea centered and optic disc centered) are colocalized and integrated using the optic disc and the larger retinal vessels as landmarks. A categorical outcome is provided: no or mild DR, moderate DR, or vtDR, see Supplementary Table 2. In contrast to the reference standard, the device puts both no DR and mild DR into one grade. If the exam has insufficient quality, no outputs for vtDR or moderate DR are provided.

Statistical Analysis of Performance

For assessment of the interobserver agreement of the reference standard, specific agreement between the two graders was calculated for the categories moderate DR and vtDR using a method described recently (27). Specific agreement was expressed as the chance that one of the graders scored the same grade, i.e., moderate DR or vtDR, as the other grader. The 95% CIs for specific agreement were obtained by bootstrap resampling using 1,000 bootstrap replicates.

With use of the ICDR classification, sensitivity, specificity, and positive predictive value (PPV) and negative predictive value (NPV), and their 95% CIs, were calculated for the device outputs no or mild DR, mtmDR, and vtDR, compared with the corresponding ICDR reference standard classifications of no or mild DR, moderate DR, and vtDR (20).

The analysis was based on exact binomial distribution. Exams of insufficient quality per the ICDR reference standard, or the device, were excluded from diagnostic accuracy analysis.

As has been our standard in the past (13,19,28), we show all images of false negatives. Diagnostic accuracy is reported according to the Standards for Reporting of Diagnostic Accuracy Studies (STARD) as updated in 2015 (29). Analyses were conducted in R (30).

Results

Between 1 January 2015 and 31 December 2015, 1,616 participants were imaged. Mean age was 63 years (SD 11.3), and 53% of the participants were male (see STARD diagrams [Figs. 1 and 2]).

Figure 1
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 1

STARD diagram for the device vtDR output (29).

Figure 2
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 2

STARD diagram for the enhanced device mtmDR output (29).

Of these 1,616 participants, the images of 191 (11.7%) were graded as of insufficient quality by the reference standard. Of the 1,425 participants with exams of sufficient quality, 1,187 (83.3%) had no DR, 167 (11.7%) had mild DR, 55 (3.9%) had moderate DR, and 16 (1.1%) had vtDR (15 of these 16 vtDR cases had DME and 1 [0.1%] had vtDR without DME, but with severe nonproliferative DR)—all according to the reference standard per the ICDR grading system. The interobserver agreement of the reference standard, expressed as specific agreement, i.e., the chance that one of the graders scored the same grade as the other, was 53% (95% CI 43–62) in case of moderate DR and 48% (95% CI 26–68) for vtDR.

The device gave an output of insufficient quality for 280 participants (17.3%) per the ICDR grading system. Of the 1,293 participants (90.6%) with exams of sufficient quality for both the reference standard and device, 1,167 (90.3%) had no or mild DR, 82 (6.4%) moderate DR, and 44 (3.4%) vtDR, including 15 (34.1%) with DME (see Table1).

View this table:
  • View inline
  • View popup
Table 1

Confusion matrix for reference standard according to ICDR grading system and device output

The sensitivity/specificity, per the ICDR reference standard, for the device to detect vtDR was 100% (95% CI 77.1–100)/97.8% (95% CI 96.8–98.5) and mtmDR 79.4% (95% CI 66.5–87.9)/93.8% (95% CI 92.1–94.9). The PPV and NPV for vtDR were 36.4% (95% CI 28.4–45.2) and 100%, respectively. For mtmDR, the PPV and NPV were 39.7% (95% CI 33.8–45.8) and 98.9% (95% CI 98.2–99.3), respectively.

There were 13 false negative exams for the enhanced device’s mtmDR output according to the ICDR reference standard, and all images for these participants are shown in Fig. 3. Review of the images of the 13 false negative cases in Fig. 3 indicated that these participants had a single isolated hemorrhage or cotton wool spot and had no microaneurysms.

Figure 3
  • Download figure
  • Open in new tab
  • Download powerpoint
Figure 3

Right and left eye images (two images per eye, with one disc and one fovea centered) of the 13 participants who were false negative for the mtmDR output of the device, according to the ICDR reference standard. None had vtDR or macular edema. The vtDR output did not have any false negatives.

Conclusions

The results show that a hybrid lesion-based device, with deep learning enhancements, for the automated detection of DR achieved high diagnostic accuracy in a primary care setting in a study with a predetermined protocol and an independent reference standard. These results confirm corresponding results in an earlier study of essentially the same algorithm in a laboratory setting (13). Specifically, the device achieved high sensitivity (100%) in people with vtDR, as the device did not miss any vtDR, or DME, according to the ICDR grading system. It also achieved high specificity (97.8%). However, the number of vtDR cases, although representative for the studied patient population, was low and prevents definite conclusions. The device also had a high sensitivity to detect mtmDR of 79.4%, at a specificity of 93.8%.

Applying the device into the health care system at primary care sites, where patients with diabetes are regularly seen, could improve the percentage of patients screened when indicated. In addition, such a device would lead to improved accuracy compared with present standard of care and will lead to a higher number of patients with images with sufficient quality owing to the direct feedback of the device regarding the image quality. Nongradable images can either be seen by a human grader or directly referred to an eye care provider, implying that no diagnoses of DR were missed as a result of images of insufficient quality, with a guarantee for good clinical care. Overall, this system has the potential to reduce the sociomedical burden of DR.

Clinicians increasingly deviate from the methods used by reading centers, as defined in the original standards (31). For example, whether a single red lesion is a microaneurysm or a hemorrhage can make the difference between a mild versus moderate level of DR. These levels were used in the primary outcome studies that to a great degree still determine the management of DR, such as the Diabetic Retinopathy Study (DRS) (32), Early Treatment of Diabetic Retinopathy Study (ETDRS) (33), and DCCT/EDIC (Diabetes Control and Complications Trial/Epidemiology of Diabetes Interventions and Complications) studies (34), and so it is important to use methods that are as close as possible to methods of these original standards to avoid conflicts based solely on differences in definitions. The ICDR classification used in the current study is a simplified classification based on the original ETDRS classification, which was often too complicated to use in clinical studies. It is widely accepted in the ophthalmological community and the preferred classification in leading reading centers around the world.

A relatively low sensitivity to detect DR in a standard of care setting, using single human graders, has been shown in previous studies (11,35,36). This is also clear from the current study, with a modest interobserver agreement of the graders of roughly 50%. One of the advantages of using a device for the automated detection of DR is the consistently high diagnostic accuracy—not accomplished by single human graders.

The results also show that the diagnostic accuracy of a device to detect DR is typically lower in a real-world setting than in a laboratory setting, as we and others have shown previously (9,13,19). Image quality in published data sets is likely higher than is found in a real-world setting. Finally, there are often differences in the prevalence of DR, with laboratory studies thus far typically showing higher prevalence than in real-life studies such as this one (19). The recent studies by Gulshan et al. (16) and by Ting et al. (17) do report overlapping diagnostic accuracy values for automated screening of DR. The random subject sample of real-life images, prospectively collected, which will inherently have a large number of poor-quality photographs, was unique to our study.

The study has limitations. The reference standard was graded from retinal color images, which lack stereo, and no macular optical coherence tomography was available—now a widely used method for determining the presence of DME. Isolated retinal thickening may be underappreciated (37), though human expert detection of DME from exudates only, in nonstereo images, was shown to be almost as sensitive as clinical stereo biomicroscopic analysis of retinal thickening (38,39). DME prevalence and severity may be underestimated in this data set, and a reference standard including optical coherence tomography could lead to differences in a device’s measured algorithmic performance.

The application of mydriatics was unfortunately not reported to the diagnostic center, and the influence of mydriatics on quality of the images could not be analyzed.

The missing of other diagnoses other than DR using a device for automated screening is inherent to most algorithms. False positives for other pathologies, like venous occlusions or exudative (wet) age-related macular degeneration, will be sent to the ophthalmologist, but other, more subtle, diagnoses, like glaucoma or dry exudative (wet) age-related macular degeneration, may be missed. These diagnoses are relatively infrequent (and in many cases probably already known), so the importance of missing other diagnoses is limited and, in our opinion, acceptable.

The retrospective nature of the study can be considered to be a limitation but allowed for the analysis of an unselected, unbiased, real-life data set.

The device used in the current study has recently received U.S. Food and Drug Administration approval for providing a screening decision without the need for clinician to also interpret image or results, making it usable by health care providers who may not normally be involved in eye care (40).

In summary, the device had high diagnostic accuracy for the detection of vtDR and a more modest but still adequate accuracy in detection of mtmDR in a primary care setting using an independent reference standard. The diagnostic accuracy of the device therefore allows safe use in a primary care setting.

Article Information

Funding. M.D.A. is the Robert C. Watzke Professor of Ophthalmology and Visual Sciences, University of Iowa; Research to Prevent Blindness, New York, NY. This material is the result of work supported with resources and the use of facilities at the Iowa City VA Medical Center.

Contents are solely the responsibility of the authors and do not necessarily represent the official views of the Department of Veterans Affairs or the U.S. government.

Duality of Interest. This study was funded by IDx. M.D.A. is listed as inventor on patents and patent applications related to the study subject. M.D.A. is director of and shareholder in IDx. All authors, with the exception of G.N., received financial support from IDx. No other potential conflicts of interest relevant to this article were reported.

Author Contributions. F.D.V. drafted the manuscript and supervised the study. F.D.V., M.D.A., G.C.F.B., C.K., G.N., and A.A.v.d.H. were responsible for study concept and design. F.D.V., G.C.F.B., C.K., and A.A.v.d.H. interpreted data. F.D.V., G.C.F.B., and C.K. acquired data. A.A.v.d.H. analyzed data and performed statistical analysis. All authors critically revised the manuscript for important intellectual content and provided administrative, technical, or material support. F.D.V. is the guarantor of this work and, as such, had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Prior Presentation. Parts of this study were presented in abstract form at the 53rd Annual Meeting of the European Association for the Study of Diabetes, Lisbon, Portugal, 11–15 September 2017.

Footnotes

  • This article contains Supplementary Data online at http://care.diabetesjournals.org/lookup/suppl/doi:10.2337/dc18-0148/-/DC1.

  • Received January 19, 2018.
  • Accepted December 30, 2018.
  • © 2019 by the American Diabetes Association.
http://www.diabetesjournals.org/content/license

Readers may use this article as long as the work is properly cited, the use is educational and not for profit, and the work is not altered. More information is available at http://www.diabetesjournals.org/content/license.

References

  1. ↵
    1. Liew G,
    2. Michaelides M,
    3. Bunce C
    . A comparison of the causes of blindness certifications in England and Wales in working age adults (16-64 years), 1999-2000 with 2009-2010. BMJ Open 2014;4:e004015pmid:24525390
    OpenUrlAbstract/FREE Full Text
  2. ↵
    1. Bragge P,
    2. Gruen RL,
    3. Chau M,
    4. Forbes A,
    5. Taylor HR
    . Screening for presence or absence of diabetic retinopathy: a meta-analysis. Arch Ophthalmol 2011;129:435–444pmid:21149748
    OpenUrlCrossRefPubMed
  3. ↵
    American Academy of Ophthalmology Retina/Vitreous Panel. Preferred Practice Patterns: Diabetic Retinopathy. San Francisco, CA, American Academy of Ophthalmology, 2016
  4. ↵
    1. Scanlon PH
    . The English national screening programme for sight-threatening diabetic retinopathy. J Med Screen 2008;15:1–4pmid:18416946
    OpenUrlCrossRefPubMed
  5. ↵
    1. Lam JG,
    2. Lee BS,
    3. Chen PP
    . The effect of electronic health records adoption on patient visit volume at an academic ophthalmology department. BMC Health Serv Res 2016;16:7pmid:26762304
    OpenUrlPubMed
    1. Redd TK,
    2. Read-Brown S,
    3. Choi D,
    4. Yackel TR,
    5. Tu DC,
    6. Chiang MF
    . Electronic health record impact on productivity and efficiency in an academic pediatric ophthalmology practice. J AAPOS 2014;18:584–589
    OpenUrl
  6. ↵
    Major sector productivity and costs [Internet], 2017. Available from http://data.bls.gov/pdq/SurveyOutputServlet. Accessed 3 June 2017
  7. ↵
    1. Rahimy E
    . Deep learning applications in ophthalmology. Curr Opin Ophthalmol 2018;29:254–260pmid:29528860
    OpenUrlPubMed
  8. ↵
    1. Hansen MB,
    2. Abràmoff MD,
    3. Folk JC,
    4. Mathenge W,
    5. Bastawrous A,
    6. Peto T
    . Results of automated retinal image analysis for detection of diabetic retinopathy from the Nakuru Study, Kenya. PLoS One 2015;10:e0139148pmid:26425849
    OpenUrlPubMed
    1. van der Heijden AA,
    2. Abramoff MD,
    3. Verbraak F,
    4. van Hecke MV,
    5. Liem A,
    6. Nijpels G
    . Validation of automated screening for referable diabetic retinopathy with the IDx-DR device in the Hoorn Diabetes Care System. Acta Ophthalmol 2018;96:63–68
    OpenUrl
  9. ↵
    Maguire MG, Daniel E, Niemeijer M, Pistilli M, Folk JC, Abramoff MD. Identifying diabetic eye disease: comparison of clinical examination by ophthalmologists to automated detection from retinal color images (Abstract). ARVO Meeting Abstracts 2015;56
  10. ↵
    1. Tufail A,
    2. Kapetanakis VV,
    3. Salas-Vega S, et al
    . An observational study to assess if automated diabetic retinopathy image assessment software can replace one or more steps of manual imaging grading and to determine their cost-effectiveness. Health Technol Assess 2016;20:1–72pmid:27981917
    OpenUrlCrossRefPubMed
  11. ↵
    1. Abràmoff MD,
    2. Lou Y,
    3. Erginay A, et al
    . Improved automated detection of diabetic retinopathy on a publicly available dataset through integration of deep learning. Invest Ophthalmol Vis Sci 2016;57:5200–5206pmid:27701631
    OpenUrlCrossRefPubMed
    1. Gargeya R,
    2. Leng T
    . Automated identification of diabetic retinopathy using deep learning. Ophthalmology 2017;124:962–969pmid:28359545
    OpenUrlCrossRefPubMed
    1. Greenspan H,
    2. van Ginneken B,
    3. Summers RM
    . Guest editorial: deep learning in medical imaging: overview and future promise of an exciting new technique. IEEE Trans Med Imaging 2016;35:1153–1159
    OpenUrlCrossRef
  12. ↵
    1. Gulshan V,
    2. Peng L,
    3. Coram M, et al
    . Development and validation of a deep learning algorithm for detection of diabetic retinopathy in retinal fundus photographs. JAMA 2016;316:2402–2410pmid:27898976
    OpenUrlCrossRefPubMed
  13. ↵
    1. Ting DSW,
    2. Cheung CY,
    3. Lim G, et al
    . Development and validation of a deep learning system for diabetic retinopathy and related eye diseases using retinal images from multiethnic populations with diabetes. JAMA 2017;318:2211–2223pmid:29234807
    OpenUrlPubMed
  14. ↵
    Lynch S, Abramoff MD. Catastrophic failure in image-based convolutional neural network algorithms for detecting diabetic retinopathy (Abstract). IOVS 2017;58:A3776
  15. ↵
    1. Abràmoff MD,
    2. Folk JC,
    3. Han DP, et al
    . Automated analysis of retinal images for detection of referable diabetic retinopathy. JAMA Ophthalmol 2013;131:351–357pmid:23494039
    OpenUrlCrossRefPubMedWeb of Science
  16. ↵
    1. Wilkinson CP,
    2. Ferris FL III,
    3. Klein RE, et al.; Global Diabetic Retinopathy Project Group
    . Proposed international clinical diabetic retinopathy and diabetic macular edema disease severity scales. Ophthalmology 2003;110:1677–1682pmid:13129861
    OpenUrlCrossRefPubMedWeb of Science
  17. ↵
    1. Retinopathie D
    . Concept richtlijn (02-05-2017) [Internet], 2017. Available from https://www.oogheelkunde.org/richtlijn/diabetische-retinopathie-multidisciplinaire-richtlijn-geautoriseerd-november-2017. Accessed 5 May 2017
  18. ↵
    1. de Voogd S,
    2. Ikram MK,
    3. Wolfs RC, et al
    . Is diabetes mellitus a risk factor for open-angle glaucoma? The Rotterdam Study. Ophthalmology 2006;113:1827–1831pmid:16884777
    OpenUrlCrossRefPubMedWeb of Science
    1. Klaver CCW,
    2. Assink JJM,
    3. van Leeuwen R, et al
    . Incidence and progression rates of age-related maculopathy: the Rotterdam Study. Invest Ophthalmol Vis Sci 2001;42:2237–2241pmid:11527936
    OpenUrlAbstract/FREE Full Text
  19. ↵
    1. Stolk RP,
    2. Vingerling JR,
    3. de Jong PT, et al
    . Retinopathy, glucose, and insulin in an elderly population. The Rotterdam Study. Diabetes 1995;44:11–15pmid:7813804
    OpenUrlAbstract/FREE Full Text
  20. ↵
    1. Abràmoff MD,
    2. Alward WL,
    3. Greenlee EC, et al
    . Automated segmentation of the optic disc from stereo color photographs using physiologically plausible features. Invest Ophthalmol Vis Sci 2007;48:1665–1673pmid:17389498
    OpenUrlAbstract/FREE Full Text
  21. ↵
    1. Abràmoff MD,
    2. Garvin MK,
    3. Sonka M
    . Retinal imaging and image analysis. IEEE Rev Biomed Eng 2010;3:169–208pmid:22275207
    OpenUrlPubMed
  22. ↵
    1. de Vet HCW,
    2. Mullender MG,
    3. Eekhout I
    . Specific agreement on ordinal and multiple nominal outcomes can be calculated for more than two raters. J Clin Epidemiol 2018;96:47–53pmid:29217452
    OpenUrlPubMed
  23. ↵
    1. Abràmoff MD,
    2. Reinhardt JM,
    3. Russell SR, et al
    . Automated early detection of diabetic retinopathy. Ophthalmology 2010;117:1147–1154pmid:20399502
    OpenUrlCrossRefPubMedWeb of Science
  24. ↵
    1. Cohen JF,
    2. Korevaar DA,
    3. Altman DG, et al
    . STARD 2015 guidelines for reporting diagnostic accuracy studies: explanation and elaboration. BMJ Open 2016;6:e012799pmid:28137831
    OpenUrlAbstract/FREE Full Text
  25. ↵
    R Core Team. R: A Language and Environment for Statistical Computing. Vienna, Austria, R Foundation for Statistical Computing, 2013
  26. ↵
    1. Arbel Y,
    2. Qiu F,
    3. Bennell MC, et al
    . Association between publication of appropriate use criteria and the temporal trends in diagnostic angiography in stable coronary artery disease: a population-based study. Am Heart J 2016;175:153–159pmid:27179734
    OpenUrlPubMed
  27. ↵
    Photocoagulation treatment of proliferative diabetic retinopathy: the second report of diabetic retinopathy study findings. Ophthalmology 1978;85:82–106pmid:345173
    OpenUrlPubMedWeb of Science
  28. ↵
    1. Early Treatment Diabetic Retinopathy Study Research Group
    . Fundus photographic risk factors for progression of diabetic retinopathy. ETDRS report number 12. Ophthalmology 1991;98(Suppl. ):823–833pmid:2062515
    OpenUrlCrossRefPubMedWeb of Science
  29. ↵
    1. American Diabetes Association
    . Executive summary: standards of medical care in diabetes--2012. Diabetes Care 2012;35(Suppl. 1):S4–S10pmid:22187471
    OpenUrlFREE Full Text
  30. ↵
    1. Hutchinson A,
    2. McIntosh A,
    3. Peters J, et al
    . Effectiveness of screening and monitoring tests for diabetic retinopathy--a systematic review. Diabet Med 2000;17:495–506pmid:10972578
    OpenUrlCrossRefPubMedWeb of Science
  31. ↵
    1. Owens DR,
    2. Gibbins RL,
    3. Lewis PA,
    4. Wall S,
    5. Allen JC,
    6. Morton R
    . Screening for diabetic retinopathy by general practitioners: ophthalmoscopy or retinal photography as 35 mm colour transparencies? Diabet Med 1998;15:170–175pmid:9507921
    OpenUrlCrossRefPubMedWeb of Science
  32. ↵
    1. Wang YT,
    2. Tadarati M,
    3. Wolfson Y,
    4. Bressler SB,
    5. Bressler NM
    . Comparison of prevalence of diabetic macular edema based on monocular fundus photography vs optical coherence tomography. JAMA Ophthalmol 2016;134:222–228pmid:26719967
    OpenUrlPubMed
  33. ↵
    1. Rudnisky CJ,
    2. Tennant MT,
    3. de Leon AR,
    4. Hinz BJ,
    5. Greve MD
    . Benefits of stereopsis when identifying clinically significant macular edema via teleophthalmology. Can J Ophthalmol 2006;41:727–732pmid:17224954
    OpenUrlCrossRefPubMedWeb of Science
  34. ↵
    1. Rudnisky CJ,
    2. Tennant MT,
    3. Weis E,
    4. Ting A,
    5. Hinz BJ,
    6. Greve MD
    . Web-based grading of compressed stereoscopic digital photography versus standard slide film photography for the diagnosis of diabetic retinopathy. Ophthalmology 2007;114:1748–1754pmid:17368543
    OpenUrlCrossRefPubMedWeb of Science
  35. ↵
    1. Abràmoff MD,
    2. Lavin PT,
    3. Birch M,
    4. Shah N,
    5. Folk JC
    . Pivotal trial of an autonomous AI-based diagnostic system for detection of diabetic retinopathy in primary care offices. NPJ Digit Med 2018;1:39
    OpenUrl
View Abstract
PreviousNext
Back to top
Diabetes Care: 42 (4)

In this Issue

April 2019, 42(4)
  • Table of Contents
  • Table of Contents (PDF)
  • About the Cover
  • Index by Author
  • Masthead (PDF)
Sign up to receive current issue alerts
View Selected Citations (0)
Print
Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in spreading the word about Diabetes Care.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Diagnostic Accuracy of a Device for the Automated Detection of Diabetic Retinopathy in a Primary Care Setting
(Your Name) has forwarded a page to you from Diabetes Care
(Your Name) thought you would like to see this page from the Diabetes Care web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Citation Tools
Diagnostic Accuracy of a Device for the Automated Detection of Diabetic Retinopathy in a Primary Care Setting
Frank D. Verbraak, Michael D. Abramoff, Gonny C.F. Bausch, Caroline Klaver, Giel Nijpels, Reinier O. Schlingemann, Amber A. van der Heijden
Diabetes Care Apr 2019, 42 (4) 651-656; DOI: 10.2337/dc18-0148

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Add to Selected Citations
Share

Diagnostic Accuracy of a Device for the Automated Detection of Diabetic Retinopathy in a Primary Care Setting
Frank D. Verbraak, Michael D. Abramoff, Gonny C.F. Bausch, Caroline Klaver, Giel Nijpels, Reinier O. Schlingemann, Amber A. van der Heijden
Diabetes Care Apr 2019, 42 (4) 651-656; DOI: 10.2337/dc18-0148
del.icio.us logo Digg logo Reddit logo Twitter logo CiteULike logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Introduction
    • Research Design and Methods
    • Results
    • Conclusions
    • Article Information
    • Footnotes
    • References
  • Figures & Tables
  • Suppl Material
  • Info & Metrics
  • PDF

Related Articles

Cited By...

More in this TOC Section

  • Association Between Achieving Inpatient Glycemic Control and Clinical Outcomes in Hospitalized Patients With COVID-19: A Multicenter, Retrospective Hospital-Based Analysis
  • Sedentary Behavior and Diabetes Risk Among Women Over the Age of 65 Years: The OPACH Study
  • Dapagliflozin and the Incidence of Type 2 Diabetes in Patients With Heart Failure and Reduced Ejection Fraction: An Exploratory Analysis From DAPA-HF
Show more Cardiovascular and Metabolic Risk

Similar Articles

Subjects

  • Complications-Retinopathy

Navigate

  • Current Issue
  • Standards of Care Guidelines
  • Online Ahead of Print
  • Archives
  • Submit
  • Subscribe
  • Email Alerts
  • RSS Feeds

More Information

  • About the Journal
  • Instructions for Authors
  • Journal Policies
  • Reprints and Permissions
  • Advertising
  • Privacy Policy: ADA Journals
  • Copyright Notice/Public Access Policy
  • Contact Us

Other ADA Resources

  • Diabetes
  • Clinical Diabetes
  • Diabetes Spectrum
  • Scientific Sessions Abstracts
  • Standards of Medical Care in Diabetes
  • BMJ Open - Diabetes Research & Care
  • Professional Books
  • Diabetes Forecast

 

  • DiabetesJournals.org
  • Diabetes Core Update
  • ADA's DiabetesPro
  • ADA Member Directory
  • Diabetes.org

© 2021 by the American Diabetes Association. Diabetes Care Print ISSN: 0149-5992, Online ISSN: 1935-5548.