Pages Menu

Posted on May 1, 2018 in News | 0 comments

Assessing Cardiovascular Risk through Retinal Fundus Imaging

WITHOUT SOURCES
Feburary 2018 saw the release of the exciting research paper ‘Prediction of cardiovascular risk factor from retinal fundus photographs via deep learning’ in Nature Biomedical Engineering. The Google Team had developed a new artificial intelligence technology using deep convolutional neural networks to assess both individual cardiovascular risk factors and the risk of a cardiac event through retinal fundus imaging.

The algorithm assesses individual risk factors (including smoking, blood pressure) by generating a heat map so the algorithm assesses the anatomical regions most relevant to the particular cardiovascular risk factor. It is particularly impressive that many of the risk factors which the algorithm predicted were risk factors previous not believed to be present in retinal images, including age, gender, smoking status and systolic blood pressure. Alongside assessing individual risk factors the team also developed a model to predict major adverse cardiovascular event onset within 5 years.

Current means of cardiovascular disease risk calculation include the Framingham CVD Risk Prediction Score, Pooled Cohort Equations, and Systematic Coronary Risk Evaluation. There are continued efforts to improve the quality of the risk prediction calculators, as there are limitations associated with some of them. For example the Framingham CVD Risk Prediction Score, which is the most commonly used risk estimation system worldwide, has been found to overestimate risk for women and has been less effective in predicting risk for elderly people.

This development has exciting implications in improving identification of individuals at risk for cardiovascular disease. Cardiovascular disease poses a significant global burden as the leading global cause of death. In 2015 Cardiovascular disease represented 31% of all global deaths of which three quarters occurred in low and middle income countries.

The algorithm not only has the ability to improve outcomes of patients at risk for developing cardiovascular disease through early identification, but also other diseases as the individual risk factors identified (including blood pressure, age, smoking, gender, race) can potentially be used to calculate risk for other diseases such as Chronic Kidney Disease and Diabetes.

WITH SOURCES
Feburary 2018 saw the release of the exciting research paper ‘Prediction of cardiovascular risk factor from retinal fundus photographs via deep learning’ in Nature Biomedical Engineering. The Google Team had developed a new artificial intelligence technology using deep convolutional neural networks to assess both individual cardiovascular risk factors and the risk of a cardiac event through retinal fundus imaging.

The algorithm assesses individual risk factors (including smoking, blood pressure) by generating a heat map so the algorithm assesses the anatomical regions most relevant to the particular cardiovascular risk factor. It is particularly impressive that many of the risk factors which the algorithm predicted were risk factors previous not believed to be present in retinal images, including age, gender, smoking status and systolic blood pressure. Alongside assessing individual risk factors the team also developed a model to predict major adverse cardiovascular event onset within 5 years.
(1).

Current means of cardiovascular disease risk calculation include the Framingham CVD Risk Prediction Score, Pooled Cohort Equations, and Systematic Coronary Risk Evaluation. There are continued efforts to improve the quality of the risk prediction calculators, as there are limitations associated with some of them. For example the Framingham CVD Risk Prediction Score, which is the most commonly used risk estimation system worldwide, has been found to overestimate risk for women(2) and has been less effective in predicting risk for elderly people (3).

This development has exciting implications in improving identification of individuals at risk for cardiovascular disease. Cardiovascular disease poses a significant global burden as the leading global cause of death. In 2015 Cardiovascular disease represented 31% of all global deaths of which three quarters occurred in low and middle income countries (4).

The algorithm not only has the ability to improve outcomes of patients at risk for developing cardiovascular disease through early identification, but also other diseases as the individual risk factors identified (including blood pressure, age, smoking, gender, race) can potentially be used to calculate risk for other diseases such as Chronic Kidney Disease and Diabetes.

Sources
1. Prediction of cardiovascular risk factors form retinal fundus photographs via deep learning. Nature biomedical engineering. https://www.nature.com/articles/s41551-018-0195-0

2. Validation of the Framingham general cardiovascular risk score in a multiethnic Asian population: Retrospective cohort study http://bmjopen.bmj.com/content/5/5/e007324?utm_source=trendmd&utm_medium=cpc&utm_campaign=bmjopen&trendmd-shared=1&utm_content=Journalcontent&utm_term=TrendMDPhase4
3. Value and limitations of existing scores for the Assessment of Cardiovascular Risk: A Review for Clinicians. Journal of the American College of Cardiology. https://www.sciencedirect.com/science/article/pii/S0735109709025029
4. WHO Cardiovascular Diseases Fact Sheet May 2017 http://www.who.int/mediacentre/factsheets/fs317/en/

Read More

Posted on Oct 28, 2016 in News | 0 comments

VisionAtHome Wins $750K Google Prize To Bring Eye Testing To Global Communities

A team from Melbourne’s Centre for Eye Research Australia (CERA) has won this year’s Google Impact Challenge.

Medical doctor and PhD candidate Dr William Yan successfully pitched the team’s project, which uses a software algorithm to perform accurate, evidence-based visual acuity testing via webcam and Internet connection.

VisionAtHome’s aim is to help rural, remote and mobility-impaired users access and perform eye testing at home, particularly in areas with limited or no access to ophthalmologists. As Dr Yan explains, “94% of blindness or vision loss in Indigenous Australians is preventable or treatable. Less than 1% of eye specialists work in remote Australia, but almost all these areas have access to the Internet. Time is not on our side to bring changes in infrastructure to remote Australia, given its vastness, so telemedicine is a means of bridging the gap sooner.”

Dr William Yan, Project Lead of VisionAtHome. Photo: Centre for Eye Research Australia

Dr William Yan, Project Lead of VisionAtHome. Photo: Centre for Eye Research Australia

CERA Principal investigator and Professor of Ophthalmic Epidemiology at the University of Melbourne, Professor Mingguang He was delighted with the outcome, describing VisionAtHome as a “simple hand-held solution for those who live far away from eye specialists. (It) has the potential to help millions of people not only in Australia but worldwide.“ This can expand to global areas of need beyond Australian communities, including the elderly, children, the physically disabled, and resource-poor settings in developing countries. In future, VisionAtHome aims to include Ishihara and visual field testing.

The Google Impact Challenge helps non-profits that use technology to solve social problems. The prize from Google will help CERA’s team translate their software to smartphone and mobile device applications, and undertake further research including clinical trials. For more information please visit cera.org.au

By: Louise Teo

Read More

Posted on Jul 13, 2016 in News | 0 comments

App Competition Open: Reimagining Diabetes Management via IBM Watson Health and the American Diabetes Association

The American Diabetes Association has announced a new competition for app developers proposing cognitive computing solutions for diabetes. Clinical and research data from the ADA’s vast repository will be made available via IBM Watson Health.

Announced at the ADA’s 76th Scientific Sessions in June, app developers will have the chance to reimagine how diabetes is managed through Watson’s data analysis. Watson will enable healthcare providers, researchers, institutions and patients to all benefit from more streamlined and efficient data analysis. Data is deidentified and shared via Watson’s cloud system, enabling secure and timely access for app developers across all facets of healthcare to explore app-based solutions for diabetes prevention, treatment and management.

“Patients, caregivers and healthcare providers need access to cognitive tools that can help them translate that big data into action, and Watson can offer access to timely, personalized insights,” said Kyu Rhee, MD, MPP, and IBM Watson Health’s chief health officer. As 29 million Americans have diabetes according to the Centers for Disease Control and Prevention, with 415 million adults worldwide, the opportunities from this partnership will help accelerate the fight against this chronic disease.

American DIabetes Association, IBM Watson Health

ADA SVP of medical innovation Jane Chiang, MD and IBM Watson Health chief health officer Kyu Rhee, MD at the ADA’s 76th Scientific Sessions. (Photo Credit: A.J. Sisco/Feature Photo for IBM via IBM’s website)

For competition details, head to http://watsonhealth.ibm.com/challengediabetes.

Source: https://www-03.ibm.com/press/us/en/pressrelease/49903.wss

By: Louise Teo

Read More

Posted on Jun 19, 2016 in News | 0 comments

Australian Cancer patient app CancerAid set to launch in Asia

Australian startup CancerAid will launch in Asia with the Hong Kong Integrated Oncology Centre.

The biggest problem cancer patients and their families face is digesting the bewildering amount of information they are given throughout their journey.

CancerAid app. Photo courtesy of Dr Nikhil Pooviah, Founder of CancerAid

CancerAid app. Photo courtesy of Dr Nikhil Pooviah, Founder of CancerAid

CancerAid aims to solve this problem by providing a secure communication and information portal for the millions of people affected by cancer worldwide.

CancerAid was founded in Sydney in 2014 by radiation oncology doctor, Nikhil Pooviah. Its mobile app features, such as the Journey Organiser and Treatment Plan, will help patients keep track of appointments and record information they can then share with their loved ones and treating team. The app can be personalised to each patient’s individual cancer subtype and treatment program. Patients will also be able to access help from Allied Health members, and be able to communicate with cancer specialists who are registered with the program.

CancerAid will launch in Australia later in 2016.

canceraid.com.au

By: Dr Louise Teo

Read More

Posted on Jun 26, 2015 in News | 0 comments

Peek Acuity app proven by clinical study to be just as accurate as ETDRS and Snellen chart

Results from a recent study published in JAMA Ophthalmology, have shown that the Peek (Portable Eye Examination Kit) Acuity app is just as accurate as ETDRS and Snellen chart for testing visual acuity. This makes the app one of the few mobile health applications to be proven and validated in a clinical trial.

The app was tested on 233 Kenyan adults aged 55 and above. The results were found to be just as accurate and repeatable as the Snellen chart, while being comparable in accuracy to the ETDRS chart.

In developing countries where access to specialist clinics are limited, the Peek Acuity app enables healthcare and community workers to test visual acuity using an accurate and portable system in the patient’s home or in clinic. Lead author of the study and co-founder of the Peek Acuity app, Dr Andrew Bastawrous said, “we aimed to develop and validate a smartphone-based visual acuity test which would work in challenging circumstances, such as rural Africa, but also provide reliable enough results to use in routine clinical practice in well-established healthcare systems.“

The app features a “tumbling E”, where the letter E is displayed in 1 of 4 orientations. The patient points in the direction they perceive the arm of the letter E to be pointing. The use of the “tumbling E” allows eye tests to be performed on those unable to read letters used in the English language, and ensures acuity is resolution based rather than recognition based. The application also provides alternatives to finger counting, hand movements and light perception.

Peek Acuity app is currently compatible with Android and iOS devices.

 

By: Dr Joanne Teong

Source: http://www.peekvision.org

Link to study: JAMA Ophthalmology

Read More

Posted on Apr 8, 2015 in News | 0 comments

Innovative retinal imaging device turns smartphones into portable ophthalmoscopes

The D-EYE Retinal Imaging System is an innovative device that converts a smartphone into a portable, easy-to-use, affordable, and effective fundus camera.

The D-EYE device is a smartphone-sized case that fits onto an Apple iOS or Android phone. The D-EYE fundoscope lens is positioned over the smartphone’s camera and LED light source, enabling the phone to capture high definition video and still images of the fundus of the eye. The D-EYE app installed on the smartphone allows the user to store and manage patient information.

The D-EYE Retinal Imaging System offers:

  • Field of view up to 20 degrees
  • Easy viewing of optic nerve head, even without dilating eye drops, for detecting glaucoma
  • Diabetic retinopathy screening and grading
  • Hypertensive retinopathy screening and grading
  • Age-related macular degeneration screening
  • Cataract diagnosis and grading
  • Visual acuity testing for adults and children
  • Ability to record multiple images or videos
  • Optional private and secure cloud-based storage system
  • No additional external power or lighting source required

The D-EYE Retinal Imaging System was invented by Dr Andrea Russo to improve accessibility to medical screenings for people in need. The convenience and portability of the device is especially valuable for examining bed-ridden patients, children and infants. According to Dr Russo, “The D-EYE retinal screening system can be used by a variety of health professionals ranging from ophthalmologists, neurologists, general practitioners, emergency physicians and pediatricians, to school nurses and others. The system offers a quick, accurate and inexpensive way to examine the human eye and identify a variety of health conditions.”

The D-EYE Retinal Imaging System is currently compatible with iPhone 5, 5S, and 6, or Samsung Galaxy S4 and S5.

Source: http://www.d-eyecare.com

By: Dr Joanne Teong

Read More