Dr. Maisel welcomes suggestions from physicians, IMS Members and non-members.Simply click on the suggestion box icon and “Let Us Hear from YOU!” President's Message
I am sitting on a plane, jetting back from the English countryside, after a family holiday with my daughter’s large extended family of in-laws spanning 4 generations. Ages ranged from 1 to 83 years old, and professions ranged from students to teachers to information technologists, military elite, business owners and doctors. We were bound together by a common thread – our dependence on and obsession with our smart devices. I cannot recall a single gathering where an iPhone was not used to answer a question of debate (i.e., the origin of afternoon tea time as an English tradition, or the reason frogs and bats are protected species in England, with imprisonment the fine for an offense - will explain later). We used our “devices” to hire cabs and find directions, to review restaurant menus and find and purchase entertainment. We even used them to navigate through and reserve rides at LegoLand. We used them to review and retell the family history, and to record family history in the making. We spent a lot of time complaining about the amount of time the children spent on their “devices” and shared ideas and methods for computer-time limitations and providing alternative childhood activities. The more “senior” of our gathering spent time priding ourselves in doing things the old way – by learning through study, experience and memory, criticizing the younger generation’s dependence on Google. We shared the fear that critical thinking and common sense are being lost, that no problems are being solved by the younger generations without their Iphones. We even talked about the fear of residents and younger doctors standing at the patient’s bedside unable to make diagnoses and answer questions without a “device” in hand. On parting, we pledged to stay in touch by Facetime and Skype.
I am very aware of the generational gap of acceptance and respect for the impact of informational technology (IT) and artificial intelligence (AI) on our lives and professions. Whether or not we acknowledge it, there is literally nothing we do from day to day that has not benefitted from IT. In medicine, it is allowing us to move forward at lightning pace. It is solving centuries old medical questions about the etiology of and treatment of diseases. IT is allowing the practice of medicine to reach every corner of the earth by virtual clinics. AI is translating and transforming massive informational input into health predictions and diagnoses. So what are the next medical breakthroughs that will become part of our everyday lives over the next few years?
Telemedicine and the virtual clinic are here - just waiting for the business of medicine to catch up with the technology. Telemedicine is still needing organizational structure and insurance reimbursement to become mainstream. Although nothing can ever replace the “touch” of Medicine, the benefits of telemedicine in improving patient care are obvious - reduction in the cost of time and travel, immediacy of access, reduction in the exposure to and spread of infectious diseases, the evaluation and treatment of the infirment and the bedridden, to name a few. In my pediatric GI practice, I can’t count the number of times a rash has been evaluated by (HIPAA compliant) photo sharing, or that a caretaker has been guided through the change of a gastrostomy tube by facetime.
Remote patient monitoring (RPM) is evolving, ranging from chest-worn cardiac devices that detect and report arrhythmias and congestive heart failure, bed pressure sensors that alert need to change position to prevent pressures sores, smart bandages that detect and report early infection, to pill packs that report regimen compliance.
Over the next couple of years, the data collected for health, sleep and activity metrics using internet apps, such as Fitbits™, Apple Carekit™, and HealthKit™, will become uploadable into electronic health records (EHRs). It is predicted that 245 million wearable devices will be sold in 2019! The Office of the National Coordinator for Health Information Technology has mandated that the data be uploadable to EHR’s. The American Hospital Association, the AMA, and the Healthcare Information and Management Systems Society are currently vetting and evaluating medical apps for safety and effectiveness. “Software vendors will need to develop apps to bring digestible, accurate, timely, and relevant vital-signs readings to the clinician’s attention.”*
In the next 2 years, Artificial Intelligence (AI) is expected to be commonplace and vital in analyzing exponentially increasing data from EHRs and apps, with consequent improvement in care decisions and saving lives. AI is defined by computers that can process language, develop vision and independently solve problems, and learn - actions formerly ascribed only to us humans. As an example, an artificial pancreas was patented by the FDA in 2016, that continuously monitors glucose and releases insulin automatically, replicating a healthy pancreas, and freeing up diabetic patients to live more normal lives. Google has recently patented a digital contact lens that can measure blood glucose level from tears, to be coordinated with an insulin release device. AI is revolutionizing drug discovery. The company Atomwise uses supercomputers that root out therapies from a database of molecular structures.
Last year, Atomwise launched a virtual search for safe, existing medicines in order to redesign them to treat Ebola virus. They found 2 drugs predicted by the company’s AI technology to significantly reduce Ebola infectivity. This analysis, which typically would have taken months or years, was completed in less than one day!**
Predictive analytics overlaps AI in that it analyzes trends and then makes predictions about the future, such as predicting which patients with pulmonary or heart disease will be readmitted, or which patients will or will not respond to a particular medication, or who is expected to require ICU admission within 36 hours. Predictive analytics provide physician-level insights without the physicians needing to be present at the bedside, helping guide the clinician’s medical decision-making, similar to how more recently adopted technologies, such as CT and EHRs, do. While the technologies will never replace the clinicians, they will help deliver a high level of patient care.
Developing technologies that are still several years away from mainstream adoption by the healthcare industry are “blockchain,” “augmented reality,” and “virtual reality.” Blockchain is touted to be “as revolutionary as the internet itself.”
So, what is blockchain? Blockchain was developed in 2008 for the purpose of establishing the ability to exchange digital cryptocurrency known as Bitcoin. The secure data structure of the technology was created to enable all kinds of transactions to take place without a trusted authority - such as a bank or insurance company - serving as a middleman, allowing 2 people to trust each other, without ever meeting. The general structure is that blockchain transactions are logged publicly and in chronological order. The database reflects an ever-expanding list of ordered “blocks”, each time-stamped and connected to the block that came before it, thus creating a blockchain. What makes the blockchain secure is that each block cannot be changed, deleted or otherwise modified - it is an indelible record that a given transaction occurred. Blockchain’s potential for data security, its open and decentralized structure, lends itself well to health records management and proof of identity.*** Other possible applications in medicine include drug discovery, insurance claim simplification, and collection of data on patient-reported outcomes. It is predicted that blockchain technology will lead to global interoperability and make current EHRs obsolete. Infrastructure for this technology in the healthcare arena is still in its infancy and several years away from maturity.
Augmented reality (AR) is the same technology that has made Pokemon so popular by placing virtual figures around the world for “capture” by game players with that app. AR superimposes a computer-generated image, sound and other information on a user’s view of the real world, therefore providing a composite view. AR headsets, goggles, and now contact lenses, allow the user to project digital information over the real-life image they see. Medical applications already in use include education through layering of text over anatomic structures, enabling medical students to peel away skin and muscle and observe the placement and function of bones and internal organs. Future uses of AR may include projection of veins over an arm for ease of phlebotomy, projection of an MRI over a patient during surgery, vital signs pushed to a head-worn display during a cardiac arrest, facial recognition for patient identification in a crowded waiting room, as well as face recognition with immediate projection of a patient’s medical history.
Virtual reality (VR) creates a computer-generated environment that immerses the user in a simulated world. Its potential application to the medical field is open-ended! The technology is already being used in medicine to train, diagnose and treat. Current uses include exposure therapy to treat patients with phobias, such as claustrophobia, fear of heights, and fear of flying, by simulating the experience of standing in a crowd, sitting on a cliff, and viewing out of a jet window. Post-traumatic stress disorder is being treated with virtual reality by placing veterans in virtual reality simulations of warfare environments to help teach them how to deal with instances that might otherwise be triggers to destructive behavior towards others or themselves.
Diversional therapy is being used for burn patients during dressing changes by “putting” them into a virtual world of snow and ice. One program has patients throwing snowballs at penguins. Surgery training is getting away from cadaver training, turning to simulators that include haptic feedback for those doing the training. Professors at University of Texas, Dallas have created a program to help children with autism work on socials skills by putting them into job interviews or blind dates with avatars, helping them work on reading social cues and expressing socially acceptable behavior. Simulated experiences can help disabled patients experience climbing a mountain in their wheelchair, or allow a child with cancer “swim” in an animated fish tank.
Limits of technology have been enumerated in the recent Medscape article “What Technology Changes Will Affect Your Practice Soon?” “Telemedicine may increase access to clinicians. But what happens when the suicidal patient is not physically in your office for a proper referral to psychiatry? A remote monitoring device may experience battery failure, and changes in patient status may be missed. Older versions of insulin pumps or cardiac pacemakers can be hacked and their settings changed, with possibly lethal results. Software flaws may allow personal data to leak to criminals. Poor algorithms or rushed products may fail to perform. Creating a new technology is the easy part compared with the Byzantine process by which it is discovered, discussed, chosen, bought, presented, taught, piloted, troubleshot, rolled out, stocked, replaced, and scaled up.”*
It is reassuring that as physicians, we will still be needed at the bedside for the immediate future, but definitely with our smart “devices” in hand! Which leads me to the Google searches that started my pondering of this topic in the first place: 1. The origin of afternoon tea time as English tradition started in the year 1840 with Anna, the seventh Duchess of Bedford, who became hungry around 4 PM everyday. The evening meal in her house was served fashionably late at 8PM. She started asking for a tray of tea, bread and butter, and cakes to be brought to her room in the late afternoons, and then began asking friends to join her, thus establishing the fashionable social event. 2. The reason frogs and bats are protected species in England, with imprisonment the fine for an offense, is that they are insectivores that protect crops and forests against infestations, reducing and, in many regions, eliminating, the need for chemical insecticides. Like us, I guess technology has not yet put them out of a job!
* “What Technology Changes Will Affect Your Practice Soon?” James M. Lebret, MD, Medscape 2/1/2017
**TMF+ “The Most Exciting Technologies of 2017”